NASA Technical Reports Server (NTRS)
Schwarzenberg, M.; Pippia, P.; Meloni, M. A.; Cossu, G.; Cogoli-Greuter, M.; Cogoli, A.
1998-01-01
The purpose of this paper is to present the results obtained in our laboratory with both instruments, the FFM [free fall machine] and the RPM [random positioning machine], to compare them with the data from earlier experiments with human lymphocytes conducted in the FRC [fast rotating clinostat] and in space. Furthermore, the suitability of the FFM and RPM for research in gravitational cell biology is discussed.
Stylianou, Neophytos; Akbarov, Artur; Kontopantelis, Evangelos; Buchan, Iain; Dunn, Ken W
2015-08-01
Predicting mortality from burn injury has traditionally employed logistic regression models. Alternative machine learning methods have been introduced in some areas of clinical prediction as the necessary software and computational facilities have become accessible. Here we compare logistic regression and machine learning predictions of mortality from burn. An established logistic mortality model was compared to machine learning methods (artificial neural network, support vector machine, random forests and naïve Bayes) using a population-based (England & Wales) case-cohort registry. Predictive evaluation used: area under the receiver operating characteristic curve; sensitivity; specificity; positive predictive value and Youden's index. All methods had comparable discriminatory abilities, similar sensitivities, specificities and positive predictive values. Although some machine learning methods performed marginally better than logistic regression the differences were seldom statistically significant and clinically insubstantial. Random forests were marginally better for high positive predictive value and reasonable sensitivity. Neural networks yielded slightly better prediction overall. Logistic regression gives an optimal mix of performance and interpretability. The established logistic regression model of burn mortality performs well against more complex alternatives. Clinical prediction with a small set of strong, stable, independent predictors is unlikely to gain much from machine learning outside specialist research contexts. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.
Zemp, Roland; Tanadini, Matteo; Plüss, Stefan; Schnüriger, Karin; Singh, Navrag B; Taylor, William R; Lorenzetti, Silvio
2016-01-01
Occupational musculoskeletal disorders, particularly chronic low back pain (LBP), are ubiquitous due to prolonged static sitting or nonergonomic sitting positions. Therefore, the aim of this study was to develop an instrumented chair with force and acceleration sensors to determine the accuracy of automatically identifying the user's sitting position by applying five different machine learning methods (Support Vector Machines, Multinomial Regression, Boosting, Neural Networks, and Random Forest). Forty-one subjects were requested to sit four times in seven different prescribed sitting positions (total 1148 samples). Sixteen force sensor values and the backrest angle were used as the explanatory variables (features) for the classification. The different classification methods were compared by means of a Leave-One-Out cross-validation approach. The best performance was achieved using the Random Forest classification algorithm, producing a mean classification accuracy of 90.9% for subjects with which the algorithm was not familiar. The classification accuracy varied between 81% and 98% for the seven different sitting positions. The present study showed the possibility of accurately classifying different sitting positions by means of the introduced instrumented office chair combined with machine learning analyses. The use of such novel approaches for the accurate assessment of chair usage could offer insights into the relationships between sitting position, sitting behaviour, and the occurrence of musculoskeletal disorders.
The influence of negative training set size on machine learning-based virtual screening.
Kurczab, Rafał; Smusz, Sabina; Bojarski, Andrzej J
2014-01-01
The paper presents a thorough analysis of the influence of the number of negative training examples on the performance of machine learning methods. The impact of this rather neglected aspect of machine learning methods application was examined for sets containing a fixed number of positive and a varying number of negative examples randomly selected from the ZINC database. An increase in the ratio of positive to negative training instances was found to greatly influence most of the investigated evaluating parameters of ML methods in simulated virtual screening experiments. In a majority of cases, substantial increases in precision and MCC were observed in conjunction with some decreases in hit recall. The analysis of dynamics of those variations let us recommend an optimal composition of training data. The study was performed on several protein targets, 5 machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest) and 2 types of molecular fingerprints (MACCS and CDK FP). The most effective classification was provided by the combination of CDK FP with SMO or Random Forest algorithms. The Naïve Bayes models appeared to be hardly sensitive to changes in the number of negative instances in the training set. In conclusion, the ratio of positive to negative training instances should be taken into account during the preparation of machine learning experiments, as it might significantly influence the performance of particular classifier. What is more, the optimization of negative training set size can be applied as a boosting-like approach in machine learning-based virtual screening.
The influence of negative training set size on machine learning-based virtual screening
2014-01-01
Background The paper presents a thorough analysis of the influence of the number of negative training examples on the performance of machine learning methods. Results The impact of this rather neglected aspect of machine learning methods application was examined for sets containing a fixed number of positive and a varying number of negative examples randomly selected from the ZINC database. An increase in the ratio of positive to negative training instances was found to greatly influence most of the investigated evaluating parameters of ML methods in simulated virtual screening experiments. In a majority of cases, substantial increases in precision and MCC were observed in conjunction with some decreases in hit recall. The analysis of dynamics of those variations let us recommend an optimal composition of training data. The study was performed on several protein targets, 5 machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest) and 2 types of molecular fingerprints (MACCS and CDK FP). The most effective classification was provided by the combination of CDK FP with SMO or Random Forest algorithms. The Naïve Bayes models appeared to be hardly sensitive to changes in the number of negative instances in the training set. Conclusions In conclusion, the ratio of positive to negative training instances should be taken into account during the preparation of machine learning experiments, as it might significantly influence the performance of particular classifier. What is more, the optimization of negative training set size can be applied as a boosting-like approach in machine learning-based virtual screening. PMID:24976867
Some history and use of the random positioning machine, RPM, in gravity related research
NASA Astrophysics Data System (ADS)
van Loon, Jack J. W. A.
The first experiments using machines and instruments to manipulate gravity and thus learn about its impact to this force onto living systems were performed by Sir Thomas Andrew Knight in 1806, exactly two centuries ago. What have we learned from these experiments and in particular what have we learned about the use of instruments to reveal the impact of gravity and rotation on plants and other living systems? In this essay I want to go into the use of instruments in gravity related research with emphases on the Random Positioning Machine, RPM. Going from water wheel via clinostat to RPM, we will address the usefulness and possible working principles of these hypergravity and mostly called microgravity, or better, micro-weight simulation techniques.
Assessment of various supervised learning algorithms using different performance metrics
NASA Astrophysics Data System (ADS)
Susheel Kumar, S. M.; Laxkar, Deepak; Adhikari, Sourav; Vijayarajan, V.
2017-11-01
Our work brings out comparison based on the performance of supervised machine learning algorithms on a binary classification task. The supervised machine learning algorithms which are taken into consideration in the following work are namely Support Vector Machine(SVM), Decision Tree(DT), K Nearest Neighbour (KNN), Naïve Bayes(NB) and Random Forest(RF). This paper mostly focuses on comparing the performance of above mentioned algorithms on one binary classification task by analysing the Metrics such as Accuracy, F-Measure, G-Measure, Precision, Misclassification Rate, False Positive Rate, True Positive Rate, Specificity, Prevalence.
Shamshirsaz, Alireza Abdollah; Kamgar, Mohammad; Bekheirnia, Mir Reza; Ayazi, Farzam; Hashemi, Seyed Reza; Bouzari, Navid; Habibzadeh, Mohammad Reza; Pourzahedgilani, Nima; Broumand, Varshasb; Shamshirsaz, Amirhooshang Abdollah; Moradi, Maziyar; Borghei, Mehrdad; Haghighi, Niloofar Nobakht; Broumand, Behrooz
2004-01-01
Background Hepatitis C virus (HCV) infection is a significant problem among patients undergoing maintenance hemodialysis (HD). We conducted a prospective multi-center study to evaluate the effect of dialysis machine separation on the spread of HCV infection. Methods Twelve randomly selected dialysis centers in Tehran, Iran were randomly divided into two groups; those using dedicated machines (D) for HCV infected individuals and those using non-dedicated HD machines (ND). 593 HD cases including 51 HCV positive (RT-PCR) cases and 542 HCV negative patients were enrolled in this study. The prevalence of HCV infection in the D group was 10.1% (range: 4.6%– 13.2%) and it was 7.1% (range: 4.2%–16.8%) in the ND group. During the study conduction 5 new HCV positive cases and 169 new HCV negative cases were added. In the D group, PCR positive patients were dialyzed on dedicated machines. In the ND group all patients shared the same machines. Results In the first follow-up period, the incidence of HCV infection was 1.6% and 4.7% in the D and ND group respectively (p = 0.05). In the second follow-up period, the incidence of HCV infection was 1.3% in the D group and 5.7% in the ND group (p < 0.05). Conclusions In this study the incidence of HCV in HD patients decreased by the use of dedicated HD machines for HCV infected patients. Additional studies may help to clarify the role of machine dedication in conjunction with application of universal precautions in reducing HCV transmission. PMID:15469615
Jardine, Luke Anthony; Sturgess, Barbara Ruth; Inglis, Garry Donald Trevor; Davies, Mark William
2009-04-01
To determine if: time from blood culture inoculation to positive growth (total time to positive) and time from blood culture machine entry to positive growth (machine time to positive) is altered by delayed entry into the automated blood culture machine, and if the total time to positive differs by the concentration of organisms inoculated into blood culture bottles. Staphylococcus epidermidis, Escherichia coli and group B beta-haemolytic streptococci were chosen as clinically significant representative organisms. Two concentrations (> or =10 colony-forming units per millilitre and <1 colony-forming units per millilitre) were inoculated into PEDS BacT/Alert blood culture bottles and randomly allocated to one of three delayed automated blood culture machine entry times (30 min/8.5 h/15.5 h). For all organisms at all concentrations, except the Staphylococcus epidermidis, the machine time to positive was significantly decreased by delayed entry. For all organisms at all concentrations, the mean total time to positive significantly increased with increasing delayed entry into the blood culture machine. Higher concentrations of group B beta-haemolytic streptococci and Escherichia coli grew significantly faster than lower concentrations. Bacterial growth in inoculated bottles, stored at room temperature, continues although at a slower rate than in those blood culture bottles immediately entered into the machine. If a blood culture specimen has been stored at room temperature for greater than 15.5 h, the currently allowed safety margin of 36 h (before declaring a result negative) may be insufficient.
Machine-z: Rapid Machine-Learned Redshift Indicator for Swift Gamma-Ray Bursts
NASA Technical Reports Server (NTRS)
Ukwatta, T. N.; Wozniak, P. R.; Gehrels, N.
2016-01-01
Studies of high-redshift gamma-ray bursts (GRBs) provide important information about the early Universe such as the rates of stellar collapsars and mergers, the metallicity content, constraints on the re-ionization period, and probes of the Hubble expansion. Rapid selection of high-z candidates from GRB samples reported in real time by dedicated space missions such as Swift is the key to identifying the most distant bursts before the optical afterglow becomes too dim to warrant a good spectrum. Here, we introduce 'machine-z', a redshift prediction algorithm and a 'high-z' classifier for Swift GRBs based on machine learning. Our method relies exclusively on canonical data commonly available within the first few hours after the GRB trigger. Using a sample of 284 bursts with measured redshifts, we trained a randomized ensemble of decision trees (random forest) to perform both regression and classification. Cross-validated performance studies show that the correlation coefficient between machine-z predictions and the true redshift is nearly 0.6. At the same time, our high-z classifier can achieve 80 per cent recall of true high-redshift bursts, while incurring a false positive rate of 20 per cent. With 40 per cent false positive rate the classifier can achieve approximately 100 per cent recall. The most reliable selection of high-redshift GRBs is obtained by combining predictions from both the high-z classifier and the machine-z regressor.
Chaotic sources of noise in machine acoustics
NASA Astrophysics Data System (ADS)
Moon, F. C., Prof.; Broschart, Dipl.-Ing. T.
1994-05-01
In this paper a model is posited for deterministic, random-like noise in machines with sliding rigid parts impacting linear continuous machine structures. Such problems occur in gear transmission systems. A mathematical model is proposed to explain the random-like structure-borne and air-borne noise from such systems when the input is a periodic deterministic excitation of the quasi-rigid impacting parts. An experimental study is presented which supports the model. A thin circular plate is impacted by a chaotically vibrating mass excited by a sinusoidal moving base. The results suggest that the plate vibrations might be predicted by replacing the chaotic vibrating mass with a probabilistic forcing function. Prechaotic vibrations of the impacting mass show classical period doubling phenomena.
Hsieh, Chung-Ho; Lu, Ruey-Hwa; Lee, Nai-Hsin; Chiu, Wen-Ta; Hsu, Min-Huei; Li, Yu-Chuan Jack
2011-01-01
Diagnosing acute appendicitis clinically is still difficult. We developed random forests, support vector machines, and artificial neural network models to diagnose acute appendicitis. Between January 2006 and December 2008, patients who had a consultation session with surgeons for suspected acute appendicitis were enrolled. Seventy-five percent of the data set was used to construct models including random forest, support vector machines, artificial neural networks, and logistic regression. Twenty-five percent of the data set was withheld to evaluate model performance. The area under the receiver operating characteristic curve (AUC) was used to evaluate performance, which was compared with that of the Alvarado score. Data from a total of 180 patients were collected, 135 used for training and 45 for testing. The mean age of patients was 39.4 years (range, 16-85). Final diagnosis revealed 115 patients with and 65 without appendicitis. The AUC of random forest, support vector machines, artificial neural networks, logistic regression, and Alvarado was 0.98, 0.96, 0.91, 0.87, and 0.77, respectively. The sensitivity, specificity, positive, and negative predictive values of random forest were 94%, 100%, 100%, and 87%, respectively. Random forest performed better than artificial neural networks, logistic regression, and Alvarado. We demonstrated that random forest can predict acute appendicitis with good accuracy and, deployed appropriately, can be an effective tool in clinical decision making. Copyright © 2011 Mosby, Inc. All rights reserved.
Machine- z: Rapid machine-learned redshift indicator for Swift gamma-ray bursts
Ukwatta, T. N.; Wozniak, P. R.; Gehrels, N.
2016-03-08
Studies of high-redshift gamma-ray bursts (GRBs) provide important information about the early Universe such as the rates of stellar collapsars and mergers, the metallicity content, constraints on the re-ionization period, and probes of the Hubble expansion. Rapid selection of high-z candidates from GRB samples reported in real time by dedicated space missions such as Swift is the key to identifying the most distant bursts before the optical afterglow becomes too dim to warrant a good spectrum. Here, we introduce ‘machine-z’, a redshift prediction algorithm and a ‘high-z’ classifier for Swift GRBs based on machine learning. Our method relies exclusively onmore » canonical data commonly available within the first few hours after the GRB trigger. Using a sample of 284 bursts with measured redshifts, we trained a randomized ensemble of decision trees (random forest) to perform both regression and classification. Cross-validated performance studies show that the correlation coefficient between machine-z predictions and the true redshift is nearly 0.6. At the same time, our high-z classifier can achieve 80 per cent recall of true high-redshift bursts, while incurring a false positive rate of 20 per cent. With 40 per cent false positive rate the classifier can achieve ~100 per cent recall. As a result, the most reliable selection of high-redshift GRBs is obtained by combining predictions from both the high-z classifier and the machine-z regressor.« less
Relative optical navigation around small bodies via Extreme Learning Machine
NASA Astrophysics Data System (ADS)
Law, Andrew M.
To perform close proximity operations under a low-gravity environment, relative and absolute positions are vital information to the maneuver. Hence navigation is inseparably integrated in space travel. Extreme Learning Machine (ELM) is presented as an optical navigation method around small celestial bodies. Optical Navigation uses visual observation instruments such as a camera to acquire useful data and determine spacecraft position. The required input data for operation is merely a single image strip and a nadir image. ELM is a machine learning Single Layer feed-Forward Network (SLFN), a type of neural network (NN). The algorithm is developed on the predicate that input weights and biases can be randomly assigned and does not require back-propagation. The learned model is the output layer weights which are used to calculate a prediction. Together, Extreme Learning Machine Optical Navigation (ELM OpNav) utilizes optical images and ELM algorithm to train the machine to navigate around a target body. In this thesis the asteroid, Vesta, is the designated celestial body. The trained ELMs estimate the position of the spacecraft during operation with a single data set. The results show the approach is promising and potentially suitable for on-board navigation.
Cheng, Zhanzhan; Zhou, Shuigeng; Wang, Yang; Liu, Hui; Guan, Jihong; Chen, Yi-Ping Phoebe
2016-05-18
Prediction of compound-protein interactions (CPIs) is to find new compound-protein pairs where a protein is targeted by at least a compound, which is a crucial step in new drug design. Currently, a number of machine learning based methods have been developed to predict new CPIs in the literature. However, as there is not yet any publicly available set of validated negative CPIs, most existing machine learning based approaches use the unknown interactions (not validated CPIs) selected randomly as the negative examples to train classifiers for predicting new CPIs. Obviously, this is not quite reasonable and unavoidably impacts the CPI prediction performance. In this paper, we simply take the unknown CPIs as unlabeled examples, and propose a new method called PUCPI (the abbreviation of PU learning for Compound-Protein Interaction identification) that employs biased-SVM (Support Vector Machine) to predict CPIs using only positive and unlabeled examples. PU learning is a class of learning methods that leans from positive and unlabeled (PU) samples. To the best of our knowledge, this is the first work that identifies CPIs using only positive and unlabeled examples. We first collect known CPIs as positive examples and then randomly select compound-protein pairs not in the positive set as unlabeled examples. For each CPI/compound-protein pair, we extract protein domains as protein features and compound substructures as chemical features, then take the tensor product of the corresponding compound features and protein features as the feature vector of the CPI/compound-protein pair. After that, biased-SVM is employed to train classifiers on different datasets of CPIs and compound-protein pairs. Experiments over various datasets show that our method outperforms six typical classifiers, including random forest, L1- and L2-regularized logistic regression, naive Bayes, SVM and k-nearest neighbor (kNN), and three types of existing CPI prediction models. Source code, datasets and related documents of PUCPI are available at: http://admis.fudan.edu.cn/projects/pucpi.html.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ukwatta, T. N.; Wozniak, P. R.; Gehrels, N.
Studies of high-redshift gamma-ray bursts (GRBs) provide important information about the early Universe such as the rates of stellar collapsars and mergers, the metallicity content, constraints on the re-ionization period, and probes of the Hubble expansion. Rapid selection of high-z candidates from GRB samples reported in real time by dedicated space missions such as Swift is the key to identifying the most distant bursts before the optical afterglow becomes too dim to warrant a good spectrum. Here, we introduce ‘machine-z’, a redshift prediction algorithm and a ‘high-z’ classifier for Swift GRBs based on machine learning. Our method relies exclusively onmore » canonical data commonly available within the first few hours after the GRB trigger. Using a sample of 284 bursts with measured redshifts, we trained a randomized ensemble of decision trees (random forest) to perform both regression and classification. Cross-validated performance studies show that the correlation coefficient between machine-z predictions and the true redshift is nearly 0.6. At the same time, our high-z classifier can achieve 80 per cent recall of true high-redshift bursts, while incurring a false positive rate of 20 per cent. With 40 per cent false positive rate the classifier can achieve ~100 per cent recall. As a result, the most reliable selection of high-redshift GRBs is obtained by combining predictions from both the high-z classifier and the machine-z regressor.« less
Method and apparatus for precision laser micromachining
Chang, Jim; Warner, Bruce E.; Dragon, Ernest P.
2000-05-02
A method and apparatus for micromachining and microdrilling which results in a machined part of superior surface quality is provided. The system uses a near diffraction limited, high repetition rate, short pulse length, visible wavelength laser. The laser is combined with a high speed precision tilting mirror and suitable beam shaping optics, thus allowing a large amount of energy to be accurately positioned and scanned on the workpiece. As a result of this system, complicated, high resolution machining patterns can be achieved. A cover plate may be temporarily attached to the workpiece. Then as the workpiece material is vaporized during the machining process, the vapors condense on the cover plate rather than the surface of the workpiece. In order to eliminate cutting rate variations as the cutting direction is varied, a randomly polarized laser beam is utilized. A rotating half-wave plate is used to achieve the random polarization. In order to correctly locate the focus at the desired location within the workpiece, the position of the focus is first determined by monitoring the speckle size while varying the distance between the workpiece and the focussing optics. When the speckle size reaches a maximum, the focus is located at the first surface of the workpiece. After the location of the focus has been determined, it is repositioned to the desired location within the workpiece, thus optimizing the quality of the machined area.
Effects of promotional materials on vending sales of low-fat items in teachers' lounges.
Fiske, Amy; Cullen, Karen Weber
2004-01-01
This study examined the impact of an environmental intervention in the form of promotional materials and increased availability of low-fat items on vending machine sales. Ten vending machines were selected and randomly assigned to one of three conditions: control, or one of two experimental conditions. Vending machines in the two intervention conditions received three additional low-fat selections. Low-fat items were promoted at two levels: labels (intervention I), and labels plus signs (intervention II). The number of individual items sold and the total revenue generated was recorded weekly for each machine for 4 weeks. Use of promotional materials resulted in a small, but not significant, increase in the number of low-fat items sold, although machine sales were not significantly impacted by the change in product selection. Results of this study, although not statistically significant, suggest that environmental change may be a realistic means of positively influencing consumer behavior.
Ellis, Katherine; Godbole, Suneeta; Marshall, Simon; Lanckriet, Gert; Staudenmayer, John; Kerr, Jacqueline
2014-01-01
Active travel is an important area in physical activity research, but objective measurement of active travel is still difficult. Automated methods to measure travel behaviors will improve research in this area. In this paper, we present a supervised machine learning method for transportation mode prediction from global positioning system (GPS) and accelerometer data. We collected a dataset of about 150 h of GPS and accelerometer data from two research assistants following a protocol of prescribed trips consisting of five activities: bicycling, riding in a vehicle, walking, sitting, and standing. We extracted 49 features from 1-min windows of this data. We compared the performance of several machine learning algorithms and chose a random forest algorithm to classify the transportation mode. We used a moving average output filter to smooth the output predictions over time. The random forest algorithm achieved 89.8% cross-validated accuracy on this dataset. Adding the moving average filter to smooth output predictions increased the cross-validated accuracy to 91.9%. Machine learning methods are a viable approach for automating measurement of active travel, particularly for measuring travel activities that traditional accelerometer data processing methods misclassify, such as bicycling and vehicle travel.
Machine Learning Techniques for Prediction of Early Childhood Obesity.
Dugan, T M; Mukhopadhyay, S; Carroll, A; Downs, S
2015-01-01
This paper aims to predict childhood obesity after age two, using only data collected prior to the second birthday by a clinical decision support system called CHICA. Analyses of six different machine learning methods: RandomTree, RandomForest, J48, ID3, Naïve Bayes, and Bayes trained on CHICA data show that an accurate, sensitive model can be created. Of the methods analyzed, the ID3 model trained on the CHICA dataset proved the best overall performance with accuracy of 85% and sensitivity of 89%. Additionally, the ID3 model had a positive predictive value of 84% and a negative predictive value of 88%. The structure of the tree also gives insight into the strongest predictors of future obesity in children. Many of the strongest predictors seen in the ID3 modeling of the CHICA dataset have been independently validated in the literature as correlated with obesity, thereby supporting the validity of the model. This study demonstrated that data from a production clinical decision support system can be used to build an accurate machine learning model to predict obesity in children after age two.
Bozkurt, Selen; Bostanci, Asli; Turhan, Murat
2017-08-11
The goal of this study is to evaluate the results of machine learning methods for the classification of OSA severity of patients with suspected sleep disorder breathing as normal, mild, moderate and severe based on non-polysomnographic variables: 1) clinical data, 2) symptoms and 3) physical examination. In order to produce classification models for OSA severity, five different machine learning methods (Bayesian network, Decision Tree, Random Forest, Neural Networks and Logistic Regression) were trained while relevant variables and their relationships were derived empirically from observed data. Each model was trained and evaluated using 10-fold cross-validation and to evaluate classification performances of all methods, true positive rate (TPR), false positive rate (FPR), Positive Predictive Value (PPV), F measure and Area Under Receiver Operating Characteristics curve (ROC-AUC) were used. Results of 10-fold cross validated tests with different variable settings promisingly indicated that the OSA severity of suspected OSA patients can be classified, using non-polysomnographic features, with 0.71 true positive rate as the highest and, 0.15 false positive rate as the lowest, respectively. Moreover, the test results of different variables settings revealed that the accuracy of the classification models was significantly improved when physical examination variables were added to the model. Study results showed that machine learning methods can be used to estimate the probabilities of no, mild, moderate, and severe obstructive sleep apnea and such approaches may improve accurate initial OSA screening and help referring only the suspected moderate or severe OSA patients to sleep laboratories for the expensive tests.
Ellis, Katherine; Godbole, Suneeta; Marshall, Simon; Lanckriet, Gert; Staudenmayer, John; Kerr, Jacqueline
2014-01-01
Background: Active travel is an important area in physical activity research, but objective measurement of active travel is still difficult. Automated methods to measure travel behaviors will improve research in this area. In this paper, we present a supervised machine learning method for transportation mode prediction from global positioning system (GPS) and accelerometer data. Methods: We collected a dataset of about 150 h of GPS and accelerometer data from two research assistants following a protocol of prescribed trips consisting of five activities: bicycling, riding in a vehicle, walking, sitting, and standing. We extracted 49 features from 1-min windows of this data. We compared the performance of several machine learning algorithms and chose a random forest algorithm to classify the transportation mode. We used a moving average output filter to smooth the output predictions over time. Results: The random forest algorithm achieved 89.8% cross-validated accuracy on this dataset. Adding the moving average filter to smooth output predictions increased the cross-validated accuracy to 91.9%. Conclusion: Machine learning methods are a viable approach for automating measurement of active travel, particularly for measuring travel activities that traditional accelerometer data processing methods misclassify, such as bicycling and vehicle travel. PMID:24795875
Can machine-learning improve cardiovascular risk prediction using routine clinical data?
Kai, Joe; Garibaldi, Jonathan M.; Qureshi, Nadeem
2017-01-01
Background Current approaches to predict cardiovascular risk fail to identify many people who would benefit from preventive treatment, while others receive unnecessary intervention. Machine-learning offers opportunity to improve accuracy by exploiting complex interactions between risk factors. We assessed whether machine-learning can improve cardiovascular risk prediction. Methods Prospective cohort study using routine clinical data of 378,256 patients from UK family practices, free from cardiovascular disease at outset. Four machine-learning algorithms (random forest, logistic regression, gradient boosting machines, neural networks) were compared to an established algorithm (American College of Cardiology guidelines) to predict first cardiovascular event over 10-years. Predictive accuracy was assessed by area under the ‘receiver operating curve’ (AUC); and sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) to predict 7.5% cardiovascular risk (threshold for initiating statins). Findings 24,970 incident cardiovascular events (6.6%) occurred. Compared to the established risk prediction algorithm (AUC 0.728, 95% CI 0.723–0.735), machine-learning algorithms improved prediction: random forest +1.7% (AUC 0.745, 95% CI 0.739–0.750), logistic regression +3.2% (AUC 0.760, 95% CI 0.755–0.766), gradient boosting +3.3% (AUC 0.761, 95% CI 0.755–0.766), neural networks +3.6% (AUC 0.764, 95% CI 0.759–0.769). The highest achieving (neural networks) algorithm predicted 4,998/7,404 cases (sensitivity 67.5%, PPV 18.4%) and 53,458/75,585 non-cases (specificity 70.7%, NPV 95.7%), correctly predicting 355 (+7.6%) more patients who developed cardiovascular disease compared to the established algorithm. Conclusions Machine-learning significantly improves accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment, while avoiding unnecessary treatment of others. PMID:28376093
Can machine-learning improve cardiovascular risk prediction using routine clinical data?
Weng, Stephen F; Reps, Jenna; Kai, Joe; Garibaldi, Jonathan M; Qureshi, Nadeem
2017-01-01
Current approaches to predict cardiovascular risk fail to identify many people who would benefit from preventive treatment, while others receive unnecessary intervention. Machine-learning offers opportunity to improve accuracy by exploiting complex interactions between risk factors. We assessed whether machine-learning can improve cardiovascular risk prediction. Prospective cohort study using routine clinical data of 378,256 patients from UK family practices, free from cardiovascular disease at outset. Four machine-learning algorithms (random forest, logistic regression, gradient boosting machines, neural networks) were compared to an established algorithm (American College of Cardiology guidelines) to predict first cardiovascular event over 10-years. Predictive accuracy was assessed by area under the 'receiver operating curve' (AUC); and sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) to predict 7.5% cardiovascular risk (threshold for initiating statins). 24,970 incident cardiovascular events (6.6%) occurred. Compared to the established risk prediction algorithm (AUC 0.728, 95% CI 0.723-0.735), machine-learning algorithms improved prediction: random forest +1.7% (AUC 0.745, 95% CI 0.739-0.750), logistic regression +3.2% (AUC 0.760, 95% CI 0.755-0.766), gradient boosting +3.3% (AUC 0.761, 95% CI 0.755-0.766), neural networks +3.6% (AUC 0.764, 95% CI 0.759-0.769). The highest achieving (neural networks) algorithm predicted 4,998/7,404 cases (sensitivity 67.5%, PPV 18.4%) and 53,458/75,585 non-cases (specificity 70.7%, NPV 95.7%), correctly predicting 355 (+7.6%) more patients who developed cardiovascular disease compared to the established algorithm. Machine-learning significantly improves accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment, while avoiding unnecessary treatment of others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dayman, Ken J; Ade, Brian J; Weber, Charles F
High-dimensional, nonlinear function estimation using large datasets is a current area of interest in the machine learning community, and applications may be found throughout the analytical sciences, where ever-growing datasets are making more information available to the analyst. In this paper, we leverage the existing relevance vector machine, a sparse Bayesian version of the well-studied support vector machine, and expand the method to include integrated feature selection and automatic function shaping. These innovations produce an algorithm that is able to distinguish variables that are useful for making predictions of a response from variables that are unrelated or confusing. We testmore » the technology using synthetic data, conduct initial performance studies, and develop a model capable of making position-independent predictions of the coreaveraged burnup using a single specimen drawn randomly from a nuclear reactor core.« less
Graph Kernels for Molecular Similarity.
Rupp, Matthias; Schneider, Gisbert
2010-04-12
Molecular similarity measures are important for many cheminformatics applications like ligand-based virtual screening and quantitative structure-property relationships. Graph kernels are formal similarity measures defined directly on graphs, such as the (annotated) molecular structure graph. Graph kernels are positive semi-definite functions, i.e., they correspond to inner products. This property makes them suitable for use with kernel-based machine learning algorithms such as support vector machines and Gaussian processes. We review the major types of kernels between graphs (based on random walks, subgraphs, and optimal assignments, respectively), and discuss their advantages, limitations, and successful applications in cheminformatics. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Molecular Basis of Mechano-Signal Transduction in Vascular Endothelial Cells
NASA Technical Reports Server (NTRS)
Jo, Hanjoong
2004-01-01
Simulated microgravity studies using a random positioning machine (RPM). One RPM machine has been built for us by Fokker Science in Netherland. Using the device, we have developed an in vitro system to examine the effect of simulated microgravity on osteoblastic bone cells. Using this system, we have carried out gene chip studies to determine the gene expression profiles of osteoblasts cultured under simulated microgravity conditions in comparison to static controls. From this study, we have identified numerous genes, some of which are expected ones inducing bone loss, but many of which are unexpected and unknown. These findings are being prepared for publications.
Li, Ning; Cao, Chao; Wang, Cong
2017-06-15
Supporting simultaneous access of machine-type devices is a critical challenge in machine-to-machine (M2M) communications. In this paper, we propose an optimal scheme to dynamically adjust the Access Class Barring (ACB) factor and the number of random access channel (RACH) resources for clustered machine-to-machine (M2M) communications, in which Delay-Sensitive (DS) devices coexist with Delay-Tolerant (DT) ones. In M2M communications, since delay-sensitive devices share random access resources with delay-tolerant devices, reducing the resources consumed by delay-sensitive devices means that there will be more resources available to delay-tolerant ones. Our goal is to optimize the random access scheme, which can not only satisfy the requirements of delay-sensitive devices, but also take the communication quality of delay-tolerant ones into consideration. We discuss this problem from the perspective of delay-sensitive services by adjusting the resource allocation and ACB scheme for these devices dynamically. Simulation results show that our proposed scheme realizes good performance in satisfying the delay-sensitive services as well as increasing the utilization rate of the random access resources allocated to them.
Accurate Diabetes Risk Stratification Using Machine Learning: Role of Missing Value and Outliers.
Maniruzzaman, Md; Rahman, Md Jahanur; Al-MehediHasan, Md; Suri, Harman S; Abedin, Md Menhazul; El-Baz, Ayman; Suri, Jasjit S
2018-04-10
Diabetes mellitus is a group of metabolic diseases in which blood sugar levels are too high. About 8.8% of the world was diabetic in 2017. It is projected that this will reach nearly 10% by 2045. The major challenge is that when machine learning-based classifiers are applied to such data sets for risk stratification, leads to lower performance. Thus, our objective is to develop an optimized and robust machine learning (ML) system under the assumption that missing values or outliers if replaced by a median configuration will yield higher risk stratification accuracy. This ML-based risk stratification is designed, optimized and evaluated, where: (i) the features are extracted and optimized from the six feature selection techniques (random forest, logistic regression, mutual information, principal component analysis, analysis of variance, and Fisher discriminant ratio) and combined with ten different types of classifiers (linear discriminant analysis, quadratic discriminant analysis, naïve Bayes, Gaussian process classification, support vector machine, artificial neural network, Adaboost, logistic regression, decision tree, and random forest) under the hypothesis that both missing values and outliers when replaced by computed medians will improve the risk stratification accuracy. Pima Indian diabetic dataset (768 patients: 268 diabetic and 500 controls) was used. Our results demonstrate that on replacing the missing values and outliers by group median and median values, respectively and further using the combination of random forest feature selection and random forest classification technique yields an accuracy, sensitivity, specificity, positive predictive value, negative predictive value and area under the curve as: 92.26%, 95.96%, 79.72%, 91.14%, 91.20%, and 0.93, respectively. This is an improvement of 10% over previously developed techniques published in literature. The system was validated for its stability and reliability. RF-based model showed the best performance when outliers are replaced by median values.
NASA Astrophysics Data System (ADS)
Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie
2015-08-01
The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.
Forsyth, Alexander W; Barzilay, Regina; Hughes, Kevin S; Lui, Dickson; Lorenz, Karl A; Enzinger, Andrea; Tulsky, James A; Lindvall, Charlotta
2018-06-01
Clinicians document cancer patients' symptoms in free-text format within electronic health record visit notes. Although symptoms are critically important to quality of life and often herald clinical status changes, computational methods to assess the trajectory of symptoms over time are woefully underdeveloped. To create machine learning algorithms capable of extracting patient-reported symptoms from free-text electronic health record notes. The data set included 103,564 sentences obtained from the electronic clinical notes of 2695 breast cancer patients receiving paclitaxel-containing chemotherapy at two academic cancer centers between May 1996 and May 2015. We manually annotated 10,000 sentences and trained a conditional random field model to predict words indicating an active symptom (positive label), absence of a symptom (negative label), or no symptom at all (neutral label). Sentences labeled by human coder were divided into training, validation, and test data sets. Final model performance was determined on 20% test data unused in model development or tuning. The final model achieved precision of 0.82, 0.86, and 0.99 and recall of 0.56, 0.69, and 1.00 for positive, negative, and neutral symptom labels, respectively. The most common positive symptoms were pain, fatigue, and nausea. Machine-based labeling of 103,564 sentences took two minutes. We demonstrate the potential of machine learning to gather, track, and analyze symptoms experienced by cancer patients during chemotherapy. Although our initial model requires further optimization to improve the performance, further model building may yield machine learning methods suitable to be deployed in routine clinical care, quality improvement, and research applications. Copyright © 2018 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
2018-11-09
Retreatment and EMS were completed using a dental operating microscope (Zeiss OPMJ PROergo) and contemporary materials and techniques. Retreatrnent...paralleling technique and external cone positioning device (XCP) using size 2 digital sensors (Kodak RVG 6100). A dental x-ray machine (Planmeca...EMS and retreatment were calculated. Examiners used MiPACS dental enterprise viewer (LEAD Technologies Inc, Charlotte, NC) to interpret randomized
A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.
Mascaro, Joseph; Asner, Gregory P; Knapp, David E; Kennedy-Bowdoin, Ty; Martin, Roberta E; Anderson, Christopher; Higgins, Mark; Chadwick, K Dana
2014-01-01
Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus). The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging)-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area) for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag"), which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1) when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.
A Tale of Two “Forests”: Random Forest Machine Learning Aids Tropical Forest Carbon Mapping
Mascaro, Joseph; Asner, Gregory P.; Knapp, David E.; Kennedy-Bowdoin, Ty; Martin, Roberta E.; Anderson, Christopher; Higgins, Mark; Chadwick, K. Dana
2014-01-01
Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus). The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging)-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including—in the latter case—x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area) for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called “out-of-bag”), which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha−1 when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation. PMID:24489686
Xie, X S; Qi, C; Du, X Y; Shi, W W; Zhang, M
2016-02-20
To investigate the features of hand-transmitted vibration of common vibration tools in the workplace for automobile casting and assembly. From September to October, 2014, measurement and spectral analysis were performed for 16 typical hand tools(including percussion drill, pneumatic wrench, grinding machine, internal grinder, and arc welding machine) in 6 workplaces for automobile casting and assembly according to ISO 5349-1-2001 Mechanical vibration-Measurement and evaluation of human exposure to hand-transmitted vibration-part 1: General requirements and ISO 5349-2-2001 Mechanical vibration-Measurement and evaluation of human exposure to hand-transmitted vibration-Part 2: Practical guidance for measurement in the workplace. The vibration acceleration waveforms of shearing machine, arc welding machine, and pneumatic wrench were mainly impact wave and random wave, while those of internal grinder, angle grinder, percussion drill, and grinding machine were mainly long-and short-period waves. The daily exposure duration to vibration of electric wrench, pneumatic wrench, shearing machine, percussion drill, and internal grinder was about 150 minutes, while that of plasma cutting machine, angle grinder, grinding machine, bench grinder, and arc welding machine was about 400 minutes. The range of vibration total value(ahv) was as follows: pneumatic wrench 0.30~11.04 m/s(2), grinding wheel 1.61~8.97 m/s(2), internal grinder 1.46~8.70 m/s(2), percussion drill 11.10~14.50 m/s(2), and arc welding machine 0.21~2.18 m/s(2). The workers engaged in cleaning had the longest daily exposure duration to vibration, and the effective value of 8-hour energy-equivalent frequency-weighted acceleration for them[A(8)] was 8.03 m/s(2), while this value for workers engaged in assembly was 4.78 m/s(2). The frequency spectrogram with an 1/3-time frequency interval showed that grinding machine, angle grinder, and percussion drill had a high vibration acceleration, and the vibration limit curve was recommended for those with a frequency higher than 400 min/d. The workers who are engaged in cleaning, grinding, and a few positions of assembly and use grinding machine, angle grinder, internal grinder, and percussion drill are exposed to vibrations with a high vibration acceleration and at a high position of the frequency spectrum. The hand-transmitted vibration in the positions of cutting, polishing, and cleaning in automobile casting has great harm, and the harm caused by pneumatic wrench in automobile assembly should be taken seriously.
Feed mechanism and method for feeding minute items
Stringer, Timothy Kent; Yerganian, Simon Scott
2012-11-06
A feeding mechanism and method for feeding minute items, such as capacitors, resistors, or solder preforms. The mechanism is adapted to receive a plurality of the randomly-positioned and randomly-oriented extremely small or minute items, and to isolate, orient, and position the items in a specific repeatable pickup location wherefrom they may be removed for use by, for example, a computer-controlled automated assembly machine. The mechanism comprises a sliding shelf adapted to receive and support the items; a wiper arm adapted to achieve a single even layer of the items; and a pushing arm adapted to push the items into the pickup location. The mechanism can be adapted for providing the items with a more exact orientation, and can also be adapted for use in a liquid environment.
Feed mechanism and method for feeding minute items
Stringer, Timothy Kent [Bucyrus, KS; Yerganian, Simon Scott [Lee's Summit, MO
2009-10-20
A feeding mechanism and method for feeding minute items, such as capacitors, resistors, or solder preforms. The mechanism is adapted to receive a plurality of the randomly-positioned and randomly-oriented extremely small or minute items, and to isolate, orient, and position one or more of the items in a specific repeatable pickup location wherefrom they may be removed for use by, for example, a computer-controlled automated assembly machine. The mechanism comprises a sliding shelf adapted to receive and support the items; a wiper arm adapted to achieve a single even layer of the items; and a pushing arm adapted to push the items into the pickup location. The mechanism can be adapted for providing the items with a more exact orientation, and can also be adapted for use in a liquid environment.
Fluid dynamics during Random Positioning Machine micro-gravity experiments
NASA Astrophysics Data System (ADS)
Leguy, Carole A. D.; Delfos, René; Pourquie, Mathieu J. B. M.; Poelma, Christian; Westerweel, Jerry; van Loon, Jack J. W. A.
2017-06-01
A Random Positioning Machine (RPM) is a device used to study the role of gravity on biological systems. This is accomplished through continuous reorientation of the sample such that the net influence of gravity is randomized over time. The aim of this study is to predict fluid flow behavior during such RPM simulated microgravity studies, which may explain differences found between RPM and space flight experiments. An analytical solution is given for a cylinder as a model for an experimental container. Then, a dual-axis rotating frame is used to mimic the motion characteristics of an RPM with sinusoidal rotation frequencies of 0.2 Hz and 0.1 Hz while Particle Image Velocimetry is used to measure the velocity field inside a flask. To reproduce the same experiment numerically, a Direct Numerical Simulation model is used. The analytical model predicts that an increase in the Womersley number leads to higher shear stresses at the cylinder wall and decrease in fluid angular velocity inside the cylinder. The experimental results show that periodic single-axis rotation induces a fluid motion parallel to the wall and that a complex flow is observed for two-axis rotation with a maximum wall shear stress of 8.0 mPa (80 mdyne /cm2). The experimental and numerical results show that oscillatory motion inside an RPM induces flow motion that can, depending on the experimental samples, reduce the quality of the simulated microgravity. Thus, it is crucial to determine the appropriate oscillatory frequency of the axes to design biological experiments.
ERIC Educational Resources Information Center
Kocken, Paul L.; Eeuwijk, Jennifer; van Kesteren, Nicole M.C.; Dusseldorp, Elise; Buijs, Goof; Bassa-Dafesh, Zeina; Snel, Jeltje
2012-01-01
Background: Vending machines account for food sales and revenue in schools. We examined 3 strategies for promoting the sale of lower-calorie food products from vending machines in high schools in the Netherlands. Methods: A school-based randomized controlled trial was conducted in 13 experimental schools and 15 control schools. Three strategies…
The use of instruments for gravity related research
NASA Astrophysics Data System (ADS)
van Loon, J. J. W.
The first experiments using machines and instruments to manipulate gravity and thus learn about the impact of gravity onto living systems were performed by T A Knight in 1806 exactly 2 centuries ago What have we learned from these experiments and in particular what have we leaned about the use of instruments to reveal the impact of gravity and rotation onto plants and other living systems In this overview paper I will introduce the use of various instruments for gravity related research From water wheel to Random Positioning Machine RPM from clinostat to Free Fall Machine FFM and Rotating Wall Vessel RWV the usefulness and working principles of these microgravity simulators will be discussed We will discuss the question whether the RPM is a useful microgravity simulator and how to interpret experimental results This work is supported by NWO-ALW-SRON grant MG-057
NASA Astrophysics Data System (ADS)
Warnke, Elisabeth; Kopp, Sascha; Wehland, Markus; Hemmersbach, Ruth; Bauer, Johann; Pietsch, Jessica; Infanger, Manfred; Grimm, Daniela
2016-06-01
The ground-based facilities 2D clinostat (CN) and Random Positioning Machine (RPM) were designed to simulate microgravity conditions on Earth. With support of the CORA-ESA-GBF program we could use both facilities to investigate the impact of simulated microgravity on normal and malignant thyroid cells. In this review we report about the current knowledge of thyroid cancer cells and normal thyrocytes grown under altered gravity conditions with a special focus on growth behaviour, changes in the gene expression pattern and protein content, as well as on altered secretion behaviour of the cells. We reviewed data obtained from normal thyrocytes and cell lines (two poorly differentiated follicular thyroid cancer cell lines FTC-133 and ML-1, as well as the normal thyroid cell lines Nthy-ori 3-1 and HTU-5). Thyroid cells cultured under conditions of simulated microgravity (RPM and CN) and in Space showed similar changes with respect to spheroid formation. In static 1 g control cultures no spheroids were detectable. Changes in the regulation of cytokines are discussed to be involved in MCS (multicellular spheroids) formation. The ESA-GBF program helps the scientists to prepare future spaceflight experiments and furthermore, it might help to identify targets for drug therapy against thyroid cancer.
NASA Astrophysics Data System (ADS)
Gershovich, J. G.; Buravkova, L. B.
2008-06-01
Recent studies have shown that simulated microgravity (SMG) results in altered proliferation and differentiation not only osteoblasts but also affects on osteogenic capacity of mesenchymal stem cells (MSCs) from various sources. For present study we used system that simulates effects of microgravity produced by the Random Positioning Machine (RPM). Cultured MCSs from human bone marrow and human osteoblasts (OBs) were exposed to SMG at RPM for 10-40 days. Induced osteogenesis of these progenitor cells was compared with the appropriate static (1g) and dynamic (horizontal shaker) controls. Clinorotated OBs and MSCs showed proliferation rate lower than static and dynamic control groups of cells in the early terms of SMG. Significant reduction of ALP activity was detected after 10 days of clinorotation of MSCs. There was no such dramatic difference in ALP activity of MSCs derived cells between SMG and control groups after 20 days of clinorotation but the expression of ALP was still reduced. However, virtually no matrix mineralization was found in OBs cultured under SMG conditions in the presence of differentiation stimuli. The similar effect was observed when we assayed matrix calcification of MSCs derived cultures. Thus, our results confirm low gravity mediated reduction of osteogenesis of different osteogenic precursors' cells and can clarify the mechanisms of bone loss during spaceflight.
Wuest, Simon L; Richard, Stéphane; Kopp, Sascha; Grimm, Daniela; Egli, Marcel
2015-01-01
Random Positioning Machines (RPMs) have been used since many years as a ground-based model to simulate microgravity. In this review we discuss several aspects of the RPM. Recent technological development has expanded the operative range of the RPM substantially. New possibilities of live cell imaging and partial gravity simulations, for example, are of particular interest. For obtaining valuable and reliable results from RPM experiments, the appropriate use of the RPM is of utmost importance. The simulation of microgravity requires that the RPM's rotation is faster than the biological process under study, but not so fast that undesired side effects appear. It remains a legitimate question, however, whether the RPM can accurately and reliably simulate microgravity conditions comparable to real microgravity in space. We attempt to answer this question by mathematically analyzing the forces working on the samples while they are mounted on the operating RPM and by comparing data obtained under real microgravity in space and simulated microgravity on the RPM. In conclusion and after taking the mentioned constraints into consideration, we are convinced that simulated microgravity experiments on the RPM are a valid alternative for conducting examinations on the influence of the force of gravity in a fast and straightforward approach.
Wuest, Simon L.; Richard, Stéphane; Kopp, Sascha
2015-01-01
Random Positioning Machines (RPMs) have been used since many years as a ground-based model to simulate microgravity. In this review we discuss several aspects of the RPM. Recent technological development has expanded the operative range of the RPM substantially. New possibilities of live cell imaging and partial gravity simulations, for example, are of particular interest. For obtaining valuable and reliable results from RPM experiments, the appropriate use of the RPM is of utmost importance. The simulation of microgravity requires that the RPM's rotation is faster than the biological process under study, but not so fast that undesired side effects appear. It remains a legitimate question, however, whether the RPM can accurately and reliably simulate microgravity conditions comparable to real microgravity in space. We attempt to answer this question by mathematically analyzing the forces working on the samples while they are mounted on the operating RPM and by comparing data obtained under real microgravity in space and simulated microgravity on the RPM. In conclusion and after taking the mentioned constraints into consideration, we are convinced that simulated microgravity experiments on the RPM are a valid alternative for conducting examinations on the influence of the force of gravity in a fast and straightforward approach. PMID:25649075
Probability machines: consistent probability estimation using nonparametric learning machines.
Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A
2012-01-01
Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.
Comparison of two different types of heat and moisture exchangers in ventilated patients.
Ahmed, Syed Moied; Mahajan, Jyotsna; Nadeem, Abu
2009-09-01
To compare the efficacy of two different types of Heat and Moisture Exchangers (HME filters) in reducing transmission of infection from the patient to ventilator and vice versa and also its cost effectiveness. Randomized, controlled, double blind, prospective study. 60 patients admitted to the ICU from May 1, 2007 to July 31, 2007 of either sex, age ranging between 20 and 60 years, requiring mechanical ventilation were screened for the study. Following intubation of the patients, the HME device was attached to the breathing circuit randomly by the chit-in-a box method. The patients were divided into two groups according to the HME filters attached. Both the groups were comparable with respect to age and sex ratio. In Type A HME filters, 80% showed growth on the patient end within 24 h and in 27% filters, culture was positive both on the patient and the machine ends. The organisms detected were Staphylococcus aureus, Escherichia coli and Pseudomonas aeruginosa and co-related with the endotracheal aspirate culture. After 48 h, 87% filters developed organisms on the patient end, whereas 64% filters were culture positive both on the patient and the machine end. In Type B HME filters, 70% showed growth on patient's end after 24 h. Organisms detected were S. aureus, E. coli, P. aeruginosa and Acinetobacter. Thirty percent of filters were culture negative on both the patient and machine ends. No growth was found on the machine end in any of the filters after 24 h. After 48 h, 73% of the filters had microbial growth on the patient end, whereas only 3% filters had growth (S. aureus) on the machine end only. Seven percent had growth on both the patient as well as the machine ends. The microorganisms detected on the HME filters co-related with the endotracheal aspirate cultures. HME filter Type B (study group) was significantly better in reducing contamination of ventilator from the patient as compared to Type A (control group), which was routinely used in our ICU. Type B filter was found to be effective for at least 48 h. This study can also be applied to patients coming to emergency department (ED) and requiring emergency surgery and postoperative ventilation; and trauma patients like flail chest, head injury etc. requiring ventilatory support to prevent them from acquiring ventilator-associated pneumonia (VAP).
NASA Astrophysics Data System (ADS)
Olory Agomma, R.; Vázquez, C.; Cresson, T.; De Guise, J.
2018-02-01
Most algorithms to detect and identify anatomical structures in medical images require either to be initialized close to the target structure, or to know that the structure is present in the image, or to be trained on a homogeneous database (e.g. all full body or all lower limbs). Detecting these structures when there is no guarantee that the structure is present in the image, or when the image database is heterogeneous (mixed configurations), is a challenge for automatic algorithms. In this work we compared two state-of-the-art machine learning techniques in order to determine which one is the most appropriate for predicting targets locations based on image patches. By knowing the position of thirteen landmarks points, labelled by an expert in EOS frontal radiography, we learn the displacement between salient points detected in the image and these thirteen landmarks. The learning step is carried out with a machine learning approach by exploring two methods: Convolutional Neural Network (CNN) and Random Forest (RF). The automatic detection of the thirteen landmarks points in a new image is then obtained by averaging the positions of each one of these thirteen landmarks estimated from all the salient points in the new image. We respectively obtain for CNN and RF, an average prediction error (both mean and standard deviation in mm) of 29 +/-18 and 30 +/- 21 for the thirteen landmarks points, indicating the approximate location of anatomical regions. On the other hand, the learning time is 9 days for CNN versus 80 minutes for RF. We provide a comparison of the results between the two machine learning approaches.
Paging memory from random access memory to backing storage in a parallel computer
Archer, Charles J; Blocksome, Michael A; Inglett, Todd A; Ratterman, Joseph D; Smith, Brian E
2013-05-21
Paging memory from random access memory (`RAM`) to backing storage in a parallel computer that includes a plurality of compute nodes, including: executing a data processing application on a virtual machine operating system in a virtual machine on a first compute node; providing, by a second compute node, backing storage for the contents of RAM on the first compute node; and swapping, by the virtual machine operating system in the virtual machine on the first compute node, a page of memory from RAM on the first compute node to the backing storage on the second compute node.
Analysis of Machine Learning Techniques for Heart Failure Readmissions.
Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M
2016-11-01
The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Ksoll, Victor F.; Gouliermis, Dimitrios A.; Klessen, Ralf S.; Grebel, Eva K.; Sabbi, Elena; Anderson, Jay; Lennon, Daniel J.; Cignoni, Michele; de Marchi, Guido; Smith, Linda J.; Tosi, Monica; van der Marel, Roeland P.
2018-05-01
The Hubble Tarantula Treasury Project (HTTP) has provided an unprecedented photometric coverage of the entire star-burst region of 30 Doradus down to the half Solar mass limit. We use the deep stellar catalogue of HTTP to identify all the pre-main-sequence (PMS) stars of the region, i.e., stars that have not started their lives on the main-sequence yet. The photometric distinction of these stars from the more evolved populations is not a trivial task due to several factors that alter their colour-magnitude diagram positions. The identification of PMS stars requires, thus, sophisticated statistical methods. We employ Machine Learning Classification techniques on the HTTP survey of more than 800,000 sources to identify the PMS stellar content of the observed field. Our methodology consists of 1) carefully selecting the most probable low-mass PMS stellar population of the star-forming cluster NGC2070, 2) using this sample to train classification algorithms to build a predictive model for PMS stars, and 3) applying this model in order to identify the most probable PMS content across the entire Tarantula Nebula. We employ Decision Tree, Random Forest and Support Vector Machine classifiers to categorise the stars as PMS and Non-PMS. The Random Forest and Support Vector Machine provided the most accurate models, predicting about 20,000 sources with a candidateship probability higher than 50 percent, and almost 10,000 PMS candidates with a probability higher than 95 percent. This is the richest and most accurate photometric catalogue of extragalactic PMS candidates across the extent of a whole star-forming complex.
Lima, Fabiano F; Camillo, Carlos A; Gobbo, Luis A; Trevisan, Iara B; Nascimento, Wesley B B M; Silva, Bruna S A; Lima, Manoel C S; Ramos, Dionei; Ramos, Ercy M C
2018-03-01
The objectives of the study were to compare the effects of resistance training using either a low cost and portable elastic tubing or conventional weight machines on muscle force, functional exercise capacity, and health-related quality of life (HRQOL) in middle-aged to older healthy adults. In this clinical trial twenty-nine middle-aged to older healthy adults were randomly assigned to one of the three groups a priori defined: resistance training with elastic tubing (ETG; n = 10), conventional resistance training (weight machines) (CTG; n = 9) and control group (CG, n = 10). Both ETG and CTG followed a 12-week resistance training (3x/week - upper and lower limbs). Muscle force, functional exercise capacity and HRQOL were evaluated at baseline, 6 and 12 weeks. CG underwent the three evaluations with no formal intervention or activity counseling provided. ETG and CTG increased similarly and significantly muscle force (Δ16-44% in ETG and Δ25-46% in CTG, p < 0.05 for both), functional exercise capacity (ETG Δ4 ± 4% and CTG Δ6±8%; p < 0.05 for both). Improvement on "pain" domain of HRQOL could only be observed in the CTG (Δ21 ± 26% p = 0.037). CG showed no statistical improvement in any of the variables investigated. Resistance training using elastic tubing (a low cost and portable tool) and conventional resistance training using weight machines promoted similar positive effects on peripheral muscle force and functional exercise capacity in middle-aged to older healthy adults.
Tensor manifold-based extreme learning machine for 2.5-D face recognition
NASA Astrophysics Data System (ADS)
Chong, Lee Ying; Ong, Thian Song; Teoh, Andrew Beng Jin
2018-01-01
We explore the use of the Gabor regional covariance matrix (GRCM), a flexible matrix-based descriptor that embeds the Gabor features in the covariance matrix, as a 2.5-D facial descriptor and an effective means of feature fusion for 2.5-D face recognition problems. Despite its promise, matching is not a trivial problem for GRCM since it is a special instance of a symmetric positive definite (SPD) matrix that resides in non-Euclidean space as a tensor manifold. This implies that GRCM is incompatible with the existing vector-based classifiers and distance matchers. Therefore, we bridge the gap of the GRCM and extreme learning machine (ELM), a vector-based classifier for the 2.5-D face recognition problem. We put forward a tensor manifold-compliant ELM and its two variants by embedding the SPD matrix randomly into reproducing kernel Hilbert space (RKHS) via tensor kernel functions. To preserve the pair-wise distance of the embedded data, we orthogonalize the random-embedded SPD matrix. Hence, classification can be done using a simple ridge regressor, an integrated component of ELM, on the random orthogonal RKHS. Experimental results show that our proposed method is able to improve the recognition performance and further enhance the computational efficiency.
Modeling Music Emotion Judgments Using Machine Learning Methods
Vempala, Naresh N.; Russo, Frank A.
2018-01-01
Emotion judgments and five channels of physiological data were obtained from 60 participants listening to 60 music excerpts. Various machine learning (ML) methods were used to model the emotion judgments inclusive of neural networks, linear regression, and random forests. Input for models of perceived emotion consisted of audio features extracted from the music recordings. Input for models of felt emotion consisted of physiological features extracted from the physiological recordings. Models were trained and interpreted with consideration of the classic debate in music emotion between cognitivists and emotivists. Our models supported a hybrid position wherein emotion judgments were influenced by a combination of perceived and felt emotions. In comparing the different ML approaches that were used for modeling, we conclude that neural networks were optimal, yielding models that were flexible as well as interpretable. Inspection of a committee machine, encompassing an ensemble of networks, revealed that arousal judgments were predominantly influenced by felt emotion, whereas valence judgments were predominantly influenced by perceived emotion. PMID:29354080
Modeling Music Emotion Judgments Using Machine Learning Methods.
Vempala, Naresh N; Russo, Frank A
2017-01-01
Emotion judgments and five channels of physiological data were obtained from 60 participants listening to 60 music excerpts. Various machine learning (ML) methods were used to model the emotion judgments inclusive of neural networks, linear regression, and random forests. Input for models of perceived emotion consisted of audio features extracted from the music recordings. Input for models of felt emotion consisted of physiological features extracted from the physiological recordings. Models were trained and interpreted with consideration of the classic debate in music emotion between cognitivists and emotivists. Our models supported a hybrid position wherein emotion judgments were influenced by a combination of perceived and felt emotions. In comparing the different ML approaches that were used for modeling, we conclude that neural networks were optimal, yielding models that were flexible as well as interpretable. Inspection of a committee machine, encompassing an ensemble of networks, revealed that arousal judgments were predominantly influenced by felt emotion, whereas valence judgments were predominantly influenced by perceived emotion.
A machine learning-based framework to identify type 2 diabetes through electronic health records
Zheng, Tao; Xie, Wei; Xu, Liling; He, Xiaoying; Zhang, Ya; You, Mingrong; Yang, Gong; Chen, You
2016-01-01
Objective To discover diverse genotype-phenotype associations affiliated with Type 2 Diabetes Mellitus (T2DM) via genome-wide association study (GWAS) and phenome-wide association study (PheWAS), more cases (T2DM subjects) and controls (subjects without T2DM) are required to be identified (e.g., via Electronic Health Records (EHR)). However, existing expert based identification algorithms often suffer in a low recall rate and could miss a large number of valuable samples under conservative filtering standards. The goal of this work is to develop a semi-automated framework based on machine learning as a pilot study to liberalize filtering criteria to improve recall rate with a keeping of low false positive rate. Materials and methods We propose a data informed framework for identifying subjects with and without T2DM from EHR via feature engineering and machine learning. We evaluate and contrast the identification performance of widely-used machine learning models within our framework, including k-Nearest-Neighbors, Naïve Bayes, Decision Tree, Random Forest, Support Vector Machine and Logistic Regression. Our framework was conducted on 300 patient samples (161 cases, 60 controls and 79 unconfirmed subjects), randomly selected from 23,281 diabetes related cohort retrieved from a regional distributed EHR repository ranging from 2012 to 2014. Results We apply top-performing machine learning algorithms on the engineered features. We benchmark and contrast the accuracy, precision, AUC, sensitivity and specificity of classification models against the state-of-the-art expert algorithm for identification of T2DM subjects. Our results indicate that the framework achieved high identification performances (∼0.98 in average AUC), which are much higher than the state-of-the-art algorithm (0.71 in AUC). Discussion Expert algorithm-based identification of T2DM subjects from EHR is often hampered by the high missing rates due to their conservative selection criteria. Our framework leverages machine learning and feature engineering to loosen such selection criteria to achieve a high identification rate of cases and controls. Conclusions Our proposed framework demonstrates a more accurate and efficient approach for identifying subjects with and without T2DM from EHR. PMID:27919371
A machine learning-based framework to identify type 2 diabetes through electronic health records.
Zheng, Tao; Xie, Wei; Xu, Liling; He, Xiaoying; Zhang, Ya; You, Mingrong; Yang, Gong; Chen, You
2017-01-01
To discover diverse genotype-phenotype associations affiliated with Type 2 Diabetes Mellitus (T2DM) via genome-wide association study (GWAS) and phenome-wide association study (PheWAS), more cases (T2DM subjects) and controls (subjects without T2DM) are required to be identified (e.g., via Electronic Health Records (EHR)). However, existing expert based identification algorithms often suffer in a low recall rate and could miss a large number of valuable samples under conservative filtering standards. The goal of this work is to develop a semi-automated framework based on machine learning as a pilot study to liberalize filtering criteria to improve recall rate with a keeping of low false positive rate. We propose a data informed framework for identifying subjects with and without T2DM from EHR via feature engineering and machine learning. We evaluate and contrast the identification performance of widely-used machine learning models within our framework, including k-Nearest-Neighbors, Naïve Bayes, Decision Tree, Random Forest, Support Vector Machine and Logistic Regression. Our framework was conducted on 300 patient samples (161 cases, 60 controls and 79 unconfirmed subjects), randomly selected from 23,281 diabetes related cohort retrieved from a regional distributed EHR repository ranging from 2012 to 2014. We apply top-performing machine learning algorithms on the engineered features. We benchmark and contrast the accuracy, precision, AUC, sensitivity and specificity of classification models against the state-of-the-art expert algorithm for identification of T2DM subjects. Our results indicate that the framework achieved high identification performances (∼0.98 in average AUC), which are much higher than the state-of-the-art algorithm (0.71 in AUC). Expert algorithm-based identification of T2DM subjects from EHR is often hampered by the high missing rates due to their conservative selection criteria. Our framework leverages machine learning and feature engineering to loosen such selection criteria to achieve a high identification rate of cases and controls. Our proposed framework demonstrates a more accurate and efficient approach for identifying subjects with and without T2DM from EHR. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Peana, A. T.; Marzocco, S.; Bianco, G.; Autore, G.; Pinto, A.; Pippia, P.
2008-06-01
The aim of this work is to evaluate the rat intestinal transit as well as the expression of enzymes involved in this process and in gastrointestinal homeostasis as ciclooxygenase (COX-1 and COX-2), the inducibile isoform of nitric oxide synthase (iNOS), ICAM-1 and heat shock proteins HSP70 and HSP90. The modeled microgravity conditions were performed utilizing a three-dimensional clinostat, the Random Positioning Machine (RPM). Our results indicate that modeled microgravity significantly reduce rat intestinal transit. Western blot analysis on small intestine tissues of RPM rats reveals a significant increase in iNOS expression, a significant reduction in COX-2 levels, while COX-1 expression remains unaltered, and a significant increase in ICAM-1 and HSP 70 expression. Also a significant increase in HSP 90 stomach expression indicates a strong effect of simulated low g on gastrointestinal homeostasis.
Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS
NASA Astrophysics Data System (ADS)
Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.
2018-05-01
State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.
Interferometric correction system for a numerically controlled machine
Burleson, Robert R.
1978-01-01
An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.
Linear positioning laser calibration setup of CNC machine tools
NASA Astrophysics Data System (ADS)
Sui, Xiulin; Yang, Congjing
2002-10-01
The linear positioning laser calibration setup of CNC machine tools is capable of executing machine tool laser calibraiotn and backlash compensation. Using this setup, hole locations on CNC machien tools will be correct and machien tool geometry will be evaluated and adjusted. Machien tool laser calibration and backlash compensation is a simple and straightforward process. First the setup is to 'find' the stroke limits of the axis. Then the laser head is then brought into correct alignment. Second is to move the machine axis to the other extreme, the laser head is now aligned, using rotation and elevation adjustments. Finally the machine is moved to the start position and final alignment is verified. The stroke of the machine, and the machine compensation interval dictate the amount of data required for each axis. These factors determine the amount of time required for a through compensation of the linear positioning accuracy. The Laser Calibrator System monitors the material temperature and the air density; this takes into consideration machine thermal growth and laser beam frequency. This linear positioning laser calibration setup can be used on CNC machine tools, CNC lathes, horizontal centers and vertical machining centers.
Method and system for controlling a permanent magnet machine
Walters, James E.
2003-05-20
Method and system for controlling the start of a permanent magnet machine are provided. The method allows to assign a parameter value indicative of an estimated initial rotor position of the machine. The method further allows to energize the machine with a level of current being sufficiently high to start rotor motion in a desired direction in the event the initial rotor position estimate is sufficiently close to the actual rotor position of the machine. A sensing action allows to sense whether any incremental changes in rotor position occur in response to the energizing action. In the event no changes in rotor position are sensed, the method allows to incrementally adjust the estimated rotor position by a first set of angular values until changes in rotor position are sensed. In the event changes in rotor position are sensed, the method allows to provide a rotor alignment signal as rotor motion continues. The alignment signal allows to align the estimated rotor position relative to the actual rotor position. This alignment action allows for operating the machine over a wide speed range.
Zdravevski, Eftim; Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger
2017-01-01
Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from either accelerometer position. Machine learning techniques can be used for automatic activity recognition, as they provide very accurate activity recognition, significantly more accurate than when keeping a diary. Identification of jogging periods in adolescents can be performed using only one accelerometer. Performance-wise there is no significant benefit from using accelerometers on both locations.
Hanlon, John A.; Gill, Timothy J.
2001-01-01
Machine tools can be accurately measured and positioned on manufacturing machines within very small tolerances by use of an autocollimator on a 3-axis mount on a manufacturing machine and positioned so as to focus on a reference tooling ball or a machine tool, a digital camera connected to the viewing end of the autocollimator, and a marker and measure generator for receiving digital images from the camera, then displaying or measuring distances between the projection reticle and the reference reticle on the monitoring screen, and relating the distances to the actual position of the autocollimator relative to the reference tooling ball. The images and measurements are used to set the position of the machine tool and to measure the size and shape of the machine tool tip, and examine cutting edge wear. patent
FEATURE B. MACHINE GUN POSITION WITH LEWIS MOUNT, VIEW FACING ...
FEATURE B. MACHINE GUN POSITION WITH LEWIS MOUNT, VIEW FACING NORTHWEST. - Naval Air Station Barbers Point, Battery-Machine Gun Positions, South of Point Cruz Road & west of Coral Sea Road, Ewa, Honolulu County, HI
FEATURE C. MACHINE GUN POSITION WITH REMNANT OF MOUNT, VIEW ...
FEATURE C. MACHINE GUN POSITION WITH REMNANT OF MOUNT, VIEW FACING SOUTH-SOUTHEAST. - Naval Air Station Barbers Point, Battery-Machine Gun Positions, South of Point Cruz Road & west of Coral Sea Road, Ewa, Honolulu County, HI
FEATURE B. MACHINE GUN POSITION WITH LEWIS MOUNT, VIEW FACING ...
FEATURE B. MACHINE GUN POSITION WITH LEWIS MOUNT, VIEW FACING NORTHWEST (with scale stick). - Naval Air Station Barbers Point, Battery-Machine Gun Positions, South of Point Cruz Road & west of Coral Sea Road, Ewa, Honolulu County, HI
Health Promotion and Healthier Products Increase Vending Purchases: A Randomized Factorial Trial.
Hua, Sophia V; Kimmel, Lisa; Van Emmenes, Michael; Taherian, Rafi; Remer, Geraldine; Millman, Adam; Ickovics, Jeannette R
2017-07-01
The current food environment has a high prevalence of nutrient-sparse foods and beverages, most starkly seen in vending machine offerings. There are currently few studies that explore different interventions that might lead to healthier vending machine purchases. To examine how healthier product availability, price reductions, and/or promotional signs affect sales and revenue of snack and beverage vending machines. A 2×2×2 factorial randomized controlled trial was conducted. Students, staff, and employees on a university campus. All co-located snack and beverage vending machines (n=56, 28 snack and 28 beverage) were randomized into one of eight conditions: availability of healthier products and/or 25% price reduction for healthier items and/or promotional signs on machines. Aggregate sales and revenue data for the 5-month study period (February to June 2015) were compared with data from the same months 1 year prior. Analyses were conducted July 2015. The change in units sold and revenue between February through June 2014 and 2015. Linear regression models (main effects and interaction effects) and t test analyses were performed. The interaction between healthier product guidelines and promotional signs in snack vending machines documented increased revenue (P<0.05). Beverage machines randomized to meet healthier product guidelines documented increased units sold (P<0.05) with no revenue change. Price reductions alone had no effect, nor were there any effects for the three-way interaction of the factors. Examining top-selling products for all vending machines combined, pre- to postintervention, we found an overall shift to healthier purchasing. When healthier vending snacks are available, promotional signs are also important to ensure consumers purchase those items in greater amounts. Mitigating potential loss in profits is essential for sustainability of a healthier food environment. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Minati, Ludovico; Nigri, Anna; Rosazza, Cristina; Bruzzone, Maria Grazia
2012-06-01
Previous studies have demonstrated the possibility of using functional MRI to control a robot arm through a brain-machine interface by directly coupling haemodynamic activity in the sensory-motor cortex to the position of two axes. Here, we extend this work by implementing interaction at a more abstract level, whereby imagined actions deliver structured commands to a robot arm guided by a machine vision system. Rather than extracting signals from a small number of pre-selected regions, the proposed system adaptively determines at individual level how to map representative brain areas to the input nodes of a classifier network. In this initial study, a median action recognition accuracy of 90% was attained on five volunteers performing a game consisting of collecting randomly positioned coloured pawns and placing them into cups. The "pawn" and "cup" instructions were imparted through four mental imaginery tasks, linked to robot arm actions by a state machine. With the current implementation in MatLab language the median action recognition time was 24.3s and the robot execution time was 17.7s. We demonstrate the notion of combining haemodynamic brain-machine interfacing with computer vision to implement interaction at the level of high-level commands rather than individual movements, which may find application in future fMRI approaches relevant to brain-lesioned patients, and provide source code supporting further work on larger command sets and real-time processing. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.
Position feedback control system
Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.
2003-01-01
Disclosed is a system and method for independently evaluating the spatial positional performance of a machine having a movable member, comprising an articulated coordinate measuring machine comprising: a first revolute joint; a probe arm, having a proximal end rigidly attached to the first joint, and having a distal end with a probe tip attached thereto, wherein the probe tip is pivotally mounted to the movable machine member; a second revolute joint; a first support arm serially connecting the first joint to the second joint; and coordinate processing means, operatively connected to the first and second revolute joints, for calculating the spatial coordinates of the probe tip; means for kinematically constraining the articulated coordinate measuring machine to a working surface; and comparator means, in operative association with the coordinate processing means and with the movable machine, for comparing the true position of the movable machine member, as measured by the true position of the probe tip, with the desired position of the movable machine member.
Proteome Analysis of Thyroid Cancer Cells After Long-Term Exposure to a Random Positioning Machine
NASA Astrophysics Data System (ADS)
Pietsch, Jessica; Bauer, Johann; Weber, Gerhard; Nissum, Mikkel; Westphal, Kriss; Egli, Marcel; Grosse, Jirka; Schönberger, Johann; Eilles, Christoph; Infanger, Manfred; Grimm, Daniela
2011-11-01
Annulling gravity during cell culturing triggers various types of cells to change their protein expression in a time dependent manner. We therefore decided to determine gravity sensitive proteins and their period of sensitivity to the effects of gravity. In this study, thyroid cancer cells of the ML-1 cell line were cultured under normal gravity (1 g) or in a random positioning machine (RPM), which simulated near weightlessness for 7 and 11 days. Cells were then sonicated and proteins released into the supernatant were separated from those that remained attached to the cell fragments. Subsequently, both types of proteins were fractionated by free-flow isoelectric focussing (FF-IEF). The fractions obtained were further separated by sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) to which comparable FF-IEF fractions derived from cells cultured either under 1 g or on the RPM had been applied side by side. The separation resulted in pairs of lanes, on which a number of identical bands were observed. Selected gel pieces were excised and their proteins determined by mass spectrometry. Equal proteins from cells cultured under normal gravity and the RPM, respectively, were detected in comparable gel pieces. However, many of these proteins had received different Mascot scores. Quantifying heat shock cognate 71 kDa protein, glutathione S-transferase P, nucleoside diphosphate kinase A and annexin-2 by Western blotting using whole cell lysates indicated usefulness of Mascot scores for selecting the most efficient antibodies.
NASA Astrophysics Data System (ADS)
Matía, Isabel; van Loon, Jack W. A.; Carnero-Díaz, Eugénie; Marco, Roberto; Medina, Francisco Javier
2009-01-01
The study of the modifications induced by altered gravity in functions of plant cells is a valuable tool for the objective of the survival of terrestrial organisms in conditions different from those of the Earth. We have used the system "cell proliferation-ribosome biogenesis", two inter-related essential cellular processes, with the purpose of studying these modifications. Arabidopsis seedlings belonging to a transformed line containing the reporter gene GUS under the control of the promoter of the cyclin gene CYCB1, a cell cycle regulator, were grown in a Random Positioning Machine, a device known to accurately simulate microgravity. Samples were taken at 2, 4 and 8 days after germination and subjected to biometrical analysis and cellular morphometrical, ultrastructural and immunocytochemical studies in order to know the rates of cell proliferation and ribosome biogenesis, plus the estimation of the expression of the cyclin gene, as an indication of the state of cell cycle regulation. Our results show that cells divide more in simulated microgravity in a Random Positioning Machine than in control gravity, but the cell cycle appears significantly altered as early as 2 days after germination. Furthermore, higher proliferation is not accompanied by an increase in ribosome synthesis, as is the rule on Earth, but the functional markers of this process appear depleted in simulated microgravity-grown samples. Therefore, the alteration of the gravitational environmental conditions results in a considerable stress for plant cells, including those not specialized in gravity perception.
Tewary, S; Arun, I; Ahmed, R; Chatterjee, S; Chakraborty, C
2017-11-01
In prognostic evaluation of breast cancer Immunohistochemical (IHC) markers namely, oestrogen receptor (ER) and progesterone receptor (PR) are widely used. The expert pathologist investigates qualitatively the stained tissue slide under microscope to provide the Allred score; which is clinically used for therapeutic decision making. Such qualitative judgment is time-consuming, tedious and more often suffers from interobserver variability. As a result, it leads to imprecise IHC score for ER and PR. To overcome this, there is an urgent need of developing a reliable and efficient IHC quantifier for high throughput decision making. In view of this, our study aims at developing an automated IHC profiler for quantitative assessment of ER and PR molecular expression from stained tissue images. We propose here to use CMYK colour space for positively and negatively stained cell extraction for proportion score. Also colour features are used for quantitative assessment of intensity scoring among the positively stained cells. Five different machine learning models namely artificial neural network, Naïve Bayes, K-nearest neighbours, decision tree and random forest are considered for learning the colour features using average red, green and blue pixel values of positively stained cell patches. Fifty cases of ER- and PR-stained tissues have been evaluated for validation with the expert pathologist's score. All five models perform adequately where random forest shows the best correlation with the expert's score (Pearson's correlation coefficient = 0.9192). In the proposed approach the average variation of diaminobenzidine (DAB) to nuclear area from the expert's score is found to be 7.58%, as compared to 27.83% for state-of-the-art ImmunoRatio software. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Computational work and time on finite machines.
NASA Technical Reports Server (NTRS)
Savage, J. E.
1972-01-01
Measures of the computational work and computational delay required by machines to compute functions are given. Exchange inequalities are developed for random access, tape, and drum machines to show that product inequalities between storage and time, number of drum tracks and time, number of bits in an address and time, etc., must be satisfied to compute finite functions on bounded machines.
Method for measuring the contour of a machined part
Bieg, L.F.
1995-05-30
A method is disclosed for measuring the contour of a machined part with a contour gage apparatus, having a probe assembly including a probe tip for providing a measure of linear displacement of the tip on the surface of the part. The contour gage apparatus may be moved into and out of position for measuring the part while the part is still carried on the machining apparatus. Relative positions between the part and the probe tip may be changed, and a scanning operation is performed on the machined part by sweeping the part with the probe tip, whereby data points representing linear positions of the probe tip at prescribed rotation intervals in the position changes between the part and the probe tip are recorded. The method further allows real-time adjustment of the apparatus machining the part, including real-time adjustment of the machining apparatus in response to wear of the tool that occurs during machining. 5 figs.
Method for measuring the contour of a machined part
Bieg, Lothar F.
1995-05-30
A method for measuring the contour of a machined part with a contour gage apparatus, having a probe assembly including a probe tip for providing a measure of linear displacement of the tip on the surface of the part. The contour gage apparatus may be moved into and out of position for measuring the part while the part is still carried on the machining apparatus. Relative positions between the part and the probe tip may be changed, and a scanning operation is performed on the machined part by sweeping the part with the probe tip, whereby data points representing linear positions of the probe tip at prescribed rotation intervals in the position changes between the part and the probe tip are recorded. The method further allows real-time adjustment of the apparatus machining the part, including real-time adjustment of the machining apparatus in response to wear of the tool that occurs during machining.
Binny, Diana; Lancaster, Craig M; Trapp, Jamie V; Crowe, Scott B
2017-09-01
This study utilizes process control techniques to identify action limits for TomoTherapy couch positioning quality assurance tests. A test was introduced to monitor accuracy of the applied couch offset detection in the TomoTherapy Hi-Art treatment system using the TQA "Step-Wedge Helical" module and MVCT detector. Individual X-charts, process capability (cp), probability (P), and acceptability (cpk) indices were used to monitor a 4-year couch IEC offset data to detect systematic and random errors in the couch positional accuracy for different action levels. Process capability tests were also performed on the retrospective data to define tolerances based on user-specified levels. A second study was carried out whereby physical couch offsets were applied using the TQA module and the MVCT detector was used to detect the observed variations. Random and systematic variations were observed for the SPC-based upper and lower control limits, and investigations were carried out to maintain the ongoing stability of the process for a 4-year and a three-monthly period. Local trend analysis showed mean variations up to ±0.5 mm in the three-monthly analysis period for all IEC offset measurements. Variations were also observed in the detected versus applied offsets using the MVCT detector in the second study largely in the vertical direction, and actions were taken to remediate this error. Based on the results, it was recommended that imaging shifts in each coordinate direction be only applied after assessing the machine for applied versus detected test results using the step helical module. User-specified tolerance levels of at least ±2 mm were recommended for a test frequency of once every 3 months to improve couch positional accuracy. SPC enables detection of systematic variations prior to reaching machine tolerance levels. Couch encoding system recalibrations reduced variations to user-specified levels and a monitoring period of 3 months using SPC facilitated in detecting systematic and random variations. SPC analysis for couch positional accuracy enabled greater control in the identification of errors, thereby increasing confidence levels in daily treatment setups. © 2017 Royal Brisbane and Women's Hospital, Metro North Hospital and Health Service. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Do warning signs on electronic gaming machines influence irrational cognitions?
Monaghan, Sally; Blaszczynski, Alex; Nower, Lia
2009-08-01
Electronic gaming machines are popular among problem gamblers; in response, governments have introduced "responsible gaming" legislation incorporating the mandatory display of warning signs on or near electronic gaming machines. These signs are designed to correct irrational and erroneous beliefs through the provision of accurate information on probabilities of winning and the concept of randomness. There is minimal empirical data evaluating the effectiveness of such signs. In this study, 93 undergraduate students were randomly allocated to standard and informative messages displayed on an electronic gaming machine during play in a laboratory setting. Results revealed that a majority of participants incorrectly estimated gambling odds and reported irrational gambling-related cognitions prior to play. In addition, there were no significant between-group differences, and few participants recalled the content of messages or modified their gambling-related cognitions. Signs placed on electronic gaming machines may not modify irrational beliefs or alter gambling behaviour.
Beccaria, Marco; Mellors, Theodore R; Petion, Jacky S; Rees, Christiaan A; Nasir, Mavra; Systrom, Hannah K; Sairistil, Jean W; Jean-Juste, Marc-Antoine; Rivera, Vanessa; Lavoile, Kerline; Severe, Patrice; Pape, Jean W; Wright, Peter F; Hill, Jane E
2018-02-01
Tuberculosis (TB) remains a global public health malady that claims almost 1.8 million lives annually. Diagnosis of TB represents perhaps one of the most challenging aspects of tuberculosis control. Gold standards for diagnosis of active TB (culture and nucleic acid amplification) are sputum-dependent, however, in up to a third of TB cases, an adequate biological sputum sample is not readily available. The analysis of exhaled breath, as an alternative to sputum-dependent tests, has the potential to provide a simple, fast, and non-invasive, and ready-available diagnostic service that could positively change TB detection. Human breath has been evaluated in the setting of active tuberculosis using thermal desorption-comprehensive two-dimensional gas chromatography-time of flight mass spectrometry methodology. From the entire spectrum of volatile metabolites in breath, three random forest machine learning models were applied leading to the generation of a panel of 46 breath features. The twenty-two common features within each random forest model used were selected as a set that could distinguish subjects with confirmed pulmonary M. tuberculosis infection and people with other pathologies than TB. Copyright © 2018 Elsevier B.V. All rights reserved.
LTE-advanced random access mechanism for M2M communication: A review
NASA Astrophysics Data System (ADS)
Mustafa, Rashid; Sarowa, Sandeep; Jaglan, Reena Rathee; Khan, Mohammad Junaid; Agrawal, Sunil
2016-03-01
Machine Type Communications (MTC) enables one or more self-sufficient machines to communicate directly with one another without human interference. MTC applications include smart grid, security, e-Health and intelligent automation system. To support huge numbers of MTC devices, one of the challenging issues is to provide a competent way for numerous access in the network and to minimize network overload. In this article, the different control mechanisms for overload random access are reviewed to avoid congestion caused by random access channel (RACH) of MTC devices. However, past and present wireless technologies have been engineered for Human-to-Human (H2H) communications, in particular, for transmission of voice. Consequently the Long Term Evolution (LTE) -Advanced is expected to play a central role in communicating Machine to Machine (M2M) and are very optimistic about H2H communications. Distinct and unique characteristics of M2M communications create new challenges from those in H2H communications. In this article, we investigate the impact of massive M2M terminals attempting random access to LTE-Advanced all at once. We discuss and review the solutions to alleviate the overload problem by Third Generation Partnership Project (3GPP). As a result, we evaluate and compare these solutions that can effectively eliminate the congestion on the random access channel for M2M communications without affecting H2H communications.
L.R. Iverson; A.M. Prasad; A. Liaw
2004-01-01
More and better machine learning tools are becoming available for landscape ecologists to aid in understanding species-environment relationships and to map probable species occurrence now and potentially into the future. To thal end, we evaluated three statistical models: Regression Tree Analybib (RTA), Bagging Trees (BT) and Random Forest (RF) for their utility in...
Calibrating random forests for probability estimation.
Dankowski, Theresa; Ziegler, Andreas
2016-09-30
Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Deep Learning Accurately Predicts Estrogen Receptor Status in Breast Cancer Metabolomics Data.
Alakwaa, Fadhl M; Chaudhary, Kumardeep; Garmire, Lana X
2018-01-05
Metabolomics holds the promise as a new technology to diagnose highly heterogeneous diseases. Conventionally, metabolomics data analysis for diagnosis is done using various statistical and machine learning based classification methods. However, it remains unknown if deep neural network, a class of increasingly popular machine learning methods, is suitable to classify metabolomics data. Here we use a cohort of 271 breast cancer tissues, 204 positive estrogen receptor (ER+), and 67 negative estrogen receptor (ER-) to test the accuracies of feed-forward networks, a deep learning (DL) framework, as well as six widely used machine learning models, namely random forest (RF), support vector machines (SVM), recursive partitioning and regression trees (RPART), linear discriminant analysis (LDA), prediction analysis for microarrays (PAM), and generalized boosted models (GBM). DL framework has the highest area under the curve (AUC) of 0.93 in classifying ER+/ER- patients, compared to the other six machine learning algorithms. Furthermore, the biological interpretation of the first hidden layer reveals eight commonly enriched significant metabolomics pathways (adjusted P-value <0.05) that cannot be discovered by other machine learning methods. Among them, protein digestion and absorption and ATP-binding cassette (ABC) transporters pathways are also confirmed in integrated analysis between metabolomics and gene expression data in these samples. In summary, deep learning method shows advantages for metabolomics based breast cancer ER status classification, with both the highest prediction accuracy (AUC = 0.93) and better revelation of disease biology. We encourage the adoption of feed-forward networks based deep learning method in the metabolomics research community for classification.
Identifying Wrist Fracture Patients with High Accuracy by Automatic Categorization of X-ray Reports
de Bruijn, Berry; Cranney, Ann; O’Donnell, Siobhan; Martin, Joel D.; Forster, Alan J.
2006-01-01
The authors performed this study to determine the accuracy of several text classification methods to categorize wrist x-ray reports. We randomly sampled 751 textual wrist x-ray reports. Two expert reviewers rated the presence (n = 301) or absence (n = 450) of an acute fracture of wrist. We developed two information retrieval (IR) text classification methods and a machine learning method using a support vector machine (TC-1). In cross-validation on the derivation set (n = 493), TC-1 outperformed the two IR based methods and six benchmark classifiers, including Naive Bayes and a Neural Network. In the validation set (n = 258), TC-1 demonstrated consistent performance with 93.8% accuracy; 95.5% sensitivity; 92.9% specificity; and 87.5% positive predictive value. TC-1 was easy to implement and superior in performance to the other classification methods. PMID:16929046
NASA Astrophysics Data System (ADS)
Xu, Chong; Dai, Fuchu; Xu, Xiwei; Lee, Yuan Hsi
2012-04-01
Support vector machine (SVM) modeling is based on statistical learning theory. It involves a training phase with associated input and target output values. In recent years, the method has become increasingly popular. The main purpose of this study is to evaluate the mapping power of SVM modeling in earthquake triggered landslide-susceptibility mapping for a section of the Jianjiang River watershed using a Geographic Information System (GIS) software. The river was affected by the Wenchuan earthquake of May 12, 2008. Visual interpretation of colored aerial photographs of 1-m resolution and extensive field surveys provided a detailed landslide inventory map containing 3147 landslides related to the 2008 Wenchuan earthquake. Elevation, slope angle, slope aspect, distance from seismogenic faults, distance from drainages, and lithology were used as the controlling parameters. For modeling, three groups of positive and negative training samples were used in concert with four different kernel functions. Positive training samples include the centroids of 500 large landslides, those of all 3147 landslides, and 5000 randomly selected points in landslide polygons. Negative training samples include 500, 3147, and 5000 randomly selected points on slopes that remained stable during the Wenchuan earthquake. The four kernel functions are linear, polynomial, radial basis, and sigmoid. In total, 12 cases of landslide susceptibility were mapped. Comparative analyses of landslide-susceptibility probability and area relation curves show that both the polynomial and radial basis functions suitably classified the input data as either landslide positive or negative though the radial basis function was more successful. The 12 generated landslide-susceptibility maps were compared with known landslide centroid locations and landslide polygons to verify the success rate and predictive accuracy of each model. The 12 results were further validated using area-under-curve analysis. Group 3 with 5000 randomly selected points on the landslide polygons, and 5000 randomly selected points along stable slopes gave the best results with a success rate of 79.20% and predictive accuracy of 79.13% under the radial basis function. Of all the results, the sigmoid kernel function was the least skillful when used in concert with the centroid data of all 3147 landslides as positive training samples, and the negative training samples of 3147 randomly selected points in regions of stable slope (success rate = 54.95%; predictive accuracy = 61.85%). This paper also provides suggestions and reference data for selecting appropriate training samples and kernel function types for earthquake triggered landslide-susceptibility mapping using SVM modeling. Predictive landslide-susceptibility maps could be useful in hazard mitigation by helping planners understand the probability of landslides in different regions.
Brown, Raymond J.
1977-01-01
The present invention relates to a tool setting device for use with numerically controlled machine tools, such as lathes and milling machines. A reference position of the machine tool relative to the workpiece along both the X and Y axes is utilized by the control circuit for driving the tool through its program. This reference position is determined for both axes by displacing a single linear variable displacement transducer (LVDT) with the machine tool through a T-shaped pivotal bar. The use of the T-shaped bar allows the cutting tool to be moved sequentially in the X or Y direction for indicating the actual position of the machine tool relative to the predetermined desired position in the numerical control circuit by using a single LVDT.
Mikhchi, Abbas; Honarvar, Mahmood; Kashan, Nasser Emam Jomeh; Aminafshar, Mehdi
2016-06-21
Genotype imputation is an important tool for prediction of unknown genotypes for both unrelated individuals and parent-offspring trios. Several imputation methods are available and can either employ universal machine learning methods, or deploy algorithms dedicated to infer missing genotypes. In this research the performance of eight machine learning methods: Support Vector Machine, K-Nearest Neighbors, Extreme Learning Machine, Radial Basis Function, Random Forest, AdaBoost, LogitBoost, and TotalBoost compared in terms of the imputation accuracy, computation time and the factors affecting imputation accuracy. The methods employed using real and simulated datasets to impute the un-typed SNPs in parent-offspring trios. The tested methods show that imputation of parent-offspring trios can be accurate. The Random Forest and Support Vector Machine were more accurate than the other machine learning methods. The TotalBoost performed slightly worse than the other methods.The running times were different between methods. The ELM was always most fast algorithm. In case of increasing the sample size, the RBF requires long imputation time.The tested methods in this research can be an alternative for imputation of un-typed SNPs in low missing rate of data. However, it is recommended that other machine learning methods to be used for imputation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Enhanced networked server management with random remote backups
NASA Astrophysics Data System (ADS)
Kim, Song-Kyoo
2003-08-01
In this paper, the model is focused on available server management in network environments. The (remote) backup servers are hooked up by VPN (Virtual Private Network) and replace broken main severs immediately. A virtual private network (VPN) is a way to use a public network infrastructure and hooks up long-distance servers within a single network infrastructure. The servers can be represent as "machines" and then the system deals with main unreliable and random auxiliary spare (remote backup) machines. When the system performs a mandatory routine maintenance, auxiliary machines are being used for backups during idle periods. Unlike other existing models, the availability of auxiliary machines is changed for each activation in this enhanced model. Analytically tractable results are obtained by using several mathematical techniques and the results are demonstrated in the framework of optimized networked server allocation problems.
Method and apparatus for characterizing and enhancing the dynamic performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F
2013-12-17
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include dynamic one axis positional accuracy of the machine tool, dynamic cross-axis stability of the machine tool, and dynamic multi-axis positional accuracy of the machine tool.
Support vector machine learning-based fMRI data group analysis.
Wang, Ze; Childress, Anna R; Wang, Jiongjiong; Detre, John A
2007-07-15
To explore the multivariate nature of fMRI data and to consider the inter-subject brain response discrepancies, a multivariate and brain response model-free method is fundamentally required. Two such methods are presented in this paper by integrating a machine learning algorithm, the support vector machine (SVM), and the random effect model. Without any brain response modeling, SVM was used to extract a whole brain spatial discriminance map (SDM), representing the brain response difference between the contrasted experimental conditions. Population inference was then obtained through the random effect analysis (RFX) or permutation testing (PMU) on the individual subjects' SDMs. Applied to arterial spin labeling (ASL) perfusion fMRI data, SDM RFX yielded lower false-positive rates in the null hypothesis test and higher detection sensitivity for synthetic activations with varying cluster size and activation strengths, compared to the univariate general linear model (GLM)-based RFX. For a sensory-motor ASL fMRI study, both SDM RFX and SDM PMU yielded similar activation patterns to GLM RFX and GLM PMU, respectively, but with higher t values and cluster extensions at the same significance level. Capitalizing on the absence of temporal noise correlation in ASL data, this study also incorporated PMU in the individual-level GLM and SVM analyses accompanied by group-level analysis through RFX or group-level PMU. Providing inferences on the probability of being activated or deactivated at each voxel, these individual-level PMU-based group analysis methods can be used to threshold the analysis results of GLM RFX, SDM RFX or SDM PMU.
Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger
2017-01-01
Background Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. Methods The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. Results The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from either accelerometer position. Conclusions Machine learning techniques can be used for automatic activity recognition, as they provide very accurate activity recognition, significantly more accurate than when keeping a diary. Identification of jogging periods in adolescents can be performed using only one accelerometer. Performance-wise there is no significant benefit from using accelerometers on both locations. PMID:28880923
NASA Astrophysics Data System (ADS)
Aleshcheva, Ganna; Hauslage, Jens; Hemmersbach, Ruth; Infanger, Manfred; Bauer, Johann; Grimm, Daniela; Sahana, Jayashree
Chondrocytes are the only cell type found in human cartilage consisting of proteoglycans and type II collagen. Several studies on chondrocytes cultured either in Space or on a ground-based facility for simulation of microgravity revealed that these cells are very resistant to adverse effects and stress induced by altered gravity. Tissue engineering of chondrocytes is a new strategy for cartilage regeneration. Using a three-dimensional Random Positioning Machine and a 2D rotating clinostat, devices designed to simulate microgravity on Earth, we investigated the early effects of microgravity exposure on human chondrocytes of six different donors after 30 min, 2 h, 4 h, 16 h, and 24 h and compared the results with the corresponding static controls cultured under normal gravity conditions. As little as 30 min of exposure resulted in increased expression of several genes responsible for cell motility, structure and integrity (beta-actin); control of cell growth, cell proliferation, cell differentiation and apoptosis; and cytoskeletal components such as microtubules (beta-tubulin) and intermediate filaments (vimentin). After 4 hours disruptions in the vimentin network were detected. These changes were less dramatic after 16 hours, when human chondrocytes appeared to reorganize their cytoskeleton. However, the gene expression and protein content of TGF-β1 was enhanced for 24 h. Based on the results achieved, we suggest that chondrocytes exposed to simulated microgravity seem to change their extracellular matrix production behavior while they rearrange their cytoskeletal proteins prior to forming three-dimensional aggregates.
Wang, Pin-Chieh; Ritz, Beate R; Janowitz, Ira; Harrison, Robert J; Yu, Fei; Chan, Jacqueline; Rempel, David M
2008-03-01
Determine whether an adjustable chair with a curved or a flat seat pan improved monthly back and hip pain scores in sewing machine operators. This 4-month intervention study randomized 293 sewing machine operators with back and hip pain. The participants in the control group received a placebo intervention, and participants in the intervention groups received the placebo intervention and one of the two intervention chairs. Compared with the control group, mean pain improvement for the flat chair intervention was 0.43 points (95% CI = 0.34, 0.51) per month, and mean pain improvement for the curved chair intervention was 0.25 points (95% CI = 0.16, 0.34) per month. A height-adjustable task chair with a swivel function can reduce back and hip pain in sewing machine operators. The findings may be relevant to workers who perform visual- and hand-intensive manufacturing jobs.
Lima, Fabiano F.; Camillo, Carlos A.; Gobbo, Luis A.; Trevisan, Iara B.; Nascimento, Wesley B. B. M.; Silva, Bruna S. A.; Lima, Manoel C. S.; Ramos, Dionei; Ramos, Ercy M. C.
2018-01-01
The objectives of the study were to compare the effects of resistance training using either a low cost and portable elastic tubing or conventional weight machines on muscle force, functional exercise capacity, and health-related quality of life (HRQOL) in middle-aged to older healthy adults. In this clinical trial twenty-nine middle-aged to older healthy adults were randomly assigned to one of the three groups a priori defined: resistance training with elastic tubing (ETG; n = 10), conventional resistance training (weight machines) (CTG; n = 9) and control group (CG, n = 10). Both ETG and CTG followed a 12-week resistance training (3x/week - upper and lower limbs). Muscle force, functional exercise capacity and HRQOL were evaluated at baseline, 6 and 12 weeks. CG underwent the three evaluations with no formal intervention or activity counseling provided. ETG and CTG increased similarly and significantly muscle force (Δ16-44% in ETG and Δ25-46% in CTG, p < 0.05 for both), functional exercise capacity (ETG Δ4 ± 4% and CTG Δ6±8%; p < 0.05 for both). Improvement on “pain” domain of HRQOL could only be observed in the CTG (Δ21 ± 26% p = 0.037). CG showed no statistical improvement in any of the variables investigated. Resistance training using elastic tubing (a low cost and portable tool) and conventional resistance training using weight machines promoted similar positive effects on peripheral muscle force and functional exercise capacity in middle-aged to older healthy adults. Key points There is compeling evidence linking resistance training to health. Elastic resistance training improves the functionality of middle-aged to older healthy adults. Elastic resistance training was shown to be as effective as conventional resistence training in middle-aged to older healthy adults. PMID:29535589
Positional reference system for ultraprecision machining
Arnold, Jones B.; Burleson, Robert R.; Pardue, Robert M.
1982-01-01
A stable positional reference system for use in improving the cutting tool-to-part contour position in numerical controlled-multiaxis metal turning machines is provided. The reference system employs a plurality of interferometers referenced to orthogonally disposed metering bars which are substantially isolated from machine strain induced position errors for monitoring the part and tool positions relative to the metering bars. A microprocessor-based control system is employed in conjunction with the plurality of position interferometers and part contour description data inputs to calculate error components for each axis of movement and output them to corresponding axis drives with appropriate scaling and error compensation. Real-time position control, operating in combination with the reference system, makes possible the positioning of the cutting points of a tool along a part locus with a substantially greater degree of accuracy than has been attained previously in the art by referencing and then monitoring only the tool motion relative to a reference position located on the machine base.
Positional reference system for ultraprecision machining
Arnold, J.B.; Burleson, R.R.; Pardue, R.M.
1980-09-12
A stable positional reference system for use in improving the cutting tool-to-part contour position in numerical controlled-multiaxis metal turning machines is provided. The reference system employs a plurality of interferometers referenced to orthogonally disposed metering bars which are substantially isolated from machine strain induced position errors for monitoring the part and tool positions relative to the metering bars. A microprocessor-based control system is employed in conjunction with the plurality of positions interferometers and part contour description data input to calculate error components for each axis of movement and output them to corresponding axis driven with appropriate scaling and error compensation. Real-time position control, operating in combination with the reference system, makes possible the positioning of the cutting points of a tool along a part locus with a substantially greater degree of accuracy than has been attained previously in the art by referencing and then monitoring only the tool motion relative to a reference position located on the machine base.
Machine learning study for the prediction of transdermal peptide
NASA Astrophysics Data System (ADS)
Jung, Eunkyoung; Choi, Seung-Hoon; Lee, Nam Kyung; Kang, Sang-Kee; Choi, Yun-Jaie; Shin, Jae-Min; Choi, Kihang; Jung, Dong Hyun
2011-04-01
In order to develop a computational method to rapidly evaluate transdermal peptides, we report approaches for predicting the transdermal activity of peptides on the basis of peptide sequence information using Artificial Neural Network (ANN), Partial Least Squares (PLS) and Support Vector Machine (SVM). We identified 269 transdermal peptides by the phage display technique and use them as the positive controls to develop and test machine learning models. Combinations of three descriptors with neural network architectures, the number of latent variables and the kernel functions are tried in training to make appropriate predictions. The capacity of models is evaluated by means of statistical indicators including sensitivity, specificity, and the area under the receiver operating characteristic curve (ROC score). In the ROC score-based comparison, three methods proved capable of providing a reasonable prediction of transdermal peptide. The best result is obtained by SVM model with a radial basis function and VHSE descriptors. The results indicate that it is possible to discriminate between transdermal peptides and random sequences using our models. We anticipate that our models will be applicable to prediction of transdermal peptide for large peptide database for facilitating efficient transdermal drug delivery through intact skin.
Yao, Shi; Guo, Yan; Dong, Shan-Shan; Hao, Ruo-Han; Chen, Xiao-Feng; Chen, Yi-Xiao; Chen, Jia-Bin; Tian, Qing; Deng, Hong-Wen; Yang, Tie-Lin
2017-08-01
Despite genome-wide association studies (GWASs) have identified many susceptibility genes for osteoporosis, it still leaves a large part of missing heritability to be discovered. Integrating regulatory information and GWASs could offer new insights into the biological link between the susceptibility SNPs and osteoporosis. We generated five machine learning classifiers with osteoporosis-associated variants and regulatory features data. We gained the optimal classifier and predicted genome-wide SNPs to discover susceptibility regulatory variants. We further utilized Genetic Factors for Osteoporosis Consortium (GEFOS) and three in-house GWASs samples to validate the associations for predicted positive SNPs. The random forest classifier performed best among all machine learning methods with the F1 score of 0.8871. Using the optimized model, we predicted 37,584 candidate SNPs for osteoporosis. According to the meta-analysis results, a list of regulatory variants was significantly associated with osteoporosis after multiple testing corrections and contributed to the expression of known osteoporosis-associated protein-coding genes. In summary, combining GWASs and regulatory elements through machine learning could provide additional information for understanding the mechanism of osteoporosis. The regulatory variants we predicted will provide novel targets for etiology research and treatment of osteoporosis.
SU-E-J-191: Motion Prediction Using Extreme Learning Machine in Image Guided Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jia, J; Cao, R; Pei, X
Purpose: Real-time motion tracking is a critical issue in image guided radiotherapy due to the time latency caused by image processing and system response. It is of great necessity to fast and accurately predict the future position of the respiratory motion and the tumor location. Methods: The prediction of respiratory position was done based on the positioning and tracking module in ARTS-IGRT system which was developed by FDS Team (www.fds.org.cn). An approach involving with the extreme learning machine (ELM) was adopted to predict the future respiratory position as well as the tumor’s location by training the past trajectories. For themore » training process, a feed-forward neural network with one single hidden layer was used for the learning. First, the number of hidden nodes was figured out for the single layered feed forward network (SLFN). Then the input weights and hidden layer biases of the SLFN were randomly assigned to calculate the hidden neuron output matrix. Finally, the predicted movement were obtained by applying the output weights and compared with the actual movement. Breathing movement acquired from the external infrared markers was used to test the prediction accuracy. And the implanted marker movement for the prostate cancer was used to test the implementation of the tumor motion prediction. Results: The accuracy of the predicted motion and the actual motion was tested. Five volunteers with different breathing patterns were tested. The average prediction time was 0.281s. And the standard deviation of prediction accuracy was 0.002 for the respiratory motion and 0.001 for the tumor motion. Conclusion: The extreme learning machine method can provide an accurate and fast prediction of the respiratory motion and the tumor location and therefore can meet the requirements of real-time tumor-tracking in image guided radiotherapy.« less
Daniel Bowker; Jeff Stringer; Chris Barton; Songlin Fei
2011-01-01
Sediment mobilized by forest harvest machine traffic contributes substantially to the degradation of headwater stream systems. This study monitored forest harvest machine traffic to analyze how it affects sediment delivery to stream channels. Harvest machines were outfitted with global positioning system (GPS) dataloggers, recording machine movements and working status...
Experimental Investigation – Magnetic Assisted Electro Discharge Machining
NASA Astrophysics Data System (ADS)
Kesava Reddy, Chirra; Manzoor Hussain, M.; Satyanarayana, S.; Krishna, M. V. S. Murali
2018-04-01
Emerging technology needs advanced machined parts with high strength and temperature resistance, high fatigue life at low production cost with good surface quality to fit into various industrial applications. Electro discharge machine is one of the extensively used machines to manufacture advanced machined parts which cannot be machined by other traditional machine with high precision and accuracy. Machining of DIN 17350-1.2080 (High Carbon High Chromium steel), using electro discharge machining has been discussed in this paper. In the present investigation an effort is made to use permanent magnet at various positions near the spark zone to improve surface quality of the machined surface. Taguchi methodology is used to obtain optimal choice for each machining parameter such as peak current, pulse duration, gap voltage and Servo reference voltage etc. Process parameters have significant influence on machining characteristics and surface finish. Improvement in surface finish is observed when process parameters are set at optimum condition under the influence of magnetic field at various positions.
Zhang, Xiaodong; Zeng, Zhen; Liu, Xianlei; Fang, Fengzhou
2015-09-21
Freeform surface is promising to be the next generation optics, however it needs high form accuracy for excellent performance. The closed-loop of fabrication-measurement-compensation is necessary for the improvement of the form accuracy. It is difficult to do an off-machine measurement during the freeform machining because the remounting inaccuracy can result in significant form deviations. On the other side, on-machine measurement may hides the systematic errors of the machine because the measuring device is placed in situ on the machine. This study proposes a new compensation strategy based on the combination of on-machine and off-machine measurement. The freeform surface is measured in off-machine mode with nanometric accuracy, and the on-machine probe achieves accurate relative position between the workpiece and machine after remounting. The compensation cutting path is generated according to the calculated relative position and shape errors to avoid employing extra manual adjustment or highly accurate reference-feature fixture. Experimental results verified the effectiveness of the proposed method.
A Technique for Machine-Aided Indexing
ERIC Educational Resources Information Center
Klingbiel, Paul H.
1973-01-01
The technique for machine-aided indexing developed at the Defense Documentation Center (DDC) is illustrated on a randomly chosen abstract. Additional text is provided in coded form so that the reader can more fully explore this technique. (2 references) (Author)
Operating System For Numerically Controlled Milling Machine
NASA Technical Reports Server (NTRS)
Ray, R. B.
1992-01-01
OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ken L. Stratton
The objective of this project is to investigate the applicability of a combined Global Positioning System and Inertial Measurement Unit (GPS/IMU) for information based displays on earthmoving machines and for automated earthmoving machines in the future. This technology has the potential of allowing an information-based product like Caterpillar's Computer Aided Earthmoving System (CAES) to operate in areas with satellite shading. Satellite shading is an issue in open pit mining because machines are routinely required to operate close to high walls, which reduces significantly the amount of the visible sky to the GPS antenna mounted on the machine. An inertial measurementmore » unit is a product, which provides data for the calculation of position based on sensing accelerations and rotation rates of the machine's rigid body. When this information is coupled with GPS it results in a positioning system that can maintain positioning capability during time periods of shading.« less
Aoun, Bachir
2016-05-05
A new Reverse Monte Carlo (RMC) package "fullrmc" for atomic or rigid body and molecular, amorphous, or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython, C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with a set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modeling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. In addition, fullrmc provides a unique way with almost no additional computational cost to recur a group's selection, allowing the system to go out of local minimas by refining a group's position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group. © 2016 Wiley Periodicals, Inc.
Aoun, Bachir
2016-01-22
Here, a new Reverse Monte Carlo (RMC) package ‘fullrmc’ for atomic or rigid body and molecular, amorphous or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython ,C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with amore » set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modelling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. Also fullrmc provides a unique way with almost no additional computational cost to recur a group’s selection, allowing the system to go out of local minimas by refining a group’s position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aoun, Bachir
Here, a new Reverse Monte Carlo (RMC) package ‘fullrmc’ for atomic or rigid body and molecular, amorphous or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython ,C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with amore » set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modelling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. Also fullrmc provides a unique way with almost no additional computational cost to recur a group’s selection, allowing the system to go out of local minimas by refining a group’s position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group.« less
Walking Machine Control Programming
1983-08-31
configuration is useful for two reasons: first, the machine won’t fit through the garage door unless it is in the tuck position, and second, a principal way...machine out of its garage . ’We call the garage a "laboratory" even though the shorter term is more apt.- We regularly run the machine in the parking...comes down from a high push-up. The natural position for the feet as the machine comes out of the garage is the "tuck" in which each knee is bent in as
Effect of conditions of three dimensional clinostating on testicular cell machinery
NASA Astrophysics Data System (ADS)
Uva, Bianca Maria; Strollo, Felice; Ricci, Franco; Pastorino, Martina; Mason, Ian J.; Angela Masini, Maria
2007-02-01
Our scope was to study the effects of on ground random positioning machine rotation on swine testicular cells in culture. Cells of the 2n karyotype line, from trypsinized swine testes, were submitted to modeled microgravity (μG) using a 3D RPM from 15 min to 24 h. The cultured cells were then fixed and submitted to immunohistochemistry using antibodies to steroid dehydrogenases, heat shock proteins and the sodium pump (Na+/K+ ATPase). The results revealed that, after 15 min at modeled μG, all the cells showed damages at cytoskeletal level. The immunoreactions for 3βHSD, 17βHSD and Na+/K+ ATPase were almost abolished. After 24 h of treatment, the presence of the enzymes was restored, and small heat shock proteins were strongly immunostainable. On the other hand, in the 1×G cultures, the expression of HSPs was very weak. We conclude that modeled μG by random positioning of cells affects testicular cells in culture for a short time, while the normal activity of the cells is restored after 24 h.
Shan, Juan; Alam, S Kaisar; Garra, Brian; Zhang, Yingtao; Ahmed, Tahira
2016-04-01
This work identifies effective computable features from the Breast Imaging Reporting and Data System (BI-RADS), to develop a computer-aided diagnosis (CAD) system for breast ultrasound. Computerized features corresponding to ultrasound BI-RADs categories were designed and tested using a database of 283 pathology-proven benign and malignant lesions. Features were selected based on classification performance using a "bottom-up" approach for different machine learning methods, including decision tree, artificial neural network, random forest and support vector machine. Using 10-fold cross-validation on the database of 283 cases, the highest area under the receiver operating characteristic (ROC) curve (AUC) was 0.84 from a support vector machine with 77.7% overall accuracy; the highest overall accuracy, 78.5%, was from a random forest with the AUC 0.83. Lesion margin and orientation were optimum features common to all of the different machine learning methods. These features can be used in CAD systems to help distinguish benign from worrisome lesions. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. All rights reserved.
Lawrence Livermore National Laboratory ULTRA-350 Test Bed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopkins, D J; Wulff, T A; Carlisle, K
2001-04-10
LLNL has many in-house designed high precision machine tools. Some of these tools include the Large Optics Diamond Turning Machine (LODTM) [1], Diamond Turning Machine No.3 (DTM-3) and two Precision Engineering Research Lathes (PERL-1 and PERL-11). These machines have accuracy in the sub-micron range and in most cases position resolution in the couple of nanometers range. All of these machines are built with similar underlying technologies. The machines use capstan drive technology, laser interferometer position feedback, tachometer velocity feedback, permanent magnet (PM) brush motors and analog velocity and position loop servo compensation [2]. The machine controller does not perform anymore » servo compensation it simply computes the differences between the commanded position and the actual position (the following error) and sends this to a D/A for the analog servo position loop. LLNL is designing a new high precision diamond turning machine. The machine is called the ULTRA 350 [3]. In contrast to many of the proven technologies discussed above, the plan for the new machine is to use brushless linear motors, high precision linear scales, machine controller motor commutation and digital servo compensation for the velocity and position loops. Although none of these technologies are new and have been in use in industry, applications of these technologies to high precision diamond turning is limited. To minimize the risks of these technologies in the new machine design, LLNL has established a test bed to evaluate these technologies for application in high precision diamond turning. The test bed is primarily composed of commercially available components. This includes the slide with opposed hydrostatic bearings, the oil system, the brushless PM linear motor, the two-phase input three-phase output linear motor amplifier and the system controller. The linear scales are not yet commercially available but use a common electronic output format. As of this writing, the final verdict for the use of these technologies is still out but the first part of the work has been completed with promising results. The goal of this part of the work was to close a servo position loop around a slide incorporating these technologies and to measure the performance. This paper discusses the tests that were setup for system evaluation and the results of the measurements made. Some very promising results include; slide positioning to nanometer level and slow speed slide direction reversal at less than 100nm/min with no observed discontinuities. This is very important for machine contouring in diamond turning. As a point of reference, at 100 nm/min it would take the slide almost 7 years to complete the full designed travel of 350 mm. This speed has been demonstrated without the use of a velocity sensor. The velocity is derived from the position sensor. With what has been learned on the test bed, the paper finishes with a brief comparison of the old and new technologies. The emphasis of this comparison will be on the servo performance as illustrated with bode plot diagrams.« less
Lawrence Livermore National Laboratory ULTRA-350 Test Bed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopkins, D J; Wulff, T A; Carlisle, K
2001-04-10
LLNL has many in-house designed high precision machine tools. Some of these tools include the Large Optics Diamond Turning Machine (LODTM) [1], Diamond Turning Machine No.3 (DTM-3) and two Precision Engineering Research Lathes (PERL-I and PERL-II). These machines have accuracy in the sub-micron range and in most cases position resolution in the couple of nanometers range. All of these machines are built with similar underlying technologies. The machines use capstan drive technology, laser interferometer position feedback, tachometer velocity feedback, permanent magnet (PM) brush motors and analog velocity and position loop servo compensation [2]. The machine controller does not perform anymore » servo compensation it simply computes the differences between the commanded position and the actual position (the following error) and sends this to a D/A for the analog servo position loop. LLNL is designing a new high precision diamond turning machine. The machine is called the ULTRA 350 [3]. In contrast to many of the proven technologies discussed above, the plan for the new machine is to use brushless linear motors, high precision linear scales, machine controller motor commutation and digital servo compensation for the velocity and position loops. Although none of these technologies are new and have been in use in industry, applications of these technologies to high precision diamond turning is limited. To minimize the risks of these technologies in the new machine design, LLNL has established a test bed to evaluate these technologies for application in high precision diamond turning. The test bed is primarily composed of commercially available components. This includes the slide with opposed hydrostatic bearings, the oil system, the brushless PM linear motor, the two-phase input three-phase output linear motor amplifier and the system controller. The linear scales are not yet commercially available but use a common electronic output format. As of this writing, the final verdict for the use of these technologies is still out but the first part of the work has been completed with promising results. The goal of this part of the work was to close a servo position loop around a slide incorporating these technologies and to measure the performance. This paper discusses the tests that were setup for system evaluation and the results of the measurements made. Some very promising results include; slide positioning to nanometer level and slow speed slide direction reversal at less than 100nm/min with no observed discontinuities. This is very important for machine contouring in diamond turning. As a point of reference, at 100 nm/min it would take the slide almost 7 years to complete the full designed travel of 350 mm. This speed has been demonstrated without the use of a velocity sensor. The velocity is derived from the position sensor. With what has been learned on the test bed, the paper finishes with a brief comparison of the old and new technologies. The emphasis of this comparison will be on the servo performance as illustrated with bode plot diagrams.« less
Feature genes predicting the FLT3/ITD mutation in acute myeloid leukemia
LI, CHENGLONG; ZHU, BIAO; CHEN, JIAO; HUANG, XIAOBING
2016-01-01
In the present study, gene expression profiles of acute myeloid leukemia (AML) samples were analyzed to identify feature genes with the capacity to predict the mutation status of FLT3/ITD. Two machine learning models, namely the support vector machine (SVM) and random forest (RF) methods, were used for classification. Four datasets were downloaded from the European Bioinformatics Institute, two of which (containing 371 samples, including 281 FLT3/ITD mutation-negative and 90 mutation-positive samples) were randomly defined as the training group, while the other two datasets (containing 488 samples, including 350 FLT3/ITD mutation-negative and 138 mutation-positive samples) were defined as the test group. Differentially expressed genes (DEGs) were identified by significance analysis of the micro-array data by using the training samples. The classification efficiency of the SCM and RF methods was evaluated using the following parameters: Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) and the area under the receiver operating characteristic curve. Functional enrichment analysis was performed for the feature genes with DAVID. A total of 585 DEGs were identified in the training group, of which 580 were upregulated and five were downregulated. The classification accuracy rates of the two methods for the training group, the test group and the combined group using the 585 feature genes were >90%. For the SVM and RF methods, the rates of correct determination, specificity and PPV were >90%, while the sensitivity and NPV were >80%. The SVM method produced a slightly better classification effect than the RF method. A total of 13 biological pathways were overrepresented by the feature genes, mainly involving energy metabolism, chromatin organization and translation. The feature genes identified in the present study may be used to predict the mutation status of FLT3/ITD in patients with AML. PMID:27177049
Feature genes predicting the FLT3/ITD mutation in acute myeloid leukemia.
Li, Chenglong; Zhu, Biao; Chen, Jiao; Huang, Xiaobing
2016-07-01
In the present study, gene expression profiles of acute myeloid leukemia (AML) samples were analyzed to identify feature genes with the capacity to predict the mutation status of FLT3/ITD. Two machine learning models, namely the support vector machine (SVM) and random forest (RF) methods, were used for classification. Four datasets were downloaded from the European Bioinformatics Institute, two of which (containing 371 samples, including 281 FLT3/ITD mutation-negative and 90 mutation‑positive samples) were randomly defined as the training group, while the other two datasets (containing 488 samples, including 350 FLT3/ITD mutation-negative and 138 mutation-positive samples) were defined as the test group. Differentially expressed genes (DEGs) were identified by significance analysis of the microarray data by using the training samples. The classification efficiency of the SCM and RF methods was evaluated using the following parameters: Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) and the area under the receiver operating characteristic curve. Functional enrichment analysis was performed for the feature genes with DAVID. A total of 585 DEGs were identified in the training group, of which 580 were upregulated and five were downregulated. The classification accuracy rates of the two methods for the training group, the test group and the combined group using the 585 feature genes were >90%. For the SVM and RF methods, the rates of correct determination, specificity and PPV were >90%, while the sensitivity and NPV were >80%. The SVM method produced a slightly better classification effect than the RF method. A total of 13 biological pathways were overrepresented by the feature genes, mainly involving energy metabolism, chromatin organization and translation. The feature genes identified in the present study may be used to predict the mutation status of FLT3/ITD in patients with AML.
Kim, Dong Wook; Kim, Hwiyoung; Nam, Woong; Kim, Hyung Jun; Cha, In-Ho
2018-04-23
The aim of this study was to build and validate five types of machine learning models that can predict the occurrence of BRONJ associated with dental extraction in patients taking bisphosphonates for the management of osteoporosis. A retrospective review of the medical records was conducted to obtain cases and controls for the study. Total 125 patients consisting of 41 cases and 84 controls were selected for the study. Five machine learning prediction algorithms including multivariable logistic regression model, decision tree, support vector machine, artificial neural network, and random forest were implemented. The outputs of these models were compared with each other and also with conventional methods, such as serum CTX level. Area under the receiver operating characteristic (ROC) curve (AUC) was used to compare the results. The performance of machine learning models was significantly superior to conventional statistical methods and single predictors. The random forest model yielded the best performance (AUC = 0.973), followed by artificial neural network (AUC = 0.915), support vector machine (AUC = 0.882), logistic regression (AUC = 0.844), decision tree (AUC = 0.821), drug holiday alone (AUC = 0.810), and CTX level alone (AUC = 0.630). Machine learning methods showed superior performance in predicting BRONJ associated with dental extraction compared to conventional statistical methods using drug holiday and serum CTX level. Machine learning can thus be applied in a wide range of clinical studies. Copyright © 2017. Published by Elsevier Inc.
Development of techniques to enhance man/machine communication
NASA Technical Reports Server (NTRS)
Targ, R.; Cole, P.; Puthoff, H.
1974-01-01
A four-state random stimulus generator, considered to function as an ESP teaching machine was used to investigate an approach to facilitating interactions between man and machines. A subject tries to guess in which of four states the machine is. The machine offers the user feedback and reinforcement as to the correctness of his choice. Using this machine, 148 volunteer subjects were screened under various protocols. Several whose learning slope and/or mean score departed significantly from chance expectation were identified. Direct physiological evidence of perception of remote stimuli not presented to any known sense of the percipient using electroencephalographic (EEG) output when a light was flashed in a distant room was also studied.
Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro
2016-12-15
MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.
AUTOCLASSIFICATION OF THE VARIABLE 3XMM SOURCES USING THE RANDOM FOREST MACHINE LEARNING ALGORITHM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, Sean A.; Murphy, Tara; Lo, Kitty K., E-mail: s.farrell@physics.usyd.edu.au
In the current era of large surveys and massive data sets, autoclassification of astrophysical sources using intelligent algorithms is becoming increasingly important. In this paper we present the catalog of variable sources in the Third XMM-Newton Serendipitous Source catalog (3XMM) autoclassified using the Random Forest machine learning algorithm. We used a sample of manually classified variable sources from the second data release of the XMM-Newton catalogs (2XMMi-DR2) to train the classifier, obtaining an accuracy of ∼92%. We also evaluated the effectiveness of identifying spurious detections using a sample of spurious sources, achieving an accuracy of ∼95%. Manual investigation of amore » random sample of classified sources confirmed these accuracy levels and showed that the Random Forest machine learning algorithm is highly effective at automatically classifying 3XMM sources. Here we present the catalog of classified 3XMM variable sources. We also present three previously unidentified unusual sources that were flagged as outlier sources by the algorithm: a new candidate supergiant fast X-ray transient, a 400 s X-ray pulsar, and an eclipsing 5 hr binary system coincident with a known Cepheid.« less
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1984-01-01
Detailed descriptions of the data and format of the machine-readable astronomical catalog are given. The machine version is identical in data content to the published edition, but minor modifications in the data format were made in order to effect uniformity with machine versions of other astronomical catalogs. Stellar motions and positions at epoch and equinox 1950.0 are reported.
NASA Technical Reports Server (NTRS)
Patel, Mamta J.; Liu, Wenbin; Sykes, Michelle C.; Ward, Nancy E.; Risin, Semyon A.; Risin, Diana; Hanjoong, Jo
2007-01-01
Microgravity of spaceflight induces bone loss due in part to decreased bone formation by osteoblasts. We have previously examined the microgravity-induced changes in gene expression profiles in 2T3 preosteoblasts using the Random Positioning Machine (RPM) to simulate microgravity conditions. Here, we hypothesized that exposure of preosteoblasts to an independent microgravity simulator, the Rotating Wall Vessel (RWV), induces similar changes in differentiation and gene transcript profiles, resulting in a more confined list of gravi-sensitive genes that may play a role in bone formation. In comparison to static 1g controls, exposure of 2T3 cells to RWV for 3 days inhibited alkaline phosphatase activity, a marker of differentiation, and downregulated 61 genes and upregulated 45 genes by more than two-fold as shown by microarray analysis. The microarray results were confirmed with real time PCR for downregulated genes osteomodulin, bone morphogenic protein 4 (BMP4), runx2, and parathyroid hormone receptor 1. Western blot analysis validated the expression of three downregulated genes, BMP4, peroxiredoxin IV, and osteoglycin, and one upregulated gene peroxiredoxin I. Comparison of the microarrays from the RPM and the RWV studies identified 14 gravi-sensitive genes that changed in the same direction in both systems. Further comparison of our results to a published database showing gene transcript profiles of mechanically loaded mouse tibiae revealed 16 genes upregulated by the loading that were shown to be downregulated by RWV and RPM. These mechanosensitive genes identified by the comparative studies may provide novel insights into understanding the mechanisms regulating bone formation and potential targets of countermeasure against decreased bone formation both in astronauts and in general patients with musculoskeletal disorders.
Richardson, Alice M; Lidbury, Brett A
2017-08-14
Data mining techniques such as support vector machines (SVMs) have been successfully used to predict outcomes for complex problems, including for human health. Much health data is imbalanced, with many more controls than positive cases. The impact of three balancing methods and one feature selection method is explored, to assess the ability of SVMs to classify imbalanced diagnostic pathology data associated with the laboratory diagnosis of hepatitis B (HBV) and hepatitis C (HCV) infections. Random forests (RFs) for predictor variable selection, and data reshaping to overcome a large imbalance of negative to positive test results in relation to HBV and HCV immunoassay results, are examined. The methodology is illustrated using data from ACT Pathology (Canberra, Australia), consisting of laboratory test records from 18,625 individuals who underwent hepatitis virus testing over the decade from 1997 to 2007. Overall, the prediction of HCV test results by immunoassay was more accurate than for HBV immunoassay results associated with identical routine pathology predictor variable data. HBV and HCV negative results were vastly in excess of positive results, so three approaches to handling the negative/positive data imbalance were compared. Generating datasets by the Synthetic Minority Oversampling Technique (SMOTE) resulted in significantly more accurate prediction than single downsizing or multiple downsizing (MDS) of the dataset. For downsized data sets, applying a RF for predictor variable selection had a small effect on the performance, which varied depending on the virus. For SMOTE, a RF had a negative effect on performance. An analysis of variance of the performance across settings supports these findings. Finally, age and assay results for alanine aminotransferase (ALT), sodium for HBV and urea for HCV were found to have a significant impact upon laboratory diagnosis of HBV or HCV infection using an optimised SVM model. Laboratories looking to include machine learning via SVM as part of their decision support need to be aware that the balancing method, predictor variable selection and the virus type interact to affect the laboratory diagnosis of hepatitis virus infection with routine pathology laboratory variables in different ways depending on which combination is being studied. This awareness should lead to careful use of existing machine learning methods, thus improving the quality of laboratory diagnosis.
Alignment verification procedures
NASA Technical Reports Server (NTRS)
Edwards, P. R.; Phillips, E. P.; Newman, J. C., Jr.
1988-01-01
In alignment verification procedures each laboratory is required to align its test machines and gripping fixtures to produce a nearly uniform tensile stress field on an un-notched sheet specimen. The blank specimens (50 mm w X 305 mm l X 2.3 mm th) supplied by the coordinators were strain gauged. Strain gauge readings were taken at all gauges (n = 1 through 10). The alignment verification procedures are as follows: (1) zero all strain gauges while specimen is in a free-supported condition; (2) put strain-gauged specimen in the test machine so that specimen front face (face 1) is in contact with reference jaw (standard position of specimen), tighten grips, and at zero load measure strains on all gauges. (epsilon sub nS0 is strain at gauge n, standard position, zero load); (3) with specimen in machine and at a tensile load of 10 kN measure strains (specimen in standard position). (Strain = epsilon sub nS10); (4) remove specimen from machine. Put specimen in machine so that specimen back face (face 2) is in contact with reference jaw (reverse position of specimen), tighten grips, and at zero load measure strains on all gauges. (Strain - epsilon sub nR0); and (5) with specimen in machine and at tensile load of 10 kN measure strains (specimen in reverse position). (epsilon sub nR10 is strain at gauge n, reverse position, 10 kN load).
Towards large-scale FAME-based bacterial species identification using machine learning techniques.
Slabbinck, Bram; De Baets, Bernard; Dawyndt, Peter; De Vos, Paul
2009-05-01
In the last decade, bacterial taxonomy witnessed a huge expansion. The swift pace of bacterial species (re-)definitions has a serious impact on the accuracy and completeness of first-line identification methods. Consequently, back-end identification libraries need to be synchronized with the List of Prokaryotic names with Standing in Nomenclature. In this study, we focus on bacterial fatty acid methyl ester (FAME) profiling as a broadly used first-line identification method. From the BAME@LMG database, we have selected FAME profiles of individual strains belonging to the genera Bacillus, Paenibacillus and Pseudomonas. Only those profiles resulting from standard growth conditions have been retained. The corresponding data set covers 74, 44 and 95 validly published bacterial species, respectively, represented by 961, 378 and 1673 standard FAME profiles. Through the application of machine learning techniques in a supervised strategy, different computational models have been built for genus and species identification. Three techniques have been considered: artificial neural networks, random forests and support vector machines. Nearly perfect identification has been achieved at genus level. Notwithstanding the known limited discriminative power of FAME analysis for species identification, the computational models have resulted in good species identification results for the three genera. For Bacillus, Paenibacillus and Pseudomonas, random forests have resulted in sensitivity values, respectively, 0.847, 0.901 and 0.708. The random forests models outperform those of the other machine learning techniques. Moreover, our machine learning approach also outperformed the Sherlock MIS (MIDI Inc., Newark, DE, USA). These results show that machine learning proves very useful for FAME-based bacterial species identification. Besides good bacterial identification at species level, speed and ease of taxonomic synchronization are major advantages of this computational species identification strategy.
Method and system for controlling start of a permanent magnet machine
Walters, James E.; Krefta, Ronald John
2003-10-28
Method and system for controlling a permanent magnet machine are provided. The method provides a sensor assembly for sensing rotor sector position relative to a plurality of angular sectors. The method further provides a sensor for sensing angular increments in rotor position. The method allows starting the machine in a brushless direct current mode of operation using a calculated initial rotor position based on an initial angular sector position information from the sensor assembly. Upon determining a transition from the initial angular sector to the next angular sector, the method allows switching to a sinusoidal mode of operation using rotor position based on rotor position information from the incremental sensor.
Virtual screening by a new Clustering-based Weighted Similarity Extreme Learning Machine approach
Kudisthalert, Wasu
2018-01-01
Machine learning techniques are becoming popular in virtual screening tasks. One of the powerful machine learning algorithms is Extreme Learning Machine (ELM) which has been applied to many applications and has recently been applied to virtual screening. We propose the Weighted Similarity ELM (WS-ELM) which is based on a single layer feed-forward neural network in a conjunction of 16 different similarity coefficients as activation function in the hidden layer. It is known that the performance of conventional ELM is not robust due to random weight selection in the hidden layer. Thus, we propose a Clustering-based WS-ELM (CWS-ELM) that deterministically assigns weights by utilising clustering algorithms i.e. k-means clustering and support vector clustering. The experiments were conducted on one of the most challenging datasets–Maximum Unbiased Validation Dataset–which contains 17 activity classes carefully selected from PubChem. The proposed algorithms were then compared with other machine learning techniques such as support vector machine, random forest, and similarity searching. The results show that CWS-ELM in conjunction with support vector clustering yields the best performance when utilised together with Sokal/Sneath(1) coefficient. Furthermore, ECFP_6 fingerprint presents the best results in our framework compared to the other types of fingerprints, namely ECFP_4, FCFP_4, and FCFP_6. PMID:29652912
Mateen, Bilal Akhter; Bussas, Matthias; Doogan, Catherine; Waller, Denise; Saverino, Alessia; Király, Franz J; Playford, E Diane
2018-05-01
To determine whether tests of cognitive function and patient-reported outcome measures of motor function can be used to create a machine learning-based predictive tool for falls. Prospective cohort study. Tertiary neurological and neurosurgical center. In all, 337 in-patients receiving neurosurgical, neurological, or neurorehabilitation-based care. Binary (Y/N) for falling during the in-patient episode, the Trail Making Test (a measure of attention and executive function) and the Walk-12 (a patient-reported measure of physical function). The principal outcome was a fall during the in-patient stay ( n = 54). The Trail test was identified as the best predictor of falls. Moreover, addition of other variables, did not improve the prediction (Wilcoxon signed-rank P < 0.001). Classical linear statistical modeling methods were then compared with more recent machine learning based strategies, for example, random forests, neural networks, support vector machines. The random forest was the best modeling strategy when utilizing just the Trail Making Test data (Wilcoxon signed-rank P < 0.001) with 68% (± 7.7) sensitivity, and 90% (± 2.3) specificity. This study identifies a simple yet powerful machine learning (Random Forest) based predictive model for an in-patient neurological population, utilizing a single neuropsychological test of cognitive function, the Trail Making test.
NASA Astrophysics Data System (ADS)
Ma, Zhichao; Hu, Leilei; Zhao, Hongwei; Wu, Boda; Peng, Zhenxing; Zhou, Xiaoqin; Zhang, Hongguo; Zhu, Shuai; Xing, Lifeng; Hu, Huang
2010-08-01
The theories and techniques for improving machining accuracy via position control of diamond tool's tip and raising resolution of cutting depth on precise CNC lathes have been extremely focused on. A new piezo-driven ultra-precision machine tool servo system is designed and tested to improve manufacturing accuracy of workpiece. The mathematical model of machine tool servo system is established and the finite element analysis is carried out on parallel plate flexure hinges. The output position of diamond tool's tip driven by the machine tool servo system is tested via a contact capacitive displacement sensor. Proportional, integral, derivative (PID) feedback is also implemented to accommodate and compensate dynamical change owing cutting forces as well as the inherent non-linearity factors of the piezoelectric stack during cutting process. By closed loop feedback controlling strategy, the tracking error is limited to 0.8 μm. Experimental results have shown the proposed machine tool servo system could provide a tool positioning resolution of 12 nm, which is much accurate than the inherent CNC resolution magnitude. The stepped shaft of aluminum specimen with a step increment of cutting depth of 1 μm is tested, and the obtained contour illustrates the displacement command output from controller is accurately and real-time reflected on the machined part.
Forecasting Space Weather-Induced GPS Performance Degradation Using Random Forest
NASA Astrophysics Data System (ADS)
Filjar, R.; Filic, M.; Milinkovic, F.
2017-12-01
Space weather and ionospheric dynamics have a profound effect on positioning performance of the Global Satellite Navigation System (GNSS). However, the quantification of that effect is still the subject of scientific activities around the world. In the latest contribution to the understanding of the space weather and ionospheric effects on satellite-based positioning performance, we conducted a study of several candidates for forecasting method for space weather-induced GPS positioning performance deterioration. First, a 5-days set of experimentally collected data was established, encompassing the space weather and ionospheric activity indices (including: the readings of the Sudden Ionospheric Disturbance (SID) monitors, components of geomagnetic field strength, global Kp index, Dst index, GPS-derived Total Electron Content (TEC) samples, standard deviation of TEC samples, and sunspot number) and observations of GPS positioning error components (northing, easting, and height positioning error) derived from the Adriatic Sea IGS reference stations' RINEX raw pseudorange files in quiet space weather periods. This data set was split into the training and test sub-sets. Then, a selected set of supervised machine learning methods based on Random Forest was applied to the experimentally collected data set in order to establish the appropriate regional (the Adriatic Sea) forecasting models for space weather-induced GPS positioning performance deterioration. The forecasting models were developed in the R/rattle statistical programming environment. The forecasting quality of the regional forecasting models developed was assessed, and the conclusions drawn on the advantages and shortcomings of the regional forecasting models for space weather-caused GNSS positioning performance deterioration.
Reversible micromachining locator
Salzer, Leander J.; Foreman, Larry R.
2002-01-01
A locator with a part support is used to hold a part onto the kinematic mount of a tooling machine so that the part can be held in or replaced in exactly the same position relative to the cutting tool for machining different surfaces of the part or for performing different machining operations on the same or different surfaces of the part. The locator has disposed therein a plurality of steel balls placed at equidistant positions around the planar surface of the locator and the kinematic mount has a plurality of magnets which alternate with grooves which accommodate the portions of the steel balls projecting from the locator. The part support holds the part to be machined securely in place in the locator. The locator can be easily detached from the kinematic mount, turned over, and replaced onto the same kinematic mount or another kinematic mount on another tooling machine without removing the part to be machined from the locator so that there is no need to touch or reposition the part within the locator, thereby assuring exact replication of the position of the part in relation to the cutting tool on the tooling machine for each machining operation on the part.
NASA Technical Reports Server (NTRS)
Burke, Gary R.; Taft, Stephanie
2004-01-01
State machines are commonly used to control sequential logic in FPGAs and ASKS. An errant state machine can cause considerable damage to the device it is controlling. For example in space applications, the FPGA might be controlling Pyros, which when fired at the wrong time will cause a mission failure. Even a well designed state machine can be subject to random errors us a result of SEUs from the radiation environment in space. There are various ways to encode the states of a state machine, and the type of encoding makes a large difference in the susceptibility of the state machine to radiation. In this paper we compare 4 methods of state machine encoding and find which method gives the best fault tolerance, as well as determining the resources needed for each method.
Taxi-Out Time Prediction for Departures at Charlotte Airport Using Machine Learning Techniques
NASA Technical Reports Server (NTRS)
Lee, Hanbong; Malik, Waqar; Jung, Yoon C.
2016-01-01
Predicting the taxi-out times of departures accurately is important for improving airport efficiency and takeoff time predictability. In this paper, we attempt to apply machine learning techniques to actual traffic data at Charlotte Douglas International Airport for taxi-out time prediction. To find the key factors affecting aircraft taxi times, surface surveillance data is first analyzed. From this data analysis, several variables, including terminal concourse, spot, runway, departure fix and weight class, are selected for taxi time prediction. Then, various machine learning methods such as linear regression, support vector machines, k-nearest neighbors, random forest, and neural networks model are applied to actual flight data. Different traffic flow and weather conditions at Charlotte airport are also taken into account for more accurate prediction. The taxi-out time prediction results show that linear regression and random forest techniques can provide the most accurate prediction in terms of root-mean-square errors. We also discuss the operational complexity and uncertainties that make it difficult to predict the taxi times accurately.
Reversible micromachining locator
Salzer, Leander J.; Foreman, Larry R.
1999-01-01
This invention provides a device which includes a locator, a kinematic mount positioned on a conventional tooling machine, a part carrier disposed on the locator and a retainer ring. The locator has disposed therein a plurality of steel balls, placed in an equidistant position circumferentially around the locator. The kinematic mount includes a plurality of magnets which are in registry with the steel balls on the locator. In operation, a blank part to be machined is placed between a surface of a locator and the retainer ring (fitting within the part carrier). When the locator (with a blank part to be machined) is coupled to the kinematic mount, the part is thus exposed for the desired machining process. Because the locator is removably attachable to the kinematic mount, it can easily be removed from the mount, reversed, and reinserted onto the mount for additional machining. Further, the locator can likewise be removed from the mount and placed onto another tooling machine having a properly aligned kinematic mount. Because of the unique design and use of magnetic forces of the present invention, positioning errors of less than 0.25 micrometer for each machining process can be achieved.
Predicting healthcare associated infections using patients' experiences
NASA Astrophysics Data System (ADS)
Pratt, Michael A.; Chu, Henry
2016-05-01
Healthcare associated infections (HAI) are a major threat to patient safety and are costly to health systems. Our goal is to predict the HAI performance of a hospital using the patients' experience responses as input. We use four classifiers, viz. random forest, naive Bayes, artificial feedforward neural networks, and the support vector machine, to perform the prediction of six types of HAI. The six types include blood stream, urinary tract, surgical site, and intestinal infections. Experiments show that the random forest and support vector machine perform well across the six types of HAI.
30 CFR 70.207 - Bimonthly sampling; mechanized mining units.
Code of Federal Regulations, 2012 CFR
2012-07-01
... sampling device as follows: (1) Conventional section using cutting machine. On the cutting machine operator or on the cutting machine within 36 inches inby the normal working position; (2) Conventional section shooting off the solid. On the loading machine operator or on the loading machine within 36 inches inby the...
30 CFR 70.207 - Bimonthly sampling; mechanized mining units.
Code of Federal Regulations, 2014 CFR
2014-07-01
... sampling device as follows: (1) Conventional section using cutting machine. On the cutting machine operator or on the cutting machine within 36 inches inby the normal working position; (2) Conventional section shooting off the solid. On the loading machine operator or on the loading machine within 36 inches inby the...
30 CFR 70.207 - Bimonthly sampling; mechanized mining units.
Code of Federal Regulations, 2013 CFR
2013-07-01
... sampling device as follows: (1) Conventional section using cutting machine. On the cutting machine operator or on the cutting machine within 36 inches inby the normal working position; (2) Conventional section shooting off the solid. On the loading machine operator or on the loading machine within 36 inches inby the...
Source localization in an ocean waveguide using supervised machine learning.
Niu, Haiqiang; Reeves, Emma; Gerstoft, Peter
2017-09-01
Source localization in ocean acoustics is posed as a machine learning problem in which data-driven methods learn source ranges directly from observed acoustic data. The pressure received by a vertical linear array is preprocessed by constructing a normalized sample covariance matrix and used as the input for three machine learning methods: feed-forward neural networks (FNN), support vector machines (SVM), and random forests (RF). The range estimation problem is solved both as a classification problem and as a regression problem by these three machine learning algorithms. The results of range estimation for the Noise09 experiment are compared for FNN, SVM, RF, and conventional matched-field processing and demonstrate the potential of machine learning for underwater source localization.
Mannil, Manoj; von Spiczak, Jochen; Manka, Robert; Alkadhi, Hatem
2018-06-01
The aim of this study was to test whether texture analysis and machine learning enable the detection of myocardial infarction (MI) on non-contrast-enhanced low radiation dose cardiac computed tomography (CCT) images. In this institutional review board-approved retrospective study, we included non-contrast-enhanced electrocardiography-gated low radiation dose CCT image data (effective dose, 0.5 mSv) acquired for the purpose of calcium scoring of 27 patients with acute MI (9 female patients; mean age, 60 ± 12 years), 30 patients with chronic MI (8 female patients; mean age, 68 ± 13 years), and in 30 subjects (9 female patients; mean age, 44 ± 6 years) without cardiac abnormality, hereafter termed controls. Texture analysis of the left ventricle was performed using free-hand regions of interest, and texture features were classified twice (Model I: controls versus acute MI versus chronic MI; Model II: controls versus acute and chronic MI). For both classifications, 6 commonly used machine learning classifiers were used: decision tree C4.5 (J48), k-nearest neighbors, locally weighted learning, RandomForest, sequential minimal optimization, and an artificial neural network employing deep learning. In addition, 2 blinded, independent readers visually assessed noncontrast CCT images for the presence or absence of MI. In Model I, best classification results were obtained using the k-nearest neighbors classifier (sensitivity, 69%; specificity, 85%; false-positive rate, 0.15). In Model II, the best classification results were found with the locally weighted learning classification (sensitivity, 86%; specificity, 81%; false-positive rate, 0.19) with an area under the curve from receiver operating characteristics analysis of 0.78. In comparison, both readers were not able to identify MI in any of the noncontrast, low radiation dose CCT images. This study indicates the ability of texture analysis and machine learning in detecting MI on noncontrast low radiation dose CCT images being not visible for the radiologists' eye.
Machine learning models in breast cancer survival prediction.
Montazeri, Mitra; Montazeri, Mohadeseh; Montazeri, Mahdieh; Beigzadeh, Amin
2016-01-01
Breast cancer is one of the most common cancers with a high mortality rate among women. With the early diagnosis of breast cancer survival will increase from 56% to more than 86%. Therefore, an accurate and reliable system is necessary for the early diagnosis of this cancer. The proposed model is the combination of rules and different machine learning techniques. Machine learning models can help physicians to reduce the number of false decisions. They try to exploit patterns and relationships among a large number of cases and predict the outcome of a disease using historical cases stored in datasets. The objective of this study is to propose a rule-based classification method with machine learning techniques for the prediction of different types of Breast cancer survival. We use a dataset with eight attributes that include the records of 900 patients in which 876 patients (97.3%) and 24 (2.7%) patients were females and males respectively. Naive Bayes (NB), Trees Random Forest (TRF), 1-Nearest Neighbor (1NN), AdaBoost (AD), Support Vector Machine (SVM), RBF Network (RBFN), and Multilayer Perceptron (MLP) machine learning techniques with 10-cross fold technique were used with the proposed model for the prediction of breast cancer survival. The performance of machine learning techniques were evaluated with accuracy, precision, sensitivity, specificity, and area under ROC curve. Out of 900 patients, 803 patients and 97 patients were alive and dead, respectively. In this study, Trees Random Forest (TRF) technique showed better results in comparison to other techniques (NB, 1NN, AD, SVM and RBFN, MLP). The accuracy, sensitivity and the area under ROC curve of TRF are 96%, 96%, 93%, respectively. However, 1NN machine learning technique provided poor performance (accuracy 91%, sensitivity 91% and area under ROC curve 78%). This study demonstrates that Trees Random Forest model (TRF) which is a rule-based classification model was the best model with the highest level of accuracy. Therefore, this model is recommended as a useful tool for breast cancer survival prediction as well as medical decision making.
Detecting false positive sequence homology: a machine learning approach.
Fujimoto, M Stanley; Suvorov, Anton; Jensen, Nicholas O; Clement, Mark J; Bybee, Seth M
2016-02-24
Accurate detection of homologous relationships of biological sequences (DNA or amino acid) amongst organisms is an important and often difficult task that is essential to various evolutionary studies, ranging from building phylogenies to predicting functional gene annotations. There are many existing heuristic tools, most commonly based on bidirectional BLAST searches that are used to identify homologous genes and combine them into two fundamentally distinct classes: orthologs and paralogs. Due to only using heuristic filtering based on significance score cutoffs and having no cluster post-processing tools available, these methods can often produce multiple clusters constituting unrelated (non-homologous) sequences. Therefore sequencing data extracted from incomplete genome/transcriptome assemblies originated from low coverage sequencing or produced by de novo processes without a reference genome are susceptible to high false positive rates of homology detection. In this paper we develop biologically informative features that can be extracted from multiple sequence alignments of putative homologous genes (orthologs and paralogs) and further utilized in context of guided experimentation to verify false positive outcomes. We demonstrate that our machine learning method trained on both known homology clusters obtained from OrthoDB and randomly generated sequence alignments (non-homologs), successfully determines apparent false positives inferred by heuristic algorithms especially among proteomes recovered from low-coverage RNA-seq data. Almost ~42 % and ~25 % of predicted putative homologies by InParanoid and HaMStR respectively were classified as false positives on experimental data set. Our process increases the quality of output from other clustering algorithms by providing a novel post-processing method that is both fast and efficient at removing low quality clusters of putative homologous genes recovered by heuristic-based approaches.
Prediction of drug synergy in cancer using ensemble-based machine learning techniques
NASA Astrophysics Data System (ADS)
Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder
2018-04-01
Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.
Reversible micromachining locator
Salzer, L.J.; Foreman, L.R.
1999-08-31
This invention provides a device which includes a locator, a kinematic mount positioned on a conventional tooling machine, a part carrier disposed on the locator and a retainer ring. The locator has disposed therein a plurality of steel balls, placed in an equidistant position circumferentially around the locator. The kinematic mount includes a plurality of magnets which are in registry with the steel balls on the locator. In operation, a blank part to be machined is placed between a surface of a locator and the retainer ring (fitting within the part carrier). When the locator (with a blank part to be machined) is coupled to the kinematic mount, the part is thus exposed for the desired machining process. Because the locator is removably attachable to the kinematic mount, it can easily be removed from the mount, reversed, and reinserted onto the mount for additional machining. Further, the locator can likewise be removed from the mount and placed onto another tooling machine having a properly aligned kinematic mount. Because of the unique design and use of magnetic forces of the present invention, positioning errors of less than 0.25 micrometer for each machining process can be achieved. 7 figs.
Support vector machine in machine condition monitoring and fault diagnosis
NASA Astrophysics Data System (ADS)
Widodo, Achmad; Yang, Bo-Suk
2007-08-01
Recently, the issue of machine condition monitoring and fault diagnosis as a part of maintenance system became global due to the potential advantages to be gained from reduced maintenance costs, improved productivity and increased machine availability. This paper presents a survey of machine condition monitoring and fault diagnosis using support vector machine (SVM). It attempts to summarize and review the recent research and developments of SVM in machine condition monitoring and diagnosis. Numerous methods have been developed based on intelligent systems such as artificial neural network, fuzzy expert system, condition-based reasoning, random forest, etc. However, the use of SVM for machine condition monitoring and fault diagnosis is still rare. SVM has excellent performance in generalization so it can produce high accuracy in classification for machine condition monitoring and diagnosis. Until 2006, the use of SVM in machine condition monitoring and fault diagnosis is tending to develop towards expertise orientation and problem-oriented domain. Finally, the ability to continually change and obtain a novel idea for machine condition monitoring and fault diagnosis using SVM will be future works.
Resistance characteristics of innovative eco-fitness equipment: a water buoyancy muscular machine.
Chen, Wei-Han; Liu, Ya-Chen; Tai, Hsing-Hao; Liu, Chiang
2018-03-01
This paper propose innovative eco-fitness equipment-a water buoyancy muscular machine (WBM machine) in which resistance is generated by buoyancy and fluid resistance. The resistance characteristics during resistance adjustment and under isometric, concentric, eccentric and isokinetic conditions were investigated and compared to a conventional machine with metal weight plates. The results indicated that the isometric, concentric and eccentric resistances could be adjusted by varying the water volume; the maximum resistances under isometric, concentric and eccentric conditions were 163.8, 338.5 and 140.9 N, respectively. The isometric resistances at different positions remained constant in both machines; however the isometric resistance was lower for WBM machine when at a position corresponding to a 5% total displacement. The WBM machine has lower resistance under eccentric conditions and higher resistance under concentric conditions. Although the conventional machine has an identical trend, the variation was minor (within 4 N). In the WBM machine, the eccentric resistance was approximately 30-45% of the concentric resistance. Concentric resistances increased with an increase in velocity in both machines; however, the eccentric resistances decreased with an increase in velocity. In summary, the WBM machine, a piece of innovative eco-fitness equipment, has unique resistance characteristics and expansibility.
Yadav, Kabir; Sarioglu, Efsun; Choi, Hyeong Ah; Cartwright, Walter B; Hinds, Pamela S; Chamberlain, James M
2016-02-01
The authors have previously demonstrated highly reliable automated classification of free-text computed tomography (CT) imaging reports using a hybrid system that pairs linguistic (natural language processing) and statistical (machine learning) techniques. Previously performed for identifying the outcome of orbital fracture in unprocessed radiology reports from a clinical data repository, the performance has not been replicated for more complex outcomes. To validate automated outcome classification performance of a hybrid natural language processing (NLP) and machine learning system for brain CT imaging reports. The hypothesis was that our system has performance characteristics for identifying pediatric traumatic brain injury (TBI). This was a secondary analysis of a subset of 2,121 CT reports from the Pediatric Emergency Care Applied Research Network (PECARN) TBI study. For that project, radiologists dictated CT reports as free text, which were then deidentified and scanned as PDF documents. Trained data abstractors manually coded each report for TBI outcome. Text was extracted from the PDF files using optical character recognition. The data set was randomly split evenly for training and testing. Training patient reports were used as input to the Medical Language Extraction and Encoding (MedLEE) NLP tool to create structured output containing standardized medical terms and modifiers for negation, certainty, and temporal status. A random subset stratified by site was analyzed using descriptive quantitative content analysis to confirm identification of TBI findings based on the National Institute of Neurological Disorders and Stroke (NINDS) Common Data Elements project. Findings were coded for presence or absence, weighted by frequency of mentions, and past/future/indication modifiers were filtered. After combining with the manual reference standard, a decision tree classifier was created using data mining tools WEKA 3.7.5 and Salford Predictive Miner 7.0. Performance of the decision tree classifier was evaluated on the test patient reports. The prevalence of TBI in the sampled population was 159 of 2,217 (7.2%). The automated classification for pediatric TBI is comparable to our prior results, with the notable exception of lower positive predictive value. Manual review of misclassified reports, 95.5% of which were false-positives, revealed that a sizable number of false-positive errors were due to differing outcome definitions between NINDS TBI findings and PECARN clinical important TBI findings and report ambiguity not meeting definition criteria. A hybrid NLP and machine learning automated classification system continues to show promise in coding free-text electronic clinical data. For complex outcomes, it can reliably identify negative reports, but manual review of positive reports may be required. As such, it can still streamline data collection for clinical research and performance improvement. © 2016 by the Society for Academic Emergency Medicine.
Yadav, Kabir; Sarioglu, Efsun; Choi, Hyeong-Ah; Cartwright, Walter B.; Hinds, Pamela S.; Chamberlain, James M.
2016-01-01
Background The authors have previously demonstrated highly reliable automated classification of free text computed tomography (CT) imaging reports using a hybrid system that pairs linguistic (natural language processing) and statistical (machine learning) techniques. Previously performed for identifying the outcome of orbital fracture in unprocessed radiology reports from a clinical data repository, the performance has not been replicated for more complex outcomes. Objectives To validate automated outcome classification performance of a hybrid natural language processing (NLP) and machine learning system for brain CT imaging reports. The hypothesis was that our system has performance characteristics for identifying pediatric traumatic brain injury (TBI). Methods This was a secondary analysis of a subset of 2,121 CT reports from the Pediatric Emergency Care Applied Research Network (PECARN) TBI study. For that project, radiologists dictated CT reports as free text, which were then de-identified and scanned as PDF documents. Trained data abstractors manually coded each report for TBI outcome. Text was extracted from the PDF files using optical character recognition. The dataset was randomly split evenly for training and testing. Training patient reports were used as input to the Medical Language Extraction and Encoding (MedLEE) NLP tool to create structured output containing standardized medical terms and modifiers for negation, certainty, and temporal status. A random subset stratified by site was analyzed using descriptive quantitative content analysis to confirm identification of TBI findings based upon the National Institute of Neurological Disorders and Stroke Common Data Elements project. Findings were coded for presence or absence, weighted by frequency of mentions, and past/future/indication modifiers were filtered. After combining with the manual reference standard, a decision tree classifier was created using data mining tools WEKA 3.7.5 and Salford Predictive Miner 7.0. Performance of the decision tree classifier was evaluated on the test patient reports. Results The prevalence of TBI in the sampled population was 159 out of 2,217 (7.2%). The automated classification for pediatric TBI is comparable to our prior results, with the notable exception of lower positive predictive value (PPV). Manual review of misclassified reports, 95.5% of which were false positives, revealed that a sizable number of false-positive errors were due to differing outcome definitions between NINDS TBI findings and PECARN clinical important TBI findings, and report ambiguity not meeting definition criteria. Conclusions A hybrid NLP and machine learning automated classification system continues to show promise in coding free-text electronic clinical data. For complex outcomes, it can reliably identify negative reports, but manual review of positive reports may be required. As such, it can still streamline data collection for clinical research and performance improvement. PMID:26766600
Bucak, Ihsan Ömür
2010-01-01
In the automotive industry, electromagnetic variable reluctance (VR) sensors have been extensively used to measure engine position and speed through a toothed wheel mounted on the crankshaft. In this work, an application that already uses the VR sensing unit for engine and/or transmission has been chosen to infer, this time, the indirect position of the electric machine in a parallel Hybrid Electric Vehicle (HEV) system. A VR sensor has been chosen to correct the position of the electric machine, mainly because it may still become critical in the operation of HEVs to avoid possible vehicle failures during the start-up and on-the-road, especially when the machine is used with an internal combustion engine. The proposed method uses Chi-square test and is adaptive in a sense that it derives the compensation factors during the shaft operation and updates them in a timely fashion.
Bucak, İhsan Ömür
2010-01-01
In the automotive industry, electromagnetic variable reluctance (VR) sensors have been extensively used to measure engine position and speed through a toothed wheel mounted on the crankshaft. In this work, an application that already uses the VR sensing unit for engine and/or transmission has been chosen to infer, this time, the indirect position of the electric machine in a parallel Hybrid Electric Vehicle (HEV) system. A VR sensor has been chosen to correct the position of the electric machine, mainly because it may still become critical in the operation of HEVs to avoid possible vehicle failures during the start-up and on-the-road, especially when the machine is used with an internal combustion engine. The proposed method uses Chi-square test and is adaptive in a sense that it derives the compensation factors during the shaft operation and updates them in a timely fashion. PMID:22294906
Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro
2018-05-09
Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.
MLACP: machine-learning-based prediction of anticancer peptides
Manavalan, Balachandran; Basith, Shaherin; Shin, Tae Hwan; Choi, Sun; Kim, Myeong Ok; Lee, Gwang
2017-01-01
Cancer is the second leading cause of death globally, and use of therapeutic peptides to target and kill cancer cells has received considerable attention in recent years. Identification of anticancer peptides (ACPs) through wet-lab experimentation is expensive and often time consuming; therefore, development of an efficient computational method is essential to identify potential ACP candidates prior to in vitro experimentation. In this study, we developed support vector machine- and random forest-based machine-learning methods for the prediction of ACPs using the features calculated from the amino acid sequence, including amino acid composition, dipeptide composition, atomic composition, and physicochemical properties. We trained our methods using the Tyagi-B dataset and determined the machine parameters by 10-fold cross-validation. Furthermore, we evaluated the performance of our methods on two benchmarking datasets, with our results showing that the random forest-based method outperformed the existing methods with an average accuracy and Matthews correlation coefficient value of 88.7% and 0.78, respectively. To assist the scientific community, we also developed a publicly accessible web server at www.thegleelab.org/MLACP.html. PMID:29100375
Vidotti, Vanessa G; Costa, Vital P; Silva, Fabrício R; Resende, Graziela M; Cremasco, Fernanda; Dias, Marcelo; Gomi, Edson S
2012-06-15
Purpose. To investigate the sensitivity and specificity of machine learning classifiers (MLC) and spectral domain optical coherence tomography (SD-OCT) for the diagnosis of glaucoma. Methods. Sixty-two patients with early to moderate glaucomatous visual field damage and 48 healthy individuals were included. All subjects underwent a complete ophthalmologic examination, achromatic standard automated perimetry, and RNFL imaging with SD-OCT (Cirrus HD-OCT; Carl Zeiss Meditec, Inc., Dublin, California, USA). Receiver operating characteristic (ROC) curves were obtained for all SD-OCT parameters. Subsequently, the following MLCs were tested: Classification Tree (CTREE), Random Forest (RAN), Bagging (BAG), AdaBoost M1 (ADA), Ensemble Selection (ENS), Multilayer Perceptron (MLP), Radial Basis Function (RBF), Naive-Bayes (NB), and Support Vector Machine (SVM). Areas under the ROC curves (aROCs) obtained for each parameter and each MLC were compared. Results. The mean age was 57.0±9.2 years for healthy individuals and 59.9±9.0 years for glaucoma patients (p=0.103). Mean deviation values were -4.1±2.4 dB for glaucoma patients and -1.5±1.6 dB for healthy individuals (p<0.001). The SD-OCT parameters with the greater aROCs were inferior quadrant (0.813), average thickness (0.807), 7 o'clock position (0.765), and 6 o'clock position (0.754). The aROCs from classifiers varied from 0.785 (ADA) to 0.818 (BAG). The aROC obtained with BAG was not significantly different from the aROC obtained with the best single SD-OCT parameter (p=0.93). Conclusions. The SD-OCT showed good diagnostic accuracy in a group of patients with early glaucoma. In this series, MLCs did not improve the sensitivity and specificity of SD-OCT for the diagnosis of glaucoma.
Identifying QT prolongation from ECG impressions using a general-purpose Natural Language Processor
Denny, Joshua C.; Miller, Randolph A.; Waitman, Lemuel Russell; Arrieta, Mark; Peterson, Joshua F.
2009-01-01
Objective Typically detected via electrocardiograms (ECGs), QT interval prolongation is a known risk factor for sudden cardiac death. Since medications can promote or exacerbate the condition, detection of QT interval prolongation is important for clinical decision support. We investigated the accuracy of natural language processing (NLP) for identifying QT prolongation from cardiologist-generated, free-text ECG impressions compared to corrected QT (QTc) thresholds reported by ECG machines. Methods After integrating negation detection to a locally-developed natural language processor, the KnowledgeMap concept identifier, we evaluated NLP-based detection of QT prolongation compared to the calculated QTc on a set of 44,318 ECGs obtained from hospitalized patients. We also created a string query using regular expressions to identify QT prolongation. We calculated sensitivity and specificity of the methods using manual physician review of the cardiologist-generated reports as the gold standard. To investigate causes of “false positive” calculated QTc, we manually reviewed randomly selected ECGs with a long calculated QTc but no mention of QT prolongation. Separately, we validated the performance of the negation detection algorithm on 5,000 manually-categorized ECG phrases for any medical concept (not limited to QT prolongation) prior to developing the NLP query for QT prolongation. Results The NLP query for QT prolongation correctly identified 2,364 of 2,373 ECGs with QT prolongation with a sensitivity of 0.996 and a positive predictive value of 1.000. There were no false positives. The regular expression query had a sensitivity of 0.999 and positive predictive value of 0.982. In contrast, the positive predictive value of common QTc thresholds derived from ECG machines was 0.07–0.25 with corresponding sensitivities of 0.994–0.046. The negation detection algorithm had a recall of 0.973 and precision of 0.982 for 10,490 concepts found within ECG impressions. Conclusions NLP and regular expression queries of cardiologists’ ECG interpretations can more effectively identify QT prolongation than the automated QTc intervals reported by ECG machines. Future clinical decision support could employ NLP queries to detect QTc prolongation and other reported ECG abnormalities. PMID:18938105
NASA Technical Reports Server (NTRS)
Byman, J. E.
1985-01-01
A brief history of aircraft production techniques is given. A flexible machining cell is then described. It is a computer controlled system capable of performing 4-axis machining part cleaning, dimensional inspection and materials handling functions in an unmanned environment. The cell was designed to: allow processing of similar and dissimilar parts in random order without disrupting production; allow serial (one-shipset-at-a-time) manufacturing; reduce work-in-process inventory; maximize machine utilization through remote set-up; maximize throughput and minimize labor.
Machine learning with quantum relative entropy
NASA Astrophysics Data System (ADS)
Tsuda, Koji
2009-12-01
Density matrices are a central tool in quantum physics, but it is also used in machine learning. A positive definite matrix called kernel matrix is used to represent the similarities between examples. Positive definiteness assures that the examples are embedded in an Euclidean space. When a positive definite matrix is learned from data, one has to design an update rule that maintains the positive definiteness. Our update rule, called matrix exponentiated gradient update, is motivated by the quantum relative entropy. Notably, the relative entropy is an instance of Bregman divergences, which are asymmetric distance measures specifying theoretical properties of machine learning algorithms. Using the calculus commonly used in quantum physics, we prove an upperbound of the generalization error of online learning.
Prediction of Baseflow Index of Catchments using Machine Learning Algorithms
NASA Astrophysics Data System (ADS)
Yadav, B.; Hatfield, K.
2017-12-01
We present the results of eight machine learning techniques for predicting the baseflow index (BFI) of ungauged basins using a surrogate of catchment scale climate and physiographic data. The tested algorithms include ordinary least squares, ridge regression, least absolute shrinkage and selection operator (lasso), elasticnet, support vector machine, gradient boosted regression trees, random forests, and extremely randomized trees. Our work seeks to identify the dominant controls of BFI that can be readily obtained from ancillary geospatial databases and remote sensing measurements, such that the developed techniques can be extended to ungauged catchments. More than 800 gauged catchments spanning the continental United States were selected to develop the general methodology. The BFI calculation was based on the baseflow separated from daily streamflow hydrograph using HYSEP filter. The surrogate catchment attributes were compiled from multiple sources including digital elevation model, soil, landuse, climate data, other publicly available ancillary and geospatial data. 80% catchments were used to train the ML algorithms, and the remaining 20% of the catchments were used as an independent test set to measure the generalization performance of fitted models. A k-fold cross-validation using exhaustive grid search was used to fit the hyperparameters of each model. Initial model development was based on 19 independent variables, but after variable selection and feature ranking, we generated revised sparse models of BFI prediction that are based on only six catchment attributes. These key predictive variables selected after the careful evaluation of bias-variance tradeoff include average catchment elevation, slope, fraction of sand, permeability, temperature, and precipitation. The most promising algorithms exceeding an accuracy score (r-square) of 0.7 on test data include support vector machine, gradient boosted regression trees, random forests, and extremely randomized trees. Considering both the accuracy and the computational complexity of these algorithms, we identify the extremely randomized trees as the best performing algorithm for BFI prediction in ungauged basins.
Machine Learning to Differentiate Between Positive and Negative Emotions Using Pupil Diameter
Babiker, Areej; Faye, Ibrahima; Prehn, Kristin; Malik, Aamir
2015-01-01
Pupil diameter (PD) has been suggested as a reliable parameter for identifying an individual’s emotional state. In this paper, we introduce a learning machine technique to detect and differentiate between positive and negative emotions. We presented 30 participants with positive and negative sound stimuli and recorded pupillary responses. The results showed a significant increase in pupil dilation during the processing of negative and positive sound stimuli with greater increase for negative stimuli. We also found a more sustained dilation for negative compared to positive stimuli at the end of the trial, which was utilized to differentiate between positive and negative emotions using a machine learning approach which gave an accuracy of 96.5% with sensitivity of 97.93% and specificity of 98%. The obtained results were validated using another dataset designed for a different study and which was recorded while 30 participants processed word pairs with positive and negative emotions. PMID:26733912
30 CFR 70.220 - Status change reports.
Code of Federal Regulations, 2014 CFR
2014-07-01
... to all miners expected to wear a CPDM. The training shall be completed prior to a miner wearing a... cutting machine. On the cutting machine operator or on the cutting machine within 36 inches inby the normal working position; (2) Conventional section blasting off the solid. On the loading machine operator...
Exploring prediction uncertainty of spatial data in geostatistical and machine learning Approaches
NASA Astrophysics Data System (ADS)
Klump, J. F.; Fouedjio, F.
2017-12-01
Geostatistical methods such as kriging with external drift as well as machine learning techniques such as quantile regression forest have been intensively used for modelling spatial data. In addition to providing predictions for target variables, both approaches are able to deliver a quantification of the uncertainty associated with the prediction at a target location. Geostatistical approaches are, by essence, adequate for providing such prediction uncertainties and their behaviour is well understood. However, they often require significant data pre-processing and rely on assumptions that are rarely met in practice. Machine learning algorithms such as random forest regression, on the other hand, require less data pre-processing and are non-parametric. This makes the application of machine learning algorithms to geostatistical problems an attractive proposition. The objective of this study is to compare kriging with external drift and quantile regression forest with respect to their ability to deliver reliable prediction uncertainties of spatial data. In our comparison we use both simulated and real world datasets. Apart from classical performance indicators, comparisons make use of accuracy plots, probability interval width plots, and the visual examinations of the uncertainty maps provided by the two approaches. By comparing random forest regression to kriging we found that both methods produced comparable maps of estimated values for our variables of interest. However, the measure of uncertainty provided by random forest seems to be quite different to the measure of uncertainty provided by kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. These preliminary results raise questions about assessing the risks associated with decisions based on the predictions from geostatistical and machine learning algorithms in a spatial context, e.g. mineral exploration.
A Boltzmann machine for the organization of intelligent machines
NASA Technical Reports Server (NTRS)
Moed, Michael C.; Saridis, George N.
1990-01-01
A three-tier structure consisting of organization, coordination, and execution levels forms the architecture of an intelligent machine using the principle of increasing precision with decreasing intelligence from a hierarchically intelligent control. This system has been formulated as a probabilistic model, where uncertainty and imprecision can be expressed in terms of entropies. The optimal strategy for decision planning and task execution can be found by minimizing the total entropy in the system. The focus is on the design of the organization level as a Boltzmann machine. Since this level is responsible for planning the actions of the machine, the Boltzmann machine is reformulated to use entropy as the cost function to be minimized. Simulated annealing, expanding subinterval random search, and the genetic algorithm are presented as search techniques to efficiently find the desired action sequence and illustrated with numerical examples.
Depth indicator and stop aid machining to precise tolerances
NASA Technical Reports Server (NTRS)
Laverty, J. L.
1966-01-01
Attachment for machine tools provides a visual indication of the depth of cut and a positive stop to prevent overcutting. This attachment is used with drill presses, vertical milling machines, and jig borers.
Chip breaking system for automated machine tool
Arehart, Theodore A.; Carey, Donald O.
1987-01-01
The invention is a rotary selectively directional valve assembly for use in an automated turret lathe for directing a stream of high pressure liquid machining coolant to the interface of a machine tool and workpiece for breaking up ribbon-shaped chips during the formation thereof so as to inhibit scratching or other marring of the machined surfaces by these ribbon-shaped chips. The valve assembly is provided by a manifold arrangement having a plurality of circumferentially spaced apart ports each coupled to a machine tool. The manifold is rotatable with the turret when the turret is positioned for alignment of a machine tool in a machining relationship with the workpiece. The manifold is connected to a non-rotational header having a single passageway therethrough which conveys the high pressure coolant to only the port in the manifold which is in registry with the tool disposed in a working relationship with the workpiece. To position the machine tools the turret is rotated and one of the tools is placed in a material-removing relationship of the workpiece. The passageway in the header and one of the ports in the manifold arrangement are then automatically aligned to supply the machining coolant to the machine tool workpiece interface for breaking up of the chips as well as cooling the tool and workpiece during the machining operation.
Prediction and Identification of Krüppel-Like Transcription Factors by Machine Learning Method.
Liao, Zhijun; Wang, Xinrui; Chen, Xingyong; Zou, Quan
2017-01-01
The Krüppel-like factors (KLFs) are a family of containing Zn finger(ZF) motif transcription factors with 18 members in human genome, among them, KLF18 is predicted by bioinformatics. KLFs possess various physiological function involving in a number of cancers and other diseases. Here we perform a binary-class classification of KLFs and non-KLFs by machine learning methods. The protein sequences of KLFs and non-KLFs were searched from UniProt and randomly separate them into training dataset(containing positive and negative sequences) and test dataset(containing only negative sequences), after extracting the 188-dimensional(188D) feature vectors we carry out category with four classifiers(GBDT, libSVM, RF, and k-NN). On the human KLFs, we further dig into the evolutionary relationship and motif distribution, and finally we analyze the conserved amino acid residue of three zinc fingers. The classifier model from training dataset were well constructed, and the highest specificity(Sp) was 99.83% from a library for support vector machine(libSVM) and all the correctly classified rates were over 70% for 10-fold cross-validation on test dataset. The 18 human KLFs can be further divided into 7 groups and the zinc finger domains were located at the carboxyl terminus, and many conserved amino acid residues including Cysteine and Histidine, and the span and interval between them were consistent in the three ZF domains. Two classification models for KLFs prediction have been built by novel machine learning methods. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Enhancement of plant metabolite fingerprinting by machine learning.
Scott, Ian M; Vermeer, Cornelia P; Liakata, Maria; Corol, Delia I; Ward, Jane L; Lin, Wanchang; Johnson, Helen E; Whitehead, Lynne; Kular, Baldeep; Baker, John M; Walsh, Sean; Dave, Anuja; Larson, Tony R; Graham, Ian A; Wang, Trevor L; King, Ross D; Draper, John; Beale, Michael H
2010-08-01
Metabolite fingerprinting of Arabidopsis (Arabidopsis thaliana) mutants with known or predicted metabolic lesions was performed by (1)H-nuclear magnetic resonance, Fourier transform infrared, and flow injection electrospray-mass spectrometry. Fingerprinting enabled processing of five times more plants than conventional chromatographic profiling and was competitive for discriminating mutants, other than those affected in only low-abundance metabolites. Despite their rapidity and complexity, fingerprints yielded metabolomic insights (e.g. that effects of single lesions were usually not confined to individual pathways). Among fingerprint techniques, (1)H-nuclear magnetic resonance discriminated the most mutant phenotypes from the wild type and Fourier transform infrared discriminated the fewest. To maximize information from fingerprints, data analysis was crucial. One-third of distinctive phenotypes might have been overlooked had data models been confined to principal component analysis score plots. Among several methods tested, machine learning (ML) algorithms, namely support vector machine or random forest (RF) classifiers, were unsurpassed for phenotype discrimination. Support vector machines were often the best performing classifiers, but RFs yielded some particularly informative measures. First, RFs estimated margins between mutant phenotypes, whose relations could then be visualized by Sammon mapping or hierarchical clustering. Second, RFs provided importance scores for the features within fingerprints that discriminated mutants. These scores correlated with analysis of variance F values (as did Kruskal-Wallis tests, true- and false-positive measures, mutual information, and the Relief feature selection algorithm). ML classifiers, as models trained on one data set to predict another, were ideal for focused metabolomic queries, such as the distinctiveness and consistency of mutant phenotypes. Accessible software for use of ML in plant physiology is highlighted.
Evaluation of an Integrated Multi-Task Machine Learning System with Humans in the Loop
2007-01-01
machine learning components natural language processing, and optimization...was examined with a test explicitly developed to measure the impact of integrated machine learning when used by a human user in a real world setting...study revealed that integrated machine learning does produce a positive impact on overall performance. This paper also discusses how specific machine learning components contributed to human-system
Sundstrup, Emil; Jakobsen, Markus D; Andersen, Christoffer H; Jay, Kenneth; Andersen, Lars L
2012-08-01
Swiss ball training is recommended as a low intensity modality to improve joint position, posture, balance, and neural feedback. However, proper training intensity is difficult to obtain during Swiss ball exercises whereas strengthening exercises on machines usually are performed to induce high level of muscle activation. To compare muscle activation as measured by electromyography (EMG) of global core and thigh muscles during abdominal crunches performed on Swiss ball with elastic resistance or on an isotonic training machine when normalized for training intensity. 42 untrained individuals (18 men and 24 women) aged 28-67 years participated in the study. EMG activity was measured in 13 muscles during 3 repetitions with a 10 RM load during both abdominal crunches on training ball with elastic resistance and in the same movement utilizing a training machine (seated crunch, Technogym, Cesena, Italy). The order of performance of the exercises was randomized, and EMG amplitude was normalized to maximum voluntary isometric contraction (MVIC) EMG. When comparing between muscles, normalized EMG was highest in the rectus abdominis (P<0.01) and the external obliques (P<0.01). However, crunches on Swiss ball with elastic resistance showed higher activity of the rectus abdominis than crunches performed on the machine (104±3.8 vs 84±3.8% nEMG respectively, P<0.0001). By contrast, crunches performed on Swiss ball induced lower activity of the rectus femoris than crunches in training machine (27±3.7 vs 65±3.8% nEMG respectively, P<0.0001) Further, gender, age and musculoskeletal pain did not significantly influence the findings. Crunches on a Swiss ball with added elastic resistance induces high rectus abdominis activity accompanied by low hip flexor activity which could be beneficial for individuals with low back pain. In opposition, the lower rectus abdominis activity and higher rectus femoris activity observed in machine warrant caution for individuals with lumbar pain. Importantly, both men and women, younger and elderly, and individuals with and without pain benefitted equally from the exercises.
Detection of molecular particles in live cells via machine learning.
Jiang, Shan; Zhou, Xiaobo; Kirchhausen, Tom; Wong, Stephen T C
2007-08-01
Clathrin-coated pits play an important role in removing proteins and lipids from the plasma membrane and transporting them to the endosomal compartment. It is, however, still unclear whether there exist "hot spots" for the formation of Clathrin-coated pits or the pits and arrays formed randomly on the plasma membrane. To answer this question, first of all, many hundreds of individual pits need to be detected accurately and separated in live-cell microscope movies to capture and monitor how pits and vesicles were formed. Because of the noisy background and the low contrast of the live-cell movies, the existing image analysis methods, such as single threshold, edge detection, and morphological operation, cannot be used. Thus, this paper proposes a machine learning method, which is based on Haar features, to detect the particle's position. Results show that this method can successfully detect most of particles in the image. In order to get the accurate boundaries of these particles, several post-processing methods are applied and signal-to-noise ratio analysis is also performed to rule out the weak spots. Copyright 2007 International Society for Analytical Cytology.
Steps in the bacterial flagellar motor.
Mora, Thierry; Yu, Howard; Sowa, Yoshiyuki; Wingreen, Ned S
2009-10-01
The bacterial flagellar motor is a highly efficient rotary machine used by many bacteria to propel themselves. It has recently been shown that at low speeds its rotation proceeds in steps. Here we propose a simple physical model, based on the storage of energy in protein springs, that accounts for this stepping behavior as a random walk in a tilted corrugated potential that combines torque and contact forces. We argue that the absolute angular position of the rotor is crucial for understanding step properties and show this hypothesis to be consistent with the available data, in particular the observation that backward steps are smaller on average than forward steps. We also predict a sublinear speed versus torque relationship for fixed load at low torque, and a peak in rotor diffusion as a function of torque. Our model provides a comprehensive framework for understanding and analyzing stepping behavior in the bacterial flagellar motor and proposes novel, testable predictions. More broadly, the storage of energy in protein springs by the flagellar motor may provide useful general insights into the design of highly efficient molecular machines.
Parker, David L; Brosseau, Lisa M; Samant, Yogindra; Xi, Min; Pan, Wei; Haugan, David
2009-01-01
Metal fabrication employs an estimated 3.1 million workers in the United States. The absence of machine guarding and related programs such as lockout/tagout may result in serious injury or death. The purpose of this study was to improve machine-related safety in small metal-fabrication businesses. We used a randomized trial with two groups: management only and management-employee. We evaluated businesses for the adequacy of machine guarding (machine scorecard) and related safety programs (safety audit). We provided all businesses with a report outlining deficiencies and prioritizing their remediation. In addition, the management-employee group received four one-hour interactive training sessions from a peer educator. We evaluated 40 metal-fabrication businesses at baseline and 37 (93%) one year later. Of the three nonparticipants, two had gone out of business. More than 40% of devices required for adequate guarding were missing or inadequate, and 35% of required safety programs and practices were absent at baseline. Both measures improved significantly during the course of the intervention. No significant differences in changes occurred between the two intervention groups. Machine-guarding practices and programs improved by up to 13% and safety audit scores by up to 23%. Businesses that added safety committees or those that started with the lowest baseline measures showed the greatest improvements. Simple and easy-to-use assessment tools allowed businesses to significantly improve their safety practices, and safety committees facilitated this process.
Light-operated machines based on threaded molecular structures.
Credi, Alberto; Silvi, Serena; Venturi, Margherita
2014-01-01
Rotaxanes and related species represent the most common implementation of the concept of artificial molecular machines, because the supramolecular nature of the interactions between the components and their interlocked architecture allow a precise control on the position and movement of the molecular units. The use of light to power artificial molecular machines is particularly valuable because it can play the dual role of "writing" and "reading" the system. Moreover, light-driven machines can operate without accumulation of waste products, and photons are the ideal inputs to enable autonomous operation mechanisms. In appropriately designed molecular machines, light can be used to control not only the stability of the system, which affects the relative position of the molecular components but also the kinetics of the mechanical processes, thereby enabling control on the direction of the movements. This step forward is necessary in order to make a leap from molecular machines to molecular motors.
Eroglu, Duygu Yilmaz; Ozmutlu, H Cenk
2014-01-01
We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms.
2011-01-01
Background Dementia and cognitive impairment associated with aging are a major medical and social concern. Neuropsychological testing is a key element in the diagnostic procedures of Mild Cognitive Impairment (MCI), but has presently a limited value in the prediction of progression to dementia. We advance the hypothesis that newer statistical classification methods derived from data mining and machine learning methods like Neural Networks, Support Vector Machines and Random Forests can improve accuracy, sensitivity and specificity of predictions obtained from neuropsychological testing. Seven non parametric classifiers derived from data mining methods (Multilayer Perceptrons Neural Networks, Radial Basis Function Neural Networks, Support Vector Machines, CART, CHAID and QUEST Classification Trees and Random Forests) were compared to three traditional classifiers (Linear Discriminant Analysis, Quadratic Discriminant Analysis and Logistic Regression) in terms of overall classification accuracy, specificity, sensitivity, Area under the ROC curve and Press'Q. Model predictors were 10 neuropsychological tests currently used in the diagnosis of dementia. Statistical distributions of classification parameters obtained from a 5-fold cross-validation were compared using the Friedman's nonparametric test. Results Press' Q test showed that all classifiers performed better than chance alone (p < 0.05). Support Vector Machines showed the larger overall classification accuracy (Median (Me) = 0.76) an area under the ROC (Me = 0.90). However this method showed high specificity (Me = 1.0) but low sensitivity (Me = 0.3). Random Forest ranked second in overall accuracy (Me = 0.73) with high area under the ROC (Me = 0.73) specificity (Me = 0.73) and sensitivity (Me = 0.64). Linear Discriminant Analysis also showed acceptable overall accuracy (Me = 0.66), with acceptable area under the ROC (Me = 0.72) specificity (Me = 0.66) and sensitivity (Me = 0.64). The remaining classifiers showed overall classification accuracy above a median value of 0.63, but for most sensitivity was around or even lower than a median value of 0.5. Conclusions When taking into account sensitivity, specificity and overall classification accuracy Random Forests and Linear Discriminant analysis rank first among all the classifiers tested in prediction of dementia using several neuropsychological tests. These methods may be used to improve accuracy, sensitivity and specificity of Dementia predictions from neuropsychological testing. PMID:21849043
NASA Technical Reports Server (NTRS)
Warren, Wayne H., Jr.
1990-01-01
The machine-readable version of the catalog, as it is currently being distributed from the Astronomical Data Center, is described. The Basic FK5 provides improved mean positions and proper motions for the 1535 classical fundamental stars that had been included in the FK3 and FK4 catalogs. The machine version of the catalog contains the positions and proper motions of the Basic FK5 stars for the epochs and equinoxes J2000.0 and B1950.0, the mean epochs of individual observed right ascensions and declinations used to determine the final positions, and the mean errors of the final positions and proper motions for the reported epochs. The cross identifications to other designations used for the FK5 stars that are given in the published catalog were not included in the original machine versions, but the Durchmusterung numbers have been added at the Astronomical Data Center.
Coordinated joint motion control system with position error correction
Danko, George [Reno, NV
2011-11-22
Disclosed are an articulated hydraulic machine supporting, control system and control method for same. The articulated hydraulic machine has an end effector for performing useful work. The control system is capable of controlling the end effector for automated movement along a preselected trajectory. The control system has a position error correction system to correct discrepancies between an actual end effector trajectory and a desired end effector trajectory. The correction system can employ one or more absolute position signals provided by one or more acceleration sensors supported by one or more movable machine elements. Good trajectory positioning and repeatability can be obtained. A two-joystick controller system is enabled, which can in some cases facilitate the operator's task and enhance their work quality and productivity.
Coordinated joint motion control system with position error correction
Danko, George L.
2016-04-05
Disclosed are an articulated hydraulic machine supporting, control system and control method for same. The articulated hydraulic machine has an end effector for performing useful work. The control system is capable of controlling the end effector for automated movement along a preselected trajectory. The control system has a position error correction system to correct discrepancies between an actual end effector trajectory and a desired end effector trajectory. The correction system can employ one or more absolute position signals provided by one or more acceleration sensors supported by one or more movable machine elements. Good trajectory positioning and repeatability can be obtained. A two joystick controller system is enabled, which can in some cases facilitate the operator's task and enhance their work quality and productivity.
Approximating prediction uncertainty for random forest regression models
John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne
2016-01-01
Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...
Disruption Warning Database Development and Exploratory Machine Learning Studies on Alcator C-Mod
NASA Astrophysics Data System (ADS)
Montes, Kevin; Rea, Cristina; Granetz, Robert
2017-10-01
A database of about 1800 shots from the 2015 campaign on the Alcator C-Mod tokamak is assembled, including disruptive and non-disruptive discharges. The database consists of 40 relevant plasma parameters with data taken from 160k time slices. In order to investigate the possibility of developing a robust disruption prediction algorithm that is tokamak-independent, we focused machine learning studies on a subset of dimensionless parameters such as βp, n /nG , etc. The Random Forests machine learning algorithm provides insight on the available data set by ranking the relative importance of the input features. Its application on the C-Mod database, however, reveals that virtually no one parameter has more importance than any other, and that its classification algorithm has a low rate of successfully predicted samples, as well as poor false positive and false negative rates. Comparing the analysis of this algorithm on the C-Mod database with its application to a similar database on DIII-D, we conclude that disruption prediction may not be feasible on C-Mod. This conclusion is supported by empirical observations that most C-Mod disruptions are caused by radiative collapse due to molybdenum from the first wall, which happens on just a 1-2ms timescale. Supported by the US Dept. of Energy under DE-FC02-99ER54512 and DE-FC02-04ER54698.
Techniques for Combined Arms for Air Defense
2016-07-29
rockets can be expected on the initial attack run, while cannon and machine - gun fire will likely be used in the follow- on attack. THREAT...wing aircraft with 8 stinger missiles and an M3P .50 caliber machine gun . This system is highly mobile and can be used to provide SHORAD security for...position. If cover and concealment are less substantial, use the low kneeling position. When using the M240 machine gun , the gunner will also fire from a
[Evaluation of Medical Instruments Cleaning Effect of Fluorescence Detection Technique].
Sheng, Nan; Shen, Yue; Li, Zhen; Li, Huijuan; Zhou, Chaoqun
2016-01-01
To compare the cleaning effect of automatic cleaning machine and manual cleaning on coupling type surgical instruments. A total of 32 cleaned medical instruments were randomly sampled from medical institutions in Putuo District medical institutions disinfection supply center. Hygiena System SUREII ATP was used to monitor the ATP value, and the cleaning effect was evaluated. The surface ATP values of the medical instrument of manual cleaning were higher than that of the automatic cleaning machine. Coupling type surgical instruments has better cleaning effect of automatic cleaning machine before disinfection, the application is recommended.
Kocken, Paul L; Eeuwijk, Jennifer; Van Kesteren, Nicole M C; Dusseldorp, Elise; Buijs, Goof; Bassa-Dafesh, Zeina; Snel, Jeltje
2012-03-01
Vending machines account for food sales and revenue in schools. We examined 3 strategies for promoting the sale of lower-calorie food products from vending machines in high schools in the Netherlands. A school-based randomized controlled trial was conducted in 13 experimental schools and 15 control schools. Three strategies were tested within each experimental school: increasing the availability of lower-calorie products in vending machines, labeling products, and reducing the price of lower-calorie products. The experimental schools introduced the strategies in 3 consecutive phases, with phase 3 incorporating all 3 strategies. The control schools remained the same. The sales volumes from the vending machines were registered. Products were grouped into (1) extra foods containing empty calories, for example, candies and potato chips, (2) nutrient-rich basic foods, and (3) beverages. They were also divided into favorable, moderately unfavorable, and unfavorable products. Total sales volumes for experimental and control schools did not differ significantly for the extra and beverage products. Proportionally, the higher availability of lower-calorie extra products in the experimental schools led to higher sales of moderately unfavorable extra products than in the control schools, and to higher sales of favorable extra products in experimental schools where students have to stay during breaks. Together, availability, labeling, and price reduction raised the proportional sales of favorable beverages. Results indicate that when the availability of lower-calorie foods is increased and is also combined with labeling and reduced prices, students make healthier choices without buying more or fewer products from school vending machines. Changes to school vending machines help to create a healthy school environment. © 2012, American School Health Association.
Ranjith, G; Parvathy, R; Vikas, V; Chandrasekharan, Kesavadas; Nair, Suresh
2015-04-01
With the advent of new imaging modalities, radiologists are faced with handling increasing volumes of data for diagnosis and treatment planning. The use of automated and intelligent systems is becoming essential in such a scenario. Machine learning, a branch of artificial intelligence, is increasingly being used in medical image analysis applications such as image segmentation, registration and computer-aided diagnosis and detection. Histopathological analysis is currently the gold standard for classification of brain tumors. The use of machine learning algorithms along with extraction of relevant features from magnetic resonance imaging (MRI) holds promise of replacing conventional invasive methods of tumor classification. The aim of the study is to classify gliomas into benign and malignant types using MRI data. Retrospective data from 28 patients who were diagnosed with glioma were used for the analysis. WHO Grade II (low-grade astrocytoma) was classified as benign while Grade III (anaplastic astrocytoma) and Grade IV (glioblastoma multiforme) were classified as malignant. Features were extracted from MR spectroscopy. The classification was done using four machine learning algorithms: multilayer perceptrons, support vector machine, random forest and locally weighted learning. Three of the four machine learning algorithms gave an area under ROC curve in excess of 0.80. Random forest gave the best performance in terms of AUC (0.911) while sensitivity was best for locally weighted learning (86.1%). The performance of different machine learning algorithms in the classification of gliomas is promising. An even better performance may be expected by integrating features extracted from other MR sequences. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Jones, Charles W.
1981-04-07
A machine for pressing loose powder into pellets using a series of reciprocating motions has an interchangeable punch and die as its only accurately machines parts. The machine reciprocates horizontally between powder receiving and pressing positions. It reciprocates vertically to press, strip and release a pellet.
Using machine learning for sequence-level automated MRI protocol selection in neuroradiology.
Brown, Andrew D; Marotta, Thomas R
2018-05-01
Incorrect imaging protocol selection can lead to important clinical findings being missed, contributing to both wasted health care resources and patient harm. We present a machine learning method for analyzing the unstructured text of clinical indications and patient demographics from magnetic resonance imaging (MRI) orders to automatically protocol MRI procedures at the sequence level. We compared 3 machine learning models - support vector machine, gradient boosting machine, and random forest - to a baseline model that predicted the most common protocol for all observations in our test set. The gradient boosting machine model significantly outperformed the baseline and demonstrated the best performance of the 3 models in terms of accuracy (95%), precision (86%), recall (80%), and Hamming loss (0.0487). This demonstrates the feasibility of automating sequence selection by applying machine learning to MRI orders. Automated sequence selection has important safety, quality, and financial implications and may facilitate improvements in the quality and safety of medical imaging service delivery.
Classification of large-sized hyperspectral imagery using fast machine learning algorithms
NASA Astrophysics Data System (ADS)
Xia, Junshi; Yokoya, Naoto; Iwasaki, Akira
2017-07-01
We present a framework of fast machine learning algorithms in the context of large-sized hyperspectral images classification from the theoretical to a practical viewpoint. In particular, we assess the performance of random forest (RF), rotation forest (RoF), and extreme learning machine (ELM) and the ensembles of RF and ELM. These classifiers are applied to two large-sized hyperspectral images and compared to the support vector machines. To give the quantitative analysis, we pay attention to comparing these methods when working with high input dimensions and a limited/sufficient training set. Moreover, other important issues such as the computational cost and robustness against the noise are also discussed.
NASA Astrophysics Data System (ADS)
Wu, Mingtao; Guo, Bing; Zhao, Qingliang; Fan, Rongwei; Dong, Zhiwei; Yu, Xin
2018-06-01
Micro-structured surface on diamond is widely used in microelectronics, optical elements, MEMS and NEMS components, ultra-precision machining tools, etc. The efficient micro-structuring of diamond material is still a challenging task. In this article, the influence of the focus position on laser machining and laser micro-structuring monocrystalline diamond surface were researched. At the beginning, the ablation threshold and its incubation effect of monocrystalline diamond were determined and discussed. As the accumulated laser pulses ranged from 40 to 5000, the laser ablation threshold decreased from 1.48 J/cm2 to 0.97 J/cm2. Subsequently, the variation of the ablation width and ablation depth in laser machining were studied. With enough pulse energy, the ablation width mainly depended on the laser propagation attributes while the ablation depth was a complex function of the focus position. Raman analysis was used to detect the variation of the laser machined diamond surface after the laser machining experiments. Graphite formation was discovered on the machined diamond surface and graphitization was enhanced after the defocusing quantity exceeded 45 μm. At last, several micro-structured surfaces were successfully fabricated on diamond surface with the defined micro-structure patterns and structuring ratios just by adjusting the defocusing quantity. The experimental structuring ratio was consistent with the theoretical analysis.
2018-01-01
Background Many studies have tried to develop predictors for return-to-work (RTW). However, since complex factors have been demonstrated to predict RTW, it is difficult to use them practically. This study investigated whether factors used in previous studies could predict whether an individual had returned to his/her original work by four years after termination of the worker's recovery period. Methods An initial logistic regression analysis of 1,567 participants of the fourth Panel Study of Worker's Compensation Insurance yielded odds ratios. The participants were divided into two subsets, a training dataset and a test dataset. Using the training dataset, logistic regression, decision tree, random forest, and support vector machine models were established, and important variables of each model were identified. The predictive abilities of the different models were compared. Results The analysis showed that only earned income and company-related factors significantly affected return-to-original-work (RTOW). The random forest model showed the best accuracy among the tested machine learning models; however, the difference was not prominent. Conclusion It is possible to predict a worker's probability of RTOW using machine learning techniques with moderate accuracy. PMID:29736160
NASA Astrophysics Data System (ADS)
Tang, Jie; Liu, Rong; Zhang, Yue-Li; Liu, Mou-Ze; Hu, Yong-Fang; Shao, Ming-Jie; Zhu, Li-Jun; Xin, Hua-Wen; Feng, Gui-Wen; Shang, Wen-Jun; Meng, Xiang-Guang; Zhang, Li-Rong; Ming, Ying-Zi; Zhang, Wei
2017-02-01
Tacrolimus has a narrow therapeutic window and considerable variability in clinical use. Our goal was to compare the performance of multiple linear regression (MLR) and eight machine learning techniques in pharmacogenetic algorithm-based prediction of tacrolimus stable dose (TSD) in a large Chinese cohort. A total of 1,045 renal transplant patients were recruited, 80% of which were randomly selected as the “derivation cohort” to develop dose-prediction algorithm, while the remaining 20% constituted the “validation cohort” to test the final selected algorithm. MLR, artificial neural network (ANN), regression tree (RT), multivariate adaptive regression splines (MARS), boosted regression tree (BRT), support vector regression (SVR), random forest regression (RFR), lasso regression (LAR) and Bayesian additive regression trees (BART) were applied and their performances were compared in this work. Among all the machine learning models, RT performed best in both derivation [0.71 (0.67-0.76)] and validation cohorts [0.73 (0.63-0.82)]. In addition, the ideal rate of RT was 4% higher than that of MLR. To our knowledge, this is the first study to use machine learning models to predict TSD, which will further facilitate personalized medicine in tacrolimus administration in the future.
A machine learning system to improve heart failure patient assistance.
Guidi, Gabriele; Pettenati, Maria Chiara; Melillo, Paolo; Iadanza, Ernesto
2014-11-01
In this paper, we present a clinical decision support system (CDSS) for the analysis of heart failure (HF) patients, providing various outputs such as an HF severity evaluation, HF-type prediction, as well as a management interface that compares the different patients' follow-ups. The whole system is composed of a part of intelligent core and of an HF special-purpose management tool also providing the function to act as interface for the artificial intelligence training and use. To implement the smart intelligent functions, we adopted a machine learning approach. In this paper, we compare the performance of a neural network (NN), a support vector machine, a system with fuzzy rules genetically produced, and a classification and regression tree and its direct evolution, which is the random forest, in analyzing our database. Best performances in both HF severity evaluation and HF-type prediction functions are obtained by using the random forest algorithm. The management tool allows the cardiologist to populate a "supervised database" suitable for machine learning during his or her regular outpatient consultations. The idea comes from the fact that in literature there are a few databases of this type, and they are not scalable to our case.
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1984-01-01
A detailed description of the machine-readable astronomical catalog as it is currently being distributed from the Astronomical Data Center is given. Stellar motions and positions are listed herein in tabular form.
Control system for, and a method of, heating an operator station of a work machine
Baker, Thomas M.; Hoff, Brian D.; Akasam, Sivaprasad
2005-04-05
There are situations in which an operator remains in an operator station of a work machine when an engine of the work machine is inactive. The present invention includes a control system for, and a method of, heating the operator station when the engine is inactive. A heating system of the work machine includes an electrically-powered coolant pump, a power source, and at least one piece of warmed machinery. An operator heat controller is moveable between a first and a second position, and is operable to connect the electrically-powered coolant pump to the power source when the engine is inactive and the operator heat controller is in the first position. Thus, by deactivating the engine and then moving the operator heat controller to the first position, the operator may supply electrical energy to the electrically-powered coolant pump, which is operably coupled to heat the operator station.
ERIC Educational Resources Information Center
Golino, Hudson F.; Gomes, Cristiano M. A.
2016-01-01
This paper presents a non-parametric imputation technique, named random forest, from the machine learning field. The random forest procedure has two main tuning parameters: the number of trees grown in the prediction and the number of predictors used. Fifty experimental conditions were created in the imputation procedure, with different…
Manual actuator. [for spacecraft exercising machines
NASA Technical Reports Server (NTRS)
Gause, R. L.; Glenn, C. G. (Inventor)
1974-01-01
An actuator for an exercising machine employable by a crewman aboard a manned spacecraft is presented. The actuator is characterized by a force delivery arm projected from a rotary imput shaft of an exercising machine and having a force input handle extended orthogonally from its distal end. The handle includes a hand-grip configured to be received within the palm of the crewman's hand and a grid pivotally supported for angular displacement between a first position, wherein the grid is disposed in an overlying juxtaposition with the hand-grip, and a second position, angularly displaced from the first position, for affording access to the hand-grip, and a latching mechanism fixed to the sole of a shoe worn by the crewman for latching the shoe to the grid when the grid is in the first position.
Xing, Haifeng; Hou, Bo; Lin, Zhihui; Guo, Meifeng
2017-10-13
MEMS (Micro Electro Mechanical System) gyroscopes have been widely applied to various fields, but MEMS gyroscope random drift has nonlinear and non-stationary characteristics. It has attracted much attention to model and compensate the random drift because it can improve the precision of inertial devices. This paper has proposed to use wavelet filtering to reduce noise in the original data of MEMS gyroscopes, then reconstruct the random drift data with PSR (phase space reconstruction), and establish the model for the reconstructed data by LSSVM (least squares support vector machine), of which the parameters were optimized using CPSO (chaotic particle swarm optimization). Comparing the effect of modeling the MEMS gyroscope random drift with BP-ANN (back propagation artificial neural network) and the proposed method, the results showed that the latter had a better prediction accuracy. Using the compensation of three groups of MEMS gyroscope random drift data, the standard deviation of three groups of experimental data dropped from 0.00354°/s, 0.00412°/s, and 0.00328°/s to 0.00065°/s, 0.00072°/s and 0.00061°/s, respectively, which demonstrated that the proposed method can reduce the influence of MEMS gyroscope random drift and verified the effectiveness of this method for modeling MEMS gyroscope random drift.
Predictors of return rate discrimination in slot machine play.
Coates, Ewan; Blaszczynski, Alex
2014-09-01
The purpose of this study was to investigate the extent to which accurate estimates of payback percentages and volatility combined with prior learning, enabled players to successfully discriminate between multi-line/multi-credit slot machines that provided differing rates of reinforcement. The aim was to determine if the capacity to discriminate structural characteristics of gaming machines influenced player choices in selecting 'favourite' slot machines. Slot machine gambling history, gambling beliefs and knowledge, impulsivity, illusions of control, and problem solving style were assessed in a sample of 48 first year undergraduate psychology students. Participants were subsequently exposed to a choice paradigm where they could freely select to play either of two concurrently presented PC-simulated slot machines programmed to randomly differ in expected player return rates (payback percentage) and win frequency (volatility). Results suggest that prior learning and cognitions (particularly gambler's fallacy) but not payback, were major contributors to the ability of a player to discriminate volatility between slot machines. Participants displayed a general tendency to discriminate payback, but counter-intuitively placed more bets on the slot machine with lower payback percentage rates.
Field precision machining technology of target chamber in ICF lasers
NASA Astrophysics Data System (ADS)
Xu, Yuanli; Wu, Wenkai; Shi, Sucun; Duan, Lin; Chen, Gang; Wang, Baoxu; Song, Yugang; Liu, Huilin; Zhu, Mingzhi
2016-10-01
In ICF lasers, many independent laser beams are required to be positioned on target with a very high degree of accuracy during a shot. The target chamber provides a precision platform and datum reference for final optics assembly and target collimation and location system. The target chamber consists of shell with welded flanges, reinforced concrete pedestal, and lateral support structure. The field precision machining technology of target chamber in ICF lasers have been developed based on ShenGuangIII (SGIII). The same center of the target chamber is adopted in the process of design, fabrication, and alignment. The technologies of beam collimation and datum reference transformation are developed for the fabrication, positioning and adjustment of target chamber. A supporting and rotating mechanism and a special drilling machine are developed to bore the holes of ports. An adjustment mechanism is designed to accurately position the target chamber. In order to ensure the collimation requirements of the beam leading and focusing and the target positioning, custom-machined spacers are used to accurately correct the alignment error of the ports. Finally, this paper describes the chamber center, orientation, and centering alignment error measurements of SGIII. The measurements show the field precision machining of SGIII target chamber meet its design requirement. These information can be used on similar systems.
Foods Sold in School Vending Machines are Associated with Overall Student Dietary Intake
Rovner, Alisha J.; Nansel, Tonja R.; Wang, Jing; Iannotti, Ronald J.
2010-01-01
Purpose To examine the association between foods sold in school vending machines and students’ dietary behaviors. Methods The 2005-2006 US Health Behavior in School Aged Children (HBSC) survey was administered to 6th to 10th graders and school administrators. Students’ dietary intake was estimated with a brief food frequency measure. Administrators completed questions about foods sold in vending machines. For each food intake behavior, a multilevel regression analysis modeled students (level 1) nested within schools (level 2), with the corresponding food sold in vending machines as the main predictor. Control variables included gender, grade, family affluence and school poverty. Analyses were conducted separately for 6th to 8th and 9th to 10th grades. Results Eighty-three percent of schools (152 schools, 5,930 students) had vending machines which primarily sold foods of minimal nutritional values (soft drinks, chips and sweets). In younger grades, availability of fruits/vegetables and chocolate/sweets was positively related to the corresponding food intake, with vending machine content and school poverty explaining 70.6% of between-school variation in fruit/vegetable consumption, and 71.7% in sweets consumption. In older grades, there was no significant effect of foods available in vending machines on reported consumption of those foods. Conclusions Vending machines are widely available in US public schools. In younger grades, school vending machines were related to students’ diets positively or negatively, depending on what was sold in them. Schools are in a powerful position to influence children’s diets; therefore attention to foods sold in them is necessary in order to try to improve children’s diets. PMID:21185519
Food sold in school vending machines is associated with overall student dietary intake.
Rovner, Alisha J; Nansel, Tonja R; Wang, Jing; Iannotti, Ronald J
2011-01-01
To examine the association between food sold in school vending machines and the dietary behaviors of students. The 2005-2006 U.S. Health Behavior in School-aged Children survey was administered to 6th to 10th graders and school administrators. Dietary intake in students was estimated with a brief food frequency measure. School administrators completed questions regarding food sold in vending machines. For each food intake behavior, a multilevel regression analysis modeled students (level 1) nested within schools (level 2), with the corresponding food sold in vending machines as the main predictor. Control variables included gender, grade, family affluence, and school poverty index. Analyses were conducted separately for 6th to 8th and 9th-10th grades. In all, 83% of the schools (152 schools; 5,930 students) had vending machines that primarily sold food of minimal nutritional values (soft drinks, chips, and sweets). In younger grades, availability of fruit and/or vegetables and chocolate and/or sweets was positively related to the corresponding food intake, with vending machine content and school poverty index providing an explanation for 70.6% of between-school variation in fruit and/or vegetable consumption and 71.7% in sweets consumption. Among the older grades, there was no significant effect of food available in vending machines on reported consumption of those food. Vending machines are widely available in public schools in the United States. In younger grades, school vending machines were either positively or negatively related to the diets of the students, depending on what was sold in them. Schools are in a powerful position to influence the diets of children; therefore, attention to the food sold at school is necessary to try to improve their diets. Copyright © 2011 Society for Adolescent Health and Medicine. All rights reserved.
Barkman, William E.; Dow, Thomas A.; Garrard, Kenneth P.; Marston, Zachary
2016-07-12
Systems and methods for performing on-machine measurements and automatic part alignment, including: a measurement component operable for determining the position of a part on a machine; and an actuation component operable for adjusting the position of the part by contacting the part with a predetermined force responsive to the determined position of the part. The measurement component consists of a transducer. The actuation component consists of a linear actuator. Optionally, the measurement component and the actuation component consist of a single linear actuator operable for contacting the part with a first lighter force for determining the position of the part and with a second harder force for adjusting the position of the part. The actuation component is utilized in a substantially horizontal configuration and the effects of gravitational drop of the part are accounted for in the force applied and the timing of the contact.
USDA-ARS?s Scientific Manuscript database
Palmer amaranth (Amaranthus palmeri S. Wats.) invasion negatively impacts cotton (Gossypium hirsutum L.) production systems throughout the United States. The objective of this study was to evaluate canopy hyperspectral narrowband data as input into the random forest machine learning algorithm to dis...
Fernandes, Henrique; Zhang, Hai; Figueiredo, Alisson; Malheiros, Fernando; Ignacio, Luis Henrique; Sfarra, Stefano; Ibarra-Castanedo, Clemente; Guimaraes, Gilmar; Maldague, Xavier
2018-01-19
The use of fiber reinforced materials such as randomly-oriented strands has grown in recent years, especially for manufacturing of aerospace composite structures. This growth is mainly due to their advantageous properties: they are lighter and more resistant to corrosion when compared to metals and are more easily shaped than continuous fiber composites. The resistance and stiffness of these materials are directly related to their fiber orientation. Thus, efficient approaches to assess their fiber orientation are in demand. In this paper, a non-destructive evaluation method is applied to assess the fiber orientation on laminates reinforced with randomly-oriented strands. More specifically, a method called pulsed thermal ellipsometry combined with an artificial neural network, a machine learning technique, is used in order to estimate the fiber orientation on the surface of inspected parts. Results showed that the method can be potentially used to inspect large areas with good accuracy and speed.
Maldague, Xavier
2018-01-01
The use of fiber reinforced materials such as randomly-oriented strands has grown in recent years, especially for manufacturing of aerospace composite structures. This growth is mainly due to their advantageous properties: they are lighter and more resistant to corrosion when compared to metals and are more easily shaped than continuous fiber composites. The resistance and stiffness of these materials are directly related to their fiber orientation. Thus, efficient approaches to assess their fiber orientation are in demand. In this paper, a non-destructive evaluation method is applied to assess the fiber orientation on laminates reinforced with randomly-oriented strands. More specifically, a method called pulsed thermal ellipsometry combined with an artificial neural network, a machine learning technique, is used in order to estimate the fiber orientation on the surface of inspected parts. Results showed that the method can be potentially used to inspect large areas with good accuracy and speed. PMID:29351240
Gram staining with an automatic machine.
Felek, S; Arslan, A
1999-01-01
This study was undertaken to develop a new Gram-staining machine controlled by a micro-controller and to investigate the quality of slides that were stained in the machine. The machine was designed and produced by the authors. It uses standard 220 V AC. Staining, washing, and drying periods are controlled by a timer built in the micro-controller. A software was made that contains a certain algorithm and time intervals for the staining mode. One-hundred and forty smears were prepared from Escherichia coli, Staphylococcus aureus, Neisseria sp., blood culture, trypticase soy broth, direct pus and sputum smears for comparison studies. Half of the slides in each group were stained with the machine, the other half by hand and then examined by four different microbiologists. Machine-stained slides had a higher clarity and less debris than the hand-stained slides (p < 0.05). In hand-stained slides, some Gram-positive organisms showed poor Gram-positive staining features (p < 0.05). In conclusion, we suggest that Gram staining with the automatic machine increases the staining quality and helps to decrease the work load in a busy diagnostic laboratory.
Adaptation response of Arabidopsis thaliana to random positioning
NASA Astrophysics Data System (ADS)
Kittang, A.-I.; Winge, P.; van Loon, J. J. W. A.; Bones, A. M.; Iversen, T.-H.
2013-10-01
Arabidopsis thaliana seedlings were exposed on a Random Positioning Machine (RPM) under light conditions for 16 h and the samples were analysed using microarray techniques as part of a preparation for a space experiment on the International Space Station (ISS). The results demonstrated a moderate to low regulation of 55 genes (<0.2% of the analysed genes). Genes encoding proteins associated with the chaperone system (e.g. heat shock proteins, HSPs) and enzymes in the flavonoid biosynthesis were induced. Most of the repressed genes were associated with light and sugar responses. Significant up-regulation of selected HSP genes was found by quantitative Real-Time PCR in 1 week old plants after the RPM exposure both in light and darkness. Higher quantity of DPBA (diphenylboric acid 2-amino-ethyl ester) staining was observed in the whole root and in the root elongation zone of the seedlings exposed on the RPM by use of fluorescent microscopy, indicating higher flavonoid content. The regulated genes and an increase of flavonoids are related to several stresses, but increased occurrence of HSPs and flavonoids are also representative for normal growth (e.g. gravitropism). The response could be a direct stress response or an integrated response of the two signal pathways of light and gravity resulting in an overall light response.
Development of 300 mesh Soy Bean Crusher for Tofu Material Processing
NASA Astrophysics Data System (ADS)
Lee, E. S.; Pratama, P. S.; Supeno, D.; Jeong, S. W.; Byun, J. Y.; Woo, J. H.; Park, C. S.; Choi, W. S.
2018-03-01
A machine such as bean crusher machine is subjected to different loads and vibration. Due to this vibration there will be certain deformations which affect the performance of the machine in adverse manner. This paper proposed a vibration analysis of bean crusher machine using ANSYS. The effect of vibration on the structure was studied in order to ensure the safety using finite element analysis. This research supports the machine designer to create a better product with lower cost and faster development time. To do this, firstly, using Inventor, a CAD model is prepared. Secondly, the analysis is to be carried out using ANSYS 15. The modal analysis and random vibration analysis of the structure was conducted. The analysis shows that the proposed design was successfully shows the minimum deformation when the vibration was applied in normal condition.
Machine learning algorithms for the creation of clinical healthcare enterprise systems
NASA Astrophysics Data System (ADS)
Mandal, Indrajit
2017-10-01
Clinical recommender systems are increasingly becoming popular for improving modern healthcare systems. Enterprise systems are persuasively used for creating effective nurse care plans to provide nurse training, clinical recommendations and clinical quality control. A novel design of a reliable clinical recommender system based on multiple classifier system (MCS) is implemented. A hybrid machine learning (ML) ensemble based on random subspace method and random forest is presented. The performance accuracy and robustness of proposed enterprise architecture are quantitatively estimated to be above 99% and 97%, respectively (above 95% confidence interval). The study then extends to experimental analysis of the clinical recommender system with respect to the noisy data environment. The ranking of items in nurse care plan is demonstrated using machine learning algorithms (MLAs) to overcome the drawback of the traditional association rule method. The promising experimental results are compared against the sate-of-the-art approaches to highlight the advancement in recommendation technology. The proposed recommender system is experimentally validated using five benchmark clinical data to reinforce the research findings.
NASA Astrophysics Data System (ADS)
Kwintarini, Widiyanti; Wibowo, Agung; Arthaya, Bagus M.; Yuwana Martawirya, Yatna
2018-03-01
The purpose of this study was to improve the accuracy of three-axis CNC Milling Vertical engines with a general approach by using mathematical modeling methods of machine tool geometric errors. The inaccuracy of CNC machines can be caused by geometric errors that are an important factor during the manufacturing process and during the assembly phase, and are factors for being able to build machines with high-accuracy. To improve the accuracy of the three-axis vertical milling machine, by knowing geometric errors and identifying the error position parameters in the machine tool by arranging the mathematical modeling. The geometric error in the machine tool consists of twenty-one error parameters consisting of nine linear error parameters, nine angle error parameters and three perpendicular error parameters. The mathematical modeling approach of geometric error with the calculated alignment error and angle error in the supporting components of the machine motion is linear guide way and linear motion. The purpose of using this mathematical modeling approach is the identification of geometric errors that can be helpful as reference during the design, assembly and maintenance stages to improve the accuracy of CNC machines. Mathematically modeling geometric errors in CNC machine tools can illustrate the relationship between alignment error, position and angle on a linear guide way of three-axis vertical milling machines.
Slot Machines: Pursuing Responsible Gaming Practices for Virtual Reels and Near Misses
ERIC Educational Resources Information Center
Harrigan, Kevin A.
2009-01-01
Since 1983, slot machines in North America have used a computer and virtual reels to determine the odds. Since at least 1988, a technique called clustering has been used to create a high number of near misses, failures that are close to wins. The result is that what the player sees does not represent the underlying probabilities and randomness,…
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1983-01-01
A description of the machine readable catalog, including detailed format and tape file characteristics, is given. The machine file is a computation of mean values for position and magnitude at a mean epoch of observation for each unique star in the Oxford, Paris, Bordeaux, Toulouse and Northern Hemisphere Algiers zone. The format was changed to effect more efficient data searching by position and additional duplicate entries were removed. The final catalog contains data for 997311 stars.
Piette, Elizabeth R; Moore, Jason H
2018-01-01
Machine learning methods and conventions are increasingly employed for the analysis of large, complex biomedical data sets, including genome-wide association studies (GWAS). Reproducibility of machine learning analyses of GWAS can be hampered by biological and statistical factors, particularly so for the investigation of non-additive genetic interactions. Application of traditional cross validation to a GWAS data set may result in poor consistency between the training and testing data set splits due to an imbalance of the interaction genotypes relative to the data as a whole. We propose a new cross validation method, proportional instance cross validation (PICV), that preserves the original distribution of an independent variable when splitting the data set into training and testing partitions. We apply PICV to simulated GWAS data with epistatic interactions of varying minor allele frequencies and prevalences and compare performance to that of a traditional cross validation procedure in which individuals are randomly allocated to training and testing partitions. Sensitivity and positive predictive value are significantly improved across all tested scenarios for PICV compared to traditional cross validation. We also apply PICV to GWAS data from a study of primary open-angle glaucoma to investigate a previously-reported interaction, which fails to significantly replicate; PICV however improves the consistency of testing and training results. Application of traditional machine learning procedures to biomedical data may require modifications to better suit intrinsic characteristics of the data, such as the potential for highly imbalanced genotype distributions in the case of epistasis detection. The reproducibility of genetic interaction findings can be improved by considering this variable imbalance in cross validation implementation, such as with PICV. This approach may be extended to problems in other domains in which imbalanced variable distributions are a concern.
Enhancement of Plant Metabolite Fingerprinting by Machine Learning1[W
Scott, Ian M.; Vermeer, Cornelia P.; Liakata, Maria; Corol, Delia I.; Ward, Jane L.; Lin, Wanchang; Johnson, Helen E.; Whitehead, Lynne; Kular, Baldeep; Baker, John M.; Walsh, Sean; Dave, Anuja; Larson, Tony R.; Graham, Ian A.; Wang, Trevor L.; King, Ross D.; Draper, John; Beale, Michael H.
2010-01-01
Metabolite fingerprinting of Arabidopsis (Arabidopsis thaliana) mutants with known or predicted metabolic lesions was performed by 1H-nuclear magnetic resonance, Fourier transform infrared, and flow injection electrospray-mass spectrometry. Fingerprinting enabled processing of five times more plants than conventional chromatographic profiling and was competitive for discriminating mutants, other than those affected in only low-abundance metabolites. Despite their rapidity and complexity, fingerprints yielded metabolomic insights (e.g. that effects of single lesions were usually not confined to individual pathways). Among fingerprint techniques, 1H-nuclear magnetic resonance discriminated the most mutant phenotypes from the wild type and Fourier transform infrared discriminated the fewest. To maximize information from fingerprints, data analysis was crucial. One-third of distinctive phenotypes might have been overlooked had data models been confined to principal component analysis score plots. Among several methods tested, machine learning (ML) algorithms, namely support vector machine or random forest (RF) classifiers, were unsurpassed for phenotype discrimination. Support vector machines were often the best performing classifiers, but RFs yielded some particularly informative measures. First, RFs estimated margins between mutant phenotypes, whose relations could then be visualized by Sammon mapping or hierarchical clustering. Second, RFs provided importance scores for the features within fingerprints that discriminated mutants. These scores correlated with analysis of variance F values (as did Kruskal-Wallis tests, true- and false-positive measures, mutual information, and the Relief feature selection algorithm). ML classifiers, as models trained on one data set to predict another, were ideal for focused metabolomic queries, such as the distinctiveness and consistency of mutant phenotypes. Accessible software for use of ML in plant physiology is highlighted. PMID:20566707
Ozmutlu, H. Cenk
2014-01-01
We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms. PMID:24977204
Casimir rack and pinion as a miniaturized kinetic energy harvester
NASA Astrophysics Data System (ADS)
Miri, MirFaez; Etesami, Zahra
2016-08-01
We study a nanoscale machine composed of a rack and a pinion with no contact, but intermeshed via the lateral Casimir force. We adopt a simple model for the random velocity of the rack subject to external random forces, namely, a dichotomous noise with zero mean value. We show that the pinion, even when it experiences random thermal torque, can do work against a load. The device thus converts the kinetic energy of the random motions of the rack into useful work.
Time dependent variation of carrying capacity of prestressed precast beam
NASA Astrophysics Data System (ADS)
Le, Tuan D.; Konečný, Petr; Matečková, Pavlína
2018-04-01
The article deals with the evaluation of the precast concrete element time dependent carrying capacity. The variation of the resistance is inherited property of laboratory as well as in-situ members. Thus the specification of highest, yet possible, laboratory sample resistance is important with respect to evaluation of laboratory experiments based on the test machine loading capabilities. The ultimate capacity is evaluated through the bending moment resistance of a simply supported prestressed concrete beam. The probabilistic assessment is applied. Scatter of random variables of compressive strength of concrete and effective height of the cross section is considered. Monte Carlo simulation technique is used to investigate the performance of the cross section of the beam with changes of tendons’ positions and compressive strength of concrete.
A Real-Time Tool Positioning Sensor for Machine-Tools
Ruiz, Antonio Ramon Jimenez; Rosas, Jorge Guevara; Granja, Fernando Seco; Honorato, Jose Carlos Prieto; Taboada, Jose Juan Esteve; Serrano, Vicente Mico; Jimenez, Teresa Molina
2009-01-01
In machining, natural oscillations, and elastic, gravitational or temperature deformations, are still a problem to guarantee the quality of fabricated parts. In this paper we present an optical measurement system designed to track and localize in 3D a reference retro-reflector close to the machine-tool's drill. The complete system and its components are described in detail. Several tests, some static (including impacts and rotations) and others dynamic (by executing linear and circular trajectories), were performed on two different machine tools. It has been integrated, for the first time, a laser tracking system into the position control loop of a machine-tool. Results indicate that oscillations and deformations close to the tool can be estimated with micrometric resolution and a bandwidth from 0 to more than 100 Hz. Therefore this sensor opens the possibility for on-line compensation of oscillations and deformations. PMID:22408472
Stochastic scheduling on a repairable manufacturing system
NASA Astrophysics Data System (ADS)
Li, Wei; Cao, Jinhua
1995-08-01
In this paper, we consider some stochastic scheduling problems with a set of stochastic jobs on a manufacturing system with a single machine that is subject to multiple breakdowns and repairs. When the machine processing a job fails, the job processing must restart some time later when the machine is repaired. For this typical manufacturing system, we find the optimal policies that minimize the following objective functions: (1) the weighed sum of the completion times; (2) the weighed number of late jobs having constant due dates; (3) the weighted number of late jobs having random due dates exponentially distributed, which generalize some previous results.
Chen, Gongbo; Li, Shanshan; Knibbs, Luke D; Hamm, N A S; Cao, Wei; Li, Tiantian; Guo, Jianping; Ren, Hongyan; Abramson, Michael J; Guo, Yuming
2018-09-15
Machine learning algorithms have very high predictive ability. However, no study has used machine learning to estimate historical concentrations of PM 2.5 (particulate matter with aerodynamic diameter ≤ 2.5 μm) at daily time scale in China at a national level. To estimate daily concentrations of PM 2.5 across China during 2005-2016. Daily ground-level PM 2.5 data were obtained from 1479 stations across China during 2014-2016. Data on aerosol optical depth (AOD), meteorological conditions and other predictors were downloaded. A random forests model (non-parametric machine learning algorithms) and two traditional regression models were developed to estimate ground-level PM 2.5 concentrations. The best-fit model was then utilized to estimate the daily concentrations of PM 2.5 across China with a resolution of 0.1° (≈10 km) during 2005-2016. The daily random forests model showed much higher predictive accuracy than the other two traditional regression models, explaining the majority of spatial variability in daily PM 2.5 [10-fold cross-validation (CV) R 2 = 83%, root mean squared prediction error (RMSE) = 28.1 μg/m 3 ]. At the monthly and annual time-scale, the explained variability of average PM 2.5 increased up to 86% (RMSE = 10.7 μg/m 3 and 6.9 μg/m 3 , respectively). Taking advantage of a novel application of modeling framework and the most recent ground-level PM 2.5 observations, the machine learning method showed higher predictive ability than previous studies. Random forests approach can be used to estimate historical exposure to PM 2.5 in China with high accuracy. Copyright © 2018 Elsevier B.V. All rights reserved.
Flexible drive allows blind machining and welding in hard-to-reach areas
NASA Technical Reports Server (NTRS)
Harvey, D. E.; Rohrberg, R. G.
1966-01-01
Flexible power and control unit performs welding and machining operations in confined areas. A machine/weld head is connected to the unit by a flexible transmission shaft, and a locking- indexing collar is incorporated onto the head to allow it to be placed and held in position.
30 CFR 70.207 - Bimonthly sampling; mechanized mining units.
Code of Federal Regulations, 2011 CFR
2011-07-01
... air will be used to determine the average concentration for that mechanized mining unit. (e) Unless... sampling device as follows: (1) Conventional section using cutting machine. On the cutting machine operator or on the cutting machine within 36 inches inby the normal working position; (2) Conventional section...
14 CFR 382.3 - What do the terms in this rule mean?
Code of Federal Regulations, 2014 CFR
2014-01-01
... devices and medications. Automated airport kiosk means a self-service transaction machine that a carrier... machine means a continuous positive airway pressure machine. Department or DOT means the United States..., emotional or mental illness, and specific learning disabilities. The term physical or mental impairment...
Kevlar: Transitioning Helix for Research to Practice
2016-03-01
entropy randomization techniques, automated program repairs leveraging highly-optimized virtual machine technology, and developing a novel framework...attacker from exploiting residual vulnerabilities in a wide variety of classes. Helix/Kevlar uses novel, fine-grained, high- entropy diversification...the Air Force, and IARPA). Salient features of Helix/Kevlar include developing high- entropy randomization techniques, automated program repairs
Signature Verification Using N-tuple Learning Machine.
Maneechot, Thanin; Kitjaidure, Yuttana
2005-01-01
This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.
Ecological interactions and the Netflix problem.
Desjardins-Proulx, Philippe; Laigle, Idaline; Poisot, Timothée; Gravel, Dominique
2017-01-01
Species interactions are a key component of ecosystems but we generally have an incomplete picture of who-eats-who in a given community. Different techniques have been devised to predict species interactions using theoretical models or abundances. Here, we explore the K nearest neighbour approach, with a special emphasis on recommendation, along with a supervised machine learning technique. Recommenders are algorithms developed for companies like Netflix to predict whether a customer will like a product given the preferences of similar customers. These machine learning techniques are well-suited to study binary ecological interactions since they focus on positive-only data. By removing a prey from a predator, we find that recommenders can guess the missing prey around 50% of the times on the first try, with up to 881 possibilities. Traits do not improve significantly the results for the K nearest neighbour, although a simple test with a supervised learning approach (random forests) show we can predict interactions with high accuracy using only three traits per species. This result shows that binary interactions can be predicted without regard to the ecological community given only three variables: body mass and two variables for the species' phylogeny. These techniques are complementary, as recommenders can predict interactions in the absence of traits, using only information about other species' interactions, while supervised learning algorithms such as random forests base their predictions on traits only but do not exploit other species' interactions. Further work should focus on developing custom similarity measures specialized for ecology to improve the KNN algorithms and using richer data to capture indirect relationships between species.
Ecological interactions and the Netflix problem
Laigle, Idaline; Poisot, Timothée; Gravel, Dominique
2017-01-01
Species interactions are a key component of ecosystems but we generally have an incomplete picture of who-eats-who in a given community. Different techniques have been devised to predict species interactions using theoretical models or abundances. Here, we explore the K nearest neighbour approach, with a special emphasis on recommendation, along with a supervised machine learning technique. Recommenders are algorithms developed for companies like Netflix to predict whether a customer will like a product given the preferences of similar customers. These machine learning techniques are well-suited to study binary ecological interactions since they focus on positive-only data. By removing a prey from a predator, we find that recommenders can guess the missing prey around 50% of the times on the first try, with up to 881 possibilities. Traits do not improve significantly the results for the K nearest neighbour, although a simple test with a supervised learning approach (random forests) show we can predict interactions with high accuracy using only three traits per species. This result shows that binary interactions can be predicted without regard to the ecological community given only three variables: body mass and two variables for the species’ phylogeny. These techniques are complementary, as recommenders can predict interactions in the absence of traits, using only information about other species’ interactions, while supervised learning algorithms such as random forests base their predictions on traits only but do not exploit other species’ interactions. Further work should focus on developing custom similarity measures specialized for ecology to improve the KNN algorithms and using richer data to capture indirect relationships between species. PMID:28828250
Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism
NASA Technical Reports Server (NTRS)
Onyebueke, Landon; Ameye, Olusesan
2002-01-01
A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.
Bacillus thuringiensis Conjugation in Simulated Microgravity
NASA Astrophysics Data System (ADS)
Beuls, Elise; van Houdt, Rob; Leys, Natalie; Dijkstra, Camelia; Larkin, Oliver; Mahillon, Jacques
2009-10-01
Spaceflight experiments have suggested a possible effect of microgravity on the plasmid transfer among strains of the Gram-positive Bacillus thuringiensis, as opposed to no effect recorded for Gram-negative conjugation. To investigate these potential effects in a more affordable experimental setup, three ground-based microgravity simulators were tested: the Rotating Wall Vessel (RWV), the Random Positioning Machine (RPM), and a superconducting magnet. The bacterial conjugative system consisted in biparental matings between two B. thuringiensis strains, where the transfer frequencies of the conjugative plasmid pAW63 and its ability to mobilize the nonconjugative plasmid pUB110 were assessed. Specifically, potential plasmid transfers in a 0-g position (simulated microgravity) were compared to those obtained under 1-g (normal gravity) condition in each device. Statistical analyses revealed no significant difference in the conjugative and mobilizable transfer frequencies between the three different simulated microgravitational conditions and our standard laboratory condition. These important ground-based observations emphasize the fact that, though no stimulation of plasmid transfer was observed, no inhibition was observed either. In the case of Gram-positive bacteria, this ability to exchange plasmids in weightlessness, as occurs under Earth's conditions, should be seen as particularly relevant in the scope of spread of antibiotic resistances and bacterial virulence.
Bacillus thuringiensis conjugation in simulated microgravity.
Beuls, Elise; Van Houdt, Rob; Leys, Natalie; Dijkstra, Camelia; Larkin, Oliver; Mahillon, Jacques
2009-10-01
Spaceflight experiments have suggested a possible effect of microgravity on the plasmid transfer among strains of the Gram-positive Bacillus thuringiensis, as opposed to no effect recorded for Gram-negative conjugation. To investigate these potential effects in a more affordable experimental setup, three ground-based microgravity simulators were tested: the Rotating Wall Vessel (RWV), the Random Positioning Machine (RPM), and a superconducting magnet. The bacterial conjugative system consisted in biparental matings between two B. thuringiensis strains, where the transfer frequencies of the conjugative plasmid pAW63 and its ability to mobilize the nonconjugative plasmid pUB110 were assessed. Specifically, potential plasmid transfers in a 0 g position (simulated microgravity) were compared to those obtained under 1 g (normal gravity) condition in each device. Statistical analyses revealed no significant difference in the conjugative and mobilizable transfer frequencies between the three different simulated microgravitational conditions and our standard laboratory condition. These important ground-based observations emphasize the fact that, though no stimulation of plasmid transfer was observed, no inhibition was observed either. In the case of Gram-positive bacteria, this ability to exchange plasmids in weightlessness, as occurs under Earth's conditions, should be seen as particularly relevant in the scope of spread of antibiotic resistances and bacterial virulence.
Locking devices on cigarette vending machines: evaluation of a city ordinance.
Forster, J L; Hourigan, M E; Kelder, S
1992-01-01
OBJECTIVES. Policymakers, researchers, and citizens are beginning to recognize the need to limit minors' access to tobacco by restricting the sale of cigarettes through vending machines. One policy alternative that has been proposed by the tobacco industry is a requirement that vending machines be fitted with electronic locking devices. This study evaluates such a policy as enacted in St. Paul, Minn. METHODS. A random sample of vending machine locations was selected for cigarette purchase attempts conducted before implementation and at 3 and 12 months postimplementation. RESULTS. The rate of noncompliance by merchants was 34% after 3 months and 30% after 1 year. The effect of the law was to reduce the ability of a minor to purchase cigarettes from locations originally selling cigarettes through vending machines from 86% at baseline to 36% at 3 months. The purchase rate at these locations rose to 48% at 1 year. CONCLUSIONS. Our results suggest that cigarette vending machine locking devices may not be as effective as vending machine bans and require additional enforcement to ensure compliance with the law. PMID:1503160
Zeng, Xueqiang; Luo, Gang
2017-12-01
Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.
Verschueren, Sabine M. P.; Degens, Hans; Morse, Christopher I.; Onambélé, Gladys L.
2017-01-01
Accurate monitoring of sedentary behaviour and physical activity is key to investigate their exact role in healthy ageing. To date, accelerometers using cut-off point models are most preferred for this, however, machine learning seems a highly promising future alternative. Hence, the current study compared between cut-off point and machine learning algorithms, for optimal quantification of sedentary behaviour and physical activity intensities in the elderly. Thus, in a heterogeneous sample of forty participants (aged ≥60 years, 50% female) energy expenditure during laboratory-based activities (ranging from sedentary behaviour through to moderate-to-vigorous physical activity) was estimated by indirect calorimetry, whilst wearing triaxial thigh-mounted accelerometers. Three cut-off point algorithms and a Random Forest machine learning model were developed and cross-validated using the collected data. Detailed analyses were performed to check algorithm robustness, and examine and benchmark both overall and participant-specific balanced accuracies. This revealed that the four models can at least be used to confidently monitor sedentary behaviour and moderate-to-vigorous physical activity. Nevertheless, the machine learning algorithm outperformed the cut-off point models by being robust for all individual’s physiological and non-physiological characteristics and showing more performance of an acceptable level over the whole range of physical activity intensities. Therefore, we propose that Random Forest machine learning may be optimal for objective assessment of sedentary behaviour and physical activity in older adults using thigh-mounted triaxial accelerometry. PMID:29155839
Wullems, Jorgen A; Verschueren, Sabine M P; Degens, Hans; Morse, Christopher I; Onambélé, Gladys L
2017-01-01
Accurate monitoring of sedentary behaviour and physical activity is key to investigate their exact role in healthy ageing. To date, accelerometers using cut-off point models are most preferred for this, however, machine learning seems a highly promising future alternative. Hence, the current study compared between cut-off point and machine learning algorithms, for optimal quantification of sedentary behaviour and physical activity intensities in the elderly. Thus, in a heterogeneous sample of forty participants (aged ≥60 years, 50% female) energy expenditure during laboratory-based activities (ranging from sedentary behaviour through to moderate-to-vigorous physical activity) was estimated by indirect calorimetry, whilst wearing triaxial thigh-mounted accelerometers. Three cut-off point algorithms and a Random Forest machine learning model were developed and cross-validated using the collected data. Detailed analyses were performed to check algorithm robustness, and examine and benchmark both overall and participant-specific balanced accuracies. This revealed that the four models can at least be used to confidently monitor sedentary behaviour and moderate-to-vigorous physical activity. Nevertheless, the machine learning algorithm outperformed the cut-off point models by being robust for all individual's physiological and non-physiological characteristics and showing more performance of an acceptable level over the whole range of physical activity intensities. Therefore, we propose that Random Forest machine learning may be optimal for objective assessment of sedentary behaviour and physical activity in older adults using thigh-mounted triaxial accelerometry.
Machine tools error characterization and compensation by on-line measurement of artifact
NASA Astrophysics Data System (ADS)
Wahid Khan, Abdul; Chen, Wuyi; Wu, Lili
2009-11-01
Most manufacturing machine tools are utilized for mass production or batch production with high accuracy at a deterministic manufacturing principle. Volumetric accuracy of machine tools depends on the positional accuracy of the cutting tool, probe or end effector related to the workpiece in the workspace volume. In this research paper, a methodology is presented for volumetric calibration of machine tools by on-line measurement of an artifact or an object of a similar type. The machine tool geometric error characterization was carried out through a standard or an artifact, having similar geometry to the mass production or batch production product. The artifact was measured at an arbitrary position in the volumetric workspace with a calibrated Renishaw touch trigger probe system. Positional errors were stored into a computer for compensation purpose, to further run the manufacturing batch through compensated codes. This methodology was found quite effective to manufacture high precision components with more dimensional accuracy and reliability. Calibration by on-line measurement gives the advantage to improve the manufacturing process by use of deterministic manufacturing principle and found efficient and economical but limited to the workspace or envelop surface of the measured artifact's geometry or the profile.
Speed-Selector Guard For Machine Tool
NASA Technical Reports Server (NTRS)
Shakhshir, Roda J.; Valentine, Richard L.
1992-01-01
Simple guardplate prevents accidental reversal of direction of rotation or sudden change of speed of lathe, milling machine, or other machine tool. Custom-made for specific machine and control settings. Allows control lever to be placed at only one setting. Operator uses handle to slide guard to engage or disengage control lever. Protects personnel from injury and equipment from damage occurring if speed- or direction-control lever inadvertently placed in wrong position.
NASA Astrophysics Data System (ADS)
Bai, Ting; Sun, Kaimin; Deng, Shiquan; Chen, Yan
2018-03-01
High resolution image change detection is one of the key technologies of remote sensing application, which is of great significance for resource survey, environmental monitoring, fine agriculture, military mapping and battlefield environment detection. In this paper, for high-resolution satellite imagery, Random Forest (RF), Support Vector Machine (SVM), Deep belief network (DBN), and Adaboost models were established to verify the possibility of different machine learning applications in change detection. In order to compare detection accuracy of four machine learning Method, we applied these four machine learning methods for two high-resolution images. The results shows that SVM has higher overall accuracy at small samples compared to RF, Adaboost, and DBN for binary and from-to change detection. With the increase in the number of samples, RF has higher overall accuracy compared to Adaboost, SVM and DBN.
Predicting the dissolution kinetics of silicate glasses using machine learning
NASA Astrophysics Data System (ADS)
Anoop Krishnan, N. M.; Mangalathu, Sujith; Smedskjaer, Morten M.; Tandia, Adama; Burton, Henry; Bauchy, Mathieu
2018-05-01
Predicting the dissolution rates of silicate glasses in aqueous conditions is a complex task as the underlying mechanism(s) remain poorly understood and the dissolution kinetics can depend on a large number of intrinsic and extrinsic factors. Here, we assess the potential of data-driven models based on machine learning to predict the dissolution rates of various aluminosilicate glasses exposed to a wide range of solution pH values, from acidic to caustic conditions. Four classes of machine learning methods are investigated, namely, linear regression, support vector machine regression, random forest, and artificial neural network. We observe that, although linear methods all fail to describe the dissolution kinetics, the artificial neural network approach offers excellent predictions, thanks to its inherent ability to handle non-linear data. Overall, we suggest that a more extensive use of machine learning approaches could significantly accelerate the design of novel glasses with tailored properties.
Schwarz, Frank; Hegewald, Andrea; Becker, Jürgen
2014-01-01
Objectives To address the following focused question: What is the impact of implant–abutment configuration and the positioning of the machined collar/microgap on crestal bone level changes? Material and methods Electronic databases of the PubMed and the Web of Knowledge were searched for animal and human studies reporting on histological/radiological crestal bone level changes (CBL) at nonsubmerged one-/two-piece implants (placed in healed ridges) exhibiting different abutment configurations, positioning of the machined collar/microgap (between 1992 and November 2012: n = 318 titles). Quality assessment of selected full-text articles was performed according to the ARRIVE and CONSORT statement guidelines. Results A total of 13 publications (risk of bias: high) were eligible for the review. The weighted mean difference (WMD) (95% CI) between machined collars placed either above or below the bone crest amounted to 0.835 mm favoring an epicrestal positioning of the rough/smooth border (P < 0.001) (P-value for heterogeneity: 0.885, I2: 0.000% = no heterogeneity). WMD (95% CI) between microgaps placed either at or below the bone crest amounted to −0.479 mm favoring a subcrestal position of the implant neck (P < 0.001) (P-value for heterogeneity: 0.333, I2: 12.404% = low heterogeneity). Only two studies compared different implant–abutment configurations. Due to a high heterogeneity, a meta-analysis was not feasible. Conclusions While the positioning of the machined neck and microgap may limit crestal bone level changes at nonsubmerged implants, the impact of the implant–abutment connection lacks documentation. PMID:23782338
NASA Technical Reports Server (NTRS)
Wang, Wenlong; Mandra, Salvatore; Katzgraber, Helmut G.
2016-01-01
In this paper, we propose a patch planting method for creating arbitrarily large spin glass instances with known ground states. The scaling of the computational complexity of these instances with various block numbers and sizes is investigated and compared with random instances using population annealing Monte Carlo and the quantum annealing DW2X machine. The method can be useful for benchmarking tests for future generation quantum annealing machines, classical and quantum mechanical optimization algorithms.
Spectral methods in machine learning and new strategies for very large datasets
Belabbas, Mohamed-Ali; Wolfe, Patrick J.
2009-01-01
Spectral methods are of fundamental importance in statistics and machine learning, because they underlie algorithms from classical principal components analysis to more recent approaches that exploit manifold structure. In most cases, the core technical problem can be reduced to computing a low-rank approximation to a positive-definite kernel. For the growing number of applications dealing with very large or high-dimensional datasets, however, the optimal approximation afforded by an exact spectral decomposition is too costly, because its complexity scales as the cube of either the number of training examples or their dimensionality. Motivated by such applications, we present here 2 new algorithms for the approximation of positive-semidefinite kernels, together with error bounds that improve on results in the literature. We approach this problem by seeking to determine, in an efficient manner, the most informative subset of our data relative to the kernel approximation task at hand. This leads to two new strategies based on the Nyström method that are directly applicable to massive datasets. The first of these—based on sampling—leads to a randomized algorithm whereupon the kernel induces a probability distribution on its set of partitions, whereas the latter approach—based on sorting—provides for the selection of a partition in a deterministic way. We detail their numerical implementation and provide simulation results for a variety of representative problems in statistical data analysis, each of which demonstrates the improved performance of our approach relative to existing methods. PMID:19129490
NASA Astrophysics Data System (ADS)
Hadi Sutrisno, Himawan; Kiswanto, Gandjar; Istiyanto, Jos
2017-06-01
The rough machining is aimed at shaping a workpiece towards to its final form. This process takes up a big proportion of the machining time due to the removal of the bulk material which may affect the total machining time. In certain models, the rough machining has limitations especially on certain surfaces such as turbine blade and impeller. CBV evaluation is one of the concepts which is used to detect of areas admissible in the process of machining. While in the previous research, CBV area detection used a pair of normal vectors, in this research, the writer simplified the process to detect CBV area with a slicing line for each point cloud formed. The simulation resulted in three steps used for this method and they are: 1. Triangulation from CAD design models, 2. Development of CC point from the point cloud, 3. The slicing line method which is used to evaluate each point cloud position (under CBV and outer CBV). The result of this evaluation method can be used as a tool for orientation set-up on each CC point position of feasible areas in rough machining.
Determinants of wood dust exposure in the Danish furniture industry.
Mikkelsen, Anders B; Schlunssen, Vivi; Sigsgaard, Torben; Schaumburg, Inger
2002-11-01
This paper investigates the relation between wood dust exposure in the furniture industry and occupational hygiene variables. During the winter 1997-98 54 factories were visited and 2362 personal, passive inhalable dust samples were obtained; the geometric mean was 0.95 mg/m(3) and the geometric standard deviation was 2.08. In a first measuring round 1685 dust concentrations were obtained. For some of the workers repeated measurements were carried out 1 (351) and 2 weeks (326) after the first measurement. Hygiene variables like job, exhaust ventilation, cleaning procedures, etc., were documented. A multivariate analysis based on mixed effects models was used with hygiene variables being fixed effects and worker, machine, department and factory being random effects. A modified stepwise strategy of model making was adopted taking into account the hierarchically structured variables and making possible the exclusion of non-influential random as well as fixed effects. For woodworking, the following determinants of exposure increase the dust concentration: manual and automatic sanding and use of compressed air with fully automatic and semi-automatic machines and for cleaning of work pieces. Decreased dust exposure resulted from the use of compressed air with manual machines, working at fully automatic or semi-automatic machines, functioning exhaust ventilation, work on the night shift, daily cleaning of rooms, cleaning of work pieces with a brush, vacuum cleaning of machines, supplementary fresh air intake and safety representative elected within the last 2 yr. For handling and assembling, increased exposure results from work at automatic machines and presence of wood dust on the workpieces. Work on the evening shift, supplementary fresh air intake, work in a chair factory and special cleaning staff produced decreased exposure to wood dust. The implications of the results for the prevention of wood dust exposure are discussed.
Dimitriadis, Stavros I; Liparas, Dimitris
2018-06-01
Neuroinformatics is a fascinating research field that applies computational models and analytical tools to high dimensional experimental neuroscience data for a better understanding of how the brain functions or dysfunctions in brain diseases. Neuroinformaticians work in the intersection of neuroscience and informatics supporting the integration of various sub-disciplines (behavioural neuroscience, genetics, cognitive psychology, etc.) working on brain research. Neuroinformaticians are the pathway of information exchange between informaticians and clinicians for a better understanding of the outcome of computational models and the clinical interpretation of the analysis. Machine learning is one of the most significant computational developments in the last decade giving tools to neuroinformaticians and finally to radiologists and clinicians for an automatic and early diagnosis-prognosis of a brain disease. Random forest (RF) algorithm has been successfully applied to high-dimensional neuroimaging data for feature reduction and also has been applied to classify the clinical label of a subject using single or multi-modal neuroimaging datasets. Our aim was to review the studies where RF was applied to correctly predict the Alzheimer's disease (AD), the conversion from mild cognitive impairment (MCI) and its robustness to overfitting, outliers and handling of non-linear data. Finally, we described our RF-based model that gave us the 1 st position in an international challenge for automated prediction of MCI from MRI data.
Kirkpatrick, Andrew W; McKee, Ian; McKee, Jessica L; Ma, Irene; McBeth, Paul B; Roberts, Derek J; Wurster, Charles L; Parfitt, Robbie; Ball, Chad G; Oberg, Scott; Sevcik, William; Hamilton, Douglas R
2016-05-01
Remote-telementored ultrasound involves novice examiners being remotely guided by experts using informatic-technologies. However, requiring a novice to perform ultrasound is a cognitively demanding task exacerbated by unfamiliarity with ultrasound-machine controls. We incorporated a randomized evaluation of using remote control of the ultrasound functionality (knobology) within a study in which the images generated by distant naive examiners were viewed on an ultrasound graphic user interface (GUI) display viewed on laptop computers by mentors in different cities. Fire-fighters in Edmonton (101) were remotely mentored from Calgary (n = 65), Nanaimo (n = 19), and Memphis (n = 17) to examine an ultrasound phantom randomized to contain free fluid or not. Remote mentors (2 surgeons, 1 internist, and 1 ED physician) were randomly assigned to use GUI knobology control during mentoring (GUIK+/GUIK-). Remote-telementored ultrasound was feasible in all cases. Overall accuracy for fluid detection was 97% (confidence interval = 91 to 99%) with 3 false negatives (FNs). Positive/negative likelihood ratios were infinity/0.0625. One FN occurred with the GUIK+ and 2 without (GUIK-). There were no statistical test performance differences in either group (GUIK+ and GUIK-). Ultrasound-naive 1st responders can be remotely mentored with high accuracy, although providing basic remote control of the knobology did not affect outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.
Marucci-Wellman, Helen R; Corns, Helen L; Lehto, Mark R
2017-01-01
Injury narratives are now available real time and include useful information for injury surveillance and prevention. However, manual classification of the cause or events leading to injury found in large batches of narratives, such as workers compensation claims databases, can be prohibitive. In this study we compare the utility of four machine learning algorithms (Naïve Bayes, Single word and Bi-gram models, Support Vector Machine and Logistic Regression) for classifying narratives into Bureau of Labor Statistics Occupational Injury and Illness event leading to injury classifications for a large workers compensation database. These algorithms are known to do well classifying narrative text and are fairly easy to implement with off-the-shelf software packages such as Python. We propose human-machine learning ensemble approaches which maximize the power and accuracy of the algorithms for machine-assigned codes and allow for strategic filtering of rare, emerging or ambiguous narratives for manual review. We compare human-machine approaches based on filtering on the prediction strength of the classifier vs. agreement between algorithms. Regularized Logistic Regression (LR) was the best performing algorithm alone. Using this algorithm and filtering out the bottom 30% of predictions for manual review resulted in high accuracy (overall sensitivity/positive predictive value of 0.89) of the final machine-human coded dataset. The best pairings of algorithms included Naïve Bayes with Support Vector Machine whereby the triple ensemble NB SW =NB BI-GRAM =SVM had very high performance (0.93 overall sensitivity/positive predictive value and high accuracy (i.e. high sensitivity and positive predictive values)) across both large and small categories leaving 41% of the narratives for manual review. Integrating LR into this ensemble mix improved performance only slightly. For large administrative datasets we propose incorporation of methods based on human-machine pairings such as we have done here, utilizing readily-available off-the-shelf machine learning techniques and resulting in only a fraction of narratives that require manual review. Human-machine ensemble methods are likely to improve performance over total manual coding. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Applying Workspace Limitations in a Velocity-Controlled Robotic Mechanism
NASA Technical Reports Server (NTRS)
Abdallah, Muhammad E. (Inventor); Hargrave, Brian (Inventor); Platt, Robert J., Jr. (Inventor)
2014-01-01
A robotic system includes a robotic mechanism responsive to velocity control signals, and a permissible workspace defined by a convex-polygon boundary. A host machine determines a position of a reference point on the mechanism with respect to the boundary, and includes an algorithm for enforcing the boundary by automatically shaping the velocity control signals as a function of the position, thereby providing smooth and unperturbed operation of the mechanism along the edges and corners of the boundary. The algorithm is suited for application with higher speeds and/or external forces. A host machine includes an algorithm for enforcing the boundary by shaping the velocity control signals as a function of the reference point position, and a hardware module for executing the algorithm. A method for enforcing the convex-polygon boundary is also provided that shapes a velocity control signal via a host machine as a function of the reference point position.
Programmable Pulse-Position-Modulation Encoder
NASA Technical Reports Server (NTRS)
Zhu, David; Farr, William
2006-01-01
A programmable pulse-position-modulation (PPM) encoder has been designed for use in testing an optical communication link. The encoder includes a programmable state machine and an electronic code book that can be updated to accommodate different PPM coding schemes. The encoder includes a field-programmable gate array (FPGA) that is programmed to step through the stored state machine and code book and that drives a custom high-speed serializer circuit board that is capable of generating subnanosecond pulses. The stored state machine and code book can be updated by means of a simple text interface through the serial port of a personal computer.
The Riddle of the Smart Machines
ERIC Educational Resources Information Center
Howell, Dusti D.
2010-01-01
Hundreds of graduate students were introduced to the fields of instructional design and educational technology with the riddle of the smart machines, yet over the years no one has answered it correctly. After revealing the surprising answer to this riddle, both the negative and positive impacts of smart machines are analyzed. An example of this is…
Pseudo-random tool paths for CNC sub-aperture polishing and other applications.
Dunn, Christina R; Walker, David D
2008-11-10
In this paper we first contrast classical and CNC polishing techniques in regard to the repetitiveness of the machine motions. We then present a pseudo-random tool path for use with CNC sub-aperture polishing techniques and report polishing results from equivalent random and raster tool-paths. The random tool-path used - the unicursal random tool-path - employs a random seed to generate a pattern which never crosses itself. Because of this property, this tool-path is directly compatible with dwell time maps for corrective polishing. The tool-path can be used to polish any continuous area of any boundary shape, including surfaces with interior perforations.
Development of machine learning models for diagnosis of glaucoma.
Kim, Seong Jae; Cho, Kyong Jin; Oh, Sejong
2017-01-01
The study aimed to develop machine learning models that have strong prediction power and interpretability for diagnosis of glaucoma based on retinal nerve fiber layer (RNFL) thickness and visual field (VF). We collected various candidate features from the examination of retinal nerve fiber layer (RNFL) thickness and visual field (VF). We also developed synthesized features from original features. We then selected the best features proper for classification (diagnosis) through feature evaluation. We used 100 cases of data as a test dataset and 399 cases of data as a training and validation dataset. To develop the glaucoma prediction model, we considered four machine learning algorithms: C5.0, random forest (RF), support vector machine (SVM), and k-nearest neighbor (KNN). We repeatedly composed a learning model using the training dataset and evaluated it by using the validation dataset. Finally, we got the best learning model that produces the highest validation accuracy. We analyzed quality of the models using several measures. The random forest model shows best performance and C5.0, SVM, and KNN models show similar accuracy. In the random forest model, the classification accuracy is 0.98, sensitivity is 0.983, specificity is 0.975, and AUC is 0.979. The developed prediction models show high accuracy, sensitivity, specificity, and AUC in classifying among glaucoma and healthy eyes. It will be used for predicting glaucoma against unknown examination records. Clinicians may reference the prediction results and be able to make better decisions. We may combine multiple learning models to increase prediction accuracy. The C5.0 model includes decision rules for prediction. It can be used to explain the reasons for specific predictions.
Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes.
Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning
2015-08-27
This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications.
Analytical N beam position monitor method
NASA Astrophysics Data System (ADS)
Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.
2017-11-01
Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.
Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes
Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning
2015-01-01
This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications. PMID:26343656
Recent advances in environmental data mining
NASA Astrophysics Data System (ADS)
Leuenberger, Michael; Kanevski, Mikhail
2016-04-01
Due to the large amount and complexity of data available nowadays in geo- and environmental sciences, we face the need to develop and incorporate more robust and efficient methods for their analysis, modelling and visualization. An important part of these developments deals with an elaboration and application of a contemporary and coherent methodology following the process from data collection to the justification and communication of the results. Recent fundamental progress in machine learning (ML) can considerably contribute to the development of the emerging field - environmental data science. The present research highlights and investigates the different issues that can occur when dealing with environmental data mining using cutting-edge machine learning algorithms. In particular, the main attention is paid to the description of the self-consistent methodology and two efficient algorithms - Random Forest (RF, Breiman, 2001) and Extreme Learning Machines (ELM, Huang et al., 2006), which recently gained a great popularity. Despite the fact that they are based on two different concepts, i.e. decision trees vs artificial neural networks, they both propose promising results for complex, high dimensional and non-linear data modelling. In addition, the study discusses several important issues of data driven modelling, including feature selection and uncertainties. The approach considered is accompanied by simulated and real data case studies from renewable resources assessment and natural hazards tasks. In conclusion, the current challenges and future developments in statistical environmental data learning are discussed. References - Breiman, L., 2001. Random Forests. Machine Learning 45 (1), 5-32. - Huang, G.-B., Zhu, Q.-Y., Siew, C.-K., 2006. Extreme learning machine: theory and applications. Neurocomputing 70 (1-3), 489-501. - Kanevski, M., Pozdnoukhov, A., Timonin, V., 2009. Machine Learning for Spatial Environmental Data. EPFL Press; Lausanne, Switzerland, p.392. - Leuenberger, M., Kanevski, M., 2015. Extreme Learning Machines for spatial environmental data. Computers and Geosciences 85, 64-73.
Tedesco-Silva, Helio; Mello Offerni, Juliano Chrystian; Ayres Carneiro, Vanessa; Ivani de Paula, Mayara; Neto, Elias David; Brambate Carvalhinho Lemos, Francine; Requião Moura, Lúcio Roberto; Pacheco E Silva Filho, Alvaro; de Morais Cunha, Mirian de Fátima; Francisco da Silva, Erica; Miorin, Luiz Antonio; Demetrio, Daniela Priscila; Luconi, Paulo Sérgio; da Silva Luconi, Waldere Tania; Bobbio, Savina Adriana; Kuschnaroff, Liz Milstein; Noronha, Irene Lourdes; Braga, Sibele Lessa; Barsante, Renata Cristina; Mendes Moreira, João Cezar; Fernandes-Charpiot, Ida Maria Maximina; Abbud-Filho, Mario; Modelli de Andrade, Luis Gustavo; Dalsoglio Garcia, Paula; Tanajura Santamaria Saber, Luciana; Fernandes Laurindo, Alan; Chocair, Pedro Renato; Cuvello Neto, Américo Lourenço; Zanocco, Juliana Aparecida; Duboc de Almeida Soares Filho, Antonio Jose; Ferreira Aguiar, Wilson; Medina Pestana, Jose
2017-05-01
This study compared the use of static cold storage versus continuous hypothermic machine perfusion in a cohort of kidney transplant recipients at high risk for delayed graft function (DGF). In this national, multicenter, and controlled trial, 80 pairs of kidneys recovered from brain-dead deceased donors were randomized to cold storage or machine perfusion, transplanted, and followed up for 12 months. The primary endpoint was the incidence of DGF. Secondary endpoints included the duration of DGF, hospital stay, primary nonfunction, estimated glomerular filtration rate, acute rejection, and allograft and patient survivals. Mean cold ischemia time was high but not different between the 2 groups (25.6 ± 6.6 hours vs 25.05 ± 6.3 hours, 0.937). The incidence of DGF was lower in the machine perfusion compared with cold storage group (61% vs. 45%, P = 0.031). Machine perfusion was independently associated with a reduced risk of DGF (odds ratio, 0.49; 95% confidence interval, 0.26-0.95). Mean estimated glomerular filtration rate tended to be higher at day 28 (40.6 ± 19.9 mL/min per 1.73 m 2 vs 49.0 ± 26.9 mL/min per 1.73 m 2 ; P = 0.262) and 1 year (48.3 ± 19.8 mL/min per 1.73 m 2 vs 54.4 ± 28.6 mL/min per 1.73 m 2 ; P = 0.201) in the machine perfusion group. No differences in the incidence of acute rejection, primary nonfunction (0% vs 2.5%), graft loss (7.5% vs 10%), or death (8.8% vs 6.3%) were observed. In this cohort of recipients of deceased donor kidneys with high mean cold ischemia time and high incidence of DGF, the use of continuous machine perfusion was associated with a reduced risk of DGF compared with the traditional cold storage preservation method.
Jeffrey T. Walton
2008-01-01
Three machine learning subpixel estimation methods (Cubist, Random Forests, and support vector regression) were applied to estimate urban cover. Urban forest canopy cover and impervious surface cover were estimated from Landsat-7 ETM+ imagery using a higher resolution cover map resampled to 30 m as training and reference data. Three different band combinations (...
Chen, Xiaomei; Longstaff, Andrew; Fletcher, Simon; Myers, Alan
2014-04-01
This paper presents and evaluates an active dual-sensor autofocusing system that combines an optical vision sensor and a tactile probe for autofocusing on arrays of small holes on freeform surfaces. The system has been tested on a two-axis test rig and then integrated onto a three-axis computer numerical control (CNC) milling machine, where the aim is to rapidly and controllably measure the hole position errors while the part is still on the machine. The principle of operation is for the tactile probe to locate the nominal positions of holes, and the optical vision sensor follows to focus and capture the images of the holes. The images are then processed to provide hole position measurement. In this paper, the autofocusing deviations are analyzed. First, the deviations caused by the geometric errors of the axes on which the dual-sensor unit is deployed are estimated to be 11 μm when deployed on a test rig and 7 μm on the CNC machine tool. Subsequently, the autofocusing deviations caused by the interaction of the tactile probe, surface, and small hole are mathematically analyzed and evaluated. The deviations are a result of the tactile probe radius, the curvatures at the positions where small holes are drilled on the freeform surface, and the effect of the position error of the hole on focusing. An example case study is provided for the measurement of a pattern of small holes on an elliptical cylinder on the two machines. The absolute sum of the autofocusing deviations is 118 μm on the test rig and 144 μm on the machine tool. This is much less than the 500 μm depth of field of the optical microscope. Therefore, the method is capable of capturing a group of clear images of the small holes on this workpiece for either implementation.
Clustering Single-Cell Expression Data Using Random Forest Graphs.
Pouyan, Maziyar Baran; Nourani, Mehrdad
2017-07-01
Complex tissues such as brain and bone marrow are made up of multiple cell types. As the study of biological tissue structure progresses, the role of cell-type-specific research becomes increasingly important. Novel sequencing technology such as single-cell cytometry provides researchers access to valuable biological data. Applying machine-learning techniques to these high-throughput datasets provides deep insights into the cellular landscape of the tissue where those cells are a part of. In this paper, we propose the use of random-forest-based single-cell profiling, a new machine-learning-based technique, to profile different cell types of intricate tissues using single-cell cytometry data. Our technique utilizes random forests to capture cell marker dependences and model the cellular populations using the cell network concept. This cellular network helps us discover what cell types are in the tissue. Our experimental results on public-domain datasets indicate promising performance and accuracy of our technique in extracting cell populations of complex tissues.
The secondary supernova machine: Gravitational compression, stored Coulomb energy, and SNII displays
NASA Astrophysics Data System (ADS)
Clayton, Donald D.; Meyer, Bradley S.
2016-04-01
Radioactive power for several delayed optical displays of core-collapse supernovae is commonly described as having been provided by decays of 56Ni nuclei. This review analyses the provenance of that energy more deeply: the form in which that energy is stored; what mechanical work causes its storage; what conservation laws demand that it be stored; and why its release is fortuitously delayed for about 106 s into a greatly expanded supernova envelope. We call the unifying picture of those energy transfers the secondary supernova machine owing to its machine-like properties; namely, mechanical work forces storage of large increases of nuclear Coulomb energy, a positive energy component within new nuclei synthesized by the secondary machine. That positive-energy increase occurs despite the fusion decreasing negative total energy within nuclei. The excess of the Coulomb energy can later be radiated, accounting for the intense radioactivity in supernovae. Detailed familiarity with this machine is the focus of this review. The stored positive-energy component created by the machine will not be reduced until roughly 106 s later by radioactive emissions (EC and β +) owing to the slowness of weak decays. The delayed energy provided by the secondary supernova machine is a few × 1049 erg, much smaller than the one percent of the 1053 erg collapse that causes the prompt ejection of matter; however, that relatively small stored energy is vital for activation of the late displays. The conceptual basis of the secondary supernova machine provides a new framework for understanding the energy source for late SNII displays. We demonstrate the nuclear dynamics with nuclear network abundance calculations, with a model of sudden compression and reexpansion of the nuclear gas, and with nuclear energy decompositions of a nuclear-mass law. These tools identify excess Coulomb energy, a positive-energy component of the total negative nuclear energy, as the late activation energy. If the value of fundamental charge e were smaller, SNII would not be so profoundly radioactive. Excess Coulomb energy has been carried within nuclei radially for roughly 109 km before being radiated into greatly expanded supernova remnants. The Coulomb force claims heretofore unacknowledged significance for supernova physics.
Sensorless Control of Permanent Magnet Machine for NASA Flywheel Technology Development
NASA Technical Reports Server (NTRS)
Kenny, Barbara H.; Kascak, Peter E.
2002-01-01
This paper describes the position sensorless algorithms presently used in the motor control for the NASA "in-house" development work of the flywheel energy storage system. At zero and low speeds a signal injection technique, the self-sensing method, is used to determine rotor position. At higher speeds, an open loop estimate of the back EMF of the machine is made to determine the rotor position. At start up, the rotor is set to a known position by commanding dc into one of the phase windings. Experimental results up to 52,000 rpm are presented.
Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies
Theis, Fabian J.
2017-01-01
Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464
Applications of random forest feature selection for fine-scale genetic population assignment.
Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G
2018-02-01
Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.
An M-step preconditioned conjugate gradient method for parallel computation
NASA Technical Reports Server (NTRS)
Adams, L.
1983-01-01
This paper describes a preconditioned conjugate gradient method that can be effectively implemented on both vector machines and parallel arrays to solve sparse symmetric and positive definite systems of linear equations. The implementation on the CYBER 203/205 and on the Finite Element Machine is discussed and results obtained using the method on these machines are given.
49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.
Code of Federal Regulations, 2013 CFR
2013-10-01
...) A seat for each operator, except as provided in paragraph (b) of this section; (2) A safe and secure position with handholds, handrails, or a secure seat for each roadway worker transported on the machine... windshield wipers are incompatible with the windshield material; (5) A machine braking system capable of...
49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) A seat for each operator, except as provided in paragraph (b) of this section; (2) A safe and secure position with handholds, handrails, or a secure seat for each roadway worker transported on the machine... windshield wipers are incompatible with the windshield material; (5) A machine braking system capable of...
49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.
Code of Federal Regulations, 2012 CFR
2012-10-01
...) A seat for each operator, except as provided in paragraph (b) of this section; (2) A safe and secure position with handholds, handrails, or a secure seat for each roadway worker transported on the machine... windshield wipers are incompatible with the windshield material; (5) A machine braking system capable of...
1978-08-01
12°±30’ 1180±2° OPTIONAL .0005 IN./IN. BACK TAPER 015 RAD LIPS TO BE WITHIN .002 OF TRUE ANGULAR POSITION NOTES: 1. LAND WIDTH: 28% ± .005... horoscope and dye-penetrant requirements. 79 PHASE 1 PHASE II PHASE III PHASE IV CUTTING DRILLING MACHINING NONDESTRUCTIVE EVALUATION METHOD MATERIAL
Discomfort analysis in computerized numeric control machine operations.
Muthukumar, Krishnamoorthy; Sankaranarayanasamy, Krishnasamy; Ganguli, Anindya Kumar
2012-06-01
The introduction of computerized numeric control (CNC) technology in manufacturing industries has revolutionized the production process, but there are some health and safety problems associated with these machines. The present study aimed to investigate the extent of postural discomfort in CNC machine operators, and the relationship of this discomfort to the display and control panel height, with a view to validate the anthropometric recommendation for the location of the display and control panel in CNC machines. The postural discomforts associated with CNC machines were studied in 122 male operators using Corlett and Bishop's body part discomfort mapping, subject information, and discomfort level at various time intervals from starting to end of a shift. This information was collected using a questionnaire. Statistical analysis was carried out using ANOVA. Neck discomfort due to the positioning of the machine displays, and shoulder and arm discomfort due to the positioning of controls were identified as common health issues in the operators of these machines. The study revealed that 45.9% of machine operators reported discomfort in the lower back, 41.8% in the neck, 22.1% in the upper-back, 53.3% in the shoulder and arm, and 21.3% of the operators reported discomfort in the leg. Discomfort increased with the progress of the day and was highest at the end of a shift; subject age had no effect on patient tendency to experience discomfort levels.
Discomfort Analysis in Computerized Numeric Control Machine Operations
Sankaranarayanasamy, Krishnasamy; Ganguli, Anindya Kumar
2012-01-01
Objectives The introduction of computerized numeric control (CNC) technology in manufacturing industries has revolutionized the production process, but there are some health and safety problems associated with these machines. The present study aimed to investigate the extent of postural discomfort in CNC machine operators, and the relationship of this discomfort to the display and control panel height, with a view to validate the anthropometric recommendation for the location of the display and control panel in CNC machines. Methods The postural discomforts associated with CNC machines were studied in 122 male operators using Corlett and Bishop's body part discomfort mapping, subject information, and discomfort level at various time intervals from starting to end of a shift. This information was collected using a questionnaire. Statistical analysis was carried out using ANOVA. Results Neck discomfort due to the positioning of the machine displays, and shoulder and arm discomfort due to the positioning of controls were identified as common health issues in the operators of these machines. The study revealed that 45.9% of machine operators reported discomfort in the lower back, 41.8% in the neck, 22.1% in the upper-back, 53.3% in the shoulder and arm, and 21.3% of the operators reported discomfort in the leg. Conclusion Discomfort increased with the progress of the day and was highest at the end of a shift; subject age had no effect on patient tendency to experience discomfort levels. PMID:22993720
Construction machine control guidance implementation strategy.
DOT National Transportation Integrated Search
2010-07-01
Machine Controlled Guidance (MCG) technology may be used in roadway and bridge construction to improve construction efficiencies, potentially resulting in reduced project costs and accelerated schedules. The technology utilizes a Global Positioning S...
Evaluation of machine learning algorithms for improved risk assessment for Down's syndrome.
Koivu, Aki; Korpimäki, Teemu; Kivelä, Petri; Pahikkala, Tapio; Sairanen, Mikko
2018-05-04
Prenatal screening generates a great amount of data that is used for predicting risk of various disorders. Prenatal risk assessment is based on multiple clinical variables and overall performance is defined by how well the risk algorithm is optimized for the population in question. This article evaluates machine learning algorithms to improve performance of first trimester screening of Down syndrome. Machine learning algorithms pose an adaptive alternative to develop better risk assessment models using the existing clinical variables. Two real-world data sets were used to experiment with multiple classification algorithms. Implemented models were tested with a third, real-world, data set and performance was compared to a predicate method, a commercial risk assessment software. Best performing deep neural network model gave an area under the curve of 0.96 and detection rate of 78% with 1% false positive rate with the test data. Support vector machine model gave area under the curve of 0.95 and detection rate of 61% with 1% false positive rate with the same test data. When compared with the predicate method, the best support vector machine model was slightly inferior, but an optimized deep neural network model was able to give higher detection rates with same false positive rate or similar detection rate but with markedly lower false positive rate. This finding could further improve the first trimester screening for Down syndrome, by using existing clinical variables and a large training data derived from a specific population. Copyright © 2018 Elsevier Ltd. All rights reserved.
Deciphering the Preference and Predicting the Viability of Circular Permutations in Proteins
Liu, Yen-Yi; Wang, Li-Fen; Hwang, Jenn-Kang; Lyu, Ping-Chiang
2012-01-01
Circular permutation (CP) refers to situations in which the termini of a protein are relocated to other positions in the structure. CP occurs naturally and has been artificially created to study protein function, stability and folding. Recently CP is increasingly applied to engineer enzyme structure and function, and to create bifunctional fusion proteins unachievable by tandem fusion. CP is a complicated and expensive technique. An intrinsic difficulty in its application lies in the fact that not every position in a protein is amenable for creating a viable permutant. To examine the preferences of CP and develop CP viability prediction methods, we carried out comprehensive analyses of the sequence, structural, and dynamical properties of known CP sites using a variety of statistics and simulation methods, such as the bootstrap aggregating, permutation test and molecular dynamics simulations. CP particularly favors Gly, Pro, Asp and Asn. Positions preferred by CP lie within coils, loops, turns, and at residues that are exposed to solvent, weakly hydrogen-bonded, environmentally unpacked, or flexible. Disfavored positions include Cys, bulky hydrophobic residues, and residues located within helices or near the protein's core. These results fostered the development of an effective viable CP site prediction system, which combined four machine learning methods, e.g., artificial neural networks, the support vector machine, a random forest, and a hierarchical feature integration procedure developed in this work. As assessed by using the hydrofolate reductase dataset as the independent evaluation dataset, this prediction system achieved an AUC of 0.9. Large-scale predictions have been performed for nine thousand representative protein structures; several new potential applications of CP were thus identified. Many unreported preferences of CP are revealed in this study. The developed system is the best CP viability prediction method currently available. This work will facilitate the application of CP in research and biotechnology. PMID:22359629
Classifying clinical notes with pain assessment using machine learning.
Fodeh, Samah Jamal; Finch, Dezon; Bouayad, Lina; Luther, Stephen L; Ling, Han; Kerns, Robert D; Brandt, Cynthia
2017-12-26
Pain is a significant public health problem, affecting millions of people in the USA. Evidence has highlighted that patients with chronic pain often suffer from deficits in pain care quality (PCQ) including pain assessment, treatment, and reassessment. Currently, there is no intelligent and reliable approach to identify PCQ indicators inelectronic health records (EHR). Hereby, we used unstructured text narratives in the EHR to derive pain assessment in clinical notes for patients with chronic pain. Our dataset includes patients with documented pain intensity rating ratings > = 4 and initial musculoskeletal diagnoses (MSD) captured by (ICD-9-CM codes) in fiscal year 2011 and a minimal 1 year of follow-up (follow-up period is 3-yr maximum); with complete data on key demographic variables. A total of 92 patients with 1058 notes was used. First, we manually annotated qualifiers and descriptors of pain assessment using the annotation schema that we previously developed. Second, we developed a reliable classifier for indicators of pain assessment in clinical note. Based on our annotation schema, we found variations in documenting the subclasses of pain assessment. In positive notes, providers mostly documented assessment of pain site (67%) and intensity of pain (57%), followed by persistence (32%). In only 27% of positive notes, did providers document a presumed etiology for the pain complaint or diagnosis. Documentation of patients' reports of factors that aggravate pain was only present in 11% of positive notes. Random forest classifier achieved the best performance labeling clinical notes with pain assessment information, compared to other classifiers; 94, 95, 94, and 94% was observed in terms of accuracy, PPV, F1-score, and AUC, respectively. Despite the wide spectrum of research that utilizes machine learning in many clinical applications, none explored using these methods for pain assessment research. In addition, previous studies using large datasets to detect and analyze characteristics of patients with various types of pain have relied exclusively on billing and coded data as the main source of information. This study, in contrast, harnessed unstructured narrative text data from the EHR to detect pain assessment clinical notes. We developed a Random forest classifier to identify clinical notes with pain assessment information. Compared to other classifiers, ours achieved the best results in most of the reported metrics. Graphical abstract Framework for detecting pain assessment in clinical notes.
Williams, R.R.
1980-09-03
The present invention is directed to a method and device for determining the location of a cutting tool with respect to the rotational axis of a spindle-mounted workpiece. A vacuum cup supporting a machinable sacrificial pin is secured to the workpiece at a location where the pin will project along and encompass the rotational axis of the workpiece. The pin is then machined into a cylinder. The position of the surface of the cutting tool contacting the machine cylinder is spaced from the rotational axis of the workpiece a distance equal to the radius of the cylinder.
Williams, Richard R.
1982-01-01
The present invention is directed to a method and device for determining the location of a cutting tool with respect to the rotational axis of a spindle-mounted workpiece. A vacuum cup supporting a machinable sacrifical pin is secured to the workpiece at a location where the pin will project along and encompass the rotational axis of the workpiece. The pin is then machined into a cylinder. The position of the surface of the cutting tool contacting the machine cylinder is spaced from the rotational aixs of the workpiece a distance equal to the radius of the cylinder.
Data-Driven Learning of Total and Local Energies in Elemental Boron
NASA Astrophysics Data System (ADS)
Deringer, Volker L.; Pickard, Chris J.; Csányi, Gábor
2018-04-01
The allotropes of boron continue to challenge structural elucidation and solid-state theory. Here we use machine learning combined with random structure searching (RSS) algorithms to systematically construct an interatomic potential for boron. Starting from ensembles of randomized atomic configurations, we use alternating single-point quantum-mechanical energy and force computations, Gaussian approximation potential (GAP) fitting, and GAP-driven RSS to iteratively generate a representation of the element's potential-energy surface. Beyond the total energies of the very different boron allotropes, our model readily provides atom-resolved, local energies and thus deepened insight into the frustrated β -rhombohedral boron structure. Our results open the door for the efficient and automated generation of GAPs, and other machine-learning-based interatomic potentials, and suggest their usefulness as a tool for materials discovery.
Data-Driven Learning of Total and Local Energies in Elemental Boron.
Deringer, Volker L; Pickard, Chris J; Csányi, Gábor
2018-04-13
The allotropes of boron continue to challenge structural elucidation and solid-state theory. Here we use machine learning combined with random structure searching (RSS) algorithms to systematically construct an interatomic potential for boron. Starting from ensembles of randomized atomic configurations, we use alternating single-point quantum-mechanical energy and force computations, Gaussian approximation potential (GAP) fitting, and GAP-driven RSS to iteratively generate a representation of the element's potential-energy surface. Beyond the total energies of the very different boron allotropes, our model readily provides atom-resolved, local energies and thus deepened insight into the frustrated β-rhombohedral boron structure. Our results open the door for the efficient and automated generation of GAPs, and other machine-learning-based interatomic potentials, and suggest their usefulness as a tool for materials discovery.
NASA Astrophysics Data System (ADS)
Nishizuka, N.; Sugiura, K.; Kubo, Y.; Den, M.; Watari, S.; Ishii, M.
2017-02-01
We developed a flare prediction model using machine learning, which is optimized to predict the maximum class of flares occurring in the following 24 hr. Machine learning is used to devise algorithms that can learn from and make decisions on a huge amount of data. We used solar observation data during the period 2010-2015, such as vector magnetograms, ultraviolet (UV) emission, and soft X-ray emission taken by the Solar Dynamics Observatory and the Geostationary Operational Environmental Satellite. We detected active regions (ARs) from the full-disk magnetogram, from which ˜60 features were extracted with their time differentials, including magnetic neutral lines, the current helicity, the UV brightening, and the flare history. After standardizing the feature database, we fully shuffled and randomly separated it into two for training and testing. To investigate which algorithm is best for flare prediction, we compared three machine-learning algorithms: the support vector machine, k-nearest neighbors (k-NN), and extremely randomized trees. The prediction score, the true skill statistic, was higher than 0.9 with a fully shuffled data set, which is higher than that for human forecasts. It was found that k-NN has the highest performance among the three algorithms. The ranking of the feature importance showed that previous flare activity is most effective, followed by the length of magnetic neutral lines, the unsigned magnetic flux, the area of UV brightening, and the time differentials of features over 24 hr, all of which are strongly correlated with the flux emergence dynamics in an AR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nishizuka, N.; Kubo, Y.; Den, M.
We developed a flare prediction model using machine learning, which is optimized to predict the maximum class of flares occurring in the following 24 hr. Machine learning is used to devise algorithms that can learn from and make decisions on a huge amount of data. We used solar observation data during the period 2010–2015, such as vector magnetograms, ultraviolet (UV) emission, and soft X-ray emission taken by the Solar Dynamics Observatory and the Geostationary Operational Environmental Satellite . We detected active regions (ARs) from the full-disk magnetogram, from which ∼60 features were extracted with their time differentials, including magnetic neutralmore » lines, the current helicity, the UV brightening, and the flare history. After standardizing the feature database, we fully shuffled and randomly separated it into two for training and testing. To investigate which algorithm is best for flare prediction, we compared three machine-learning algorithms: the support vector machine, k-nearest neighbors (k-NN), and extremely randomized trees. The prediction score, the true skill statistic, was higher than 0.9 with a fully shuffled data set, which is higher than that for human forecasts. It was found that k-NN has the highest performance among the three algorithms. The ranking of the feature importance showed that previous flare activity is most effective, followed by the length of magnetic neutral lines, the unsigned magnetic flux, the area of UV brightening, and the time differentials of features over 24 hr, all of which are strongly correlated with the flux emergence dynamics in an AR.« less
Fraccaro, Paolo; Nicolo, Massimo; Bonetto, Monica; Giacomini, Mauro; Weller, Peter; Traverso, Carlo Enrico; Prosperi, Mattia; OSullivan, Dympna
2015-01-27
To investigate machine learning methods, ranging from simpler interpretable techniques to complex (non-linear) "black-box" approaches, for automated diagnosis of Age-related Macular Degeneration (AMD). Data from healthy subjects and patients diagnosed with AMD or other retinal diseases were collected during routine visits via an Electronic Health Record (EHR) system. Patients' attributes included demographics and, for each eye, presence/absence of major AMD-related clinical signs (soft drusen, retinal pigment epitelium, defects/pigment mottling, depigmentation area, subretinal haemorrhage, subretinal fluid, macula thickness, macular scar, subretinal fibrosis). Interpretable techniques known as white box methods including logistic regression and decision trees as well as less interpreitable techniques known as black box methods, such as support vector machines (SVM), random forests and AdaBoost, were used to develop models (trained and validated on unseen data) to diagnose AMD. The gold standard was confirmed diagnosis of AMD by physicians. Sensitivity, specificity and area under the receiver operating characteristic (AUC) were used to assess performance. Study population included 487 patients (912 eyes). In terms of AUC, random forests, logistic regression and adaboost showed a mean performance of (0.92), followed by SVM and decision trees (0.90). All machine learning models identified soft drusen and age as the most discriminating variables in clinicians' decision pathways to diagnose AMD. Both black-box and white box methods performed well in identifying diagnoses of AMD and their decision pathways. Machine learning models developed through the proposed approach, relying on clinical signs identified by retinal specialists, could be embedded into EHR to provide physicians with real time (interpretable) support.
Zhao, Jiangsan; Bodner, Gernot; Rewald, Boris
2016-01-01
Phenotyping local crop cultivars is becoming more and more important, as they are an important genetic source for breeding – especially in regard to inherent root system architectures. Machine learning algorithms are promising tools to assist in the analysis of complex data sets; novel approaches are need to apply them on root phenotyping data of mature plants. A greenhouse experiment was conducted in large, sand-filled columns to differentiate 16 European Pisum sativum cultivars based on 36 manually derived root traits. Through combining random forest and support vector machine models, machine learning algorithms were successfully used for unbiased identification of most distinguishing root traits and subsequent pairwise cultivar differentiation. Up to 86% of pea cultivar pairs could be distinguished based on top five important root traits (Timp5) – Timp5 differed widely between cultivar pairs. Selecting top important root traits (Timp) provided a significant improved classification compared to using all available traits or randomly selected trait sets. The most frequent Timp of mature pea cultivars was total surface area of lateral roots originating from tap root segments at 0–5 cm depth. The high classification rate implies that culturing did not lead to a major loss of variability in root system architecture in the studied pea cultivars. Our results illustrate the potential of machine learning approaches for unbiased (root) trait selection and cultivar classification based on rather small, complex phenotypic data sets derived from pot experiments. Powerful statistical approaches are essential to make use of the increasing amount of (root) phenotyping information, integrating the complex trait sets describing crop cultivars. PMID:27999587
Machine learning of big data in gaining insight into successful treatment of hypertension.
Koren, Gideon; Nordon, Galia; Radinsky, Kira; Shalev, Varda
2018-06-01
Despite effective medications, rates of uncontrolled hypertension remain high. Treatment protocols are largely based on randomized trials and meta-analyses of these studies. The objective of this study was to test the utility of machine learning of big data in gaining insight into the treatment of hypertension. We applied machine learning techniques such as decision trees and neural networks, to identify determinants that contribute to the success of hypertension drug treatment on a large set of patients. We also identified concomitant drugs not considered to have antihypertensive activity, which may contribute to lowering blood pressure (BP) control. Higher initial BP predicts lower success rates. Among the medication options and their combinations, treatment with beta blockers appears to be more commonly effective, which is not reflected in contemporary guidelines. Among numerous concomitant drugs taken by hypertensive patients, proton pump inhibitors (PPIs), and HMG CO-A reductase inhibitors (statins) significantly improved the success rate of hypertension. In conclusions, machine learning of big data is a novel method to identify effective antihypertensive therapy and for repurposing medications already on the market for new indications. Our results related to beta blockers, stemming from machine learning of a large and diverse set of big data, in contrast to the much narrower criteria for randomized clinic trials (RCTs), should be corroborated and affirmed by other methods, as they hold potential promise for an old class of drugs which may be presently underutilized. These previously unrecognized effects of PPIs and statins have been very recently identified as effective in lowering BP in preliminary clinical observations, lending credibility to our big data results.
NASA Astrophysics Data System (ADS)
Wang, Dongyi; Vinson, Robert; Holmes, Maxwell; Seibel, Gary; Tao, Yang
2018-04-01
The Atlantic blue crab is among the highest-valued seafood found in the American Eastern Seaboard. Currently, the crab processing industry is highly dependent on manual labor. However, there is great potential for vision-guided intelligent machines to automate the meat picking process. Studies show that the back-fin knuckles are robust features containing information about a crab's size, orientation, and the position of the crab's meat compartments. Our studies also make it clear that detecting the knuckles reliably in images is challenging due to the knuckle's small size, anomalous shape, and similarity to joints in the legs and claws. An accurate and reliable computer vision algorithm was proposed to detect the crab's back-fin knuckles in digital images. Convolutional neural networks (CNNs) can localize rough knuckle positions with 97.67% accuracy, transforming a global detection problem into a local detection problem. Compared to the rough localization based on human experience or other machine learning classification methods, the CNN shows the best localization results. In the rough knuckle position, a k-means clustering method is able to further extract the exact knuckle positions based on the back-fin knuckle color features. The exact knuckle position can help us to generate a crab cutline in XY plane using a template matching method. This is a pioneering research project in crab image analysis and offers advanced machine intelligence for automated crab processing.
Learning About Climate and Atmospheric Models Through Machine Learning
NASA Astrophysics Data System (ADS)
Lucas, D. D.
2017-12-01
From the analysis of ensemble variability to improving simulation performance, machine learning algorithms can play a powerful role in understanding the behavior of atmospheric and climate models. To learn about model behavior, we create training and testing data sets through ensemble techniques that sample different model configurations and values of input parameters, and then use supervised machine learning to map the relationships between the inputs and outputs. Following this procedure, we have used support vector machines, random forests, gradient boosting and other methods to investigate a variety of atmospheric and climate model phenomena. We have used machine learning to predict simulation crashes, estimate the probability density function of climate sensitivity, optimize simulations of the Madden Julian oscillation, assess the impacts of weather and emissions uncertainty on atmospheric dispersion, and quantify the effects of model resolution changes on precipitation. This presentation highlights recent examples of our applications of machine learning to improve the understanding of climate and atmospheric models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
A machine learning approach for predicting methionine oxidation sites.
Aledo, Juan C; Cantón, Francisco R; Veredas, Francisco J
2017-09-29
The oxidation of protein-bound methionine to form methionine sulfoxide, has traditionally been regarded as an oxidative damage. However, recent evidences support the view of this reversible reaction as a regulatory post-translational modification. The perception that methionine sulfoxidation may provide a mechanism to the redox regulation of a wide range of cellular processes, has stimulated some proteomic studies. However, these experimental approaches are expensive and time-consuming. Therefore, computational methods designed to predict methionine oxidation sites are an attractive alternative. As a first approach to this matter, we have developed models based on random forests, support vector machines and neural networks, aimed at accurate prediction of sites of methionine oxidation. Starting from published proteomic data regarding oxidized methionines, we created a hand-curated dataset formed by 113 unique polypeptides of known structure, containing 975 methionyl residues, 122 of which were oxidation-prone (positive dataset) and 853 were oxidation-resistant (negative dataset). We use a machine learning approach to generate predictive models from these datasets. Among the multiple features used in the classification task, some of them contributed substantially to the performance of the predictive models. Thus, (i) the solvent accessible area of the methionine residue, (ii) the number of residues between the analyzed methionine and the next methionine found towards the N-terminus and (iii) the spatial distance between the atom of sulfur from the analyzed methionine and the closest aromatic residue, were among the most relevant features. Compared to the other classifiers we also evaluated, random forests provided the best performance, with accuracy, sensitivity and specificity of 0.7468±0.0567, 0.6817±0.0982 and 0.7557±0.0721, respectively (mean ± standard deviation). We present the first predictive models aimed to computationally detect methionine sites that may become oxidized in vivo in response to oxidative signals. These models provide insights into the structural context in which a methionine residue become either oxidation-resistant or oxidation-prone. Furthermore, these models should be useful in prioritizing methinonyl residues for further studies to determine their potential as regulatory post-translational modification sites.
Machinability of Minor Wooden Species before and after Modification with Thermo-Vacuum Technology
Sandak, Jakub; Goli, Giacomo; Cetera, Paola; Sandak, Anna; Cavalli, Alberto; Todaro, Luigi
2017-01-01
The influence of the thermal modification process on wood machinability was investigated with four minor species of low economic importance. A set of representative experimental samples was machined to the form of disks with sharp and dull tools. The resulting surface quality was visually evaluated by a team of experts according to the American standard procedure ASTM D-1666-87. The objective quantification of the surface quality was also done by means of a three dimensions (3D) surface scanner for the whole range of grain orientations. Visual assessment and 3D surface analysis showed a good agreement in terms of conclusions. The best quality of the wood surface was obtained when machining thermally modified samples. The positive effect of the material modification was apparent when cutting deodar cedar, black pine and black poplar in unfavorable conditions (i.e., against the grain). The difference was much smaller for an easy-machinability specie such as Italian alder. The use of dull tools resulted in the worst surface quality. Thermal modification has shown a very positive effect when machining with dull tools, leading to a relevant increment of the final surface smoothness. PMID:28772480
Machinability of Minor Wooden Species before and after Modification with Thermo-Vacuum Technology.
Sandak, Jakub; Goli, Giacomo; Cetera, Paola; Sandak, Anna; Cavalli, Alberto; Todaro, Luigi
2017-01-28
The influence of the thermal modification process on wood machinability was investigated with four minor species of low economic importance. A set of representative experimental samples was machined to the form of disks with sharp and dull tools. The resulting surface quality was visually evaluated by a team of experts according to the American standard procedure ASTM D-1666-87. The objective quantification of the surface quality was also done by means of a three dimensions (3D) surface scanner for the whole range of grain orientations. Visual assessment and 3D surface analysis showed a good agreement in terms of conclusions. The best quality of the wood surface was obtained when machining thermally modified samples. The positive effect of the material modification was apparent when cutting deodar cedar, black pine and black poplar in unfavorable conditions (i.e., against the grain). The difference was much smaller for an easy-machinability specie such as Italian alder. The use of dull tools resulted in the worst surface quality. Thermal modification has shown a very positive effect when machining with dull tools, leading to a relevant increment of the final surface smoothness.
Uncertainty in Random Forests: What does it mean in a spatial context?
NASA Astrophysics Data System (ADS)
Klump, Jens; Fouedjio, Francky
2017-04-01
Geochemical surveys are an important part of exploration for mineral resources and in environmental studies. The samples and chemical analyses are often laborious and difficult to obtain and therefore come at a high cost. As a consequence, these surveys are characterised by datasets with large numbers of variables but relatively few data points when compared to conventional big data problems. With more remote sensing platforms and sensor networks being deployed, large volumes of auxiliary data of the surveyed areas are becoming available. The use of these auxiliary data has the potential to improve the prediction of chemical element concentrations over the whole study area. Kriging is a well established geostatistical method for the prediction of spatial data but requires significant pre-processing and makes some basic assumptions about the underlying distribution of the data. Some machine learning algorithms, on the other hand, may require less data pre-processing and are non-parametric. In this study we used a dataset provided by Kirkwood et al. [1] to explore the potential use of Random Forest in geochemical mapping. We chose Random Forest because it is a well understood machine learning method and has the advantage that it provides us with a measure of uncertainty. By comparing Random Forest to Kriging we found that both methods produced comparable maps of estimated values for our variables of interest. Kriging outperformed Random Forest for variables of interest with relatively strong spatial correlation. The measure of uncertainty provided by Random Forest seems to be quite different to the measure of uncertainty provided by Kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. In conclusion, our preliminary results show that the model driven approach in geostatistics gives us more reliable estimates for our target variables than Random Forest for variables with relatively strong spatial correlation. However, in cases of weak spatial correlation Random Forest, as a nonparametric method, may give the better results once we have a better understanding of the meaning of its uncertainty measures in a spatial context. References [1] Kirkwood, C., M. Cave, D. Beamish, S. Grebby, and A. Ferreira (2016), A machine learning approach to geochemical mapping, Journal of Geochemical Exploration, 163, 28-40, doi:10.1016/j.gexplo.2016.05.003.
Internal position and limit sensor for free piston machines
NASA Technical Reports Server (NTRS)
Holliday, Ezekiel S. (Inventor); Wood, James Gary (Inventor)
2012-01-01
A sensor for sensing the position of a reciprocating free piston in a free piston Stirling machine. The sensor has a disk mounted to an end face of the power piston coaxially with its cylinder and reciprocating with the piston The disk includes a rim around its outer perimeter formed of an electrically conductive material A coil is wound coaxially with the cylinder, spaced outwardly from the outer perimeter of the disk and mounted in fixed position relative to the pressure vessel, preferably on the exterior of the pressure vessel wall.
Does providing nutrition information at vending machines reduce calories per item sold?
Dingman, Deirdre A; Schulz, Mark R; Wyrick, David L; Bibeau, Daniel L; Gupta, Sat N
2015-02-01
In 2010, the United States (US) enacted a restaurant menu labeling law. The law also applied to vending machine companies selling food. Research suggested that providing nutrition information on menus in restaurants might reduce the number of calories purchased. We tested the effect of providing nutrition information and 'healthy' designations to consumers where vending machines were located in college residence halls. We conducted our study at one university in Southeast US (October-November 2012). We randomly assigned 18 vending machines locations (residence halls) to an intervention or control group. For the intervention we posted nutrition information, interpretive signage, and sent a promotional email to residents of the hall. For the control group we did nothing. We tracked sales over 4 weeks before and 4 weeks after we introduced the intervention. Our intervention did not change what the residents bought. We recommend additional research about providing nutrition information where vending machines are located, including testing formats used to present information.
Manipulating Tabu List to Handle Machine Breakdowns in Job Shop Scheduling Problems
NASA Astrophysics Data System (ADS)
Nababan, Erna Budhiarti; SalimSitompul, Opim
2011-06-01
Machine breakdowns in a production schedule may occur on a random basis that make the well-known hard combinatorial problem of Job Shop Scheduling Problems (JSSP) becomes more complex. One of popular techniques used to solve the combinatorial problems is Tabu Search. In this technique, moves that will be not allowed to be revisited are retained in a tabu list in order to avoid in gaining solutions that have been obtained previously. In this paper, we propose an algorithm to employ a second tabu list to keep broken machines, in addition to the tabu list that keeps the moves. The period of how long the broken machines will be kept on the list is categorized using fuzzy membership function. Our technique are tested to the benchmark data of JSSP available on the OR library. From the experiment, we found that our algorithm is promising to help a decision maker to face the event of machine breakdowns.
Application of Machine Learning Approaches for Protein-protein Interactions Prediction.
Zhang, Mengying; Su, Qiang; Lu, Yi; Zhao, Manman; Niu, Bing
2017-01-01
Proteomics endeavors to study the structures, functions and interactions of proteins. Information of the protein-protein interactions (PPIs) helps to improve our knowledge of the functions and the 3D structures of proteins. Thus determining the PPIs is essential for the study of the proteomics. In this review, in order to study the application of machine learning in predicting PPI, some machine learning approaches such as support vector machine (SVM), artificial neural networks (ANNs) and random forest (RF) were selected, and the examples of its applications in PPIs were listed. SVM and RF are two commonly used methods. Nowadays, more researchers predict PPIs by combining more than two methods. This review presents the application of machine learning approaches in predicting PPI. Many examples of success in identification and prediction in the area of PPI prediction have been discussed, and the PPIs research is still in progress. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
SU-E-T-613: Dosimetric Consequences of Systematic MLC Leaf Positioning Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kathuria, K; Siebers, J
2014-06-01
Purpose: The purpose of this study is to determine the dosimetric consequences of systematic MLC leaf positioning errors for clinical IMRT patient plans so as to establish detection tolerances for quality assurance programs. Materials and Methods: Dosimetric consequences were simulated by extracting mlc delivery instructions from the TPS, altering the file by the specified error, reloading the delivery instructions into the TPS, recomputing dose, and extracting dose-volume metrics for one head-andneck and one prostate patient. Machine error was simulated by offsetting MLC leaves in Pinnacle in a systematic way. Three different algorithms were followed for these systematic offsets, and aremore » as follows: a systematic sequential one-leaf offset (one leaf offset in one segment per beam), a systematic uniform one-leaf offset (same one leaf offset per segment per beam) and a systematic offset of a given number of leaves picked uniformly at random from a given number of segments (5 out of 10 total). Dose to the PTV and normal tissue was simulated. Results: A systematic 5 mm offset of 1 leaf for all delivery segments of all beams resulted in a maximum PTV D98 deviation of 1%. Results showed very low dose error in all reasonably possible machine configurations, rare or otherwise, which could be simulated. Very low error in dose to PTV and OARs was shown in all possible cases of one leaf per beam per segment being offset (<1%), or that of only one leaf per beam being offset (<.2%). The errors resulting from a high number of adjacent leaves (maximum of 5 out of 60 total leaf-pairs) being simultaneously offset in many (5) of the control points (total 10–18 in all beams) per beam, in both the PTV and the OARs analyzed, were similarly low (<2–3%). Conclusions: The above results show that patient shifts and anatomical changes are the main source of errors in dose delivered, not machine delivery. These two sources of error are “visually complementary” and uncorrelated (albeit not additive in the final error) and one can easily incorporate error resulting from machine delivery in an error model based purely on tumor motion.« less
NASA Astrophysics Data System (ADS)
Kweon, Hyunkyu; Choi, Sungdae; Kim, Youngsik; Nam, Kiho
Micro UTM (Universal Testing Machines) are becoming increasingly popular for testing the mechanical properties of MEMS materials, metal thin films, and micro-molecule materials1-2. And, new miniature testing machines that can perform in-process measurement in SEM, TEM, and SPM are also needed. In this paper, a new micro UTM with a precision positioning system that can be fine positioning stage. Coarse positioning is implemented by step motor. The size, load output and used in SEM, TEM, and SPM have been proposed. Bimorph type PZT precision actuator is used in displacement output of bimorph type UTM are 109×64×22(mm), about 35g, and 0.4 mm, respectively. And the displacement output is controlled in the block digital form. The results of the analysis and basic properties of positioning system and the UTM system are presented. In addition, the experiment results of in-process measurement during tensile load in SEM and AFM are showed.
The precision measurement and assembly for miniature parts based on double machine vision systems
NASA Astrophysics Data System (ADS)
Wang, X. D.; Zhang, L. F.; Xin, M. Z.; Qu, Y. Q.; Luo, Y.; Ma, T. M.; Chen, L.
2015-02-01
In the process of miniature parts' assembly, the structural features on the bottom or side of the parts often need to be aligned and positioned. The general assembly equipment integrated with one vertical downward machine vision system cannot satisfy the requirement. A precision automatic assembly equipment was developed with double machine vision systems integrated. In the system, a horizontal vision system is employed to measure the position of the feature structure at the parts' side view, which cannot be seen with the vertical one. The position measured by horizontal camera is converted to the vertical vision system with the calibration information. By careful calibration, the parts' alignment and positioning in the assembly process can be guaranteed. The developed assembly equipment has the characteristics of easy implementation, modularization and high cost performance. The handling of the miniature parts and assembly procedure were briefly introduced. The calibration procedure was given and the assembly error was analyzed for compensation.
Computer-aided diagnosis of lung nodule using gradient tree boosting and Bayesian optimization.
Nishio, Mizuho; Nishizawa, Mitsuo; Sugiyama, Osamu; Kojima, Ryosuke; Yakami, Masahiro; Kuroda, Tomohiro; Togashi, Kaori
2018-01-01
We aimed to evaluate a computer-aided diagnosis (CADx) system for lung nodule classification focussing on (i) usefulness of the conventional CADx system (hand-crafted imaging feature + machine learning algorithm), (ii) comparison between support vector machine (SVM) and gradient tree boosting (XGBoost) as machine learning algorithms, and (iii) effectiveness of parameter optimization using Bayesian optimization and random search. Data on 99 lung nodules (62 lung cancers and 37 benign lung nodules) were included from public databases of CT images. A variant of the local binary pattern was used for calculating a feature vector. SVM or XGBoost was trained using the feature vector and its corresponding label. Tree Parzen Estimator (TPE) was used as Bayesian optimization for parameters of SVM and XGBoost. Random search was done for comparison with TPE. Leave-one-out cross-validation was used for optimizing and evaluating the performance of our CADx system. Performance was evaluated using area under the curve (AUC) of receiver operating characteristic analysis. AUC was calculated 10 times, and its average was obtained. The best averaged AUC of SVM and XGBoost was 0.850 and 0.896, respectively; both were obtained using TPE. XGBoost was generally superior to SVM. Optimal parameters for achieving high AUC were obtained with fewer numbers of trials when using TPE, compared with random search. Bayesian optimization of SVM and XGBoost parameters was more efficient than random search. Based on observer study, AUC values of two board-certified radiologists were 0.898 and 0.822. The results show that diagnostic accuracy of our CADx system was comparable to that of radiologists with respect to classifying lung nodules.
Bagheri, Hossein; Hooshmand, Tabassom; Aghajani, Farzaneh
2015-09-01
This study aimed to evaluate the effect of different ceramic surface treatments after machining grinding on the biaxial flexural strength (BFS) of machinable dental ceramics with different crystalline phases. Disk-shape specimens (10mm in diameter and 1.3mm in thickness) of machinable ceramic cores (two silica-based and one zirconia-based ceramics) were prepared. Each type of the ceramic surfaces was then randomly treated (n=15) with different treatments as follows: 1) machined finish as control, 2) machined finish and sandblasting with alumina, and 3) machined finish and hydrofluoric acid etching for the leucite and lithium disilicate-based ceramics, and for the zirconia; 1) machined finish and post-sintered as control, 2) machined finish, post-sintered, and sandblasting, and 3) machined finish, post-sintered, and Nd;YAG laser irradiation. The BFS were measured in a universal testing machine. Data based were analyzed by ANOVA and Tukey's multiple comparisons post-hoc test (α=0.05). The mean BFS of machined finish only surfaces for leucite ceramic was significantly higher than that of sandblasted (P=0.001) and acid etched surfaces (P=0.005). A significantly lower BFS was found after sandblasting for lithium disilicate compared with that of other groups (P<0.05). Sandblasting significantly increased the BFS for the zirconia (P<0.05), but the BFS was significantly decreased after laser irradiation (P<0.05). The BFS of the machinable ceramics was affected by the type of ceramic material and surface treatment method. Sandblasting with alumina was detrimental to the strength of only silica-based ceramics. Nd:YAG laser irradiation may lead to substantial strength degradation of zirconia.
Bagheri, Hossein; Aghajani, Farzaneh
2015-01-01
Objectives: This study aimed to evaluate the effect of different ceramic surface treatments after machining grinding on the biaxial flexural strength (BFS) of machinable dental ceramics with different crystalline phases. Materials and Methods: Disk-shape specimens (10mm in diameter and 1.3mm in thickness) of machinable ceramic cores (two silica-based and one zirconia-based ceramics) were prepared. Each type of the ceramic surfaces was then randomly treated (n=15) with different treatments as follows: 1) machined finish as control, 2) machined finish and sandblasting with alumina, and 3) machined finish and hydrofluoric acid etching for the leucite and lithium disilicate-based ceramics, and for the zirconia; 1) machined finish and post-sintered as control, 2) machined finish, post-sintered, and sandblasting, and 3) machined finish, post-sintered, and Nd;YAG laser irradiation. The BFS were measured in a universal testing machine. Data based were analyzed by ANOVA and Tukey’s multiple comparisons post-hoc test (α=0.05). Results: The mean BFS of machined finish only surfaces for leucite ceramic was significantly higher than that of sandblasted (P=0.001) and acid etched surfaces (P=0.005). A significantly lower BFS was found after sandblasting for lithium disilicate compared with that of other groups (P<0.05). Sandblasting significantly increased the BFS for the zirconia (P<0.05), but the BFS was significantly decreased after laser irradiation (P<0.05). Conclusions: The BFS of the machinable ceramics was affected by the type of ceramic material and surface treatment method. Sandblasting with alumina was detrimental to the strength of only silica-based ceramics. Nd:YAG laser irradiation may lead to substantial strength degradation of zirconia. PMID:27148372
Teo, Ming; Amis, Terence; Lee, Sharon; Falland, Karina; Lambert, Stephen; Wheatley, John
2011-07-01
Continuous positive airway pressure (CPAP) titration studies are commonly performed using a nasal mask but some patients may prefer a full-face or oronasal mask. There is little evidence regarding the equivalence of different mask interfaces used to initiate treatment. We hypothesized that oronasal breathing when using an oronasal mask increases upper airway collapsibility and that a higher pressure may be required to maintain airway patency. We also assessed patient preferences for the 2 mask interfaces. Prospective, randomized, cross-over design with 2 consecutive CPAP titration nights. Accredited laboratory in a university hospital. Twenty-four treatment-naive subjects with obstructive sleep apnea syndrome and respiratory disturbance index of greater than 15 events per hour. CPAP titration was performed using an auto-titrating machine with randomization to a nasal or oronasal mask, followed by a second titration night using the alternate mask style. There was no significant difference in the mean pressures determined between nasal and oronasal masks, although 43% of subjects had nasal-to-oronasal mask-pressure differences of 2 cm H(2)O or more. Residual respiratory events, arousals, and measured leak were all greater with the oronasal mask. Seventy-nine percent of subjects preferred the nasal mask. Patients with obstructive sleep apnea syndrome can generally switch between nasal and oronasal masks without changing machine pressure, although there are individual differences that may be clinically significant. Measured leak is greater with the oronasal mask. Most patients with obstructive sleep apnea syndrome prefer a nasal mask as the interface for initiation of CPAP. Australian New Zealand Clinical Trials Registry (ANZCTR). ACTRN: ACTRN12611000243910. URL: http://www.ANZCTR.org.au/ACTRN12611000243910.aspx
Al Ajmi, Eiman; Forghani, Behzad; Reinhold, Caroline; Bayat, Maryam; Forghani, Reza
2018-06-01
There is a rich amount of quantitative information in spectral datasets generated from dual-energy CT (DECT). In this study, we compare the performance of texture analysis performed on multi-energy datasets to that of virtual monochromatic images (VMIs) at 65 keV only, using classification of the two most common benign parotid neoplasms as a testing paradigm. Forty-two patients with pathologically proven Warthin tumour (n = 25) or pleomorphic adenoma (n = 17) were evaluated. Texture analysis was performed on VMIs ranging from 40 to 140 keV in 5-keV increments (multi-energy analysis) or 65-keV VMIs only, which is typically considered equivalent to single-energy CT. Random forest (RF) models were constructed for outcome prediction using separate randomly selected training and testing sets or the entire patient set. Using multi-energy texture analysis, tumour classification in the independent testing set had accuracy, sensitivity, specificity, positive predictive value, and negative predictive value of 92%, 86%, 100%, 100%, and 83%, compared to 75%, 57%, 100%, 100%, and 63%, respectively, for single-energy analysis. Multi-energy texture analysis demonstrates superior performance compared to single-energy texture analysis of VMIs at 65 keV for classification of benign parotid tumours. • We present and validate a paradigm for texture analysis of DECT scans. • Multi-energy dataset texture analysis is superior to single-energy dataset texture analysis. • DECT texture analysis has high accura\\cy for diagnosis of benign parotid tumours. • DECT texture analysis with machine learning can enhance non-invasive diagnostic tumour evaluation.
NASA Technical Reports Server (NTRS)
Rogers, David
1988-01-01
The advent of the Connection Machine profoundly changes the world of supercomputers. The highly nontraditional architecture makes possible the exploration of algorithms that were impractical for standard Von Neumann architectures. Sparse distributed memory (SDM) is an example of such an algorithm. Sparse distributed memory is a particularly simple and elegant formulation for an associative memory. The foundations for sparse distributed memory are described, and some simple examples of using the memory are presented. The relationship of sparse distributed memory to three important computational systems is shown: random-access memory, neural networks, and the cerebellum of the brain. Finally, the implementation of the algorithm for sparse distributed memory on the Connection Machine is discussed.
Seizure Forecasting and the Preictal State in Canine Epilepsy.
Varatharajah, Yogatheesan; Iyer, Ravishankar K; Berry, Brent M; Worrell, Gregory A; Brinkmann, Benjamin H
2017-02-01
The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state.
SEIZURE FORECASTING AND THE PREICTAL STATE IN CANINE EPILEPSY
Varatharajah, Yogatheesan; Iyer, Ravishankar K.; Berry, Brent M.; Worrell, Gregory A.; Brinkmann, Benjamin H.
2017-01-01
The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state. PMID:27464854
Machine learning with naturally labeled data for identifying abbreviation definitions.
Yeganova, Lana; Comeau, Donald C; Wilbur, W John
2011-06-09
The rapid growth of biomedical literature requires accurate text analysis and text processing tools. Detecting abbreviations and identifying their definitions is an important component of such tools. Most existing approaches for the abbreviation definition identification task employ rule-based methods. While achieving high precision, rule-based methods are limited to the rules defined and fail to capture many uncommon definition patterns. Supervised learning techniques, which offer more flexibility in detecting abbreviation definitions, have also been applied to the problem. However, they require manually labeled training data. In this work, we develop a machine learning algorithm for abbreviation definition identification in text which makes use of what we term naturally labeled data. Positive training examples are naturally occurring potential abbreviation-definition pairs in text. Negative training examples are generated by randomly mixing potential abbreviations with unrelated potential definitions. The machine learner is trained to distinguish between these two sets of examples. Then, the learned feature weights are used to identify the abbreviation full form. This approach does not require manually labeled training data. We evaluate the performance of our algorithm on the Ab3P, BIOADI and Medstract corpora. Our system demonstrated results that compare favourably to the existing Ab3P and BIOADI systems. We achieve an F-measure of 91.36% on Ab3P corpus, and an F-measure of 87.13% on BIOADI corpus which are superior to the results reported by Ab3P and BIOADI systems. Moreover, we outperform these systems in terms of recall, which is one of our goals.
Fluid Dynamics Appearing during Simulated Microgravity Using Random Positioning Machines
Stern, Philip; Casartelli, Ernesto; Egli, Marcel
2017-01-01
Random Positioning Machines (RPMs) are widely used as tools to simulate microgravity on ground. They consist of two gimbal mounted frames, which constantly rotate biological samples around two perpendicular axes and thus distribute the Earth’s gravity vector in all directions over time. In recent years, the RPM is increasingly becoming appreciated as a laboratory instrument also in non-space-related research. For instance, it can be applied for the formation of scaffold-free spheroid cell clusters. The kinematic rotation of the RPM, however, does not only distribute the gravity vector in such a way that it averages to zero, but it also introduces local forces to the cell culture. These forces can be described by rigid body analysis. Although RPMs are commonly used in laboratories, the fluid motion in the cell culture flasks on the RPM and the possible effects of such on cells have not been examined until today; thus, such aspects have been widely neglected. In this study, we used a numerical approach to describe the fluid dynamic characteristic occurring inside a cell culture flask turning on an operating RPM. The simulations showed that the fluid motion within the cell culture flask never reached a steady state or neared a steady state condition. The fluid velocity depends on the rotational velocity of the RPM and is in the order of a few centimeters per second. The highest shear stresses are found along the flask walls; depending of the rotational velocity, they can reach up to a few 100 mPa. The shear stresses in the “bulk volume,” however, are always smaller, and their magnitude is in the order of 10 mPa. In conclusion, RPMs are highly appreciated as reliable tools in microgravity research. They have even started to become useful instruments in new research fields of mechanobiology. Depending on the experiment, the fluid dynamic on the RPM cannot be neglected and needs to be taken into consideration. The results presented in this study elucidate the fluid motion and provide insight into the convection and shear stresses that occur inside a cell culture flask during RPM experiments. PMID:28135286
High-speed ultrafast laser machining with tertiary beam positioning (Conference Presentation)
NASA Astrophysics Data System (ADS)
Yang, Chuan; Zhang, Haibin
2017-03-01
For an industrial laser application, high process throughput and low average cost of ownership are critical to commercial success. Benefiting from high peak power, nonlinear absorption and small-achievable spot size, ultrafast lasers offer advantages of minimal heat affected zone, great taper and sidewall quality, and small via capability that exceeds the limits of their predecessors in via drilling for electronic packaging. In the past decade, ultrafast lasers have both grown in power and reduced in cost. For example, recently, disk and fiber technology have both shown stable operation in the 50W to 200W range, mostly at high repetition rate (beyond 500 kHz) that helps avoid detrimental nonlinear effects. However, to effectively and efficiently scale the throughput with the fast-growing power capability of the ultrafast lasers while keeping the beneficial laser-material interactions is very challenging, mainly because of the bottleneck imposed by the inertia-related acceleration limit and servo gain bandwidth when only stages and galvanometers are being used. On the other side, inertia-free scanning solutions like acoustic optics and electronic optical deflectors have small scan field, and therefore not suitable for large-panel processing. Our recent system developments combine stages, galvanometers, and AODs into a coordinated tertiary architecture for high bandwidth and meanwhile large field beam positioning. Synchronized three-level movements allow extremely fast local speed and continuous motion over the whole stage travel range. We present the via drilling results from such ultrafast system with up to 3MHz pulse to pulse random access, enabling high quality low cost ultrafast machining with emerging high average power laser sources.
Esophageal Cancer: Associations With (pN+) Lymph Node Metastases.
Rice, Thomas W; Ishwaran, Hemant; Hofstetter, Wayne L; Schipper, Paul H; Kesler, Kenneth A; Law, Simon; Lerut, E M R; Denlinger, Chadrick E; Salo, Jarmo A; Scott, Walter J; Watson, Thomas J; Allen, Mark S; Chen, Long-Qi; Rusch, Valerie W; Cerfolio, Robert J; Luketich, James D; Duranceau, Andre; Darling, Gail E; Pera, Manuel; Apperson-Hansen, Carolyn; Blackstone, Eugene H
2017-01-01
To identify the associations of lymph node metastases (pN+), number of positive nodes, and pN subclassification with cancer, treatment, patient, geographic, and institutional variables, and to recommend extent of lymphadenectomy needed to accurately detect pN+ for esophageal cancer. Limited data and traditional analytic techniques have precluded identifying intricate associations of pN+ with other cancer, treatment, and patient characteristics. Data on 5806 esophagectomy patients from the Worldwide Esophageal Cancer Collaboration were analyzed by Random Forest machine learning techniques. pN+, number of positive nodes, and pN subclassification were associated with increasing depth of cancer invasion (pT), increasing cancer length, decreasing cancer differentiation (G), and more regional lymph nodes resected. Lymphadenectomy necessary to accurately detect pN+ is 60 for shorter, well-differentiated cancers (<2.5 cm) and 20 for longer, poorly differentiated ones. In esophageal cancer, pN+, increasing number of positive nodes, and increasing pN classification are associated with deeper invading, longer, and poorly differentiated cancers. Consequently, if the goal of lymphadenectomy is to accurately define pN+ status of such cancers, few nodes need to be removed. Conversely, superficial, shorter, and well-differentiated cancers require a more extensive lymphadenectomy to accurately define pN+ status.
Hit Dexter: A Machine-Learning Model for the Prediction of Frequent Hitters.
Stork, Conrad; Wagner, Johannes; Friedrich, Nils-Ole; de Bruyn Kops, Christina; Šícho, Martin; Kirchmair, Johannes
2018-03-20
False-positive assay readouts caused by badly behaving compounds-frequent hitters, pan-assay interference compounds (PAINS), aggregators, and others-continue to pose a major challenge to experimental screening. There are only a few in silico methods that allow the prediction of such problematic compounds. We report the development of Hit Dexter, two extremely randomized trees classifiers for the prediction of compounds likely to trigger positive assay readouts either by true promiscuity or by assay interference. The models were trained on a well-prepared dataset extracted from the PubChem Bioassay database, consisting of approximately 311 000 compounds tested for activity on at least 50 proteins. Hit Dexter reached MCC and AUC values of up to 0.67 and 0.96 on an independent test set, respectively. The models are expected to be of high value, in particular to medicinal chemists and biochemists who can use Hit Dexter to identify compounds for which extra caution should be exercised with positive assay readouts. Hit Dexter is available as a free web service at http://hitdexter.zbh. uni-hamburg.de. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Esophageal Cancer: Associations with pN+
Rice, Thomas W.; Ishwaran, Hemant; Hofstetter, Wayne L.; Schipper, Paul H.; Kesler, Kenneth A.; Law, Simon; Lerut, Toni E.M.R.; Denlinger, Chadrick E.; Salo, Jarmo A.; Scott, Walter J.; Watson, Thomas J.; Allen, Mark S.; Chen, Long-Qi; Rusch, Valerie W.; Cerfolio, Robert J.; Luketich, James D.; Duranceau, Andre; Darling, Gail E.; Pera, Manuel; Apperson-Hansen, Carolyn; Blackstone, Eugene H.
2017-01-01
Objectives 1) To identify the association of positive lymph node metastases (pN+), number of positive nodes, and pN subclassification with cancer, treatment, patient, geographic, and institutional variables, and 2) to recommend extent of lymphadenectomy needed to accurately detect pN+ for esophageal cancer. Summary Background Data Limited data and traditional analytic techniques have precluded identifying intricate associations of pN+ with other cancer, treatment, and patient characteristics. Methods Data on 5,806 esophagectomy patients from the Worldwide Esophageal Cancer Collaboration (WECC) were analyzed by Random Forest machine learning techniques. Results pN+, number of positive nodes, and pN subclassification were associated with increasing depth of cancer invasion (pT), increasing cancer length, decreasing cancer differentiation (G), and more regional lymph nodes resected. Lymphadenectomy necessary to accurately detect pN+ is 60 for shorter, well-differentiated cancers (<2.5 cm) and 20 for longer, poorly differentiated ones. Conclusions In esophageal cancer, pN+, increasing number of positive nodes, and increasing pN classification are associated with deeper invading, longer, and poorly differentiated cancers. Consequently, if the goal of lymphadenectomy is to accurately define pN+ status of such cancers, few nodes need to be removed. Conversely, superficial, shorter, and well-differentiated cancers require a more extensive lymphadenectomy to accurately define pN+ status. PMID:28009736
Computer Aided Process Planning of Machined Metal Parts
1984-09-01
the manufac- turer to accentuate the positive to assist marketing . Machine usage costs and facility loadings are frequently critical. For example...Variant systems currently on the market include Multiplan (TM of OIR, Inc.), CY-Miplan (TM of Computervision), PICAPP (TM of PICAPP, Inc.) and CSD...Multiproduct, Multistage Manufacturing Systems, Journal of Engineering for Industry, ASME, August 1977. Hitomi, K. and I. Ham, Product Mix and Machine Loading
Ahmad-Sabry, Mohammad H I
2015-04-01
During 6 weeks, we had 4 incidents of echocardiography machine malfunction. There were 3 in the operating room, which were damaged due to intravenous (IV) fluid spillage over the keyboard of the machine leading to burning of the keyboard electric connection, and 1 in the cardiology department, which was damagaed due to spillage of coffee on it. The malfunction had an economic impact on the hospital (about $ 20,000) in addition to the nonavailability of the ultrasound (US) machine for the cardiac patient after the incident till the end of the case and for consequent cases till the fixation of the machine. We undertook an analysis of the incidents using simplified approach. The first incident happened when changing an empty IV fluid bag for a full one led to spillage of some fluid onto the keyboard. The second incidence was due to the use of needle to depressurize a medication bottle for continuous IV drip, and the third event was due to disconnection of the IV set from the bottle during transfer of the patient from operation room to intensive care unit. The fundamental problem is of course that fluid is harmful to the US machine. In addition, the machines are in a position between the patient bed and anesthesia machine. This means that IV pulls are on each side of the patient bed, which makes the machine vulnerable to fluid spillage. We considered a machine modification, to create a protective cover, but this was hindered by complexity of keyboard of the US machine, technical and financial challenges, and the time it would take to achieve. Second, we considered the creation of a protocol, with putting the machine in a position where no IV pulls are around and transferring the machine out of the room when transferring the patient will endanger the machine by the IV fluid. Third, changing of human behavior; to do this, we announced the protocol in our anesthesia conference to make it known to each and every one. We taught residents, fellows, and staff about the new protocol.Our simplified approach was effective for the prevention of fluid spillage over the US machine.
Machine Learning Methods for Production Cases Analysis
NASA Astrophysics Data System (ADS)
Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.
2018-03-01
Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.
Scheduling Jobs with Variable Job Processing Times on Unrelated Parallel Machines
Zhang, Guang-Qian; Wang, Jian-Jun; Liu, Ya-Jing
2014-01-01
m unrelated parallel machines scheduling problems with variable job processing times are considered, where the processing time of a job is a function of its position in a sequence, its starting time, and its resource allocation. The objective is to determine the optimal resource allocation and the optimal schedule to minimize a total cost function that dependents on the total completion (waiting) time, the total machine load, the total absolute differences in completion (waiting) times on all machines, and total resource cost. If the number of machines is a given constant number, we propose a polynomial time algorithm to solve the problem. PMID:24982933
2015-01-01
Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377
Design and fabrication of a freeform phase plate for high-order ocular aberration correction
NASA Astrophysics Data System (ADS)
Yi, Allen Y.; Raasch, Thomas W.
2005-11-01
In recent years it has become possible to measure and in some instances to correct the high-order aberrations of human eyes. We have investigated the correction of wavefront error of human eyes by using phase plates designed to compensate for that error. The wavefront aberrations of the four eyes of two subjects were experimentally determined, and compensating phase plates were machined with an ultraprecision diamond-turning machine equipped with four independent axes. A slow-tool servo freeform trajectory was developed for the machine tool path. The machined phase-correction plates were measured and compared with the original design values to validate the process. The position of the phase-plate relative to the pupil is discussed. The practical utility of this mode of aberration correction was investigated with visual acuity testing. The results are consistent with the potential benefit of aberration correction but also underscore the critical positioning requirements of this mode of aberration correction. This process is described in detail from optical measurements, through machining process design and development, to final results.
[Effect of manual cleaning and machine cleaning for dental handpiece].
Zhou, Xiaoli; Huang, Hao; He, Xiaoyan; Chen, Hui; Zhou, Xiaoying
2013-08-01
Comparing the dental handpiece' s cleaning effect between manual cleaning and machine cleaning. Eighty same contaminated dental handpieces were randomly divided into experimental group and control group, each group contains 40 pieces. The experimental group was treated by full automatic washing machine, and the control group was cleaned manually. The cleaning method was conducted according to the operations process standard, then ATP bioluminescence was used to test the cleaning results. Average relative light units (RLU) by ATP bioluminescence detection were as follows: Experimental group was 9, control group was 41. The two groups were less than the recommended RLU value provided by the instrument manufacturer (RLU < or = 45). There was significant difference between the two groups (P < 0.05). The cleaning quality of the experimental group was better than that of control group. It is recommended that the central sterile supply department should clean dental handpieces by machine to ensure the cleaning effect and maintain the quality.
Learning molecular energies using localized graph kernels.
Ferré, Grégoire; Haut, Terry; Barros, Kipton
2017-03-21
Recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturally incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. We benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.
Learning molecular energies using localized graph kernels
NASA Astrophysics Data System (ADS)
Ferré, Grégoire; Haut, Terry; Barros, Kipton
2017-03-01
Recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturally incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. We benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.
Machine learning methods in chemoinformatics
Mitchell, John B O
2014-01-01
Machine learning algorithms are generally developed in computer science or adjacent disciplines and find their way into chemical modeling by a process of diffusion. Though particular machine learning methods are popular in chemoinformatics and quantitative structure–activity relationships (QSAR), many others exist in the technical literature. This discussion is methods-based and focused on some algorithms that chemoinformatics researchers frequently use. It makes no claim to be exhaustive. We concentrate on methods for supervised learning, predicting the unknown property values of a test set of instances, usually molecules, based on the known values for a training set. Particularly relevant approaches include Artificial Neural Networks, Random Forest, Support Vector Machine, k-Nearest Neighbors and naïve Bayes classifiers. WIREs Comput Mol Sci 2014, 4:468–481. How to cite this article: WIREs Comput Mol Sci 2014, 4:468–481. doi:10.1002/wcms.1183 PMID:25285160
Machine printed text and handwriting identification in noisy document images.
Zheng, Yefeng; Li, Huiping; Doermann, David
2004-03-01
In this paper, we address the problem of the identification of text in noisy document images. We are especially focused on segmenting and identifying between handwriting and machine printed text because: 1) Handwriting in a document often indicates corrections, additions, or other supplemental information that should be treated differently from the main content and 2) the segmentation and recognition techniques requested for machine printed and handwritten text are significantly different. A novel aspect of our approach is that we treat noise as a separate class and model noise based on selected features. Trained Fisher classifiers are used to identify machine printed text and handwriting from noise and we further exploit context to refine the classification. A Markov Random Field-based (MRF) approach is used to model the geometrical structure of the printed text, handwriting, and noise to rectify misclassifications. Experimental results show that our approach is robust and can significantly improve page segmentation in noisy document collections.
A Comparison of Machine Learning Approaches for Corn Yield Estimation
NASA Astrophysics Data System (ADS)
Kim, N.; Lee, Y. W.
2017-12-01
Machine learning is an efficient empirical method for classification and prediction, and it is another approach to crop yield estimation. The objective of this study is to estimate corn yield in the Midwestern United States by employing the machine learning approaches such as the support vector machine (SVM), random forest (RF), and deep neural networks (DNN), and to perform the comprehensive comparison for their results. We constructed the database using satellite images from MODIS, the climate data of PRISM climate group, and GLDAS soil moisture data. In addition, to examine the seasonal sensitivities of corn yields, two period groups were set up: May to September (MJJAS) and July and August (JA). In overall, the DNN showed the highest accuracies in term of the correlation coefficient for the two period groups. The differences between our predictions and USDA yield statistics were about 10-11 %.
Bahl, Manisha; Barzilay, Regina; Yedidia, Adam B; Locascio, Nicholas J; Yu, Lili; Lehman, Constance D
2018-03-01
Purpose To develop a machine learning model that allows high-risk breast lesions (HRLs) diagnosed with image-guided needle biopsy that require surgical excision to be distinguished from HRLs that are at low risk for upgrade to cancer at surgery and thus could be surveilled. Materials and Methods Consecutive patients with biopsy-proven HRLs who underwent surgery or at least 2 years of imaging follow-up from June 2006 to April 2015 were identified. A random forest machine learning model was developed to identify HRLs at low risk for upgrade to cancer. Traditional features such as age and HRL histologic results were used in the model, as were text features from the biopsy pathologic report. Results One thousand six HRLs were identified, with a cancer upgrade rate of 11.4% (115 of 1006). A machine learning random forest model was developed with 671 HRLs and tested with an independent set of 335 HRLs. Among the most important traditional features were age and HRL histologic results (eg, atypical ductal hyperplasia). An important text feature from the pathologic reports was "severely atypical." Instead of surgical excision of all HRLs, if those categorized with the model to be at low risk for upgrade were surveilled and the remainder were excised, then 97.4% (37 of 38) of malignancies would have been diagnosed at surgery, and 30.6% (91 of 297) of surgeries of benign lesions could have been avoided. Conclusion This study provides proof of concept that a machine learning model can be applied to predict the risk of upgrade of HRLs to cancer. Use of this model could decrease unnecessary surgery by nearly one-third and could help guide clinical decision making with regard to surveillance versus surgical excision of HRLs. © RSNA, 2017.
Lei, Tailong; Sun, Huiyong; Kang, Yu; Zhu, Feng; Liu, Hui; Zhou, Wenfang; Wang, Zhe; Li, Dan; Li, Youyong; Hou, Tingjun
2017-11-06
Xenobiotic chemicals and their metabolites are mainly excreted out of our bodies by the urinary tract through the urine. Chemical-induced urinary tract toxicity is one of the main reasons that cause failure during drug development, and it is a common adverse event for medications, natural supplements, and environmental chemicals. Despite its importance, there are only a few in silico models for assessing urinary tract toxicity for a large number of compounds with diverse chemical structures. Here, we developed a series of qualitative and quantitative structure-activity relationship (QSAR) models for predicting urinary tract toxicity. In our study, the recursive feature elimination method incorporated with random forests (RFE-RF) was used for dimension reduction, and then eight machine learning approaches were used for QSAR modeling, i.e., relevance vector machine (RVM), support vector machine (SVM), regularized random forest (RRF), C5.0 trees, eXtreme gradient boosting (XGBoost), AdaBoost.M1, SVM boosting (SVMBoost), and RVM boosting (RVMBoost). For building classification models, the synthetic minority oversampling technique was used to handle the imbalance data set problem. Among all the machine learning approaches, SVMBoost based on the RBF kernel achieves both the best quantitative (q ext 2 = 0.845) and qualitative predictions for the test set (MCC of 0.787, AUC of 0.893, sensitivity of 89.6%, specificity of 94.1%, and global accuracy of 90.8%). The application domains were then analyzed, and all of the tested chemicals fall within the application domain coverage. We also examined the structure features of the chemicals with large prediction errors. In brief, both the regression and classification models developed by the SVMBoost approach have reliable prediction capability for assessing chemical-induced urinary tract toxicity.
Bitter or not? BitterPredict, a tool for predicting taste from chemical structure.
Dagan-Wiener, Ayana; Nissim, Ido; Ben Abu, Natalie; Borgonovo, Gigliola; Bassoli, Angela; Niv, Masha Y
2017-09-21
Bitter taste is an innately aversive taste modality that is considered to protect animals from consuming toxic compounds. Yet, bitterness is not always noxious and some bitter compounds have beneficial effects on health. Hundreds of bitter compounds were reported (and are accessible via the BitterDB http://bitterdb.agri.huji.ac.il/dbbitter.php ), but numerous additional bitter molecules are still unknown. The dramatic chemical diversity of bitterants makes bitterness prediction a difficult task. Here we present a machine learning classifier, BitterPredict, which predicts whether a compound is bitter or not, based on its chemical structure. BitterDB was used as the positive set, and non-bitter molecules were gathered from literature to create the negative set. Adaptive Boosting (AdaBoost), based on decision trees machine-learning algorithm was applied to molecules that were represented using physicochemical and ADME/Tox descriptors. BitterPredict correctly classifies over 80% of the compounds in the hold-out test set, and 70-90% of the compounds in three independent external sets and in sensory test validation, providing a quick and reliable tool for classifying large sets of compounds into bitter and non-bitter groups. BitterPredict suggests that about 40% of random molecules, and a large portion (66%) of clinical and experimental drugs, and of natural products (77%) are bitter.
MRI texture features as biomarkers to predict MGMT methylation status in glioblastomas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korfiatis, Panagiotis; Kline, Timothy L.; Erickson, Bradley J., E-mail: bje@mayo.edu
Purpose: Imaging biomarker research focuses on discovering relationships between radiological features and histological findings. In glioblastoma patients, methylation of the O{sup 6}-methylguanine methyltransferase (MGMT) gene promoter is positively correlated with an increased effectiveness of current standard of care. In this paper, the authors investigate texture features as potential imaging biomarkers for capturing the MGMT methylation status of glioblastoma multiforme (GBM) tumors when combined with supervised classification schemes. Methods: A retrospective study of 155 GBM patients with known MGMT methylation status was conducted. Co-occurrence and run length texture features were calculated, and both support vector machines (SVMs) and random forest classifiersmore » were used to predict MGMT methylation status. Results: The best classification system (an SVM-based classifier) had a maximum area under the receiver-operating characteristic (ROC) curve of 0.85 (95% CI: 0.78–0.91) using four texture features (correlation, energy, entropy, and local intensity) originating from the T2-weighted images, yielding at the optimal threshold of the ROC curve, a sensitivity of 0.803 and a specificity of 0.813. Conclusions: Results show that supervised machine learning of MRI texture features can predict MGMT methylation status in preoperative GBM tumors, thus providing a new noninvasive imaging biomarker.« less
NASA Astrophysics Data System (ADS)
Huttunen, Jani; Kokkola, Harri; Mielonen, Tero; Esa Juhani Mononen, Mika; Lipponen, Antti; Reunanen, Juha; Vilhelm Lindfors, Anders; Mikkonen, Santtu; Erkki Juhani Lehtinen, Kari; Kouremeti, Natalia; Bais, Alkiviadis; Niska, Harri; Arola, Antti
2016-07-01
In order to have a good estimate of the current forcing by anthropogenic aerosols, knowledge on past aerosol levels is needed. Aerosol optical depth (AOD) is a good measure for aerosol loading. However, dedicated measurements of AOD are only available from the 1990s onward. One option to lengthen the AOD time series beyond the 1990s is to retrieve AOD from surface solar radiation (SSR) measurements taken with pyranometers. In this work, we have evaluated several inversion methods designed for this task. We compared a look-up table method based on radiative transfer modelling, a non-linear regression method and four machine learning methods (Gaussian process, neural network, random forest and support vector machine) with AOD observations carried out with a sun photometer at an Aerosol Robotic Network (AERONET) site in Thessaloniki, Greece. Our results show that most of the machine learning methods produce AOD estimates comparable to the look-up table and non-linear regression methods. All of the applied methods produced AOD values that corresponded well to the AERONET observations with the lowest correlation coefficient value being 0.87 for the random forest method. While many of the methods tended to slightly overestimate low AODs and underestimate high AODs, neural network and support vector machine showed overall better correspondence for the whole AOD range. The differences in producing both ends of the AOD range seem to be caused by differences in the aerosol composition. High AODs were in most cases those with high water vapour content which might affect the aerosol single scattering albedo (SSA) through uptake of water into aerosols. Our study indicates that machine learning methods benefit from the fact that they do not constrain the aerosol SSA in the retrieval, whereas the LUT method assumes a constant value for it. This would also mean that machine learning methods could have potential in reproducing AOD from SSR even though SSA would have changed during the observation period.
SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeMarco, J; McCloskey, S; Low, D
Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less
2016-01-01
Background As more and more researchers are turning to big data for new opportunities of biomedical discoveries, machine learning models, as the backbone of big data analysis, are mentioned more often in biomedical journals. However, owing to the inherent complexity of machine learning methods, they are prone to misuse. Because of the flexibility in specifying machine learning models, the results are often insufficiently reported in research articles, hindering reliable assessment of model validity and consistent interpretation of model outputs. Objective To attain a set of guidelines on the use of machine learning predictive models within clinical settings to make sure the models are correctly applied and sufficiently reported so that true discoveries can be distinguished from random coincidence. Methods A multidisciplinary panel of machine learning experts, clinicians, and traditional statisticians were interviewed, using an iterative process in accordance with the Delphi method. Results The process produced a set of guidelines that consists of (1) a list of reporting items to be included in a research article and (2) a set of practical sequential steps for developing predictive models. Conclusions A set of guidelines was generated to enable correct application of machine learning models and consistent reporting of model specifications and results in biomedical research. We believe that such guidelines will accelerate the adoption of big data analysis, particularly with machine learning methods, in the biomedical research community. PMID:27986644
Prediction of antiepileptic drug treatment outcomes using machine learning.
Colic, Sinisa; Wither, Robert G; Lang, Min; Zhang, Liang; Eubanks, James H; Bardakjian, Berj L
2017-02-01
Antiepileptic drug (AED) treatments produce inconsistent outcomes, often necessitating patients to go through several drug trials until a successful treatment can be found. This study proposes the use of machine learning techniques to predict epilepsy treatment outcomes of commonly used AEDs. Machine learning algorithms were trained and evaluated using features obtained from intracranial electroencephalogram (iEEG) recordings of the epileptiform discharges observed in Mecp2-deficient mouse model of the Rett Syndrome. Previous work have linked the presence of cross-frequency coupling (I CFC ) of the delta (2-5 Hz) rhythm with the fast ripple (400-600 Hz) rhythm in epileptiform discharges. Using the I CFC to label post-treatment outcomes we compared support vector machines (SVMs) and random forest (RF) machine learning classifiers for providing likelihood scores of successful treatment outcomes. (a) There was heterogeneity in AED treatment outcomes, (b) machine learning techniques could be used to rank the efficacy of AEDs by estimating likelihood scores for successful treatment outcome, (c) I CFC features yielded the most effective a priori identification of appropriate AED treatment, and (d) both classifiers performed comparably. Machine learning approaches yielded predictions of successful drug treatment outcomes which in turn could reduce the burdens of drug trials and lead to substantial improvements in patient quality of life.
A comparison of free weight squat to Smith machine squat using electromyography.
Schwanbeck, Shane; Chilibeck, Philip D; Binsted, Gordon
2009-12-01
The purpose of this experiment was to determine whether free weight or Smith machine squats were optimal for activating the prime movers of the legs and the stabilizers of the legs and the trunk. Six healthy participants performed 1 set of 8 repetitions (using a weight they could lift 8 times, i.e., 8RM, or 8 repetition maximum) for each of the free weight squat and Smith machine squat in a randomized order with a minimum of 3 days between sessions, while electromyographic (EMG) activity of the tibialis anterior, gastrocnemius, vastus medialis, vastus lateralis, biceps femoris, lumbar erector spinae, and rectus abdominus were simultaneously measured. Electromyographic activity was significantly higher by 34, 26, and 49 in the gastrocnemius, biceps femoris, and vastus medialis, respectively, during the free weight squat compared to the Smith machine squat (p < 0.05). There were no significant differences between free weight and Smith machine squat for any of the other muscles; however, the EMG averaged over all muscles during the free weight squat was 43% higher when compared to the Smith machine squat (p < 0.05). The free weight squat may be more beneficial than the Smith machine squat for individuals who are looking to strengthen plantar flexors, knee flexors, and knee extensors.
Prediction of antiepileptic drug treatment outcomes using machine learning
NASA Astrophysics Data System (ADS)
Colic, Sinisa; Wither, Robert G.; Lang, Min; Zhang, Liang; Eubanks, James H.; Bardakjian, Berj L.
2017-02-01
Objective. Antiepileptic drug (AED) treatments produce inconsistent outcomes, often necessitating patients to go through several drug trials until a successful treatment can be found. This study proposes the use of machine learning techniques to predict epilepsy treatment outcomes of commonly used AEDs. Approach. Machine learning algorithms were trained and evaluated using features obtained from intracranial electroencephalogram (iEEG) recordings of the epileptiform discharges observed in Mecp2-deficient mouse model of the Rett Syndrome. Previous work have linked the presence of cross-frequency coupling (I CFC) of the delta (2-5 Hz) rhythm with the fast ripple (400-600 Hz) rhythm in epileptiform discharges. Using the I CFC to label post-treatment outcomes we compared support vector machines (SVMs) and random forest (RF) machine learning classifiers for providing likelihood scores of successful treatment outcomes. Main results. (a) There was heterogeneity in AED treatment outcomes, (b) machine learning techniques could be used to rank the efficacy of AEDs by estimating likelihood scores for successful treatment outcome, (c) I CFC features yielded the most effective a priori identification of appropriate AED treatment, and (d) both classifiers performed comparably. Significance. Machine learning approaches yielded predictions of successful drug treatment outcomes which in turn could reduce the burdens of drug trials and lead to substantial improvements in patient quality of life.
Lenhard, Fabian; Sauer, Sebastian; Andersson, Erik; Månsson, Kristoffer Nt; Mataix-Cols, David; Rück, Christian; Serlachius, Eva
2018-03-01
There are no consistent predictors of treatment outcome in paediatric obsessive-compulsive disorder (OCD). One reason for this might be the use of suboptimal statistical methodology. Machine learning is an approach to efficiently analyse complex data. Machine learning has been widely used within other fields, but has rarely been tested in the prediction of paediatric mental health treatment outcomes. To test four different machine learning methods in the prediction of treatment response in a sample of paediatric OCD patients who had received Internet-delivered cognitive behaviour therapy (ICBT). Participants were 61 adolescents (12-17 years) who enrolled in a randomized controlled trial and received ICBT. All clinical baseline variables were used to predict strictly defined treatment response status three months after ICBT. Four machine learning algorithms were implemented. For comparison, we also employed a traditional logistic regression approach. Multivariate logistic regression could not detect any significant predictors. In contrast, all four machine learning algorithms performed well in the prediction of treatment response, with 75 to 83% accuracy. The results suggest that machine learning algorithms can successfully be applied to predict paediatric OCD treatment outcome. Validation studies and studies in other disorders are warranted. Copyright © 2017 John Wiley & Sons, Ltd.
Effective Dust Control Systems on Concrete Dowel Drilling Machinery
Echt, Alan S.; Sanderson, Wayne T.; Mead, Kenneth R.; Feng, H. Amy; Farwick, Daniel R.; Farwick, Dawn Ramsey
2016-01-01
Rotary-type percussion dowel drilling machines, which drill horizontal holes in concrete pavement, have been documented to produce respirable crystalline silica concentrations above recommended exposure criteria. This places operators at potential risk for developing health effects from exposure. United States manufacturers of these machines offer optional dust control systems. The effectiveness of the dust control systems to reduce respirable dust concentrations on two types of drilling machines was evaluated under controlled conditions with the machines operating inside large tent structures in an effort to eliminate secondary exposure sources not related to the dowel-drilling operation. Area air samples were collected at breathing zone height at three locations around each machine. Through equal numbers of sampling rounds with the control systems randomly selected to be on or off, the control systems were found to significantly reduce respirable dust concentrations from a geometric mean of 54 mg per cubic meter to 3.0 mg per cubic meter on one machine and 57 mg per cubic meter to 5.3 mg per cubic meter on the other machine. This research shows that the dust control systems can dramatically reduce respirable dust concentrations by over 90% under controlled conditions. However, these systems need to be evaluated under actual work conditions to determine their effectiveness in reducing worker exposures to crystalline silica below hazardous levels. PMID:27074062
Underestimating extreme events in power-law behavior due to machine-dependent cutoffs
NASA Astrophysics Data System (ADS)
Radicchi, Filippo
2014-11-01
Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prowell, Stacy J; Symons, Christopher T
2015-01-01
Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.
Hall effect sensors embedded within two-pole toothless stator assembly
NASA Technical Reports Server (NTRS)
Denk, Joseph (Inventor); Grant, Richard J. (Inventor)
1994-01-01
A two-pole toothless PM machine employs Hall effect sensors to indicate the position of the machine's rotor relative to power windings in the machine's stator. The Hall effect sensors are located in the main magnetic air gap underneath the power windings. The main magnetic air gap is defined by an outer magnetic surface of the rotor and an inner surface of the stator's flux collector ring.
Microcompartments and Protein Machines in Prokaryotes
Saier, Milton H.
2013-01-01
The prokaryotic cell was once thought of as a “bag of enzymes” with little or no intracellular compartmentalization. In this view, most reactions essential for life occurred as a consequence of random molecular collisions involving substrates, cofactors and cytoplasmic enzymes. Our current conception of a prokaryote is far from this view. We now consider a bacterium or an archaeon as a highly structured, non-random collection of functional membrane-embedded and proteinaceous molecular machines, each of which serves a specialized function. In this article we shall present an overview of such microcompartments including (i) the bacterial cytoskeleton and the apparati allowing DNA segregation during cells division, (ii) energy transduction apparati involving light-driven proton pumping and ion gradient-driven ATP synthesis, (iii) prokaryotic motility and taxis machines that mediate cell movements in response to gradients of chemicals and physical forces, (iv) machines of protein folding, secretion and degradation, (v) metabolasomes carrying out specific chemical reactions, (vi) 24 hour clocks allowing bacteria to coordinate their metabolic activities with the daily solar cycle and (vii) proteinaceous membrane compartmentalized structures such as sulfur granules and gas vacuoles. Membrane-bounded prokaryotic organelles were considered in a recent JMMB written symposium concerned with membraneous compartmentalization in bacteria [Saier and Bogdanov, 2013]. By contrast, in this symposium, we focus on proteinaceous microcompartments. These two symposia, taken together, provide the interested reader with an objective view of the remarkable complexity of what was once thought of as a simple non-compartmentalized cell. PMID:23920489
Quantum Entanglement in Neural Network States
NASA Astrophysics Data System (ADS)
Deng, Dong-Ling; Li, Xiaopeng; Das Sarma, S.
2017-04-01
Machine learning, one of today's most rapidly growing interdisciplinary fields, promises an unprecedented perspective for solving intricate quantum many-body problems. Understanding the physical aspects of the representative artificial neural-network states has recently become highly desirable in the applications of machine-learning techniques to quantum many-body physics. In this paper, we explore the data structures that encode the physical features in the network states by studying the quantum entanglement properties, with a focus on the restricted-Boltzmann-machine (RBM) architecture. We prove that the entanglement entropy of all short-range RBM states satisfies an area law for arbitrary dimensions and bipartition geometry. For long-range RBM states, we show by using an exact construction that such states could exhibit volume-law entanglement, implying a notable capability of RBM in representing quantum states with massive entanglement. Strikingly, the neural-network representation for these states is remarkably efficient, in the sense that the number of nonzero parameters scales only linearly with the system size. We further examine the entanglement properties of generic RBM states by randomly sampling the weight parameters of the RBM. We find that their averaged entanglement entropy obeys volume-law scaling, and the meantime strongly deviates from the Page entropy of the completely random pure states. We show that their entanglement spectrum has no universal part associated with random matrix theory and bears a Poisson-type level statistics. Using reinforcement learning, we demonstrate that RBM is capable of finding the ground state (with power-law entanglement) of a model Hamiltonian with a long-range interaction. In addition, we show, through a concrete example of the one-dimensional symmetry-protected topological cluster states, that the RBM representation may also be used as a tool to analytically compute the entanglement spectrum. Our results uncover the unparalleled power of artificial neural networks in representing quantum many-body states regardless of how much entanglement they possess, which paves a novel way to bridge computer-science-based machine-learning techniques to outstanding quantum condensed-matter physics problems.
NASA Astrophysics Data System (ADS)
Hoffman, A.; Forest, C. E.; Kemanian, A.
2016-12-01
A significant number of food-insecure nations exist in regions of the world where dust plays a large role in the climate system. While the impacts of common climate variables (e.g. temperature, precipitation, ozone, and carbon dioxide) on crop yields are relatively well understood, the impact of mineral aerosols on yields have not yet been thoroughly investigated. This research aims to develop the data and tools to progress our understanding of mineral aerosol impacts on crop yields. Suspended dust affects crop yields by altering the amount and type of radiation reaching the plant, modifying local temperature and precipitation. While dust events (i.e. dust storms) affect crop yields by depleting the soil of nutrients or by defoliation via particle abrasion. The impact of dust on yields is modeled statistically because we are uncertain which impacts will dominate the response on national and regional scales considered in this study. Multiple linear regression is used in a number of large-scale statistical crop modeling studies to estimate yield responses to various climate variables. In alignment with previous work, we develop linear crop models, but build upon this simple method of regression with machine-learning techniques (e.g. random forests) to identify important statistical predictors and isolate how dust affects yields on the scales of interest. To perform this analysis, we develop a crop-climate dataset for maize, soybean, groundnut, sorghum, rice, and wheat for the regions of West Africa, East Africa, South Africa, and the Sahel. Random forest regression models consistently model historic crop yields better than the linear models. In several instances, the random forest models accurately capture the temperature and precipitation threshold behavior in crops. Additionally, improving agricultural technology has caused a well-documented positive trend that dominates time series of global and regional yields. This trend is often removed before regression with traditional crop models, but likely at the cost of removing climate information. Our random forest models consistently discover the positive trend without removing any additional data. The application of random forests as a statistical crop model provides insight into understanding the impact of dust on yields in marginal food producing regions.
Patzel-Mattern, Katja
2005-01-01
The 20th Century is the century of of technical artefacts. With their existance and use they create an artificial reality, within which humans have to position themselves. Psychotechnik is an attempt to enable humans for this positioning. It gained importance in Germany after World War I and had its heyday between 1919 and 1926. On the basis of the activity of the engineer and supporter of Psychotechnik Georg Schlesinger, whose particular interest were disabled soldiers, the essay on hand will investigate the understanding of the body and the human being of Psychotechnik as an applied science. It turned out, that the biggest achievement of Psychotechnik was to establish a new view of the relation between human being and machine. Thus it helped to show that the human-machine-interface is a shapable unit. Psychotechnik sees the human body and its physique as the last instance for the design of machines. Its main concern is to optimize the relation between human being and machine rather than to standardize human beings according to the construction of machines. After her splendid rise during the Weimar Republic and her rapid decline since the late 1920s Psychotechnik nowadays gains scientifical attention as a historical phenomenon. The main attention in the current discourse lies on the aspects conserning philosophy of science: the unity of body and soul, the understanding of the human-machine-interface as a shapable unit and the human being as a last instance of this unit.
Li, Longxiang; Xue, Donglin; Deng, Weijie; Wang, Xu; Bai, Yang; Zhang, Feng; Zhang, Xuejun
2017-11-10
In deterministic computer-controlled optical surfacing, accurate dwell time execution by computer numeric control machines is crucial in guaranteeing a high-convergence ratio for the optical surface error. It is necessary to consider the machine dynamics limitations in the numerical dwell time algorithms. In this paper, these constraints on dwell time distribution are analyzed, and a model of the equal extra material removal is established. A positive dwell time algorithm with minimum equal extra material removal is developed. Results of simulations based on deterministic magnetorheological finishing demonstrate the necessity of considering machine dynamics performance and illustrate the validity of the proposed algorithm. Indeed, the algorithm effectively facilitates the determinacy of sub-aperture optical surfacing processes.
Vann, Charles S.
1999-01-01
This small, non-contact optical sensor increases the capability and flexibility of computer controlled machines by detecting its relative position to a workpiece in all six degrees of freedom (DOF). At a fraction of the cost, it is over 200 times faster and up to 25 times more accurate than competing 3-DOF sensors. Applications range from flexible manufacturing to a 6-DOF mouse for computers. Until now, highly agile and accurate machines have been limited by their inability to adjust to changes in their tasks. By enabling them to sense all six degrees of position, these machines can now adapt to new and complicated tasks without human intervention or delay--simplifying production, reducing costs, and enhancing the value and capability of flexible manufacturing.
Vann, C.S.
1999-03-16
This small, non-contact optical sensor increases the capability and flexibility of computer controlled machines by detecting its relative position to a workpiece in all six degrees of freedom (DOF). At a fraction of the cost, it is over 200 times faster and up to 25 times more accurate than competing 3-DOF sensors. Applications range from flexible manufacturing to a 6-DOF mouse for computers. Until now, highly agile and accurate machines have been limited by their inability to adjust to changes in their tasks. By enabling them to sense all six degrees of position, these machines can now adapt to new and complicated tasks without human intervention or delay--simplifying production, reducing costs, and enhancing the value and capability of flexible manufacturing. 3 figs.
Diamond turning machine controller implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garrard, K.P.; Taylor, L.W.; Knight, B.F.
The standard controller for a Pnuemo ASG 2500 Diamond Turning Machine, an Allen Bradley 8200, has been replaced with a custom high-performance design. This controller consists of four major components. Axis position feedback information is provided by a Zygo Axiom 2/20 laser interferometer with 0.1 micro-inch resolution. Hardware interface logic couples the computers digital and analog I/O channels to the diamond turning machine`s analog motor controllers, the laser interferometer, and other machine status and control information. It also provides front panel switches for operator override of the computer controller and implement the emergency stop sequence. The remaining two components, themore » control computer hardware and software, are discussed in detail below.« less
NASA Technical Reports Server (NTRS)
Stoms, R. M.
1984-01-01
Numerically-controlled 5-axis machine tool uses transformer and meter to determine and indicate whether tool is in home position, but lacks built-in test mode to check them. Tester makes possible test, and repair of components at machine rather then replace them when operation seems suspect.
Precision tool holder with flexure-adjustable, three degrees of freedom for a four-axis lathe
Bono, Matthew J [Pleasanton, CA; Hibbard, Robin L [Livermore, CA
2008-03-04
A precision tool holder for precisely positioning a single point cutting tool on 4-axis lathe, such that the center of the radius of the tool nose is aligned with the B-axis of the machine tool, so as to facilitate the machining of precision meso-scale components with complex three-dimensional shapes with sub-.mu.m accuracy on a four-axis lathe. The device is designed to fit on a commercial diamond turning machine and can adjust the cutting tool position in three orthogonal directions with sub-micrometer resolution. In particular, the tool holder adjusts the tool position using three flexure-based mechanisms, with two flexure mechanisms adjusting the lateral position of the tool to align the tool with the B-axis, and a third flexure mechanism adjusting the height of the tool. Preferably, the flexures are driven by manual micrometer adjusters. In this manner, this tool holder simplifies the process of setting a tool with sub-.mu.m accuracy, to substantially reduce the time required to set the tool.
Machine-learning-based real-bogus system for the HSC-SSP moving object detection pipeline
NASA Astrophysics Data System (ADS)
Lin, Hsing-Wen; Chen, Ying-Tung; Wang, Jen-Hung; Wang, Shiang-Yu; Yoshida, Fumi; Ip, Wing-Huen; Miyazaki, Satoshi; Terai, Tsuyoshi
2018-01-01
Machine-learning techniques are widely applied in many modern optical sky surveys, e.g., Pan-STARRS1, PTF/iPTF, and the Subaru/Hyper Suprime-Cam survey, to reduce human intervention in data verification. In this study, we have established a machine-learning-based real-bogus system to reject false detections in the Subaru/Hyper-Suprime-Cam Strategic Survey Program (HSC-SSP) source catalog. Therefore, the HSC-SSP moving object detection pipeline can operate more effectively due to the reduction of false positives. To train the real-bogus system, we use stationary sources as the real training set and "flagged" data as the bogus set. The training set contains 47 features, most of which are photometric measurements and shape moments generated from the HSC image reduction pipeline (hscPipe). Our system can reach a true positive rate (tpr) ˜96% with a false positive rate (fpr) ˜1% or tpr ˜99% at fpr ˜5%. Therefore, we conclude that stationary sources are decent real training samples, and using photometry measurements and shape moments can reject false positives effectively.
Refueling machine with relative positioning capability
Challberg, R.C.; Jones, C.R.
1998-12-15
A refueling machine is disclosed having relative positioning capability for refueling a nuclear reactor. The refueling machine includes a pair of articulated arms mounted on a refueling bridge. Each arm supports a respective telescoping mast. Each telescoping mast is designed to flex laterally in response to application of a lateral thrust on the end of the mast. A pendant mounted on the end of the mast carries an air-actuated grapple, television cameras, ultrasonic transducers and waterjet thrusters. The ultrasonic transducers are used to detect the gross position of the grapple relative to the bail of a nuclear fuel assembly in the fuel core. The television cameras acquire an image of the bail which is compared to a pre-stored image in computer memory. The pendant can be rotated until the television image and the pre-stored image match within a predetermined tolerance. Similarly, the waterjet thrusters can be used to apply lateral thrust to the end of the flexible mast to place the grapple in a fine position relative to the bail as a function of the discrepancy between the television and pre-stored images. 11 figs.
Refueling machine with relative positioning capability
Challberg, Roy Clifford; Jones, Cecil Roy
1998-01-01
A refueling machine having relative positioning capability for refueling a nuclear reactor. The refueling machine includes a pair of articulated arms mounted on a refueling bridge. Each arm supports a respective telescoping mast. Each telescoping mast is designed to flex laterally in response to application of a lateral thrust on the end of the mast. A pendant mounted on the end of the mast carries an air-actuated grapple, television cameras, ultrasonic transducers and waterjet thrusters. The ultrasonic transducers are used to detect the gross position of the grapple relative to the bail of a nuclear fuel assembly in the fuel core. The television cameras acquire an image of the bail which is compared to a pre-stored image in computer memory. The pendant can be rotated until the television image and the pre-stored image match within a predetermined tolerance. Similarly, the waterjet thrusters can be used to apply lateral thrust to the end of the flexible mast to place the grapple in a fine position relative to the bail as a function of the discrepancy between the television and pre-stored images.
Dominguez Veiga, Jose Juan; O'Reilly, Martin; Whelan, Darragh; Caulfield, Brian; Ward, Tomas E
2017-08-04
Inertial sensors are one of the most commonly used sources of data for human activity recognition (HAR) and exercise detection (ED) tasks. The time series produced by these sensors are generally analyzed through numerical methods. Machine learning techniques such as random forests or support vector machines are popular in this field for classification efforts, but they need to be supported through the isolation of a potentially large number of additionally crafted features derived from the raw data. This feature preprocessing step can involve nontrivial digital signal processing (DSP) techniques. However, in many cases, the researchers interested in this type of activity recognition problems do not possess the necessary technical background for this feature-set development. The study aimed to present a novel application of established machine vision methods to provide interested researchers with an easier entry path into the HAR and ED fields. This can be achieved by removing the need for deep DSP skills through the use of transfer learning. This can be done by using a pretrained convolutional neural network (CNN) developed for machine vision purposes for exercise classification effort. The new method should simply require researchers to generate plots of the signals that they would like to build classifiers with, store them as images, and then place them in folders according to their training label before retraining the network. We applied a CNN, an established machine vision technique, to the task of ED. Tensorflow, a high-level framework for machine learning, was used to facilitate infrastructure needs. Simple time series plots generated directly from accelerometer and gyroscope signals are used to retrain an openly available neural network (Inception), originally developed for machine vision tasks. Data from 82 healthy volunteers, performing 5 different exercises while wearing a lumbar-worn inertial measurement unit (IMU), was collected. The ability of the proposed method to automatically classify the exercise being completed was assessed using this dataset. For comparative purposes, classification using the same dataset was also performed using the more conventional approach of feature-extraction and classification using random forest classifiers. With the collected dataset and the proposed method, the different exercises could be recognized with a 95.89% (3827/3991) accuracy, which is competitive with current state-of-the-art techniques in ED. The high level of accuracy attained with the proposed approach indicates that the waveform morphologies in the time-series plots for each of the exercises is sufficiently distinct among the participants to allow the use of machine vision approaches. The use of high-level machine learning frameworks, coupled with the novel use of machine vision techniques instead of complex manually crafted features, may facilitate access to research in the HAR field for individuals without extensive digital signal processing or machine learning backgrounds. ©Jose Juan Dominguez Veiga, Martin O'Reilly, Darragh Whelan, Brian Caulfield, Tomas E Ward. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 04.08.2017.
O'Reilly, Martin; Whelan, Darragh; Caulfield, Brian; Ward, Tomas E
2017-01-01
Background Inertial sensors are one of the most commonly used sources of data for human activity recognition (HAR) and exercise detection (ED) tasks. The time series produced by these sensors are generally analyzed through numerical methods. Machine learning techniques such as random forests or support vector machines are popular in this field for classification efforts, but they need to be supported through the isolation of a potentially large number of additionally crafted features derived from the raw data. This feature preprocessing step can involve nontrivial digital signal processing (DSP) techniques. However, in many cases, the researchers interested in this type of activity recognition problems do not possess the necessary technical background for this feature-set development. Objective The study aimed to present a novel application of established machine vision methods to provide interested researchers with an easier entry path into the HAR and ED fields. This can be achieved by removing the need for deep DSP skills through the use of transfer learning. This can be done by using a pretrained convolutional neural network (CNN) developed for machine vision purposes for exercise classification effort. The new method should simply require researchers to generate plots of the signals that they would like to build classifiers with, store them as images, and then place them in folders according to their training label before retraining the network. Methods We applied a CNN, an established machine vision technique, to the task of ED. Tensorflow, a high-level framework for machine learning, was used to facilitate infrastructure needs. Simple time series plots generated directly from accelerometer and gyroscope signals are used to retrain an openly available neural network (Inception), originally developed for machine vision tasks. Data from 82 healthy volunteers, performing 5 different exercises while wearing a lumbar-worn inertial measurement unit (IMU), was collected. The ability of the proposed method to automatically classify the exercise being completed was assessed using this dataset. For comparative purposes, classification using the same dataset was also performed using the more conventional approach of feature-extraction and classification using random forest classifiers. Results With the collected dataset and the proposed method, the different exercises could be recognized with a 95.89% (3827/3991) accuracy, which is competitive with current state-of-the-art techniques in ED. Conclusions The high level of accuracy attained with the proposed approach indicates that the waveform morphologies in the time-series plots for each of the exercises is sufficiently distinct among the participants to allow the use of machine vision approaches. The use of high-level machine learning frameworks, coupled with the novel use of machine vision techniques instead of complex manually crafted features, may facilitate access to research in the HAR field for individuals without extensive digital signal processing or machine learning backgrounds. PMID:28778851
B-mode Ultrasound Versus Color Doppler Twinkling Artifact in Detecting Kidney Stones
Harper, Jonathan D.; Hsi, Ryan S.; Shah, Anup R.; Dighe, Manjiri K.; Carter, Stephen J.; Moshiri, Mariam; Paun, Marla; Lu, Wei; Bailey, Michael R.
2013-01-01
Abstract Purpose To compare color Doppler twinkling artifact and B-mode ultrasonography in detecting kidney stones. Patients and Methods Nine patients with recent CT scans prospectively underwent B-mode and twinkling artifact color Doppler ultrasonography on a commercial ultrasound machine. Video segments of the upper pole, interpolar area, and lower pole were created, randomized, and independently reviewed by three radiologists. Receiver operator characteristics were determined. Results There were 32 stones in 18 kidneys with a mean stone size of 8.9±7.5 mm. B-mode ultrasonography had 71% sensitivity, 48% specificity, 52% positive predictive value, and 68% negative predictive value, while twinkling artifact Doppler ultrasonography had 56% sensitivity, 74% specificity, 62% positive predictive value, and 68% negative predictive value. Conclusions When used alone, B-mode is more sensitive, but twinkling artifact is more specific in detecting kidney stones. This information may help users employ twinkling and B-mode to identify stones and developers to improve signal processing to harness the fundamental acoustic differences to ultimately improve stone detection. PMID:23067207
Research on the EDM Technology for Micro-holes at Complex Spatial Locations
NASA Astrophysics Data System (ADS)
Y Liu, J.; Guo, J. M.; Sun, D. J.; Cai, Y. H.; Ding, L. T.; Jiang, H.
2017-12-01
For the demands on machining micro-holes at complex spatial location, several key technical problems are conquered such as micro-Electron Discharge Machining (micro-EDM) power supply system’s development, the host structure’s design and machining process technical. Through developing low-voltage power supply circuit, high-voltage circuit, micro and precision machining circuit and clearance detection system, the narrow pulse and high frequency six-axis EDM machining power supply system is developed to meet the demands on micro-hole discharging machining. With the method of combining the CAD structure design, CAE simulation analysis, modal test, ODS (Operational Deflection Shapes) test and theoretical analysis, the host construction and key axes of the machine tool are optimized to meet the position demands of the micro-holes. Through developing the special deionized water filtration system to make sure that the machining process is stable enough. To verify the machining equipment and processing technical developed in this paper through developing the micro-hole’s processing flow and test on the real machine tool. As shown in the final test results: the efficient micro-EDM machining pulse power supply system, machine tool host system, deionized filtration system and processing method developed in this paper meet the demands on machining micro-holes at complex spatial locations.
Motion Estimation Utilizing Range Detection-Enhanced Visual Odometry
NASA Technical Reports Server (NTRS)
Morris, Daniel Dale (Inventor); Chang, Hong (Inventor); Friend, Paul Russell (Inventor); Chen, Qi (Inventor); Graf, Jodi Seaborn (Inventor)
2016-01-01
A motion determination system is disclosed. The system may receive a first and a second camera image from a camera, the first camera image received earlier than the second camera image. The system may identify corresponding features in the first and second camera images. The system may receive range data comprising at least one of a first and a second range data from a range detection unit, corresponding to the first and second camera images, respectively. The system may determine first positions and the second positions of the corresponding features using the first camera image and the second camera image. The first positions or the second positions may be determined by also using the range data. The system may determine a change in position of the machine based on differences between the first and second positions, and a VO-based velocity of the machine based on the determined change in position.
Machine Learning Predictions of a Multiresolution Climate Model Ensemble
NASA Astrophysics Data System (ADS)
Anderson, Gemma J.; Lucas, Donald D.
2018-05-01
Statistical models of high-resolution climate models are useful for many purposes, including sensitivity and uncertainty analyses, but building them can be computationally prohibitive. We generated a unique multiresolution perturbed parameter ensemble of a global climate model. We use a novel application of a machine learning technique known as random forests to train a statistical model on the ensemble to make high-resolution model predictions of two important quantities: global mean top-of-atmosphere energy flux and precipitation. The random forests leverage cheaper low-resolution simulations, greatly reducing the number of high-resolution simulations required to train the statistical model. We demonstrate that high-resolution predictions of these quantities can be obtained by training on an ensemble that includes only a small number of high-resolution simulations. We also find that global annually averaged precipitation is more sensitive to resolution changes than to any of the model parameters considered.
NASA Astrophysics Data System (ADS)
Wu, Qi
2010-03-01
Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.
Template For Aiming An X-Ray Machine
NASA Technical Reports Server (NTRS)
Morphet, W. J.
1994-01-01
Relatively inexpensive template helps in aligning x-ray machine with phenolic ring to be inspected for flaws. Phenolic ring in original application part of rocket nozzle. Concept also applicable to x-ray inspection of other rings. Template contains alignment holes for adjusting orientation, plus target spot for adjusting lateral position, of laser spotting beam. (Laser spotting beam coincides with the x-ray beam, turned on later, after alignment completed.) Use of template decreases positioning time and error, providing consistent sensitivity for detection of flaws.
NASA Technical Reports Server (NTRS)
Roman, N. G.; Warren, W. H., Jr.
1984-01-01
The machine-readable, character-coded version of the catalog, as it is currently being distributed from the Astronomical Data Center(ADC), is described. The format and data provided in the magnetic tape version differ somewhat from those of the published catalog, which was also produced from a tape prepared at the ADC. The primary catalog data are positions and proper motions (equinox 1950.0) for 14597 stars.
Slide system for machine tools
Douglass, S.S.; Green, W.L.
1980-06-12
The present invention relates to a machine tool which permits the machining of nonaxisymmetric surfaces on a workpiece while rotating the workpiece about a central axis of rotation. The machine tool comprises a conventional two-slide system (X-Y) with one of these slides being provided with a relatively short travel high-speed auxiliary slide which carries the material-removing tool. The auxiliary slide is synchronized with the spindle speed and the position of the other two slides and provides a high-speed reciprocating motion required for the displacement of the cutting tool for generating a nonaxisymmetric surface at a selected location on the workpiece.
Slide system for machine tools
Douglass, Spivey S.; Green, Walter L.
1982-01-01
The present invention relates to a machine tool which permits the machining of nonaxisymmetric surfaces on a workpiece while rotating the workpiece about a central axis of rotation. The machine tool comprises a conventional two-slide system (X-Y) with one of these slides being provided with a relatively short travel high-speed auxiliary slide which carries the material-removing tool. The auxiliary slide is synchronized with the spindle speed and the position of the other two slides and provides a high-speed reciprocating motion required for the displacement of the cutting tool for generating a nonaxisymmetric surface at a selected location on the workpiece.
Air-Bearing Table for Machine Shops
NASA Technical Reports Server (NTRS)
Ambrisco, D.
1986-01-01
Frequent workpiece repositioning made easier. Air-bearing table facilitates movement of heavy workpiece during machining or between repeated operations at different positions. Table assembly consists of workpiece supporting fixture riding on air bearing. Table especially useful for inertia welding, in which ease of mobility is important.
Antibiotic Residues in Milk from Three Popular Kenyan Milk Vending Machines.
Kosgey, Amos; Shitandi, Anakalo; Marion, Jason W
2018-05-01
Milk vending machines (MVMs) are growing in popularity in Kenya and worldwide. Milk vending machines dispense varying quantities of locally sourced, pasteurized milk. The Kenya Dairy Board has a regulatory framework, but surveillance is weak because of several factors. Milk vending machines' milk is not routinely screened for antibiotics, thereby increasing potential for antibiotic misuse. To investigate, a total of 80 milk samples from four commercial providers ( N = 25), street vendors ( N = 21), and three MVMs ( N = 34) were collected and screened in Eldoret, Kenya. Antibiotic residue surveillance occurred during December 2016 and January 2017 using Idexx SNAP ® tests for tetracyclines, sulfamethazine, beta-lactams, and gentamicin. Overall, 24% of MVM samples and 24% of street vendor samples were presumably positive for at least one antibiotic. No commercial samples were positive. Research into cost-effective screening methods and increased monitoring by food safety agencies are needed to uphold hazard analysis and critical control point for improving antibiotic stewardship throughout the Kenyan private dairy industry.
Monitoring Hitting Load in Tennis Using Inertial Sensors and Machine Learning.
Whiteside, David; Cant, Olivia; Connolly, Molly; Reid, Machar
2017-10-01
Quantifying external workload is fundamental to training prescription in sport. In tennis, global positioning data are imprecise and fail to capture hitting loads. The current gold standard (manual notation) is time intensive and often not possible given players' heavy travel schedules. To develop an automated stroke-classification system to help quantify hitting load in tennis. Nineteen athletes wore an inertial measurement unit (IMU) on their wrist during 66 video-recorded training sessions. Video footage was manually notated such that known shot type (serve, rally forehand, slice forehand, forehand volley, rally backhand, slice backhand, backhand volley, smash, or false positive) was associated with the corresponding IMU data for 28,582 shots. Six types of machine-learning models were then constructed to classify true shot type from the IMU signals. Across 10-fold cross-validation, a cubic-kernel support vector machine classified binned shots (overhead, forehand, or backhand) with an accuracy of 97.4%. A second cubic-kernel support vector machine achieved 93.2% accuracy when classifying all 9 shot types. With a view to monitoring external load, the combination of miniature inertial sensors and machine learning offers a practical and automated method of quantifying shot counts and discriminating shot types in elite tennis players.
Machine Learning for Social Services: A Study of Prenatal Case Management in Illinois.
Pan, Ian; Nolan, Laura B; Brown, Rashida R; Khan, Romana; van der Boor, Paul; Harris, Daniel G; Ghani, Rayid
2017-06-01
To evaluate the positive predictive value of machine learning algorithms for early assessment of adverse birth risk among pregnant women as a means of improving the allocation of social services. We used administrative data for 6457 women collected by the Illinois Department of Human Services from July 2014 to May 2015 to develop a machine learning model for adverse birth prediction and improve upon the existing paper-based risk assessment. We compared different models and determined the strongest predictors of adverse birth outcomes using positive predictive value as the metric for selection. Machine learning algorithms performed similarly, outperforming the current paper-based risk assessment by up to 36%; a refined paper-based assessment outperformed the current assessment by up to 22%. We estimate that these improvements will allow 100 to 170 additional high-risk pregnant women screened for program eligibility each year to receive services that would have otherwise been unobtainable. Our analysis exhibits the potential for machine learning to move government agencies toward a more data-informed approach to evaluating risk and providing social services. Overall, such efforts will improve the efficiency of allocating resource-intensive interventions.
Ion beam machining error control and correction for small scale optics.
Xie, Xuhui; Zhou, Lin; Dai, Yifan; Li, Shengyi
2011-09-20
Ion beam figuring (IBF) technology for small scale optical components is discussed. Since the small removal function can be obtained in IBF, it makes computer-controlled optical surfacing technology possible to machine precision centimeter- or millimeter-scale optical components deterministically. Using a small ion beam to machine small optical components, there are some key problems, such as small ion beam positioning on the optical surface, material removal rate, ion beam scanning pitch control on the optical surface, and so on, that must be seriously considered. The main reasons for the problems are that it is more sensitive to the above problems than a big ion beam because of its small beam diameter and lower material ratio. In this paper, we discuss these problems and their influences in machining small optical components in detail. Based on the identification-compensation principle, an iterative machining compensation method is deduced for correcting the positioning error of an ion beam with the material removal rate estimated by a selected optimal scanning pitch. Experiments on ϕ10 mm Zerodur planar and spherical samples are made, and the final surface errors are both smaller than λ/100 measured by a Zygo GPI interferometer.
CT fluoroscopy-assisted puncture of thoracic and abdominal masses: a randomized trial.
Kirchner, Johannes; Kickuth, Ralph; Laufer, Ulf; Schilling, Esther Maria; Adams, Stephan; Liermann, Dieter
2002-03-01
We investigated the benefit of real-time guidance of interventional punctures by means of computed tomography fluoroscopy (CTF) compared with the conventional sequential acquisition guidance. In a prospective randomized trial, 75 patients underwent either CTF-guided (group A, n = 50) or sequential CT-guided (group B, n = 25) punctures of thoracic (n = 29) or abdominal (n = 46) masses. CTF was performed on the CT machine (Somatom Plus 4 Power, Siemens Corp., Forchheim, Germany) equipped with the C.A.R.E. Vision application (tube voltage 120 kV, tube current 50 mA, rotational time 0.75 s, slice thickness 10 mm, 8 frames/s). The average procedure time showed a statistically significant difference between the two study groups (group A: 564 s, group B 795 s, P = 0.0032). The mean total mAs was 7089 mAs for the CTF and 4856 mAs for the sequential image-guided intervention, respectively. The sensitivity was 71% specificity 100% positive predictive value 100% and negative predictive value 60% for the CTF-guided puncture, and 68, 100, 100 and 50% for sequential CT, respectively. CTF guidance realizes a time-saving but increases the radiation exposure dosage.
Modes of mechanical ventilation for the operating room.
Ball, Lorenzo; Dameri, Maddalena; Pelosi, Paolo
2015-09-01
Most patients undergoing surgical procedures need to be mechanically ventilated, because of the impact of several drugs administered at induction and during maintenance of general anaesthesia on respiratory function. Optimization of intraoperative mechanical ventilation can reduce the incidence of post-operative pulmonary complications and improve the patient's outcome. Preoxygenation at induction of general anaesthesia prolongs the time window for safe intubation, reducing the risk of hypoxia and overweighs the potential risk of reabsorption atelectasis. Non-invasive positive pressure ventilation delivered through different interfaces should be considered at the induction of anaesthesia morbidly obese patients. Anaesthesia ventilators are becoming increasingly sophisticated, integrating many functions that were once exclusive to intensive care. Modern anaesthesia machines provide high performances in delivering the desired volumes and pressures accurately and precisely, including assisted ventilation modes. Therefore, the physicians should be familiar with the potential and pitfalls of the most commonly used intraoperative ventilation modes: volume-controlled, pressure-controlled, dual-controlled and assisted ventilation. Although there is no clear evidence to support the advantage of any one of these ventilation modes over the others, protective mechanical ventilation with low tidal volume and low levels of positive end-expiratory pressure (PEEP) should be considered in patients undergoing surgery. The target tidal volume should be calculated based on the predicted or ideal body weight rather than on the actual body weight. To optimize ventilation monitoring, anaesthesia machines should include end-inspiratory and end-expiratory pause as well as flow-volume loop curves. The routine administration of high PEEP levels should be avoided, as this may lead to haemodynamic impairment and fluid overload. Higher PEEP might be considered during surgery longer than 3 h, laparoscopy in the Trendelenburg position and in patients with body mass index >35 kg/m(2). Large randomized trials are warranted to identify subgroups of patients and the type of surgery that can potentially benefit from specific ventilation modes or ventilation settings. Copyright © 2015 Elsevier Ltd. All rights reserved.
Exploring a potential energy surface by machine learning for characterizing atomic transport
NASA Astrophysics Data System (ADS)
Kanamori, Kenta; Toyoura, Kazuaki; Honda, Junya; Hattori, Kazuki; Seko, Atsuto; Karasuyama, Masayuki; Shitara, Kazuki; Shiga, Motoki; Kuwabara, Akihide; Takeuchi, Ichiro
2018-03-01
We propose a machine-learning method for evaluating the potential barrier governing atomic transport based on the preferential selection of dominant points for atomic transport. The proposed method generates numerous random samples of the entire potential energy surface (PES) from a probabilistic Gaussian process model of the PES, which enables defining the likelihood of the dominant points. The robustness and efficiency of the method are demonstrated on a dozen model cases for proton diffusion in oxides, in comparison with a conventional nudge elastic band method.
Fiber tractography using machine learning.
Neher, Peter F; Côté, Marc-Alexandre; Houde, Jean-Christophe; Descoteaux, Maxime; Maier-Hein, Klaus H
2017-09-01
We present a fiber tractography approach based on a random forest classification and voting process, guiding each step of the streamline progression by directly processing raw diffusion-weighted signal intensities. For comparison to the state-of-the-art, i.e. tractography pipelines that rely on mathematical modeling, we performed a quantitative and qualitative evaluation with multiple phantom and in vivo experiments, including a comparison to the 96 submissions of the ISMRM tractography challenge 2015. The results demonstrate the vast potential of machine learning for fiber tractography. Copyright © 2017 Elsevier Inc. All rights reserved.
Epidermis area detection for immunofluorescence microscopy
NASA Astrophysics Data System (ADS)
Dovganich, Andrey; Krylov, Andrey; Nasonov, Andrey; Makhneva, Natalia
2018-04-01
We propose a novel image segmentation method for immunofluorescence microscopy images of skin tissue for the diagnosis of various skin diseases. The segmentation is based on machine learning algorithms. The feature vector is filled by three groups of features: statistical features, Laws' texture energy measures and local binary patterns. The images are preprocessed for better learning. Different machine learning algorithms have been used and the best results have been obtained with random forest algorithm. We use the proposed method to detect the epidermis region as a part of pemphigus diagnosis system.
Positive-unlabeled learning for disease gene identification
Yang, Peng; Li, Xiao-Li; Mei, Jian-Ping; Kwoh, Chee-Keong; Ng, See-Kiong
2012-01-01
Background: Identifying disease genes from human genome is an important but challenging task in biomedical research. Machine learning methods can be applied to discover new disease genes based on the known ones. Existing machine learning methods typically use the known disease genes as the positive training set P and the unknown genes as the negative training set N (non-disease gene set does not exist) to build classifiers to identify new disease genes from the unknown genes. However, such kind of classifiers is actually built from a noisy negative set N as there can be unknown disease genes in N itself. As a result, the classifiers do not perform as well as they could be. Result: Instead of treating the unknown genes as negative examples in N, we treat them as an unlabeled set U. We design a novel positive-unlabeled (PU) learning algorithm PUDI (PU learning for disease gene identification) to build a classifier using P and U. We first partition U into four sets, namely, reliable negative set RN, likely positive set LP, likely negative set LN and weak negative set WN. The weighted support vector machines are then used to build a multi-level classifier based on the four training sets and positive training set P to identify disease genes. Our experimental results demonstrate that our proposed PUDI algorithm outperformed the existing methods significantly. Conclusion: The proposed PUDI algorithm is able to identify disease genes more accurately by treating the unknown data more appropriately as unlabeled set U instead of negative set N. Given that many machine learning problems in biomedical research do involve positive and unlabeled data instead of negative data, it is possible that the machine learning methods for these problems can be further improved by adopting PU learning methods, as we have done here for disease gene identification. Availability and implementation: The executable program and data are available at http://www1.i2r.a-star.edu.sg/∼xlli/PUDI/PUDI.html. Contact: xlli@i2r.a-star.edu.sg or yang0293@e.ntu.edu.sg Supplementary information: Supplementary Data are available at Bioinformatics online. PMID:22923290
NASA Technical Reports Server (NTRS)
Warren, Wayne H., Jr.
1990-01-01
The machine readable version of the catalog, as it is currently being distributed from the Astronomical Data Center, is described. The Zodiacal Zone Catalog is a catalog of positions and proper motions for stars in the magnitude range where m sub v is between 4 and 10, lying within 16 deg of the ecliptic and north of declination -30 deg. The catalog contains positions and proper motions, at epoch, for equator and equinox J2000.0, magnitudes and spectral types taken mostly from the Smithsonian Astrophysical Observatory Star Catalog, and reference positions and proper motions for equinox and epoch B1950.0.
Kuo, Ching-Yen; Yu, Liang-Chin; Chen, Hou-Chaung; Chan, Chien-Lung
2018-01-01
The aims of this study were to compare the performance of machine learning methods for the prediction of the medical costs associated with spinal fusion in terms of profit or loss in Taiwan Diagnosis-Related Groups (Tw-DRGs) and to apply these methods to explore the important factors associated with the medical costs of spinal fusion. A data set was obtained from a regional hospital in Taoyuan city in Taiwan, which contained data from 2010 to 2013 on patients of Tw-DRG49702 (posterior and other spinal fusion without complications or comorbidities). Naïve-Bayesian, support vector machines, logistic regression, C4.5 decision tree, and random forest methods were employed for prediction using WEKA 3.8.1. Five hundred thirty-two cases were categorized as belonging to the Tw-DRG49702 group. The mean medical cost was US $4,549.7, and the mean age of the patients was 62.4 years. The mean length of stay was 9.3 days. The length of stay was an important variable in terms of determining medical costs for patients undergoing spinal fusion. The random forest method had the best predictive performance in comparison to the other methods, achieving an accuracy of 84.30%, a sensitivity of 71.4%, a specificity of 92.2%, and an AUC of 0.904. Our study demonstrated that the random forest model can be employed to predict the medical costs of Tw-DRG49702, and could inform hospital strategy in terms of increasing the financial management efficiency of this operation.
Parallel Processing and Scientific Applications
1992-11-30
Lattice QCD Calculations on the Connection Machine), SIAM News 24, 1 (May 1991) 5. C. F. Baillie and D. A. Johnston, Crumpling Dynamically Triangulated...hypercubic lattice ; in the second, the surface is randomly triangulated once at the beginning of the simulation; and in the third the random...Sharpe, QCD with Dynamical Wilson Fermions 1I, Phys. Rev. D44, 3272 (1991), 8. R. Gupta and C. F. Baillie, Critical Behavior of the 2D XY Model, Phys
The dynamic analysis of drum roll lathe for machining of rollers
NASA Astrophysics Data System (ADS)
Qiao, Zheng; Wu, Dongxu; Wang, Bo; Li, Guo; Wang, Huiming; Ding, Fei
2014-08-01
An ultra-precision machine tool for machining of the roller has been designed and assembled, and due to the obvious impact which dynamic characteristic of machine tool has on the quality of microstructures on the roller surface, the dynamic characteristic of the existing machine tool is analyzed in this paper, so is the influence of circumstance that a large scale and slender roller is fixed in the machine on dynamic characteristic of the machine tool. At first, finite element model of the machine tool is built and simplified, and based on that, the paper carries on with the finite element mode analysis and gets the natural frequency and shaking type of four steps of the machine tool. According to the above model analysis results, the weak stiffness systems of machine tool can be further improved and the reasonable bandwidth of control system of the machine tool can be designed. In the end, considering the shock which is caused by Z axis as a result of fast positioning frequently to feeding system and cutting tool, transient analysis is conducted by means of ANSYS analysis in this paper. Based on the results of transient analysis, the vibration regularity of key components of machine tool and its impact on cutting process are explored respectively.
Yip, T C-F; Ma, A J; Wong, V W-S; Tse, Y-K; Chan, H L-Y; Yuen, P-C; Wong, G L-H
2017-08-01
Non-alcoholic fatty liver disease (NAFLD) affects 20%-40% of the general population in developed countries and is an increasingly important cause of hepatocellular carcinoma. Electronic medical records facilitate large-scale epidemiological studies, existing NAFLD scores often require clinical and anthropometric parameters that may not be captured in those databases. To develop and validate a laboratory parameter-based machine learning model to detect NAFLD for the general population. We randomly divided 922 subjects from a population screening study into training and validation groups; NAFLD was diagnosed by proton-magnetic resonance spectroscopy. On the basis of machine learning from 23 routine clinical and laboratory parameters after elastic net regulation, we evaluated the logistic regression, ridge regression, AdaBoost and decision tree models. The areas under receiver-operating characteristic curve (AUROC) of models in validation group were compared. Six predictors including alanine aminotransferase, high-density lipoprotein cholesterol, triglyceride, haemoglobin A 1c , white blood cell count and the presence of hypertension were selected. The NAFLD ridge score achieved AUROC of 0.87 (95% CI 0.83-0.90) and 0.88 (0.84-0.91) in the training and validation groups respectively. Using dual cut-offs of 0.24 and 0.44, NAFLD ridge score achieved 92% (86%-96%) sensitivity and 90% (86%-93%) specificity with corresponding negative and positive predictive values of 96% (91%-98%) and 69% (59%-78%), and 87% of overall accuracy among 70% of classifiable subjects in the validation group; 30% of subjects remained indeterminate. NAFLD ridge score is a simple and robust reference comparable to existing NAFLD scores to exclude NAFLD patients in epidemiological studies. © 2017 John Wiley & Sons Ltd.
Shovel, Louisa; Max, Bryan; Correll, Darin J
2016-01-01
The purpose of this study was to see if an instructional card, attached to the PCA machine following total hip arthroplasty describing proper use of the device, would positively affect subjects' understanding of device usage, pain scores, pain medication consumption and satisfaction. Eighty adults undergoing total hip replacements who had been prescribed PCA were randomized into two study groups. Forty participants received the standard post-operative instruction on PCA device usage at our institution. The other 40 participants received the standard of care in addition to being given a typed instructional card immediately post-operatively, describing proper PCA device use. This card was attached to the PCA device during their recovery period. On post-operative day one, each patient completed a questionnaire on PCA usage, pain scores and satisfaction scores. The pain scores in the Instructional Card group were significantly lower than the Control group (p = 0.024). Subjects' understanding of PCA usage was also improved in the Instructional Card group for six of the seven questions asked. The findings from this study strongly support that postoperative patient information on proper PCA use by means of an instructional card improves pain control and hence the overall recovery for patients undergoing surgery. In addition, through improved understanding it adds an important safety feature in that patients and potentially their family members and/or friends may refrain from PCA-by-proxy. This article demonstrates that the simple intervention of adding an instructional card to a PCA machine is an effective method to improve patients' knowledge as well as pain control and potentially increase the safety of the device use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yahya, Noorazrul, E-mail: noorazrul.yahya@research.uwa.edu.au; Ebert, Martin A.; Bulsara, Max
Purpose: Given the paucity of available data concerning radiotherapy-induced urinary toxicity, it is important to ensure derivation of the most robust models with superior predictive performance. This work explores multiple statistical-learning strategies for prediction of urinary symptoms following external beam radiotherapy of the prostate. Methods: The performance of logistic regression, elastic-net, support-vector machine, random forest, neural network, and multivariate adaptive regression splines (MARS) to predict urinary symptoms was analyzed using data from 754 participants accrued by TROG03.04-RADAR. Predictive features included dose-surface data, comorbidities, and medication-intake. Four symptoms were analyzed: dysuria, haematuria, incontinence, and frequency, each with three definitions (grade ≥more » 1, grade ≥ 2 and longitudinal) with event rate between 2.3% and 76.1%. Repeated cross-validations producing matched models were implemented. A synthetic minority oversampling technique was utilized in endpoints with rare events. Parameter optimization was performed on the training data. Area under the receiver operating characteristic curve (AUROC) was used to compare performance using sample size to detect differences of ≥0.05 at the 95% confidence level. Results: Logistic regression, elastic-net, random forest, MARS, and support-vector machine were the highest-performing statistical-learning strategies in 3, 3, 3, 2, and 1 endpoints, respectively. Logistic regression, MARS, elastic-net, random forest, neural network, and support-vector machine were the best, or were not significantly worse than the best, in 7, 7, 5, 5, 3, and 1 endpoints. The best-performing statistical model was for dysuria grade ≥ 1 with AUROC ± standard deviation of 0.649 ± 0.074 using MARS. For longitudinal frequency and dysuria grade ≥ 1, all strategies produced AUROC>0.6 while all haematuria endpoints and longitudinal incontinence models produced AUROC<0.6. Conclusions: Logistic regression and MARS were most likely to be the best-performing strategy for the prediction of urinary symptoms with elastic-net and random forest producing competitive results. The predictive power of the models was modest and endpoint-dependent. New features, including spatial dose maps, may be necessary to achieve better models.« less
Effective dust control systems on concrete dowel drilling machinery.
Echt, Alan S; Sanderson, Wayne T; Mead, Kenneth R; Feng, H Amy; Farwick, Daniel R; Farwick, Dawn Ramsey
2016-09-01
Rotary-type percussion dowel drilling machines, which drill horizontal holes in concrete pavement, have been documented to produce respirable crystalline silica concentrations above recommended exposure criteria. This places operators at potential risk for developing health effects from exposure. United States manufacturers of these machines offer optional dust control systems. The effectiveness of the dust control systems to reduce respirable dust concentrations on two types of drilling machines was evaluated under controlled conditions with the machines operating inside large tent structures in an effort to eliminate secondary exposure sources not related to the dowel-drilling operation. Area air samples were collected at breathing zone height at three locations around each machine. Through equal numbers of sampling rounds with the control systems randomly selected to be on or off, the control systems were found to significantly reduce respirable dust concentrations from a geometric mean of 54 mg per cubic meter to 3.0 mg per cubic meter on one machine and 57 mg per cubic meter to 5.3 mg per cubic meter on the other machine. This research shows that the dust control systems can dramatically reduce respirable dust concentrations by over 90% under controlled conditions. However, these systems need to be evaluated under actual work conditions to determine their effectiveness in reducing worker exposures to crystalline silica below hazardous levels.
Monroy-Parada, Doris Xiomara; Ángeles Moya, María; José Bosqued, María; López, Lázaro; Rodríguez-Artalejo, Fernando; Royo-Bordonada, Miguel Ángel
2016-06-09
Policies restricting access to sugary drinks and unhealthy foods in the school environment are associated with healthier consumption patterns. In 2010, Spain approved a Consensus Document regarding Food at Schools with nutritional criteria to improve the nutritional profile of foods and drinks served at schools. The objective of this study was to describe the frequency of food and drink vending machines at secondary schools in Madrid, the products offered at them and their nutritional profile. Cross-sectional study of a random sample of 330 secondary schools in Madrid in 2014-2015. The characteristics of the schools and the existence of vending machines were recorded through the internet and by telephone interview. The products offered in a representative sample of 6 vending machines were identified by in situ inspection, and its nutritional composition was taken from its labeling. Finally, the nutritional profile of each product was analyzed with the United Kingdom profile model, which classifies products as healthy and less healthy. The prevalence of vending machines was 17.3%. Among the products offered, 80.5% were less healthy food and drinks (high in energy, fat or sugar and poor in nutrients) and 10.5% were healthy products. Vending machines are common at secondary schools in Madrid. Most products are vending machines are still less healthy.
Risk estimation using probability machines.
Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D
2014-03-01
Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.
Defining and Testing the Influence of Servo System Response on Machine Tool Compliance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopkins, D J
2004-03-24
Compliance can be defined as the measurement of displacement per unit of force applied e.g. nano-meters per Newton (m/N). Compliance is the reciprocal of stiffness. High stiffness means low compliance and visa versa. It is an important factor in machine tool characteristics because it reflects the ability of the machine axis to maintain a desired position as it encounters a force or torque. Static compliance is a measurement made with a constant force applied e.g. the average depth of cut. Dynamic compliance is a measurement made as a function of frequency, e.g. a fast too servo (FTS) that applies amore » varying cutting force or load, interrupted cuts and external disturbances such as ground vibrations or air conditioning induced forces on the machine. Compliance can be defined for both a linear and rotary axis of a machine tool. However, to properly define compliance for a rotary axis, the axis must allow a commanded angular position. Note that this excludes velocity only axes. In this paper, several factors are discussed that affect compliance but emphasis is placed on how the machine servo system plays a key role in compliance at low to mid frequency regions. The paper discusses several techniques for measuring compliance and provides examples of results from these measurements.« less
Intelligent image processing for machine safety
NASA Astrophysics Data System (ADS)
Harvey, Dennis N.
1994-10-01
This paper describes the use of intelligent image processing as a machine guarding technology. One or more color, linear array cameras are positioned to view the critical region(s) around a machine tool or other piece of manufacturing equipment. The image data is processed to provide indicators of conditions dangerous to the equipment via color content, shape content, and motion content. The data from these analyses is then sent to a threat evaluator. The purpose of the evaluator is to determine if a potentially machine-damaging condition exists based on the analyses of color, shape, and motion, and on `knowledge' of the specific environment of the machine. The threat evaluator employs fuzzy logic as a means of dealing with uncertainty in the vision data.
Nakai, Yasushi; Takiguchi, Tetsuya; Matsui, Gakuyo; Yamaoka, Noriko; Takada, Satoshi
2017-10-01
Abnormal prosody is often evident in the voice intonations of individuals with autism spectrum disorders. We compared a machine-learning-based voice analysis with human hearing judgments made by 10 speech therapists for classifying children with autism spectrum disorders ( n = 30) and typical development ( n = 51). Using stimuli limited to single-word utterances, machine-learning-based voice analysis was superior to speech therapist judgments. There was a significantly higher true-positive than false-negative rate for machine-learning-based voice analysis but not for speech therapists. Results are discussed in terms of some artificiality of clinician judgments based on single-word utterances, and the objectivity machine-learning-based voice analysis adds to judging abnormal prosody.
Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets.
Shuryak, Igor
2017-01-01
The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected "signal"; (5) using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation.
Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets
Shuryak, Igor
2017-01-01
The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected “signal”; (5) using several machine learning methods to test the “signal’s” sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation. PMID:28068401
A Bayesian approach to in silico blood-brain barrier penetration modeling.
Martins, Ines Filipa; Teixeira, Ana L; Pinheiro, Luis; Falcao, Andre O
2012-06-25
The human blood-brain barrier (BBB) is a membrane that protects the central nervous system (CNS) by restricting the passage of solutes. The development of any new drug must take into account its existence whether for designing new molecules that target components of the CNS or, on the other hand, to find new substances that should not penetrate the barrier. Several studies in the literature have attempted to predict BBB penetration, so far with limited success and few, if any, application to real world drug discovery and development programs. Part of the reason is due to the fact that only about 2% of small molecules can cross the BBB, and the available data sets are not representative of that reality, being generally biased with an over-representation of molecules that show an ability to permeate the BBB (BBB positives). To circumvent this limitation, the current study aims to devise and use a new approach based on Bayesian statistics, coupled with state-of-the-art machine learning methods to produce a robust model capable of being applied in real-world drug research scenarios. The data set used, gathered from the literature, totals 1970 curated molecules, one of the largest for similar studies. Random Forests and Support Vector Machines were tested in various configurations against several chemical descriptor set combinations. Models were tested in a 5-fold cross-validation process, and the best one tested over an independent validation set. The best fitted model produced an overall accuracy of 95%, with a mean square contingency coefficient (ϕ) of 0.74, and showing an overall capacity for predicting BBB positives of 83% and 96% for determining BBB negatives. This model was adapted into a Web based tool made available for the whole community at http://b3pp.lasige.di.fc.ul.pt.
Osteoporosis risk prediction using machine learning and conventional methods.
Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won
2013-01-01
A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.
Machine vision based quality inspection of flat glass products
NASA Astrophysics Data System (ADS)
Zauner, G.; Schagerl, M.
2014-03-01
This application paper presents a machine vision solution for the quality inspection of flat glass products. A contact image sensor (CIS) is used to generate digital images of the glass surfaces. The presented machine vision based quality inspection at the end of the production line aims to classify five different glass defect types. The defect images are usually characterized by very little `image structure', i.e. homogeneous regions without distinct image texture. Additionally, these defect images usually consist of only a few pixels. At the same time the appearance of certain defect classes can be very diverse (e.g. water drops). We used simple state-of-the-art image features like histogram-based features (std. deviation, curtosis, skewness), geometric features (form factor/elongation, eccentricity, Hu-moments) and texture features (grey level run length matrix, co-occurrence matrix) to extract defect information. The main contribution of this work now lies in the systematic evaluation of various machine learning algorithms to identify appropriate classification approaches for this specific class of images. In this way, the following machine learning algorithms were compared: decision tree (J48), random forest, JRip rules, naive Bayes, Support Vector Machine (multi class), neural network (multilayer perceptron) and k-Nearest Neighbour. We used a representative image database of 2300 defect images and applied cross validation for evaluation purposes.
14 CFR 382.3 - What do the terms in this rule mean?
Code of Federal Regulations, 2011 CFR
2011-01-01
... and places between which those flights are performed. CPAP machine means a continuous positive airway pressure machine. Department or DOT means the United States Department of Transportation. Direct threat... learning disabilities. The term physical or mental impairment includes, but is not limited to, such...
5 CFR 532.279 - Special wage schedules for printing positions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Opaquer 4 Offset Press Helper 5 Bindery Machine Operator (Helper) 5 Film Assembler-Stripper (Single Flat-Single Color) 5 Platemaker (Single Color) 5 Film Assembler-Stripper (Partial and Composite Flats) 7... Cutter) 8 Bindery Machine Operator (Power Folder) 8 Film Assembler-Stripper (Multiple Flat-Multiple Color...
5 CFR 532.279 - Special wage schedules for printing positions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Opaquer 4 Offset Press Helper 5 Bindery Machine Operator (Helper) 5 Film Assembler-Stripper (Single Flat-Single Color) 5 Platemaker (Single Color) 5 Film Assembler-Stripper (Partial and Composite Flats) 7... Cutter) 8 Bindery Machine Operator (Power Folder) 8 Film Assembler-Stripper (Multiple Flat-Multiple Color...
14 CFR 382.3 - What do the terms in this rule mean?
Code of Federal Regulations, 2012 CFR
2012-01-01
... and places between which those flights are performed. CPAP machine means a continuous positive airway pressure machine. Department or DOT means the United States Department of Transportation. Direct threat... learning disabilities. The term physical or mental impairment includes, but is not limited to, such...
14 CFR 382.3 - What do the terms in this rule mean?
Code of Federal Regulations, 2013 CFR
2013-01-01
... and places between which those flights are performed. CPAP machine means a continuous positive airway pressure machine. Department or DOT means the United States Department of Transportation. Direct threat... learning disabilities. The term physical or mental impairment includes, but is not limited to, such...
NASA Astrophysics Data System (ADS)
Othman, Arsalan A.; Gloaguen, Richard
2017-09-01
Lithological mapping in mountainous regions is often impeded by limited accessibility due to relief. This study aims to evaluate (1) the performance of different supervised classification approaches using remote sensing data and (2) the use of additional information such as geomorphology. We exemplify the methodology in the Bardi-Zard area in NE Iraq, a part of the Zagros Fold - Thrust Belt, known for its chromite deposits. We highlighted the improvement of remote sensing geological classification by integrating geomorphic features and spatial information in the classification scheme. We performed a Maximum Likelihood (ML) classification method besides two Machine Learning Algorithms (MLA): Support Vector Machine (SVM) and Random Forest (RF) to allow the joint use of geomorphic features, Band Ratio (BR), Principal Component Analysis (PCA), spatial information (spatial coordinates) and multispectral data of the Advanced Space-borne Thermal Emission and Reflection radiometer (ASTER) satellite. The RF algorithm showed reliable results and discriminated serpentinite, talus and terrace deposits, red argillites with conglomerates and limestone, limy conglomerates and limestone conglomerates, tuffites interbedded with basic lavas, limestone and Metamorphosed limestone and reddish green shales. The best overall accuracy (∼80%) was achieved by Random Forest (RF) algorithms in the majority of the sixteen tested combination datasets.
NASA Astrophysics Data System (ADS)
Pham, Binh Thai; Prakash, Indra; Tien Bui, Dieu
2018-02-01
A hybrid machine learning approach of Random Subspace (RSS) and Classification And Regression Trees (CART) is proposed to develop a model named RSSCART for spatial prediction of landslides. This model is a combination of the RSS method which is known as an efficient ensemble technique and the CART which is a state of the art classifier. The Luc Yen district of Yen Bai province, a prominent landslide prone area of Viet Nam, was selected for the model development. Performance of the RSSCART model was evaluated through the Receiver Operating Characteristic (ROC) curve, statistical analysis methods, and the Chi Square test. Results were compared with other benchmark landslide models namely Support Vector Machines (SVM), single CART, Naïve Bayes Trees (NBT), and Logistic Regression (LR). In the development of model, ten important landslide affecting factors related with geomorphology, geology and geo-environment were considered namely slope angles, elevation, slope aspect, curvature, lithology, distance to faults, distance to rivers, distance to roads, and rainfall. Performance of the RSSCART model (AUC = 0.841) is the best compared with other popular landslide models namely SVM (0.835), single CART (0.822), NBT (0.821), and LR (0.723). These results indicate that performance of the RSSCART is a promising method for spatial landslide prediction.
Predicting human liver microsomal stability with machine learning techniques.
Sakiyama, Yojiro; Yuki, Hitomi; Moriya, Takashi; Hattori, Kazunari; Suzuki, Misaki; Shimada, Kaoru; Honma, Teruki
2008-02-01
To ensure a continuing pipeline in pharmaceutical research, lead candidates must possess appropriate metabolic stability in the drug discovery process. In vitro ADMET (absorption, distribution, metabolism, elimination, and toxicity) screening provides us with useful information regarding the metabolic stability of compounds. However, before the synthesis stage, an efficient process is required in order to deal with the vast quantity of data from large compound libraries and high-throughput screening. Here we have derived a relationship between the chemical structure and its metabolic stability for a data set of in-house compounds by means of various in silico machine learning such as random forest, support vector machine (SVM), logistic regression, and recursive partitioning. For model building, 1952 proprietary compounds comprising two classes (stable/unstable) were used with 193 descriptors calculated by Molecular Operating Environment. The results using test compounds have demonstrated that all classifiers yielded satisfactory results (accuracy > 0.8, sensitivity > 0.9, specificity > 0.6, and precision > 0.8). Above all, classification by random forest as well as SVM yielded kappa values of approximately 0.7 in an independent validation set, slightly higher than other classification tools. These results suggest that nonlinear/ensemble-based classification methods might prove useful in the area of in silico ADME modeling.
Mortality risk score prediction in an elderly population using machine learning.
Rose, Sherri
2013-03-01
Standard practice for prediction often relies on parametric regression methods. Interesting new methods from the machine learning literature have been introduced in epidemiologic studies, such as random forest and neural networks. However, a priori, an investigator will not know which algorithm to select and may wish to try several. Here I apply the super learner, an ensembling machine learning approach that combines multiple algorithms into a single algorithm and returns a prediction function with the best cross-validated mean squared error. Super learning is a generalization of stacking methods. I used super learning in the Study of Physical Performance and Age-Related Changes in Sonomans (SPPARCS) to predict death among 2,066 residents of Sonoma, California, aged 54 years or more during the period 1993-1999. The super learner for predicting death (risk score) improved upon all single algorithms in the collection of algorithms, although its performance was similar to that of several algorithms. Super learner outperformed the worst algorithm (neural networks) by 44% with respect to estimated cross-validated mean squared error and had an R2 value of 0.201. The improvement of super learner over random forest with respect to R2 was approximately 2-fold. Alternatives for risk score prediction include the super learner, which can provide improved performance.
Learning molecular energies using localized graph kernels
Ferré, Grégoire; Haut, Terry Scot; Barros, Kipton Marcos
2017-03-21
We report that recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturallymore » incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. Finally, we benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.« less
Trajectories of the ribosome as a Brownian nanomachine
Dashti, Ali; Schwander, Peter; Langlois, Robert; ...
2014-11-24
In a Brownian machine, there is a tiny device buffeted by the random motions of molecules in the environment, is capable of exploiting these thermal motions for many of the conformational changes in its work cycle. Such machines are now thought to be ubiquitous, with the ribosome, a molecular machine responsible for protein synthesis, increasingly regarded as prototypical. We present a new analytical approach capable of determining the free-energy landscape and the continuous trajectories of molecular machines from a large number of snapshots obtained by cryogenic electron microscopy. We demonstrate this approach in the context of experimental cryogenic electron microscopemore » images of a large ensemble of nontranslating ribosomes purified from yeast cells. The free-energy landscape is seen to contain a closed path of low energy, along which the ribosome exhibits conformational changes known to be associated with the elongation cycle. This approach allows model-free quantitative analysis of the degrees of freedom and the energy landscape underlying continuous conformational changes in nanomachines, including those important for biological function.« less
Building a profile of subjective well-being for social media users.
Chen, Lushi; Gong, Tao; Kosinski, Michal; Stillwell, David; Davidson, Robert L
2017-01-01
Subjective well-being includes 'affect' and 'satisfaction with life' (SWL). This study proposes a unified approach to construct a profile of subjective well-being based on social media language in Facebook status updates. We apply sentiment analysis to generate users' affect scores, and train a random forest model to predict SWL using affect scores and other language features of the status updates. Results show that: the computer-selected features resemble the key predictors of SWL as identified in early studies; the machine-predicted SWL is moderately correlated with the self-reported SWL (r = 0.36, p < 0.01), indicating that language-based assessment can constitute valid SWL measures; the machine-assessed affect scores resemble those reported in a previous experimental study; and the machine-predicted subjective well-being profile can also reflect other psychological traits like depression (r = 0.24, p < 0.01). This study provides important insights for psychological prediction using multiple, machine-assessed components and longitudinal or dense psychological assessment using social media language.
Building a profile of subjective well-being for social media users
Kosinski, Michal; Stillwell, David; Davidson, Robert L.
2017-01-01
Subjective well-being includes ‘affect’ and ‘satisfaction with life’ (SWL). This study proposes a unified approach to construct a profile of subjective well-being based on social media language in Facebook status updates. We apply sentiment analysis to generate users’ affect scores, and train a random forest model to predict SWL using affect scores and other language features of the status updates. Results show that: the computer-selected features resemble the key predictors of SWL as identified in early studies; the machine-predicted SWL is moderately correlated with the self-reported SWL (r = 0.36, p < 0.01), indicating that language-based assessment can constitute valid SWL measures; the machine-assessed affect scores resemble those reported in a previous experimental study; and the machine-predicted subjective well-being profile can also reflect other psychological traits like depression (r = 0.24, p < 0.01). This study provides important insights for psychological prediction using multiple, machine-assessed components and longitudinal or dense psychological assessment using social media language. PMID:29135991
Learning molecular energies using localized graph kernels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferré, Grégoire; Haut, Terry Scot; Barros, Kipton Marcos
We report that recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturallymore » incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. Finally, we benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.« less
Can We Train Machine Learning Methods to Outperform the High-dimensional Propensity Score Algorithm?
Karim, Mohammad Ehsanul; Pang, Menglan; Platt, Robert W
2018-03-01
The use of retrospective health care claims datasets is frequently criticized for the lack of complete information on potential confounders. Utilizing patient's health status-related information from claims datasets as surrogates or proxies for mismeasured and unobserved confounders, the high-dimensional propensity score algorithm enables us to reduce bias. Using a previously published cohort study of postmyocardial infarction statin use (1998-2012), we compare the performance of the algorithm with a number of popular machine learning approaches for confounder selection in high-dimensional covariate spaces: random forest, least absolute shrinkage and selection operator, and elastic net. Our results suggest that, when the data analysis is done with epidemiologic principles in mind, machine learning methods perform as well as the high-dimensional propensity score algorithm. Using a plasmode framework that mimicked the empirical data, we also showed that a hybrid of machine learning and high-dimensional propensity score algorithms generally perform slightly better than both in terms of mean squared error, when a bias-based analysis is used.
Precise positioning method for multi-process connecting based on binocular vision
NASA Astrophysics Data System (ADS)
Liu, Wei; Ding, Lichao; Zhao, Kai; Li, Xiao; Wang, Ling; Jia, Zhenyuan
2016-01-01
With the rapid development of aviation and aerospace, the demand for metal coating parts such as antenna reflector, eddy-current sensor and signal transmitter, etc. is more and more urgent. Such parts with varied feature dimensions, complex three-dimensional structures, and high geometric accuracy are generally fabricated by the combination of different manufacturing technology. However, it is difficult to ensure the machining precision because of the connection error between different processing methods. Therefore, a precise positioning method is proposed based on binocular micro stereo vision in this paper. Firstly, a novel and efficient camera calibration method for stereoscopic microscope is presented to solve the problems of narrow view field, small depth of focus and too many nonlinear distortions. Secondly, the extraction algorithms for law curve and free curve are given, and the spatial position relationship between the micro vision system and the machining system is determined accurately. Thirdly, a precise positioning system based on micro stereovision is set up and then embedded in a CNC machining experiment platform. Finally, the verification experiment of the positioning accuracy is conducted and the experimental results indicated that the average errors of the proposed method in the X and Y directions are 2.250 μm and 1.777 μm, respectively.
Supervised machine learning for analysing spectra of exoplanetary atmospheres
NASA Astrophysics Data System (ADS)
Márquez-Neila, Pablo; Fisher, Chloe; Sznitman, Raphael; Heng, Kevin
2018-06-01
The use of machine learning is becoming ubiquitous in astronomy1-3, but remains rare in the study of the atmospheres of exoplanets. Given the spectrum of an exoplanetary atmosphere, a multi-parameter space is swept through in real time to find the best-fit model4-6. Known as atmospheric retrieval, this technique originates in the Earth and planetary sciences7. Such methods are very time-consuming, and by necessity there is a compromise between physical and chemical realism and computational feasibility. Machine learning has previously been used to determine which molecules to include in the model, but the retrieval itself was still performed using standard methods8. Here, we report an adaptation of the `random forest' method of supervised machine learning9,10, trained on a precomputed grid of atmospheric models, which retrieves full posterior distributions of the abundances of molecules and the cloud opacity. The use of a precomputed grid allows a large part of the computational burden to be shifted offline. We demonstrate our technique on a transmission spectrum of the hot gas-giant exoplanet WASP-12b using a five-parameter model (temperature, a constant cloud opacity and the volume mixing ratios or relative abundances of molecules of water, ammonia and hydrogen cyanide)11. We obtain results consistent with the standard nested-sampling retrieval method. We also estimate the sensitivity of the measured spectrum to the model parameters, and we are able to quantify the information content of the spectrum. Our method can be straightforwardly applied using more sophisticated atmospheric models to interpret an ensemble of spectra without having to retrain the random forest.
Chen, Yang; Luo, Yan; Huang, Wei; Hu, Die; Zheng, Rong-Qin; Cong, Shu-Zhen; Meng, Fan-Kun; Yang, Hong; Lin, Hong-Jun; Sun, Yan; Wang, Xiu-Yan; Wu, Tao; Ren, Jie; Pei, Shu-Fang; Zheng, Ying; He, Yun; Hu, Yu; Yang, Na; Yan, Hongmei
2017-10-01
Hepatic fibrosis is a common middle stage of the pathological processes of chronic liver diseases. Clinical intervention during the early stages of hepatic fibrosis can slow the development of liver cirrhosis and reduce the risk of developing liver cancer. Performing a liver biopsy, the gold standard for viral liver disease management, has drawbacks such as invasiveness and a relatively high sampling error rate. Real-time tissue elastography (RTE), one of the most recently developed technologies, might be promising imaging technology because it is both noninvasive and provides accurate assessments of hepatic fibrosis. However, determining the stage of liver fibrosis from RTE images in a clinic is a challenging task. In this study, in contrast to the previous liver fibrosis index (LFI) method, which predicts the stage of diagnosis using RTE images and multiple regression analysis, we employed four classical classifiers (i.e., Support Vector Machine, Naïve Bayes, Random Forest and K-Nearest Neighbor) to build a decision-support system to improve the hepatitis B stage diagnosis performance. Eleven RTE image features were obtained from 513 subjects who underwent liver biopsies in this multicenter collaborative research. The experimental results showed that the adopted classifiers significantly outperformed the LFI method and that the Random Forest(RF) classifier provided the highest average accuracy among the four machine algorithms. This result suggests that sophisticated machine-learning methods can be powerful tools for evaluating the stage of hepatic fibrosis and show promise for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.
Integrating human and machine intelligence in galaxy morphology classification tasks
NASA Astrophysics Data System (ADS)
Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl
2018-06-01
Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.
Alghamdi, Manal; Al-Mallah, Mouaz; Keteyian, Steven; Brawner, Clinton; Ehrman, Jonathan; Sakr, Sherif
2017-01-01
Machine learning is becoming a popular and important approach in the field of medical research. In this study, we investigate the relative performance of various machine learning methods such as Decision Tree, Naïve Bayes, Logistic Regression, Logistic Model Tree and Random Forests for predicting incident diabetes using medical records of cardiorespiratory fitness. In addition, we apply different techniques to uncover potential predictors of diabetes. This FIT project study used data of 32,555 patients who are free of any known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems between 1991 and 2009 and had a complete 5-year follow-up. At the completion of the fifth year, 5,099 of those patients have developed diabetes. The dataset contained 62 attributes classified into four categories: demographic characteristics, disease history, medication use history, and stress test vital signs. We developed an Ensembling-based predictive model using 13 attributes that were selected based on their clinical importance, Multiple Linear Regression, and Information Gain Ranking methods. The negative effect of the imbalance class of the constructed model was handled by Synthetic Minority Oversampling Technique (SMOTE). The overall performance of the predictive model classifier was improved by the Ensemble machine learning approach using the Vote method with three Decision Trees (Naïve Bayes Tree, Random Forest, and Logistic Model Tree) and achieved high accuracy of prediction (AUC = 0.92). The study shows the potential of ensembling and SMOTE approaches for predicting incident diabetes using cardiorespiratory fitness data.
Jauregi Unanue, Iñigo; Zare Borzeshi, Ehsan; Piccardi, Massimo
2017-12-01
Previous state-of-the-art systems on Drug Name Recognition (DNR) and Clinical Concept Extraction (CCE) have focused on a combination of text "feature engineering" and conventional machine learning algorithms such as conditional random fields and support vector machines. However, developing good features is inherently heavily time-consuming. Conversely, more modern machine learning approaches such as recurrent neural networks (RNNs) have proved capable of automatically learning effective features from either random assignments or automated word "embeddings". (i) To create a highly accurate DNR and CCE system that avoids conventional, time-consuming feature engineering. (ii) To create richer, more specialized word embeddings by using health domain datasets such as MIMIC-III. (iii) To evaluate our systems over three contemporary datasets. Two deep learning methods, namely the Bidirectional LSTM and the Bidirectional LSTM-CRF, are evaluated. A CRF model is set as the baseline to compare the deep learning systems to a traditional machine learning approach. The same features are used for all the models. We have obtained the best results with the Bidirectional LSTM-CRF model, which has outperformed all previously proposed systems. The specialized embeddings have helped to cover unusual words in DrugBank and MedLine, but not in the i2b2/VA dataset. We present a state-of-the-art system for DNR and CCE. Automated word embeddings has allowed us to avoid costly feature engineering and achieve higher accuracy. Nevertheless, the embeddings need to be retrained over datasets that are adequate for the domain, in order to adequately cover the domain-specific vocabulary. Copyright © 2017 Elsevier Inc. All rights reserved.
Seismic activity prediction using computational intelligence techniques in northern Pakistan
NASA Astrophysics Data System (ADS)
Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat
2017-10-01
Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.
Zhang, A; Critchley, S; Monsour, P A
2016-12-01
The aim of the present study was to assess the current adoption of cone beam computed tomography (CBCT) and panoramic radiography (PR) machines across Australia. Information regarding registered CBCT and PR machines was obtained from radiation regulators across Australia. The number of X-ray machines was correlated with the population size, the number of dentists, and the gross state product (GSP) per capita, to determine the best fitting regression model(s). In 2014, there were 232 CBCT and 1681 PR machines registered in Australia. Based on absolute counts, Queensland had the largest number of CBCT and PR machines whereas the Northern Territory had the smallest number. However, when based on accessibility in terms of the population size and the number of dentists, the Australian Capital Territory had the most CBCT machines and Western Australia had the most PR machines. The number of X-ray machines correlated strongly with both the population size and the number of dentists, but not with the GSP per capita. In 2014, the ratio of PR to CBCT machines was approximately 7:1. Projected increases in either the population size or the number of dentists could positively impact on the adoption of PR and CBCT machines in Australia. © 2016 Australian Dental Association.
Studies of the DIII-D disruption database using Machine Learning algorithms
NASA Astrophysics Data System (ADS)
Rea, Cristina; Granetz, Robert; Meneghini, Orso
2017-10-01
A Random Forests Machine Learning algorithm, trained on a large database of both disruptive and non-disruptive DIII-D discharges, predicts disruptive behavior in DIII-D with about 90% of accuracy. Several algorithms have been tested and Random Forests was found superior in performances for this particular task. Over 40 plasma parameters are included in the database, with data for each of the parameters taken from 500k time slices. We focused on a subset of non-dimensional plasma parameters, deemed to be good predictors based on physics considerations. Both binary (disruptive/non-disruptive) and multi-label (label based on the elapsed time before disruption) classification problems are investigated. The Random Forests algorithm provides insight on the available dataset by ranking the relative importance of the input features. It is found that q95 and Greenwald density fraction (n/nG) are the most relevant parameters for discriminating between DIII-D disruptive and non-disruptive discharges. A comparison with the Gradient Boosted Trees algorithm is shown and the first results coming from the application of regression algorithms are presented. Work supported by the US Department of Energy under DE-FC02-04ER54698, DE-SC0014264 and DE-FG02-95ER54309.
Multi-parameter monitoring of electrical machines using integrated fibre Bragg gratings
NASA Astrophysics Data System (ADS)
Fabian, Matthias; Hind, David; Gerada, Chris; Sun, Tong; Grattan, Kenneth T. V.
2017-04-01
In this paper a sensor system for multi-parameter electrical machine condition monitoring is reported. The proposed FBG-based system allows for the simultaneous monitoring of machine vibration, rotor speed and position, torque, spinning direction, temperature distribution along the stator windings and on the rotor surface as well as the stator wave frequency. This all-optical sensing solution reduces the component count of conventional sensor systems, i.e., all 48 sensing elements are contained within the machine operated by a single sensing interrogation unit. In this work, the sensing system has been successfully integrated into and tested on a permanent magnet motor prototype.
Vehicle classification in WAMI imagery using deep network
NASA Astrophysics Data System (ADS)
Yi, Meng; Yang, Fan; Blasch, Erik; Sheaff, Carolyn; Liu, Kui; Chen, Genshe; Ling, Haibin
2016-05-01
Humans have always had a keen interest in understanding activities and the surrounding environment for mobility, communication, and survival. Thanks to recent progress in photography and breakthroughs in aviation, we are now able to capture tens of megapixels of ground imagery, namely Wide Area Motion Imagery (WAMI), at multiple frames per second from unmanned aerial vehicles (UAVs). WAMI serves as a great source for many applications, including security, urban planning and route planning. These applications require fast and accurate image understanding which is time consuming for humans, due to the large data volume and city-scale area coverage. Therefore, automatic processing and understanding of WAMI imagery has been gaining attention in both industry and the research community. This paper focuses on an essential step in WAMI imagery analysis, namely vehicle classification. That is, deciding whether a certain image patch contains a vehicle or not. We collect a set of positive and negative sample image patches, for training and testing the detector. Positive samples are 64 × 64 image patches centered on annotated vehicles. We generate two sets of negative images. The first set is generated from positive images with some location shift. The second set of negative patches is generated from randomly sampled patches. We also discard those patches if a vehicle accidentally locates at the center. Both positive and negative samples are randomly divided into 9000 training images and 3000 testing images. We propose to train a deep convolution network for classifying these patches. The classifier is based on a pre-trained AlexNet Model in the Caffe library, with an adapted loss function for vehicle classification. The performance of our classifier is compared to several traditional image classifier methods using Support Vector Machine (SVM) and Histogram of Oriented Gradient (HOG) features. While the SVM+HOG method achieves an accuracy of 91.2%, the accuracy of our deep network-based classifier reaches 97.9%.
Self-assembling fluidic machines
NASA Astrophysics Data System (ADS)
Grzybowski, Bartosz A.; Radkowski, Michal; Campbell, Christopher J.; Lee, Jessamine Ng; Whitesides, George M.
2004-03-01
This letter describes dynamic self-assembly of two-component rotors floating at the interface between liquid and air into simple, reconfigurable mechanical systems ("machines"). The rotors are powered by an external, rotating magnetic field, and their positions within the interface are controlled by: (i) repulsive hydrodynamic interactions between them and (ii) by localized magnetic fields produced by an array of small electromagnets located below the plane of the interface. The mechanical functions of the machines depend on the spatiotemporal sequence of activation of the electromagnets.
2014-06-01
motion capture data used to determine position and orientation of a Soldier’s head, turret and the M2 machine gun • Controlling and acquiring user/weapon...data from the M2 simulation machine gun • Controlling paintball guns used to fire at the GPK during an experimental run • Sending and receiving TCP...Mounted, Armor/Cavalry, Combat Engineers, Field Artillery Cannon Crewmember, or MP duty assignment – Currently M2 .50 Caliber Machine Gun qualified
NASA Astrophysics Data System (ADS)
Haikal Ahmad, M. A.; Zulafif Rahim, M.; Fauzi, M. F. Mohd; Abdullah, Aslam; Omar, Z.; Ding, Songlin; Ismail, A. E.; Rasidi Ibrahim, M.
2018-01-01
Polycrystalline diamond (PCD) is regarded as among the hardest material in the world. Electrical Discharge Machining (EDM) typically used to machine this material because of its non-contact process nature. This investigation was purposely done to compare the EDM performances of PCD when using normal electrode of copper (Cu) and newly proposed graphitization catalyst electrode of copper nickel (CuNi). Two level full factorial design of experiment with 4 center points technique was used to study the influence of main and interaction effects of the machining parameter namely; pulse-on, pulse-off, sparking current, and electrode materials (categorical factor). The paper shows interesting discovery in which the newly proposed electrode presented positive impact to the machining performance. With the same machining parameters of finishing, CuNi delivered more than 100% better in Ra and MRR than ordinary Cu electrode.
Structure and Randomness of Continuous-Time, Discrete-Event Processes
NASA Astrophysics Data System (ADS)
Marzen, Sarah E.; Crutchfield, James P.
2017-10-01
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.
NASA Astrophysics Data System (ADS)
Deng, Chengbin; Wu, Changshan
2013-12-01
Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.
Low, Yee Syuen; Blöcker, Christopher; McPherson, John R; Tang, See Aik; Cheng, Ying Ying; Wong, Joyner Y S; Chua, Clarinda; Lim, Tony K H; Tang, Choong Leong; Chew, Min Hoe; Tan, Patrick; Tan, Iain B; Rozen, Steven G; Cheah, Peh Yean
2017-09-10
Approximately 20% early-stage (I/II) colorectal cancer (CRC) patients develop metastases despite curative surgery. We aim to develop a formalin-fixed and paraffin-embedded (FFPE)-based predictor of metastases in early-stage, clinically-defined low risk, microsatellite-stable (MSS) CRC patients. We considered genome-wide mRNA and miRNA expression and mutation status of 20 genes assayed in 150 fresh-frozen tumours with known metastasis status. We selected 193 genes for further analysis using NanoString nCounter arrays on corresponding FFPE tumours. Neither mutation status nor miRNA expression improved the estimated prediction. The final predictor, ColoMet19, based on the top 19 genes' mRNA levels trained by Random Forest machine-learning strategy, had an estimated positive-predictive-value (PPV) of 0.66. We tested ColoMet19 on an independent test-set of 131 tumours and obtained a population-adjusted PPV of 0.67 indicating that early-stage CRC patients who tested positive have a 67% risk of developing metastases, substantially higher than the metastasis risk of 40% for node-positive (Stage III) patients who are generally treated with chemotherapy. Predicted-positive patients also had poorer metastasis-free survival (hazard ratios [HR] = 1.92, design-set; HR = 2.05, test-set). Thus, early-stage CRC patients who test positive may be considered for adjuvant therapy after surgery. Copyright © 2017 Elsevier B.V. All rights reserved.
Automatic ball bar for a coordinate measuring machine
Jostlein, H.
1997-07-15
An automatic ball bar for a coordinate measuring machine determines the accuracy of a coordinate measuring machine having at least one servo drive. The apparatus comprises a first and second gauge ball connected by a telescoping rigid member. The rigid member includes a switch such that inward radial movement of the second gauge ball relative to the first gauge ball causes activation of the switch. The first gauge ball is secured in a first magnetic socket assembly in order to maintain the first gauge ball at a fixed location with respect to the coordinate measuring machine. A second magnetic socket assembly secures the second gauge ball to the arm or probe holder of the coordinate measuring machine. The second gauge ball is then directed by the coordinate measuring machine to move radially inward from a point just beyond the length of the ball bar until the switch is activated. Upon switch activation, the position of the coordinate measuring machine is determined and compared to known ball bar length such that the accuracy of the coordinate measuring machine can be determined. 5 figs.
Automatic ball bar for a coordinate measuring machine
Jostlein, Hans
1997-01-01
An automatic ball bar for a coordinate measuring machine determines the accuracy of a coordinate measuring machine having at least one servo drive. The apparatus comprises a first and second gauge ball connected by a telescoping rigid member. The rigid member includes a switch such that inward radial movement of the second gauge ball relative to the first gauge ball causes activation of the switch. The first gauge ball is secured in a first magnetic socket assembly in order to maintain the first gauge ball at a fixed location with respect to the coordinate measuring machine. A second magnetic socket assembly secures the second gauge ball to the arm or probe holder of the coordinate measuring machine. The second gauge ball is then directed by the coordinate measuring machine to move radially inward from a point just beyond the length of the ball bar until the switch is activated. Upon switch activation, the position of the coordinate measuring machine is determined and compared to known ball bar length such that the accuracy of the coordinate measuring machine can be determined.
Yorioka, Katsuhiro; Oie, Shigeharu; Hayashi, Koji; Kimoto, Hiroo; Furukawa, Hiroyuki
2016-06-01
Although microbial contamination of ice machines has been reported, no previous study has addressed microbial contamination of ice produced by machines equipped with activated charcoal (AC) filters in hospitals. The aim of this study was to provide clinical data for evaluating AC filters to prevent microbial contamination of ice. We compared microbial contamination in ice samples produced by machines with (n = 20) and without an AC filter (n = 40) in Shunan City Shinnanyo Municipal Hospital. All samples from the ice machine equipped with an AC filter contained 10-116 CFUs/g of glucose nonfermenting gram-negative bacteria such as Pseudomonas aeruginosa and Chryseobacterium meningosepticum. No microorganisms were detected in samples from ice machines without AC filters. After the AC filter was removed from the ice machine that tested positive for Gram-negative bacteria, the ice was resampled (n = 20). Analysis found no contaminants. Ice machines equipped with AC filters pose a serious risk factor for ice contamination. New filter-use guidelines and regulations on bacterial detection limits to prevent contamination of ice in healthcare facilities are necessary.
Random Bits Forest: a Strong Classifier/Regressor for Big Data
NASA Astrophysics Data System (ADS)
Wang, Yi; Li, Yi; Pu, Weilin; Wen, Kathryn; Shugart, Yin Yao; Xiong, Momiao; Jin, Li
2016-07-01
Efficiency, memory consumption, and robustness are common problems with many popular methods for data analysis. As a solution, we present Random Bits Forest (RBF), a classification and regression algorithm that integrates neural networks (for depth), boosting (for width), and random forests (for prediction accuracy). Through a gradient boosting scheme, it first generates and selects ~10,000 small, 3-layer random neural networks. These networks are then fed into a modified random forest algorithm to obtain predictions. Testing with datasets from the UCI (University of California, Irvine) Machine Learning Repository shows that RBF outperforms other popular methods in both accuracy and robustness, especially with large datasets (N > 1000). The algorithm also performed highly in testing with an independent data set, a real psoriasis genome-wide association study (GWAS).
Machine for preparing phosphors for the fluorimetric determination of uranium
Stevens, R.E.; Wood, W.H.; Goetz, K.G.; Horr, C.A.
1956-01-01
The time saved by use of a machine for preparing many phosphors at one time increases the rate of productivity of the fluorimetric method for determining uranium. The machine prepares 18 phosphors at a time and eliminates the tedious and time-consuming step of preparing them by hand, while improving the precision of the method in some localities. The machine consists of a ring burner over which the platinum dishes, containing uranium and flux, are rotated. By placing the machine in an inclined position the molten flux comes into contact with all surfaces within th dish as the dishes rotate over the flame. Precision is improved because the heating and cooling conditions are the same for each of the 18 phosphors in one run as well as for successive runs.
Laser Doppler position sensor for position and shape measurements of fast rotating objects
NASA Astrophysics Data System (ADS)
Czarske, Jürgen; Pfister, Thorsten; Büttner, Lars
2006-08-01
We report about a novel optical method based on laser Doppler velocimetry for position and shape measurements of moved solid state surfaces with approximately one micrometer position resolution. 3D shape measurements of a rotating cylinder inside a turning machine as well as tip clearance measurements at a transonic centrifugal compressor performed during operation at 50,000 rpm and 586 m/s blade tip velocity are presented. All results are in good agreement with conventional reference probes. The measurement accuracy of the laser Doppler position sensor is investigated in dependence of the speckle pattern. Furthermore, it is shown that this sensor offers high temporal resolution and high position resolution simultaneously and that shading can be reduced compared to triangulation. Consequently, the presented laser Doppler position sensor opens up new perspectives in the field of real-time manufacturing metrology and process control, for example controlling the turning and the grinding process or for future developments of turbo machines.
Grouin, Cyril; Zweigenbaum, Pierre
2013-01-01
In this paper, we present a comparison of two approaches to automatically de-identify medical records written in French: a rule-based system and a machine-learning based system using a conditional random fields (CRF) formalism. Both systems have been designed to process nine identifiers in a corpus of medical records in cardiology. We performed two evaluations: first, on 62 documents in cardiology, and on 10 documents in foetopathology - produced by optical character recognition (OCR) - to evaluate the robustness of our systems. We achieved a 0.843 (rule-based) and 0.883 (machine-learning) exact match overall F-measure in cardiology. While the rule-based system allowed us to achieve good results on nominative (first and last names) and numerical data (dates, phone numbers, and zip codes), the machine-learning approach performed best on more complex categories (postal addresses, hospital names, medical devices, and towns). On the foetopathology corpus, although our systems have not been designed for this corpus and despite OCR character recognition errors, we obtained promising results: a 0.681 (rule-based) and 0.638 (machine-learning) exact-match overall F-measure. This demonstrates that existing tools can be applied to process new documents of lower quality.
An incremental anomaly detection model for virtual machines.
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.
An incremental anomaly detection model for virtual machines
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245
Operation of micro and molecular machines: a new concept with its origins in interface science.
Ariga, Katsuhiko; Ishihara, Shinsuke; Izawa, Hironori; Xia, Hong; Hill, Jonathan P
2011-03-21
A landmark accomplishment of nanotechnology would be successful fabrication of ultrasmall machines that can work like tweezers, motors, or even computing devices. Now we must consider how operation of micro- and molecular machines might be implemented for a wide range of applications. If these machines function only under limited conditions and/or require specialized apparatus then they are useless for practical applications. Therefore, it is important to carefully consider the access of functionality of the molecular or nanoscale systems by conventional stimuli at the macroscopic level. In this perspective, we will outline the position of micro- and molecular machines in current science and technology. Most of these machines are operated by light irradiation, application of electrical or magnetic fields, chemical reactions, and thermal fluctuations, which cannot always be applied in remote machine operation. We also propose strategies for molecular machine operation using the most conventional of stimuli, that of macroscopic mechanical force, achieved through mechanical operation of molecular machines located at an air-water interface. The crucial roles of the characteristics of an interfacial environment, i.e. connection between macroscopic dimension and nanoscopic function, and contact of media with different dielectric natures, are also described.
Impact of the HEALTHY Study on Vending Machine Offerings in Middle Schools.
Hartstein, Jill; Cullen, Karen W; Virus, Amy; El Ghormli, Laure; Volpe, Stella L; Staten, Myrlene A; Bridgman, Jessica C; Stadler, Diane D; Gillis, Bonnie; McCormick, Sarah B; Mobley, Connie C
2011-01-01
The purpose of this study is to report the impact of the three-year middle school-based HEALTHY study on intervention school vending machine offerings. There were two goals for the vending machines: serve only dessert/snack foods with 200 kilocalories or less per single serving package, and eliminate 100% fruit juice and beverages with added sugar. Six schools in each of seven cities (Houston, TX, San Antonio, TX, Irvine, CA, Portland, OR, Pittsburg, PA, Philadelphia, PA, and Chapel Hill, NC) were randomized into intervention (n=21 schools) or control (n=21 schools) groups, with three intervention and three control schools per city. All items in vending machine slots were tallied twice in the fall of 2006 for baseline data and twice at the end of the study, in 2009. The percentage of total slots for each food/beverage category was calculated and compared between intervention and control schools at the end of study, using the Pearson chi-square test statistic. At baseline, 15 intervention and 15 control schools had beverage and/or snack vending machines, compared with 11 intervention and 11 control schools at the end of the study. At the end of study, all of the intervention schools with beverage vending machines, but only one out of the nine control schools, met the beverage goal. The snack goal was met by all of the intervention schools and only one of the four control schools with snack vending machines. The HEALTHY study's vending machine beverage and snack goals were successfully achieved in intervention schools, reducing access to less healthy food items outside the school meals program. Although the effect of these changes on student diet, energy balance and growth is unknown, these results suggest that healthier options for snacks can successfully be offered in school vending machines.
A comparison of machine learning and Bayesian modelling for molecular serotyping.
Newton, Richard; Wernisch, Lorenz
2017-08-11
Streptococcus pneumoniae is a human pathogen that is a major cause of infant mortality. Identifying the pneumococcal serotype is an important step in monitoring the impact of vaccines used to protect against disease. Genomic microarrays provide an effective method for molecular serotyping. Previously we developed an empirical Bayesian model for the classification of serotypes from a molecular serotyping array. With only few samples available, a model driven approach was the only option. In the meanwhile, several thousand samples have been made available to us, providing an opportunity to investigate serotype classification by machine learning methods, which could complement the Bayesian model. We compare the performance of the original Bayesian model with two machine learning algorithms: Gradient Boosting Machines and Random Forests. We present our results as an example of a generic strategy whereby a preliminary probabilistic model is complemented or replaced by a machine learning classifier once enough data are available. Despite the availability of thousands of serotyping arrays, a problem encountered when applying machine learning methods is the lack of training data containing mixtures of serotypes; due to the large number of possible combinations. Most of the available training data comprises samples with only a single serotype. To overcome the lack of training data we implemented an iterative analysis, creating artificial training data of serotype mixtures by combining raw data from single serotype arrays. With the enhanced training set the machine learning algorithms out perform the original Bayesian model. However, for serotypes currently lacking sufficient training data the best performing implementation was a combination of the results of the Bayesian Model and the Gradient Boosting Machine. As well as being an effective method for classifying biological data, machine learning can also be used as an efficient method for revealing subtle biological insights, which we illustrate with an example.
Pfeiffenberger, Erik; Chaleil, Raphael A.G.; Moal, Iain H.
2017-01-01
ABSTRACT Reliable identification of near‐native poses of docked protein–protein complexes is still an unsolved problem. The intrinsic heterogeneity of protein–protein interactions is challenging for traditional biophysical or knowledge based potentials and the identification of many false positive binding sites is not unusual. Often, ranking protocols are based on initial clustering of docked poses followed by the application of an energy function to rank each cluster according to its lowest energy member. Here, we present an approach of cluster ranking based not only on one molecular descriptor (e.g., an energy function) but also employing a large number of descriptors that are integrated in a machine learning model, whereby, an extremely randomized tree classifier based on 109 molecular descriptors is trained. The protocol is based on first locally enriching clusters with additional poses, the clusters are then characterized using features describing the distribution of molecular descriptors within the cluster, which are combined into a pairwise cluster comparison model to discriminate near‐native from incorrect clusters. The results show that our approach is able to identify clusters containing near‐native protein–protein complexes. In addition, we present an analysis of the descriptors with respect to their power to discriminate near native from incorrect clusters and how data transformations and recursive feature elimination can improve the ranking performance. Proteins 2017; 85:528–543. © 2016 Wiley Periodicals, Inc. PMID:27935158
Liang, Ja-Der; Ping, Xiao-Ou; Tseng, Yi-Ju; Huang, Guan-Tarn; Lai, Feipei; Yang, Pei-Ming
2014-12-01
Recurrence of hepatocellular carcinoma (HCC) is an important issue despite effective treatments with tumor eradication. Identification of patients who are at high risk for recurrence may provide more efficacious screening and detection of tumor recurrence. The aim of this study was to develop recurrence predictive models for HCC patients who received radiofrequency ablation (RFA) treatment. From January 2007 to December 2009, 83 newly diagnosed HCC patients receiving RFA as their first treatment were enrolled. Five feature selection methods including genetic algorithm (GA), simulated annealing (SA) algorithm, random forests (RF) and hybrid methods (GA+RF and SA+RF) were utilized for selecting an important subset of features from a total of 16 clinical features. These feature selection methods were combined with support vector machine (SVM) for developing predictive models with better performance. Five-fold cross-validation was used to train and test SVM models. The developed SVM-based predictive models with hybrid feature selection methods and 5-fold cross-validation had averages of the sensitivity, specificity, accuracy, positive predictive value, negative predictive value, and area under the ROC curve as 67%, 86%, 82%, 69%, 90%, and 0.69, respectively. The SVM derived predictive model can provide suggestive high-risk recurrent patients, who should be closely followed up after complete RFA treatment. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Thompson, John W; Bower, Susanne; Tyrer, Stephen P
2008-04-01
A double blind randomised controlled clinical trial on the effect of transcutaneous spinal electroanalgesia (TSE) on low back pain was carried out in 58 patients attending a Pain Management Unit. Four TSE instruments, two active and two sham, were used and each patient was assigned randomly to one of these. Low back pain was rated by each patient using a visual analogue scale (VAS) immediately before and immediately after a single 20 min treatment of TSE and also daily for the week prior to, and the week following, the treatment. No significant difference in mean pain score was detected between the active and sham treated groups immediately after treatment or during the subsequent week. The Hospital, Anxiety and Depression scale (HAD) and the General Health Questionnaire (GHQ) were completed by each patient and there was a positive correlation between the scores achieved on these scales and the mean pain scores in both the active and sham treated groups. A post-trial problem was the discovery that the specification of the two active TSE machines differed from the manufacturer's specification. Thus, the output frequencies were either more (+10%) or less (-17%) while the maximum output voltages were both less (-40% and -20%), respectively. However, additional statistical analysis revealed no significant differences between the results obtained with the two active machines.
NASA Astrophysics Data System (ADS)
Lesniak, J. M.; Hupse, R.; Blanc, R.; Karssemeijer, N.; Székely, G.
2012-08-01
False positive (FP) marks represent an obstacle for effective use of computer-aided detection (CADe) of breast masses in mammography. Typically, the problem can be approached either by developing more discriminative features or by employing different classifier designs. In this paper, the usage of support vector machine (SVM) classification for FP reduction in CADe is investigated, presenting a systematic quantitative evaluation against neural networks, k-nearest neighbor classification, linear discriminant analysis and random forests. A large database of 2516 film mammography examinations and 73 input features was used to train the classifiers and evaluate for their performance on correctly diagnosed exams as well as false negatives. Further, classifier robustness was investigated using varying training data and feature sets as input. The evaluation was based on the mean exam sensitivity in 0.05-1 FPs on normals on the free-response receiver operating characteristic curve (FROC), incorporated into a tenfold cross validation framework. It was found that SVM classification using a Gaussian kernel offered significantly increased detection performance (P = 0.0002) compared to the reference methods. Varying training data and input features, SVMs showed improved exploitation of large feature sets. It is concluded that with the SVM-based CADe a significant reduction of FPs is possible outperforming other state-of-the-art approaches for breast mass CADe.
Tan, W Katherine; Hassanpour, Saeed; Heagerty, Patrick J; Rundell, Sean D; Suri, Pradeep; Huhdanpaa, Hannu T; James, Kathryn; Carrell, David S; Langlotz, Curtis P; Organ, Nancy L; Meier, Eric N; Sherman, Karen J; Kallmes, David F; Luetmer, Patrick H; Griffith, Brent; Nerenz, David R; Jarvik, Jeffrey G
2018-03-28
To evaluate a natural language processing (NLP) system built with open-source tools for identification of lumbar spine imaging findings related to low back pain on magnetic resonance and x-ray radiology reports from four health systems. We used a limited data set (de-identified except for dates) sampled from lumbar spine imaging reports of a prospectively assembled cohort of adults. From N = 178,333 reports, we randomly selected N = 871 to form a reference-standard dataset, consisting of N = 413 x-ray reports and N = 458 MR reports. Using standardized criteria, four spine experts annotated the presence of 26 findings, where 71 reports were annotated by all four experts and 800 were each annotated by two experts. We calculated inter-rater agreement and finding prevalence from annotated data. We randomly split the annotated data into development (80%) and testing (20%) sets. We developed an NLP system from both rule-based and machine-learned models. We validated the system using accuracy metrics such as sensitivity, specificity, and area under the receiver operating characteristic curve (AUC). The multirater annotated dataset achieved inter-rater agreement of Cohen's kappa > 0.60 (substantial agreement) for 25 of 26 findings, with finding prevalence ranging from 3% to 89%. In the testing sample, rule-based and machine-learned predictions both had comparable average specificity (0.97 and 0.95, respectively). The machine-learned approach had a higher average sensitivity (0.94, compared to 0.83 for rules-based), and a higher overall AUC (0.98, compared to 0.90 for rules-based). Our NLP system performed well in identifying the 26 lumbar spine findings, as benchmarked by reference-standard annotation by medical experts. Machine-learned models provided substantial gains in model sensitivity with slight loss of specificity, and overall higher AUC. Copyright © 2018 The Association of University Radiologists. All rights reserved.
Monte-Moreno, Enric
2011-10-01
This work presents a system for a simultaneous non-invasive estimate of the blood glucose level (BGL) and the systolic (SBP) and diastolic (DBP) blood pressure, using a photoplethysmograph (PPG) and machine learning techniques. The method is independent of the person whose values are being measured and does not need calibration over time or subjects. The architecture of the system consists of a photoplethysmograph sensor, an activity detection module, a signal processing module that extracts features from the PPG waveform, and a machine learning algorithm that estimates the SBP, DBP and BGL values. The idea that underlies the system is that there is functional relationship between the shape of the PPG waveform and the blood pressure and glucose levels. As described in this paper we tested this method on 410 individuals without performing any personalized calibration. The results were computed after cross validation. The machine learning techniques tested were: ridge linear regression, a multilayer perceptron neural network, support vector machines and random forests. The best results were obtained with the random forest technique. In the case of blood pressure, the resulting coefficients of determination for reference vs. prediction were R(SBP)(2)=0.91, R(DBP)(2)=0.89, and R(BGL)(2)=0.90. For the glucose estimation, distribution of the points on a Clarke error grid placed 87.7% of points in zone A, 10.3% in zone B, and 1.9% in zone D. Blood pressure values complied with the grade B protocol of the British Hypertension society. An effective system for estimate of blood glucose and blood pressure from a photoplethysmograph is presented. The main advantage of the system is that for clinical use it complies with the grade B protocol of the British Hypertension society for the blood pressure and only in 1.9% of the cases did not detect hypoglycemia or hyperglycemia. Copyright © 2011 Elsevier B.V. All rights reserved.
Dong, Zhixu; Sun, Xingwei; Chen, Changzheng; Sun, Mengnan
2018-04-13
The inconvenient loading and unloading of a long and heavy drill pipe gives rise to the difficulty in measuring the contour parameters of its threads at both ends. To solve this problem, in this paper we take the SCK230 drill pipe thread-repairing machine tool as a carrier to design and achieve a fast and on-machine measuring system based on a laser probe. This system drives a laser displacement sensor to acquire the contour data of a certain axial section of the thread by using the servo function of a CNC machine tool. To correct the sensor's measurement errors caused by the measuring point inclination angle, an inclination error model is built to compensate data in real time. To better suppress random error interference and ensure real contour information, a new wavelet threshold function is proposed to process data through the wavelet threshold denoising. Discrete data after denoising is segmented according to the geometrical characteristics of the drill pipe thread, and the regression model of the contour data in each section is fitted by using the method of weighted total least squares (WTLS). Then, the thread parameters are calculated in real time to judge the processing quality. Inclination error experiments show that the proposed compensation model is accurate and effective, and it can improve the data acquisition accuracy of a sensor. Simulation results indicate that the improved threshold function is of better continuity and self-adaptability, which makes sure that denoising effects are guaranteed, and, meanwhile, the complete elimination of real data distorted in random errors is avoided. Additionally, NC50 thread-testing experiments show that the proposed on-machine measuring system can complete the measurement of a 25 mm thread in 7.8 s, with a measurement accuracy of ±8 μm and repeatability limit ≤ 4 μm (high repeatability), and hence the accuracy and efficiency of measurement are both improved.
Sun, Xingwei; Chen, Changzheng; Sun, Mengnan
2018-01-01
The inconvenient loading and unloading of a long and heavy drill pipe gives rise to the difficulty in measuring the contour parameters of its threads at both ends. To solve this problem, in this paper we take the SCK230 drill pipe thread-repairing machine tool as a carrier to design and achieve a fast and on-machine measuring system based on a laser probe. This system drives a laser displacement sensor to acquire the contour data of a certain axial section of the thread by using the servo function of a CNC machine tool. To correct the sensor’s measurement errors caused by the measuring point inclination angle, an inclination error model is built to compensate data in real time. To better suppress random error interference and ensure real contour information, a new wavelet threshold function is proposed to process data through the wavelet threshold denoising. Discrete data after denoising is segmented according to the geometrical characteristics of the drill pipe thread, and the regression model of the contour data in each section is fitted by using the method of weighted total least squares (WTLS). Then, the thread parameters are calculated in real time to judge the processing quality. Inclination error experiments show that the proposed compensation model is accurate and effective, and it can improve the data acquisition accuracy of a sensor. Simulation results indicate that the improved threshold function is of better continuity and self-adaptability, which makes sure that denoising effects are guaranteed, and, meanwhile, the complete elimination of real data distorted in random errors is avoided. Additionally, NC50 thread-testing experiments show that the proposed on-machine measuring system can complete the measurement of a 25 mm thread in 7.8 s, with a measurement accuracy of ±8 μm and repeatability limit ≤ 4 μm (high repeatability), and hence the accuracy and efficiency of measurement are both improved. PMID:29652836
Luo, Wei; Phung, Dinh; Tran, Truyen; Gupta, Sunil; Rana, Santu; Karmakar, Chandan; Shilton, Alistair; Yearwood, John; Dimitrova, Nevenka; Ho, Tu Bao; Venkatesh, Svetha; Berk, Michael
2016-12-16
As more and more researchers are turning to big data for new opportunities of biomedical discoveries, machine learning models, as the backbone of big data analysis, are mentioned more often in biomedical journals. However, owing to the inherent complexity of machine learning methods, they are prone to misuse. Because of the flexibility in specifying machine learning models, the results are often insufficiently reported in research articles, hindering reliable assessment of model validity and consistent interpretation of model outputs. To attain a set of guidelines on the use of machine learning predictive models within clinical settings to make sure the models are correctly applied and sufficiently reported so that true discoveries can be distinguished from random coincidence. A multidisciplinary panel of machine learning experts, clinicians, and traditional statisticians were interviewed, using an iterative process in accordance with the Delphi method. The process produced a set of guidelines that consists of (1) a list of reporting items to be included in a research article and (2) a set of practical sequential steps for developing predictive models. A set of guidelines was generated to enable correct application of machine learning models and consistent reporting of model specifications and results in biomedical research. We believe that such guidelines will accelerate the adoption of big data analysis, particularly with machine learning methods, in the biomedical research community. ©Wei Luo, Dinh Phung, Truyen Tran, Sunil Gupta, Santu Rana, Chandan Karmakar, Alistair Shilton, John Yearwood, Nevenka Dimitrova, Tu Bao Ho, Svetha Venkatesh, Michael Berk. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 16.12.2016.
Uhlig, Johannes; Uhlig, Annemarie; Kunze, Meike; Beissbarth, Tim; Fischer, Uwe; Lotz, Joachim; Wienbeck, Susanne
2018-05-24
The purpose of this study is to evaluate the diagnostic performance of machine learning techniques for malignancy prediction at breast cone-beam CT (CBCT) and to compare them to human readers. Five machine learning techniques, including random forests, back propagation neural networks (BPN), extreme learning machines, support vector machines, and K-nearest neighbors, were used to train diagnostic models on a clinical breast CBCT dataset with internal validation by repeated 10-fold cross-validation. Two independent blinded human readers with profound experience in breast imaging and breast CBCT analyzed the same CBCT dataset. Diagnostic performance was compared using AUC, sensitivity, and specificity. The clinical dataset comprised 35 patients (American College of Radiology density type C and D breasts) with 81 suspicious breast lesions examined with contrast-enhanced breast CBCT. Forty-five lesions were histopathologically proven to be malignant. Among the machine learning techniques, BPNs provided the best diagnostic performance, with AUC of 0.91, sensitivity of 0.85, and specificity of 0.82. The diagnostic performance of the human readers was AUC of 0.84, sensitivity of 0.89, and specificity of 0.72 for reader 1 and AUC of 0.72, sensitivity of 0.71, and specificity of 0.67 for reader 2. AUC was significantly higher for BPN when compared with both reader 1 (p = 0.01) and reader 2 (p < 0.001). Machine learning techniques provide a high and robust diagnostic performance in the prediction of malignancy in breast lesions identified at CBCT. BPNs showed the best diagnostic performance, surpassing human readers in terms of AUC and specificity.
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1982-01-01
Observed positions, proper motions, estimated photographic magnitudes and colors, and references to identifications in other catalogs are included. Photoelectric data on the UBV system are included for many stars, but no attempt was made to find all existing photometry. The machine-readable catalog is described.
MACHINE TOOL OPERATOR--GENERAL, ENTRY, SUGGESTED GUIDE FOR A TRAINING COURSE.
ERIC Educational Resources Information Center
RONEY, MAURICE W.; AND OTHERS
THE PURPOSE OF THIS CURRICULUM GUIDE IS TO ASSIST THE ADMINISTRATOR AND INSTRUCTOR IN PLANNING AND DEVELOPING MANPOWER DEVELOPMENT AND TRAINING PROGRAMS TO PREPARE MACHINE TOOL OPERATORS FOR ENTRY-LEVEL POSITIONS. THE COURSE OUTLINE PROVIDES UNITS IN -- (1) ORIENTATION, (2) BENCH WORK, (3) SHOP MATHEMATICS, (4) BLUEPRINT READING AND SKETCHING, (5)…
Diverse applications of advanced man-telerobot interfaces
NASA Technical Reports Server (NTRS)
Mcaffee, Douglas A.
1991-01-01
Advancements in man-machine interfaces and control technologies used in space telerobotics and teleoperators have potential application wherever human operators need to manipulate multi-dimensional spatial relationships. Bilateral six degree-of-freedom position and force cues exchanged between the user and a complex system can broaden and improve the effectiveness of several diverse man-machine interfaces.
NASA Astrophysics Data System (ADS)
Zhong, Xuemin; Liu, Hongqi; Mao, Xinyong; Li, Bin; He, Songping; Peng, Fangyu
2018-05-01
Large multi-axis propeller-measuring machines have two types of geometric error, position-independent geometric errors (PIGEs) and position-dependent geometric errors (PDGEs), which both have significant effects on the volumetric error of the measuring tool relative to the worktable. This paper focuses on modeling, identifying and compensating for the volumetric error of the measuring machine. A volumetric error model in the base coordinate system is established based on screw theory considering all the geometric errors. In order to fully identify all the geometric error parameters, a new method for systematic measurement and identification is proposed. All the PIGEs of adjacent axes and the six PDGEs of the linear axes are identified with a laser tracker using the proposed model. Finally, a volumetric error compensation strategy is presented and an inverse kinematic solution for compensation is proposed. The final measuring and compensation experiments have further verified the efficiency and effectiveness of the measuring and identification method, indicating that the method can be used in volumetric error compensation for large machine tools.
Machine learning prediction for classification of outcomes in local minimisation
NASA Astrophysics Data System (ADS)
Das, Ritankar; Wales, David J.
2017-01-01
Machine learning schemes are employed to predict which local minimum will result from local energy minimisation of random starting configurations for a triatomic cluster. The input data consists of structural information at one or more of the configurations in optimisation sequences that converge to one of four distinct local minima. The ability to make reliable predictions, in terms of the energy or other properties of interest, could save significant computational resources in sampling procedures that involve systematic geometry optimisation. Results are compared for two energy minimisation schemes, and for neural network and quadratic functions of the inputs.
Lenselink, Eelke B; Ten Dijke, Niels; Bongers, Brandon; Papadatos, George; van Vlijmen, Herman W T; Kowalczyk, Wojtek; IJzerman, Adriaan P; van Westen, Gerard J P
2017-08-14
The increase of publicly available bioactivity data in recent years has fueled and catalyzed research in chemogenomics, data mining, and modeling approaches. As a direct result, over the past few years a multitude of different methods have been reported and evaluated, such as target fishing, nearest neighbor similarity-based methods, and Quantitative Structure Activity Relationship (QSAR)-based protocols. However, such studies are typically conducted on different datasets, using different validation strategies, and different metrics. In this study, different methods were compared using one single standardized dataset obtained from ChEMBL, which is made available to the public, using standardized metrics (BEDROC and Matthews Correlation Coefficient). Specifically, the performance of Naïve Bayes, Random Forests, Support Vector Machines, Logistic Regression, and Deep Neural Networks was assessed using QSAR and proteochemometric (PCM) methods. All methods were validated using both a random split validation and a temporal validation, with the latter being a more realistic benchmark of expected prospective execution. Deep Neural Networks are the top performing classifiers, highlighting the added value of Deep Neural Networks over other more conventional methods. Moreover, the best method ('DNN_PCM') performed significantly better at almost one standard deviation higher than the mean performance. Furthermore, Multi-task and PCM implementations were shown to improve performance over single task Deep Neural Networks. Conversely, target prediction performed almost two standard deviations under the mean performance. Random Forests, Support Vector Machines, and Logistic Regression performed around mean performance. Finally, using an ensemble of DNNs, alongside additional tuning, enhanced the relative performance by another 27% (compared with unoptimized 'DNN_PCM'). Here, a standardized set to test and evaluate different machine learning algorithms in the context of multi-task learning is offered by providing the data and the protocols. Graphical Abstract .
Torii, Manabu; Yin, Lanlan; Nguyen, Thang; Mazumdar, Chand T.; Liu, Hongfang; Hartley, David M.; Nelson, Noele P.
2014-01-01
Purpose Early detection of infectious disease outbreaks is crucial to protecting the public health of a society. Online news articles provide timely information on disease outbreaks worldwide. In this study, we investigated automated detection of articles relevant to disease outbreaks using machine learning classifiers. In a real-life setting, it is expensive to prepare a training data set for classifiers, which usually consists of manually labeled relevant and irrelevant articles. To mitigate this challenge, we examined the use of randomly sampled unlabeled articles as well as labeled relevant articles. Methods Naïve Bayes and Support Vector Machine (SVM) classifiers were trained on 149 relevant and 149 or more randomly sampled unlabeled articles. Diverse classifiers were trained by varying the number of sampled unlabeled articles and also the number of word features. The trained classifiers were applied to 15 thousand articles published over 15 days. Top-ranked articles from each classifier were pooled and the resulting set of 1337 articles was reviewed by an expert analyst to evaluate the classifiers. Results Daily averages of areas under ROC curves (AUCs) over the 15-day evaluation period were 0.841 and 0.836, respectively, for the naïve Bayes and SVM classifier. We referenced a database of disease outbreak reports to confirm that this evaluation data set resulted from the pooling method indeed covered incidents recorded in the database during the evaluation period. Conclusions The proposed text classification framework utilizing randomly sampled unlabeled articles can facilitate a cost-effective approach to training machine learning classifiers in a real-life Internet-based biosurveillance project. We plan to examine this framework further using larger data sets and using articles in non-English languages. PMID:21134784
Hornbrook, Mark C; Goshen, Ran; Choman, Eran; O'Keeffe-Rosetti, Maureen; Kinar, Yaron; Liles, Elizabeth G; Rust, Kristal C
2017-10-01
Machine learning tools identify patients with blood counts indicating greater likelihood of colorectal cancer and warranting colonoscopy referral. To validate a machine learning colorectal cancer detection model on a US community-based insured adult population. Eligible colorectal cancer cases (439 females, 461 males) with complete blood counts before diagnosis were identified from Kaiser Permanente Northwest Region's Tumor Registry. Control patients (n = 9108) were randomly selected from KPNW's population who had no cancers, received at ≥1 blood count, had continuous enrollment from 180 days prior to the blood count through 24 months after the count, and were aged 40-89. For each control, one blood count was randomly selected as the pseudo-colorectal cancer diagnosis date for matching to cases, and assigned a "calendar year" based on the count date. For each calendar year, 18 controls were randomly selected to match the general enrollment's 10-year age groups and lengths of continuous enrollment. Prediction performance was evaluated by area under the curve, specificity, and odds ratios. Area under the receiver operating characteristics curve for detecting colorectal cancer was 0.80 ± 0.01. At 99% specificity, the odds ratio for association of a high-risk detection score with colorectal cancer was 34.7 (95% CI 28.9-40.4). The detection model had the highest accuracy in identifying right-sided colorectal cancers. ColonFlag ® identifies individuals with tenfold higher risk of undiagnosed colorectal cancer at curable stages (0/I/II), flags colorectal tumors 180-360 days prior to usual clinical diagnosis, and is more accurate at identifying right-sided (compared to left-sided) colorectal cancers.
Big data integration for regional hydrostratigraphic mapping
NASA Astrophysics Data System (ADS)
Friedel, M. J.
2013-12-01
Numerical models provide a way to evaluate groundwater systems, but determining the hydrostratigraphic units (HSUs) used in devising these models remains subjective, nonunique, and uncertain. A novel geophysical-hydrogeologic data integration scheme is proposed to constrain the estimation of continuous HSUs. First, machine-learning and multivariate statistical techniques are used to simultaneously integrate borehole hydrogeologic (lithology, hydraulic conductivity, aqueous field parameters, dissolved constituents) and geophysical (gamma, spontaneous potential, and resistivity) measurements. Second, airborne electromagnetic measurements are numerically inverted to obtain subsurface resistivity structure at randomly selected locations. Third, the machine-learning algorithm is trained using the borehole hydrostratigraphic units and inverted airborne resistivity profiles. The trained machine-learning algorithm is then used to estimate HSUs at independent resistivity profile locations. We demonstrate efficacy of the proposed approach to map the hydrostratigraphy of a heterogeneous surficial aquifer in northwestern Nebraska.
Radio Frequency Interference Detection using Machine Learning.
NASA Astrophysics Data System (ADS)
Mosiane, Olorato; Oozeer, Nadeem; Aniyan, Arun; Bassett, Bruce A.
2017-05-01
Radio frequency interference (RFI) has plagued radio astronomy which potentially might be as bad or worse by the time the Square Kilometre Array (SKA) comes up. RFI can be either internal (generated by instruments) or external that originates from intentional or unintentional radio emission generated by man. With the huge amount of data that will be available with up coming radio telescopes, an automated aproach will be required to detect RFI. In this paper to try automate this process we present the result of applying machine learning techniques to cross match RFI from the Karoo Array Telescope (KAT-7) data. We found that not all the features selected to characterise RFI are always important. We further investigated 3 machine learning techniques and conclude that the Random forest classifier performs with a 98% Area Under Curve and 91% recall in detecting RFI.
Classifying bent radio galaxies from a mixture of point-like/extended images with Machine Learning.
NASA Astrophysics Data System (ADS)
Bastien, David; Oozeer, Nadeem; Somanah, Radhakrishna
2017-05-01
The hypothesis that bent radio sources are supposed to be found in rich, massive galaxy clusters and the avalibility of huge amount of data from radio surveys have fueled our motivation to use Machine Learning (ML) to identify bent radio sources and as such use them as tracers for galaxy clusters. The shapelet analysis allowed us to decompose radio images into 256 features that could be fed into the ML algorithm. Additionally, ideas from the field of neuro-psychology helped us to consider training the machine to identify bent galaxies at different orientations. From our analysis, we found that the Random Forest algorithm was the most effective with an accuracy rate of 92% for a classification of point and extended sources as well as an accuracy of 80% for bent and unbent classification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belley, M; Schmidt, M; Knutson, N
Purpose: Physics second-checks for external beam radiation therapy are performed, in-part, to verify that the machine parameters in the Record-and-Verify (R&V) system that will ultimately be sent to the LINAC exactly match the values initially calculated by the Treatment Planning System (TPS). While performing the second-check, a large portion of the physicists’ time is spent navigating and arranging display windows to locate and compare the relevant numerical values (MLC position, collimator rotation, field size, MU, etc.). Here, we describe the development of a software tool that guides the physicist by aggregating and succinctly displaying machine parameter data relevant to themore » physics second-check process. Methods: A data retrieval software tool was developed using Python to aggregate data and generate a list of machine parameters that are commonly verified during the physics second-check process. This software tool imported values from (i) the TPS RT Plan DICOM file and (ii) the MOSAIQ (R&V) Structured Query Language (SQL) database. The machine parameters aggregated for this study included: MLC positions, X&Y jaw positions, collimator rotation, gantry rotation, MU, dose rate, wedges and accessories, cumulative dose, energy, machine name, couch angle, and more. Results: A GUI interface was developed to generate a side-by-side display of the aggregated machine parameter values for each field, and presented to the physicist for direct visual comparison. This software tool was tested for 3D conformal, static IMRT, sliding window IMRT, and VMAT treatment plans. Conclusion: This software tool facilitated the data collection process needed in order for the physicist to conduct a second-check, thus yielding an optimized second-check workflow that was both more user friendly and time-efficient. Utilizing this software tool, the physicist was able to spend less time searching through the TPS PDF plan document and the R&V system and focus the second-check efforts on assessing the patient-specific plan-quality.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibragimov, B; Pernus, F; Strojan, P
Purpose: Accurate and efficient delineation of tumor target and organs-at-risks is essential for the success of radiotherapy. In reality, despite of decades of intense research efforts, auto-segmentation has not yet become clinical practice. In this study, we present, for the first time, a deep learning-based classification algorithm for autonomous segmentation in head and neck (HaN) treatment planning. Methods: Fifteen HN datasets of CT, MR and PET images with manual annotation of organs-at-risk (OARs) including spinal cord, brainstem, optic nerves, chiasm, eyes, mandible, tongue, parotid glands were collected and saved in a library of plans. We also have ten super-resolution MRmore » images of the tongue area, where the genioglossus and inferior longitudinalis tongue muscles are defined as organs of interest. We applied the concepts of random forest- and deep learning-based object classification for automated image annotation with the aim of using machine learning to facilitate head and neck radiotherapy planning process. In this new paradigm of segmentation, random forests were used for landmark-assisted segmentation of super-resolution MR images. Alternatively to auto-segmentation with random forest-based landmark detection, deep convolutional neural networks were developed for voxel-wise segmentation of OARs in single and multi-modal images. The network consisted of three pairs of convolution and pooing layer, one RuLU layer and a softmax layer. Results: We present a comprehensive study on using machine learning concepts for auto-segmentation of OARs and tongue muscles for the HaN radiotherapy planning. An accuracy of 81.8% in terms of Dice coefficient was achieved for segmentation of genioglossus and inferior longitudinalis tongue muscles. Preliminary results of OARs regimentation also indicate that deep-learning afforded an unprecedented opportunities to improve the accuracy and robustness of radiotherapy planning. Conclusion: A novel machine learning framework has been developed for image annotation and structure segmentation. Our results indicate the great potential of deep learning in radiotherapy treatment planning.« less
Cardiovascular Event Prediction by Machine Learning: The Multi-Ethnic Study of Atherosclerosis.
Ambale-Venkatesh, Bharath; Yang, Xiaoying; Wu, Colin O; Liu, Kiang; Hundley, W Gregory; McClelland, Robyn; Gomes, Antoinette S; Folsom, Aaron R; Shea, Steven; Guallar, Eliseo; Bluemke, David A; Lima, João A C
2017-10-13
Machine learning may be useful to characterize cardiovascular risk, predict outcomes, and identify biomarkers in population studies. To test the ability of random survival forests, a machine learning technique, to predict 6 cardiovascular outcomes in comparison to standard cardiovascular risk scores. We included participants from the MESA (Multi-Ethnic Study of Atherosclerosis). Baseline measurements were used to predict cardiovascular outcomes over 12 years of follow-up. MESA was designed to study progression of subclinical disease to cardiovascular events where participants were initially free of cardiovascular disease. All 6814 participants from MESA, aged 45 to 84 years, from 4 ethnicities, and 6 centers across the United States were included. Seven-hundred thirty-five variables from imaging and noninvasive tests, questionnaires, and biomarker panels were obtained. We used the random survival forests technique to identify the top-20 predictors of each outcome. Imaging, electrocardiography, and serum biomarkers featured heavily on the top-20 lists as opposed to traditional cardiovascular risk factors. Age was the most important predictor for all-cause mortality. Fasting glucose levels and carotid ultrasonography measures were important predictors of stroke. Coronary Artery Calcium score was the most important predictor of coronary heart disease and all atherosclerotic cardiovascular disease combined outcomes. Left ventricular structure and function and cardiac troponin-T were among the top predictors for incident heart failure. Creatinine, age, and ankle-brachial index were among the top predictors of atrial fibrillation. TNF-α (tissue necrosis factor-α) and IL (interleukin)-2 soluble receptors and NT-proBNP (N-Terminal Pro-B-Type Natriuretic Peptide) levels were important across all outcomes. The random survival forests technique performed better than established risk scores with increased prediction accuracy (decreased Brier score by 10%-25%). Machine learning in conjunction with deep phenotyping improves prediction accuracy in cardiovascular event prediction in an initially asymptomatic population. These methods may lead to greater insights on subclinical disease markers without apriori assumptions of causality. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00005487. © 2017 American Heart Association, Inc.
ERIC Educational Resources Information Center
Seitz, Sue; Morris, Dan
In a study on short term memory, 32 educable mentally retarded subjects (mean IQ 62.68, mean mental age 103.78 months) were randomly assigned to each of the four experimental conditions. An automated machine presented the stimuli (32 three-letter words) and the interference items (a list of random numbers read aloud between stimuli presentations).…
Precision mechatronics based on high-precision measuring and positioning systems and machines
NASA Astrophysics Data System (ADS)
Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert
2007-06-01
Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.
Random ensemble learning for EEG classification.
Hosseini, Mohammad-Parsa; Pompili, Dario; Elisevich, Kost; Soltanian-Zadeh, Hamid
2018-01-01
Real-time detection of seizure activity in epilepsy patients is critical in averting seizure activity and improving patients' quality of life. Accurate evaluation, presurgical assessment, seizure prevention, and emergency alerts all depend on the rapid detection of seizure onset. A new method of feature selection and classification for rapid and precise seizure detection is discussed wherein informative components of electroencephalogram (EEG)-derived data are extracted and an automatic method is presented using infinite independent component analysis (I-ICA) to select independent features. The feature space is divided into subspaces via random selection and multichannel support vector machines (SVMs) are used to classify these subspaces. The result of each classifier is then combined by majority voting to establish the final output. In addition, a random subspace ensemble using a combination of SVM, multilayer perceptron (MLP) neural network and an extended k-nearest neighbors (k-NN), called extended nearest neighbor (ENN), is developed for the EEG and electrocorticography (ECoG) big data problem. To evaluate the solution, a benchmark ECoG of eight patients with temporal and extratemporal epilepsy was implemented in a distributed computing framework as a multitier cloud-computing architecture. Using leave-one-out cross-validation, the accuracy, sensitivity, specificity, and both false positive and false negative ratios of the proposed method were found to be 0.97, 0.98, 0.96, 0.04, and 0.02, respectively. Application of the solution to cases under investigation with ECoG has also been effected to demonstrate its utility. Copyright © 2017 Elsevier B.V. All rights reserved.
Automatic alkaloid removal system.
Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd
2014-01-01
This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user.
Integrating Archaeological Modeling in DoD Cultural Resource Compliance
2012-10-26
Leo 2001 Random Forests. Machine Learning 45:5–32. Briuer, Frederick, Clifford Brown, Alan Gillespie, Fredrick Limp, Michael Trimble, and Len...glaciolacustrine clays on glacial lake plains Inceptisols Very-fine, mixed, active, nonacid, mesic Mollic Endoaquepts Low to none LoC Lowville silt
Evolving optimised decision rules for intrusion detection using particle swarm paradigm
NASA Astrophysics Data System (ADS)
Sivatha Sindhu, Siva S.; Geetha, S.; Kannan, A.
2012-12-01
The aim of this article is to construct a practical intrusion detection system (IDS) that properly analyses the statistics of network traffic pattern and classify them as normal or anomalous class. The objective of this article is to prove that the choice of effective network traffic features and a proficient machine-learning paradigm enhances the detection accuracy of IDS. In this article, a rule-based approach with a family of six decision tree classifiers, namely Decision Stump, C4.5, Naive Baye's Tree, Random Forest, Random Tree and Representative Tree model to perform the detection of anomalous network pattern is introduced. In particular, the proposed swarm optimisation-based approach selects instances that compose training set and optimised decision tree operate over this trained set producing classification rules with improved coverage, classification capability and generalisation ability. Experiment with the Knowledge Discovery and Data mining (KDD) data set which have information on traffic pattern, during normal and intrusive behaviour shows that the proposed algorithm produces optimised decision rules and outperforms other machine-learning algorithm.
Recognising discourse causality triggers in the biomedical domain.
Mihăilă, Claudiu; Ananiadou, Sophia
2013-12-01
Current domain-specific information extraction systems represent an important resource for biomedical researchers, who need to process vast amounts of knowledge in a short time. Automatic discourse causality recognition can further reduce their workload by suggesting possible causal connections and aiding in the curation of pathway models. We describe here an approach to the automatic identification of discourse causality triggers in the biomedical domain using machine learning. We create several baselines and experiment with and compare various parameter settings for three algorithms, i.e. Conditional Random Fields (CRF), Support Vector Machines (SVM) and Random Forests (RF). We also evaluate the impact of lexical, syntactic, and semantic features on each of the algorithms, showing that semantics improves the performance in all cases. We test our comprehensive feature set on two corpora containing gold standard annotations of causal relations, and demonstrate the need for more gold standard data. The best performance of 79.35% F-score is achieved by CRFs when using all three feature types.
NASA Astrophysics Data System (ADS)
Lazri, Mourad; Ameur, Soltane
2018-05-01
A model combining three classifiers, namely Support vector machine, Artificial neural network and Random forest (SAR) is designed for improving the classification of convective and stratiform rain. This model (SAR model) has been trained and then tested on a datasets derived from MSG-SEVIRI (Meteosat Second Generation-Spinning Enhanced Visible and Infrared Imager). Well-classified, mid-classified and misclassified pixels are determined from the combination of three classifiers. Mid-classified and misclassified pixels that are considered unreliable pixels are reclassified by using a novel training of the developed scheme. In this novel training, only the input data corresponding to the pixels in question to are used. This whole process is repeated a second time and applied to mid-classified and misclassified pixels separately. Learning and validation of the developed scheme are realized against co-located data observed by ground radar. The developed scheme outperformed different classifiers used separately and reached 97.40% of overall accuracy of classification.
Meta-RaPS Algorithm for the Aerial Refueling Scheduling Problem
NASA Technical Reports Server (NTRS)
Kaplan, Sezgin; Arin, Arif; Rabadi, Ghaith
2011-01-01
The Aerial Refueling Scheduling Problem (ARSP) can be defined as determining the refueling completion times for each fighter aircraft (job) on multiple tankers (machines). ARSP assumes that jobs have different release times and due dates, The total weighted tardiness is used to evaluate schedule's quality. Therefore, ARSP can be modeled as a parallel machine scheduling with release limes and due dates to minimize the total weighted tardiness. Since ARSP is NP-hard, it will be more appropriate to develop a pproimate or heuristic algorithm to obtain solutions in reasonable computation limes. In this paper, Meta-Raps-ATC algorithm is implemented to create high quality solutions. Meta-RaPS (Meta-heuristic for Randomized Priority Search) is a recent and promising meta heuristic that is applied by introducing randomness to a construction heuristic. The Apparent Tardiness Rule (ATC), which is a good rule for scheduling problems with tardiness objective, is used to construct initial solutions which are improved by an exchanging operation. Results are presented for generated instances.
Teo, Ming; Amis, Terence; Lee, Sharon; Falland, Karina; Lambert, Stephen; Wheatley, John
2011-01-01
Study Objective: Continuous positive airway pressure (CPAP) titration studies are commonly performed using a nasal mask but some patients may prefer a full-face or oronasal mask. There is little evidence regarding the equivalence of different mask interfaces used to initiate treatment. We hypothesized that oronasal breathing when using an oronasal mask increases upper airway collapsibility and that a higher pressure may be required to maintain airway patency. We also assessed patient preferences for the 2 mask interfaces. Design: Prospective, randomized, cross-over design with 2 consecutive CPAP titration nights. Setting: Accredited laboratory in a university hospital. Patients or Participants: Twenty-four treatment-naive subjects with obstructive sleep apnea syndrome and respiratory disturbance index of greater than 15 events per hour. Interventions: CPAP titration was performed using an auto-titrating machine with randomization to a nasal or oronasal mask, followed by a second titration night using the alternate mask style. Measurements and Results: There was no significant difference in the mean pressures determined between nasal and oronasal masks, although 43% of subjects had nasal-to-oronasal mask-pressure differences of 2 cm H2O or more. Residual respiratory events, arousals, and measured leak were all greater with the oronasal mask. Seventy-nine percent of subjects preferred the nasal mask. Conclusions: Patients with obstructive sleep apnea syndrome can generally switch between nasal and oronasal masks without changing machine pressure, although there are individual differences that may be clinically significant. Measured leak is greater with the oronasal mask. Most patients with obstructive sleep apnea syndrome prefer a nasal mask as the interface for initiation of CPAP. Clinical Trial Registration: Australian New Zealand Clinical Trials Registry (ANZCTR). ACTRN: ACTRN12611000243910. URL: http://www.ANZCTR.org.au/ACTRN12611000243910.aspx Citation: Teo M; Amis T; Lee S; Falland K; Lambert S; Wheatley J. Equivalence of nasal and oronasal masks during initial CPAP titration for obstructive sleep apnea syndrome. SLEEP 2011;34(7):951-955. PMID:21731145
What variables are important in predicting bovine viral diarrhea virus? A random forest approach.
Machado, Gustavo; Mendoza, Mariana Recamonde; Corbellini, Luis Gustavo
2015-07-24
Bovine viral diarrhea virus (BVDV) causes one of the most economically important diseases in cattle, and the virus is found worldwide. A better understanding of the disease associated factors is a crucial step towards the definition of strategies for control and eradication. In this study we trained a random forest (RF) prediction model and performed variable importance analysis to identify factors associated with BVDV occurrence. In addition, we assessed the influence of features selection on RF performance and evaluated its predictive power relative to other popular classifiers and to logistic regression. We found that RF classification model resulted in an average error rate of 32.03% for the negative class (negative for BVDV) and 36.78% for the positive class (positive for BVDV).The RF model presented area under the ROC curve equal to 0.702. Variable importance analysis revealed that important predictors of BVDV occurrence were: a) who inseminates the animals, b) number of neighboring farms that have cattle and c) rectal palpation performed routinely. Our results suggest that the use of machine learning algorithms, especially RF, is a promising methodology for the analysis of cross-sectional studies, presenting a satisfactory predictive power and the ability to identify predictors that represent potential risk factors for BVDV investigation. We examined classical predictors and found some new and hard to control practices that may lead to the spread of this disease within and among farms, mainly regarding poor or neglected reproduction management, which should be considered for disease control and eradication.
Measurement of LHCD antenna position in Aditya tokamak
NASA Astrophysics Data System (ADS)
Ambulkar, K. K.; Sharma, P. K.; Virani, C. G.; Parmar, P. R.; Thakur, A. L.; Kulkarni, S. V.
2010-02-01
To drive plasma current non-inductively in ADITYA tokamak, 120 kW pulsed Lower Hybrid Current Drive (LHCD) system at 3.7 GHz has been designed, fabricated and installed on ADITYA tokamak. In this system, the antenna consists of a grill structure, having two rows, each row comprising of four sub-waveguides. The coupling of LHCD power to the plasma strongly depends on the plasma density near the mouth of grill antenna. Thus the grill antenna has to be precisely positioned for efficient coupling. The movement of mechanical bellow, which contracts or expands up to 50mm, governs the movement of antenna. In order to monitor the position of the antenna precisely, the reference position of the antenna with respect to the machine/plasma position has to be accurately determined. Further a mechanical system or an electronic system to measure the relative movement of the antenna with respect to the reference position is also desired. Also due to poor accessibility inside the ADITYA machine, it is impossible to measure physically the reference position of the grill antenna with respect to machine wall, taken as reference position and hence an alternative method has to be adopted to establish these measurements reliably. In this paper we report the design and development of a mechanism, using which the antenna position measurements are made. It also describes a unique method employing which the measurements of the reference position of the antenna with respect to the inner edge of the tokamak wall is carried out, which otherwise was impossible due to poor accessibility and physical constraints. The position of the antenna is monitored using an electronic scale, which is developed and installed on the bellow. Once the reference position is derived, the linear potentiometer, attached to the bellow, measures the linear distance using position transmitter. The accuracy of measurement obtained in our setup is within +/- 0.5 % and the linearity, along with repeatability is excellent.
The evolution of machining-induced surface of single-crystal FCC copper via nanoindentation
NASA Astrophysics Data System (ADS)
Zhang, Lin; Huang, Hu; Zhao, Hongwei; Ma, Zhichao; Yang, Yihan; Hu, Xiaoli
2013-05-01
The physical properties of the machining-induced new surface depend on the performance of the initial defect surface and deformed layer in the subsurface of the bulk material. In this paper, three-dimensional molecular dynamics simulations of nanoindentation are preformed on the single-point diamond turning surface of single-crystal copper comparing with that of pristine single-crystal face-centered cubic copper. The simulation results indicate that the nucleation of dislocations in the nanoindentation test on the machining-induced surface and pristine single-crystal copper is different. The dislocation embryos are gradually developed from the sites of homogeneous random nucleation around the indenter in the pristine single-crystal specimen, while the dislocation embryos derived from the vacancy-related defects are distributed in the damage layer of the subsurface beneath the machining-induced surface. The results show that the hardness of the machining-induced surface is softer than that of pristine single-crystal copper. Then, the nanocutting simulations are performed along different crystal orientations on the same crystal surface. It is shown that the crystal orientation directly influences the dislocation formation and distribution of the machining-induced surface. The crystal orientation of nanocutting is further verified to affect both residual defect generations and their propagation directions which are important in assessing the change of mechanical properties, such as hardness and Young's modulus, after nanocutting process.
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
The Tera Multithreaded Architecture and Unstructured Meshes
NASA Technical Reports Server (NTRS)
Bokhari, Shahid H.; Mavriplis, Dimitri J.
1998-01-01
The Tera Multithreaded Architecture (MTA) is a new parallel supercomputer currently being installed at San Diego Supercomputing Center (SDSC). This machine has an architecture quite different from contemporary parallel machines. The computational processor is a custom design and the machine uses hardware to support very fine grained multithreading. The main memory is shared, hardware randomized and flat. These features make the machine highly suited to the execution of unstructured mesh problems, which are difficult to parallelize on other architectures. We report the results of a study carried out during July-August 1998 to evaluate the execution of EUL3D, a code that solves the Euler equations on an unstructured mesh, on the 2 processor Tera MTA at SDSC. Our investigation shows that parallelization of an unstructured code is extremely easy on the Tera. We were able to get an existing parallel code (designed for a shared memory machine), running on the Tera by changing only the compiler directives. Furthermore, a serial version of this code was compiled to run in parallel on the Tera by judicious use of directives to invoke the "full/empty" tag bits of the machine to obtain synchronization. This version achieves 212 and 406 Mflop/s on one and two processors respectively, and requires no attention to partitioning or placement of data issues that would be of paramount importance in other parallel architectures.
More About The Farley Three-Dimensional Braider
NASA Technical Reports Server (NTRS)
Farley, Gary L.
1993-01-01
Farley three-dimensional braider, undergoing development, is machine for automatic fabrication of three-dimensional braided structures. Incorporates yarns into structure at arbitrary braid angles to produce complicated shape. Braiding surface includes movable braiding segments containing pivot points, along which yarn carriers travel during braiding process. Yarn carrier travels along sequence of pivot points as braiding segments move. Combined motions position yarns for braiding onto preform. Intended for use in making fiber preforms for fiber/matrix composite parts, such as multiblade propellers. Machine also described in "Farley Three-Dimensional Braiding Machine" (LAR-13911).
High Order Accuracy Methods for Supersonic Reactive Flows
2008-06-25
k = 0, · · · , N and N is the polynomial order used. The positive constant M is chosen such that σ(N) becomes machine zero. Typically M ∼ 32. Table...function used in this study is the Exponential filter given by σ(η) = exp(−αηp), (38) where α = − ln() and is the machine zero. The spectral...respectively. All numerical experiments were run 49 on a 667 MHz Compaq Alpha machine with 1GB memory and with an Alpha internal floating point processor. 9.1
Robust Operation of Tendon-Driven Robot Fingers Using Force and Position-Based Control Laws
NASA Technical Reports Server (NTRS)
Hargrave, Brian (Inventor); Abdallah, Muhammad E (Inventor); Reiland, Matthew J (Inventor); Diftler, Myron A (Inventor); Strawser, Philip A (Inventor); Platt, Jr., Robert J. (Inventor); Ihrke, Chris A. (Inventor)
2013-01-01
A robotic system includes a tendon-driven finger and a control system. The system controls the finger via a force-based control law when a tension sensor is available, and via a position-based control law when a sensor is not available. Multiple tendons may each have a corresponding sensor. The system selectively injects a compliance value into the position-based control law when only some sensors are available. A control system includes a host machine and a non-transitory computer-readable medium having a control process, which is executed by the host machine to control the finger via the force- or position-based control law. A method for controlling the finger includes determining the availability of a tension sensor(s), and selectively controlling the finger, using the control system, via the force or position-based control law. The position control law allows the control system to resist disturbances while nominally maintaining the initial state of internal tendon tensions.
Cao, Peng; Liu, Xiaoli; Bao, Hang; Yang, Jinzhu; Zhao, Dazhe
2015-01-01
The false-positive reduction (FPR) is a crucial step in the computer aided detection system for the breast. The issues of imbalanced data distribution and the limitation of labeled samples complicate the classification procedure. To overcome these challenges, we propose oversampling and semi-supervised learning methods based on the restricted Boltzmann machines (RBMs) to solve the classification of imbalanced data with a few labeled samples. To evaluate the proposed method, we conducted a comprehensive performance study and compared its results with the commonly used techniques. Experiments on benchmark dataset of DDSM demonstrate the effectiveness of the RBMs based oversampling and semi-supervised learning method in terms of geometric mean (G-mean) for false positive reduction in Breast CAD.
Kim, Eun Young; Magnotta, Vincent A; Liu, Dawei; Johnson, Hans J
2014-09-01
Machine learning (ML)-based segmentation methods are a common technique in the medical image processing field. In spite of numerous research groups that have investigated ML-based segmentation frameworks, there remains unanswered aspects of performance variability for the choice of two key components: ML algorithm and intensity normalization. This investigation reveals that the choice of those elements plays a major part in determining segmentation accuracy and generalizability. The approach we have used in this study aims to evaluate relative benefits of the two elements within a subcortical MRI segmentation framework. Experiments were conducted to contrast eight machine-learning algorithm configurations and 11 normalization strategies for our brain MR segmentation framework. For the intensity normalization, a Stable Atlas-based Mapped Prior (STAMP) was utilized to take better account of contrast along boundaries of structures. Comparing eight machine learning algorithms on down-sampled segmentation MR data, it was obvious that a significant improvement was obtained using ensemble-based ML algorithms (i.e., random forest) or ANN algorithms. Further investigation between these two algorithms also revealed that the random forest results provided exceptionally good agreement with manual delineations by experts. Additional experiments showed that the effect of STAMP-based intensity normalization also improved the robustness of segmentation for multicenter data sets. The constructed framework obtained good multicenter reliability and was successfully applied on a large multicenter MR data set (n>3000). Less than 10% of automated segmentations were recommended for minimal expert intervention. These results demonstrate the feasibility of using the ML-based segmentation tools for processing large amount of multicenter MR images. We demonstrated dramatically different result profiles in segmentation accuracy according to the choice of ML algorithm and intensity normalization chosen. Copyright © 2014 Elsevier Inc. All rights reserved.
Burlina, Philippe; Billings, Seth; Joshi, Neil
2017-01-01
Objective To evaluate the use of ultrasound coupled with machine learning (ML) and deep learning (DL) techniques for automated or semi-automated classification of myositis. Methods Eighty subjects comprised of 19 with inclusion body myositis (IBM), 14 with polymyositis (PM), 14 with dermatomyositis (DM), and 33 normal (N) subjects were included in this study, where 3214 muscle ultrasound images of 7 muscles (observed bilaterally) were acquired. We considered three problems of classification including (A) normal vs. affected (DM, PM, IBM); (B) normal vs. IBM patients; and (C) IBM vs. other types of myositis (DM or PM). We studied the use of an automated DL method using deep convolutional neural networks (DL-DCNNs) for diagnostic classification and compared it with a semi-automated conventional ML method based on random forests (ML-RF) and “engineered” features. We used the known clinical diagnosis as the gold standard for evaluating performance of muscle classification. Results The performance of the DL-DCNN method resulted in accuracies ± standard deviation of 76.2% ± 3.1% for problem (A), 86.6% ± 2.4% for (B) and 74.8% ± 3.9% for (C), while the ML-RF method led to accuracies of 72.3% ± 3.3% for problem (A), 84.3% ± 2.3% for (B) and 68.9% ± 2.5% for (C). Conclusions This study demonstrates the application of machine learning methods for automatically or semi-automatically classifying inflammatory muscle disease using muscle ultrasound. Compared to the conventional random forest machine learning method used here, which has the drawback of requiring manual delineation of muscle/fat boundaries, DCNN-based classification by and large improved the accuracies in all classification problems while providing a fully automated approach to classification. PMID:28854220
Burlina, Philippe; Billings, Seth; Joshi, Neil; Albayda, Jemima
2017-01-01
To evaluate the use of ultrasound coupled with machine learning (ML) and deep learning (DL) techniques for automated or semi-automated classification of myositis. Eighty subjects comprised of 19 with inclusion body myositis (IBM), 14 with polymyositis (PM), 14 with dermatomyositis (DM), and 33 normal (N) subjects were included in this study, where 3214 muscle ultrasound images of 7 muscles (observed bilaterally) were acquired. We considered three problems of classification including (A) normal vs. affected (DM, PM, IBM); (B) normal vs. IBM patients; and (C) IBM vs. other types of myositis (DM or PM). We studied the use of an automated DL method using deep convolutional neural networks (DL-DCNNs) for diagnostic classification and compared it with a semi-automated conventional ML method based on random forests (ML-RF) and "engineered" features. We used the known clinical diagnosis as the gold standard for evaluating performance of muscle classification. The performance of the DL-DCNN method resulted in accuracies ± standard deviation of 76.2% ± 3.1% for problem (A), 86.6% ± 2.4% for (B) and 74.8% ± 3.9% for (C), while the ML-RF method led to accuracies of 72.3% ± 3.3% for problem (A), 84.3% ± 2.3% for (B) and 68.9% ± 2.5% for (C). This study demonstrates the application of machine learning methods for automatically or semi-automatically classifying inflammatory muscle disease using muscle ultrasound. Compared to the conventional random forest machine learning method used here, which has the drawback of requiring manual delineation of muscle/fat boundaries, DCNN-based classification by and large improved the accuracies in all classification problems while providing a fully automated approach to classification.
Hettige, Nuwan C; Nguyen, Thai Binh; Yuan, Chen; Rajakulendran, Thanara; Baddour, Jermeen; Bhagwat, Nikhil; Bani-Fatemi, Ali; Voineskos, Aristotle N; Mallar Chakravarty, M; De Luca, Vincenzo
2017-07-01
Suicide is a major concern for those afflicted by schizophrenia. Identifying patients at the highest risk for future suicide attempts remains a complex problem for psychiatric interventions. Machine learning models allow for the integration of many risk factors in order to build an algorithm that predicts which patients are likely to attempt suicide. Currently it is unclear how to integrate previously identified risk factors into a clinically relevant predictive tool to estimate the probability of a patient with schizophrenia for attempting suicide. We conducted a cross-sectional assessment on a sample of 345 participants diagnosed with schizophrenia spectrum disorders. Suicide attempters and non-attempters were clearly identified using the Columbia Suicide Severity Rating Scale (C-SSRS) and the Beck Suicide Ideation Scale (BSS). We developed four classification algorithms using a regularized regression, random forest, elastic net and support vector machine models with sociocultural and clinical variables as features to train the models. All classification models performed similarly in identifying suicide attempters and non-attempters. Our regularized logistic regression model demonstrated an accuracy of 67% and an area under the curve (AUC) of 0.71, while the random forest model demonstrated 66% accuracy and an AUC of 0.67. Support vector classifier (SVC) model demonstrated an accuracy of 67% and an AUC of 0.70, and the elastic net model demonstrated and accuracy of 65% and an AUC of 0.71. Machine learning algorithms offer a relatively successful method for incorporating many clinical features to predict individuals at risk for future suicide attempts. Increased performance of these models using clinically relevant variables offers the potential to facilitate early treatment and intervention to prevent future suicide attempts. Copyright © 2017 Elsevier Inc. All rights reserved.
Hasan, Mehedi; Kotov, Alexander; Carcone, April; Dong, Ming; Naar, Sylvie; Hartlieb, Kathryn Brogan
2016-08-01
This study examines the effectiveness of state-of-the-art supervised machine learning methods in conjunction with different feature types for the task of automatic annotation of fragments of clinical text based on codebooks with a large number of categories. We used a collection of motivational interview transcripts consisting of 11,353 utterances, which were manually annotated by two human coders as the gold standard, and experimented with state-of-art classifiers, including Naïve Bayes, J48 Decision Tree, Support Vector Machine (SVM), Random Forest (RF), AdaBoost, DiscLDA, Conditional Random Fields (CRF) and Convolutional Neural Network (CNN) in conjunction with lexical, contextual (label of the previous utterance) and semantic (distribution of words in the utterance across the Linguistic Inquiry and Word Count dictionaries) features. We found out that, when the number of classes is large, the performance of CNN and CRF is inferior to SVM. When only lexical features were used, interview transcripts were automatically annotated by SVM with the highest classification accuracy among all classifiers of 70.8%, 61% and 53.7% based on the codebooks consisting of 17, 20 and 41 codes, respectively. Using contextual and semantic features, as well as their combination, in addition to lexical ones, improved the accuracy of SVM for annotation of utterances in motivational interview transcripts with a codebook consisting of 17 classes to 71.5%, 74.2%, and 75.1%, respectively. Our results demonstrate the potential of using machine learning methods in conjunction with lexical, semantic and contextual features for automatic annotation of clinical interview transcripts with near-human accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.
Kalscheur, Matthew M; Kipp, Ryan T; Tattersall, Matthew C; Mei, Chaoqun; Buhr, Kevin A; DeMets, David L; Field, Michael E; Eckhardt, Lee L; Page, C David
2018-01-01
Cardiac resynchronization therapy (CRT) reduces morbidity and mortality in heart failure patients with reduced left ventricular function and intraventricular conduction delay. However, individual outcomes vary significantly. This study sought to use a machine learning algorithm to develop a model to predict outcomes after CRT. Models were developed with machine learning algorithms to predict all-cause mortality or heart failure hospitalization at 12 months post-CRT in the COMPANION trial (Comparison of Medical Therapy, Pacing, and Defibrillation in Heart Failure). The best performing model was developed with the random forest algorithm. The ability of this model to predict all-cause mortality or heart failure hospitalization and all-cause mortality alone was compared with discrimination obtained using a combination of bundle branch block morphology and QRS duration. In the 595 patients with CRT-defibrillator in the COMPANION trial, 105 deaths occurred (median follow-up, 15.7 months). The survival difference across subgroups differentiated by bundle branch block morphology and QRS duration did not reach significance ( P =0.08). The random forest model produced quartiles of patients with an 8-fold difference in survival between those with the highest and lowest predicted probability for events (hazard ratio, 7.96; P <0.0001). The model also discriminated the risk of the composite end point of all-cause mortality or heart failure hospitalization better than subgroups based on bundle branch block morphology and QRS duration. In the COMPANION trial, a machine learning algorithm produced a model that predicted clinical outcomes after CRT. Applied before device implant, this model may better differentiate outcomes over current clinical discriminators and improve shared decision-making with patients. © 2018 American Heart Association, Inc.
Data-driven mapping of the potential mountain permafrost distribution.
Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail
2017-07-15
Existing mountain permafrost distribution models generally offer a good overview of the potential extent of this phenomenon at a regional scale. They are however not always able to reproduce the high spatial discontinuity of permafrost at the micro-scale (scale of a specific landform; ten to several hundreds of meters). To overcome this lack, we tested an alternative modelling approach using three classification algorithms belonging to statistics and machine learning: Logistic regression, Support Vector Machines and Random forests. These supervised learning techniques infer a classification function from labelled training data (pixels of permafrost absence and presence) with the aim of predicting the permafrost occurrence where it is unknown. The research was carried out in a 588km 2 area of the Western Swiss Alps. Permafrost evidences were mapped from ortho-image interpretation (rock glacier inventorying) and field data (mainly geoelectrical and thermal data). The relationship between selected permafrost evidences and permafrost controlling factors was computed with the mentioned techniques. Classification performances, assessed with AUROC, range between 0.81 for Logistic regression, 0.85 with Support Vector Machines and 0.88 with Random forests. The adopted machine learning algorithms have demonstrated to be efficient for permafrost distribution modelling thanks to consistent results compared to the field reality. The high resolution of the input dataset (10m) allows elaborating maps at the micro-scale with a modelled permafrost spatial distribution less optimistic than classic spatial models. Moreover, the probability output of adopted algorithms offers a more precise overview of the potential distribution of mountain permafrost than proposing simple indexes of the permafrost favorability. These encouraging results also open the way to new possibilities of permafrost data analysis and mapping. Copyright © 2017 Elsevier B.V. All rights reserved.
Pereira, Sérgio; Meier, Raphael; McKinley, Richard; Wiest, Roland; Alves, Victor; Silva, Carlos A; Reyes, Mauricio
2018-02-01
Machine learning systems are achieving better performances at the cost of becoming increasingly complex. However, because of that, they become less interpretable, which may cause some distrust by the end-user of the system. This is especially important as these systems are pervasively being introduced to critical domains, such as the medical field. Representation Learning techniques are general methods for automatic feature computation. Nevertheless, these techniques are regarded as uninterpretable "black boxes". In this paper, we propose a methodology to enhance the interpretability of automatically extracted machine learning features. The proposed system is composed of a Restricted Boltzmann Machine for unsupervised feature learning, and a Random Forest classifier, which are combined to jointly consider existing correlations between imaging data, features, and target variables. We define two levels of interpretation: global and local. The former is devoted to understanding if the system learned the relevant relations in the data correctly, while the later is focused on predictions performed on a voxel- and patient-level. In addition, we propose a novel feature importance strategy that considers both imaging data and target variables, and we demonstrate the ability of the approach to leverage the interpretability of the obtained representation for the task at hand. We evaluated the proposed methodology in brain tumor segmentation and penumbra estimation in ischemic stroke lesions. We show the ability of the proposed methodology to unveil information regarding relationships between imaging modalities and extracted features and their usefulness for the task at hand. In both clinical scenarios, we demonstrate that the proposed methodology enhances the interpretability of automatically learned features, highlighting specific learning patterns that resemble how an expert extracts relevant data from medical images. Copyright © 2017 Elsevier B.V. All rights reserved.
Accuracy of Tracking Forest Machines with GPS
M.W. Veal; S.E. Taylor; T.P. McDonald; D.K. McLemore; M.R. Dunn
2001-01-01
This paper describes the results of a study that measured the accuracy of using GPS to track movement offorest machines. Two different commercially available GPS receivers (Trimble ProXR and GeoExplorer II) were used to track wheeled skidders under three different canopy conditions at two different vehicle speeds. Dynamic GPS data were compared to position data...
Machine Translation-Assisted Language Learning: Writing for Beginners
ERIC Educational Resources Information Center
Garcia, Ignacio; Pena, Maria Isabel
2011-01-01
The few studies that deal with machine translation (MT) as a language learning tool focus on its use by advanced learners, never by beginners. Yet, freely available MT engines (i.e. Google Translate) and MT-related web initiatives (i.e. Gabble-on.com) position themselves to cater precisely to the needs of learners with a limited command of a…
ERIC Educational Resources Information Center
Kafafian, Haig
Teaching instructions, lesson plans, and exercises are provided for severely physically and/or neurologically handicapped persons learning to use the Cybertype electric writing machine with a tongue-body keyboard. The keyboard, which has eight double-throw toggle switches and a three-position state-selector switch, is designed to be used by…
Apparatus for cutting elastomeric materials
NASA Technical Reports Server (NTRS)
Corbett, A. B.
1974-01-01
Sharp thin cutting edge is held in head of milling machine designed for metal working. Controls of machine are used to position cutting edge in same plane as vibrating specimen. Controls then are operated, making blade come into contact with specimen, to cut it into shapes and sizes desired. Cut surfaces appear mirror-smooth; vibrating mechanism causes no visible striations.
Evaluation of the eZono 4000 with eZGuide for ultrasound-guided procedures.
Gadsden, Jeff; Latmore, Malikah; Levine, Daniel M
2015-05-01
Ultrasound-guided procedures are increasingly common in a variety of acute care settings, such as the operating room, critical care unit and emergency room. However, accurate judgment of needle tip position using traditional ultrasound technology is frequently difficult, and serious injury can result from inadvertently advancing beyond or through the target. Needle navigation is a recent innovation that allows the clinician to visualize the needle position and trajectory in real time as it approaches the target. A novel ultrasound machine has recently been introduced that is portable and designed for procedural guidance. The eZono 4000™ features an innovative needle navigation technology that is simple to use and permits the use of a wide range of commercially available needles, avoiding the inconvenience and cost of proprietary equipment. This article discusses this new ultrasound machine in the context of other currently available ultrasound machines featuring needle navigation.
Floating Ultrasonic Transducer Inspection System and Method for Nondestructive Evaluation
NASA Technical Reports Server (NTRS)
Johnston, Patrick H. (Inventor); Zalameda, Joseph N. (Inventor)
2016-01-01
A method for inspecting a structural sample using ultrasonic energy includes positioning an ultrasonic transducer adjacent to a surface of the sample, and then transmitting ultrasonic energy into the sample. Force pulses are applied to the transducer concurrently with transmission of the ultrasonic energy. A host machine processes ultrasonic return pulses from an ultrasonic pulser/receiver to quantify attenuation of the ultrasonic energy within the sample. The host machine detects a defect in the sample using the quantified level of attenuation. The method may include positioning a dry couplant between an ultrasonic transducer and the surface. A system includes an actuator, an ultrasonic transducer, a dry couplant between the transducer the sample, a scanning device that moves the actuator and transducer, and a measurement system having a pulsed actuator power supply, an ultrasonic pulser/receiver, and a host machine that executes the above method.
NASA Astrophysics Data System (ADS)
Mahapatra, Prasant Kumar; Sethi, Spardha; Kumar, Amod
2015-10-01
In conventional tool positioning technique, sensors embedded in the motion stages provide the accurate tool position information. In this paper, a machine vision based system and image processing technique for motion measurement of lathe tool from two-dimensional sequential images captured using charge coupled device camera having a resolution of 250 microns has been described. An algorithm was developed to calculate the observed distance travelled by the tool from the captured images. As expected, error was observed in the value of the distance traversed by the tool calculated from these images. Optimization of errors due to machine vision system, calibration, environmental factors, etc. in lathe tool movement was carried out using two soft computing techniques, namely, artificial immune system (AIS) and particle swarm optimization (PSO). The results show better capability of AIS over PSO.
NASA Astrophysics Data System (ADS)
Lucian, P.; Gheorghe, S.
2017-08-01
This paper presents a new method, based on FRISCO formula, for optimizing the choice of the best control system for kinematical feed chains with great distance between slides used in computer numerical controlled machine tools. Such machines are usually, but not limited to, used for machining large and complex parts (mostly in the aviation industry) or complex casting molds. For such machine tools the kinematic feed chains are arranged in a dual-parallel drive structure that allows the mobile element to be moved by the two kinematical branches and their related control systems. Such an arrangement allows for high speed and high rigidity (a critical requirement for precision machining) during the machining process. A significant issue for such an arrangement it’s the ability of the two parallel control systems to follow the same trajectory accurately in order to address this issue it is necessary to achieve synchronous motion control for the two kinematical branches ensuring that the correct perpendicular position it’s kept by the mobile element during its motion on the two slides.
Humanizing machines: Anthropomorphization of slot machines increases gambling.
Riva, Paolo; Sacchi, Simona; Brambilla, Marco
2015-12-01
Do people gamble more on slot machines if they think that they are playing against humanlike minds rather than mathematical algorithms? Research has shown that people have a strong cognitive tendency to imbue humanlike mental states to nonhuman entities (i.e., anthropomorphism). The present research tested whether anthropomorphizing slot machines would increase gambling. Four studies manipulated slot machine anthropomorphization and found that exposing people to an anthropomorphized description of a slot machine increased gambling behavior and reduced gambling outcomes. Such findings emerged using tasks that focused on gambling behavior (Studies 1 to 3) as well as in experimental paradigms that included gambling outcomes (Studies 2 to 4). We found that gambling outcomes decrease because participants primed with the anthropomorphic slot machine gambled more (Study 4). Furthermore, we found that high-arousal positive emotions (e.g., feeling excited) played a role in the effect of anthropomorphism on gambling behavior (Studies 3 and 4). Our research indicates that the psychological process of gambling-machine anthropomorphism can be advantageous for the gaming industry; however, this may come at great expense for gamblers' (and their families') economic resources and psychological well-being. (c) 2015 APA, all rights reserved).
Swab culture monitoring of automated endoscope reprocessors after high-level disinfection
Lu, Lung-Sheng; Wu, Keng-Liang; Chiu, Yi-Chun; Lin, Ming-Tzung; Hu, Tsung-Hui; Chiu, King-Wah
2012-01-01
AIM: To conduct a bacterial culture study for monitoring decontamination of automated endoscope reprocessors (AERs) after high-level disinfection (HLD). METHODS: From February 2006 to January 2011, authors conducted randomized consecutive sampling each month for 7 AERs. Authors collected a total of 420 swab cultures, including 300 cultures from 5 gastroscope AERs, and 120 cultures from 2 colonoscope AERs. Swab cultures were obtained from the residual water from the AERs after a full reprocessing cycle. Samples were cultured to test for aerobic bacteria, anaerobic bacteria, and mycobacterium tuberculosis. RESULTS: The positive culture rate of the AERs was 2.0% (6/300) for gastroscope AERs and 0.8% (1/120) for colonoscope AERs. All the positive cultures, including 6 from gastroscope and 1 from colonoscope AERs, showed monofloral colonization. Of the gastroscope AER samples, 50% (3/6) were colonized by aerobic bacterial and 50% (3/6) by fungal contaminations. CONCLUSION: A full reprocessing cycle of an AER with HLD is adequate for disinfection of the machine. Swab culture is a useful method for monitoring AER decontamination after each reprocessing cycle. Fungal contamination of AERs after reprocessing should also be kept in mind. PMID:22529696
Forecasting Solar Flares Using Magnetogram-based Predictors and Machine Learning
NASA Astrophysics Data System (ADS)
Florios, Kostas; Kontogiannis, Ioannis; Park, Sung-Hong; Guerra, Jordan A.; Benvenuto, Federico; Bloomfield, D. Shaun; Georgoulis, Manolis K.
2018-02-01
We propose a forecasting approach for solar flares based on data from Solar Cycle 24, taken by the Helioseismic and Magnetic Imager (HMI) on board the Solar Dynamics Observatory (SDO) mission. In particular, we use the Space-weather HMI Active Region Patches (SHARP) product that facilitates cut-out magnetograms of solar active regions (AR) in the Sun in near-realtime (NRT), taken over a five-year interval (2012 - 2016). Our approach utilizes a set of thirteen predictors, which are not included in the SHARP metadata, extracted from line-of-sight and vector photospheric magnetograms. We exploit several machine learning (ML) and conventional statistics techniques to predict flares of peak magnitude {>} M1 and {>} C1 within a 24 h forecast window. The ML methods used are multi-layer perceptrons (MLP), support vector machines (SVM), and random forests (RF). We conclude that random forests could be the prediction technique of choice for our sample, with the second-best method being multi-layer perceptrons, subject to an entropy objective function. A Monte Carlo simulation showed that the best-performing method gives accuracy ACC=0.93(0.00), true skill statistic TSS=0.74(0.02), and Heidke skill score HSS=0.49(0.01) for {>} M1 flare prediction with probability threshold 15% and ACC=0.84(0.00), TSS=0.60(0.01), and HSS=0.59(0.01) for {>} C1 flare prediction with probability threshold 35%.
Pre-operative prediction of surgical morbidity in children: comparison of five statistical models.
Cooper, Jennifer N; Wei, Lai; Fernandez, Soledad A; Minneci, Peter C; Deans, Katherine J
2015-02-01
The accurate prediction of surgical risk is important to patients and physicians. Logistic regression (LR) models are typically used to estimate these risks. However, in the fields of data mining and machine-learning, many alternative classification and prediction algorithms have been developed. This study aimed to compare the performance of LR to several data mining algorithms for predicting 30-day surgical morbidity in children. We used the 2012 National Surgical Quality Improvement Program-Pediatric dataset to compare the performance of (1) a LR model that assumed linearity and additivity (simple LR model) (2) a LR model incorporating restricted cubic splines and interactions (flexible LR model) (3) a support vector machine, (4) a random forest and (5) boosted classification trees for predicting surgical morbidity. The ensemble-based methods showed significantly higher accuracy, sensitivity, specificity, PPV, and NPV than the simple LR model. However, none of the models performed better than the flexible LR model in terms of the aforementioned measures or in model calibration or discrimination. Support vector machines, random forests, and boosted classification trees do not show better performance than LR for predicting pediatric surgical morbidity. After further validation, the flexible LR model derived in this study could be used to assist with clinical decision-making based on patient-specific surgical risks. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dimension Reduction With Extreme Learning Machine.
Kasun, Liyanaarachchi Lekamalage Chamara; Yang, Yan; Huang, Guang-Bin; Zhang, Zhengyou
2016-08-01
Data may often contain noise or irrelevant information, which negatively affect the generalization capability of machine learning algorithms. The objective of dimension reduction algorithms, such as principal component analysis (PCA), non-negative matrix factorization (NMF), random projection (RP), and auto-encoder (AE), is to reduce the noise or irrelevant information of the data. The features of PCA (eigenvectors) and linear AE are not able to represent data as parts (e.g. nose in a face image). On the other hand, NMF and non-linear AE are maimed by slow learning speed and RP only represents a subspace of original data. This paper introduces a dimension reduction framework which to some extend represents data as parts, has fast learning speed, and learns the between-class scatter subspace. To this end, this paper investigates a linear and non-linear dimension reduction framework referred to as extreme learning machine AE (ELM-AE) and sparse ELM-AE (SELM-AE). In contrast to tied weight AE, the hidden neurons in ELM-AE and SELM-AE need not be tuned, and their parameters (e.g, input weights in additive neurons) are initialized using orthogonal and sparse random weights, respectively. Experimental results on USPS handwritten digit recognition data set, CIFAR-10 object recognition, and NORB object recognition data set show the efficacy of linear and non-linear ELM-AE and SELM-AE in terms of discriminative capability, sparsity, training time, and normalized mean square error.
Balachandran, Anoop T; Gandia, Kristine; Jacobs, Kevin A; Streiner, David L; Eltoukhy, Moataz; Signorile, Joseph F
2017-11-01
Power training has been shown to be more effective than conventional resistance training for improving physical function in older adults; however, most trials have used pneumatic machines during training. Considering that the general public typically has access to plate-loaded machines, the effectiveness and safety of power training using plate-loaded machines compared to pneumatic machines is an important consideration. The purpose of this investigation was to compare the effects of high-velocity training using pneumatic machines (Pn) versus standard plate-loaded machines (PL). Independently-living older adults, 60years or older were randomized into two groups: pneumatic machine (Pn, n=19) and plate-loaded machine (PL, n=17). After 12weeks of high-velocity training twice per week, groups were analyzed using an intention-to-treat approach. Primary outcomes were lower body power measured using a linear transducer and upper body power using medicine ball throw. Secondary outcomes included lower and upper body muscle muscle strength, the Physical Performance Battery (PPB), gallon jug test, the timed up-and-go test, and self-reported function using the Patient Reported Outcomes Measurement Information System (PROMIS) and an online video questionnaire. Outcome assessors were blinded to group membership. Lower body power significantly improved in both groups (Pn: 19%, PL: 31%), with no significant difference between the groups (Cohen's d=0.4, 95% CI (-1.1, 0.3)). Upper body power significantly improved only in the PL group, but showed no significant difference between the groups (Pn: 3%, PL: 6%). For balance, there was a significant difference between the groups favoring the Pn group (d=0.7, 95% CI (0.1, 1.4)); however, there were no statistically significant differences between groups for PPB, gallon jug transfer, muscle muscle strength, timed up-and-go or self-reported function. No serious adverse events were reported in either of the groups. Pneumatic and plate-loaded machines were effective in improving lower body power and physical function in older adults. The results suggest that power training can be safely and effectively performed by older adults using either pneumatic or plate-loaded machines. Copyright © 2017 Elsevier Inc. All rights reserved.
10 CFR 431.383 - Enforcement process for electric motors.
Code of Federal Regulations, 2014 CFR
2014-01-01
... general purpose electric motor of equivalent electrical design and enclosure rather than replacing the... equivalent electrical design and enclosure rather than machining and attaching an endshield. ... sample of up to 20 units will then be randomly selected from one or more subdivided groups within the...
Traceability of On-Machine Tool Measurement: A Review.
Mutilba, Unai; Gomez-Acedo, Eneko; Kortaberria, Gorka; Olarra, Aitor; Yagüe-Fabra, Jose A
2017-07-11
Nowadays, errors during the manufacturing process of high value components are not acceptable in driving industries such as energy and transportation. Sectors such as aerospace, automotive, shipbuilding, nuclear power, large science facilities or wind power need complex and accurate components that demand close measurements and fast feedback into their manufacturing processes. New measuring technologies are already available in machine tools, including integrated touch probes and fast interface capabilities. They provide the possibility to measure the workpiece in-machine during or after its manufacture, maintaining the original setup of the workpiece and avoiding the manufacturing process from being interrupted to transport the workpiece to a measuring position. However, the traceability of the measurement process on a machine tool is not ensured yet and measurement data is still not fully reliable enough for process control or product validation. The scientific objective is to determine the uncertainty on a machine tool measurement and, therefore, convert it into a machine integrated traceable measuring process. For that purpose, an error budget should consider error sources such as the machine tools, components under measurement and the interactions between both of them. This paper reviews all those uncertainty sources, being mainly focused on those related to the machine tool, either on the process of geometric error assessment of the machine or on the technology employed to probe the measurand.
A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting
NASA Astrophysics Data System (ADS)
Kim, T.; Joo, K.; Seo, J.; Heo, J. H.
2016-12-01
Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.
Lai, Min; Zhang, Xiaodong; Fang, Fengzhou
2017-12-01
Molecular dynamics simulations of nanometric cutting on monocrystalline germanium are conducted to investigate the subsurface deformation during and after nanometric cutting. The continuous random network model of amorphous germanium is established by molecular dynamics simulation, and its characteristic parameters are extracted to compare with those of the machined deformed layer. The coordination number distribution and radial distribution function (RDF) show that the machined surface presents the similar amorphous state. The anisotropic subsurface deformation is studied by nanometric cutting on the (010), (101), and (111) crystal planes of germanium, respectively. The deformed structures are prone to extend along the 110 slip system, which leads to the difference in the shape and thickness of the deformed layer on various directions and crystal planes. On machined surface, the greater thickness of subsurface deformed layer induces the greater surface recovery height. In order to get the critical thickness limit of deformed layer on machined surface of germanium, the optimized cutting direction on each crystal plane is suggested according to the relevance of the nanometric cutting to the nanoindentation.
Importance of polarity change in the electrical discharge machining
NASA Astrophysics Data System (ADS)
Schulze, H.-P.
2017-10-01
The polarity change in the electrical discharge machining is still a problem and is often performed completely unmotivated or randomly. The polarity must be designated primarily, i.e. the anodic part must be clearly assigned to the tool or the workpiece. Normally, the polarity of the workpiece electrode is named. In paper, will be shown which determine fundamental causes the structural behavior of the cathode and anode, and when it makes sense to change the polarity. The polarity change is primarily dependent on the materials that are used as cathode and anode. This distinction must be made if there are pure metals or complex materials. Secondary of the polarity change is also affected by the process energy source (PES) and the supply line. The polarity change is mostly influenced by the fact that the removal is to be maximized on the workpiece while the tool is minimal removal (wear) occur. A second factor that makes a polarity change needed is the use of electrical discharge in combination with other machining methods, such as electrochemical machining (ECM).
Voice based gender classification using machine learning
NASA Astrophysics Data System (ADS)
Raahul, A.; Sapthagiri, R.; Pankaj, K.; Vijayarajan, V.
2017-11-01
Gender identification is one of the major problem speech analysis today. Tracing the gender from acoustic data i.e., pitch, median, frequency etc. Machine learning gives promising results for classification problem in all the research domains. There are several performance metrics to evaluate algorithms of an area. Our Comparative model algorithm for evaluating 5 different machine learning algorithms based on eight different metrics in gender classification from acoustic data. Agenda is to identify gender, with five different algorithms: Linear Discriminant Analysis (LDA), K-Nearest Neighbour (KNN), Classification and Regression Trees (CART), Random Forest (RF), and Support Vector Machine (SVM) on basis of eight different metrics. The main parameter in evaluating any algorithms is its performance. Misclassification rate must be less in classification problems, which says that the accuracy rate must be high. Location and gender of the person have become very crucial in economic markets in the form of AdSense. Here with this comparative model algorithm, we are trying to assess the different ML algorithms and find the best fit for gender classification of acoustic data.