Subcutaneous ICD screening with the Boston Scientific ZOOM programmer versus a 12-lead ECG machine.
Chang, Shu C; Patton, Kristen K; Robinson, Melissa R; Poole, Jeanne E; Prutkin, Jordan M
2018-02-24
The subcutaneous implantable cardioverter-defibrillator (S-ICD) requires preimplant screening to ensure appropriate sensing and reduce risk of inappropriate shocks. Screening can be performed using either an ICD programmer or a 12-lead electrocardiogram (ECG) machine. It is unclear whether differences in signal filtering and digital sampling change the screening success rate. Subjects were recruited if they had a transvenous single-lead ICD without pacing requirements or were candidates for a new ICD. Screening was performed using both a Boston Scientific ZOOM programmer (Marlborough, MA, USA) and General Electric MAC 5000 ECG machine (Fairfield, CT, USA). A pass was defined as having at least one lead that fit within the screening template in both supine and sitting positions. A total of 69 subjects were included and 27 sets of ECG leads had differing screening results between the two machines (7%). Of these sets, 22 (81%) passed using the ECG machine but failed using the programmer and five (19%) passed using the ECG machine but failed using the programmer (P < 0.001). Four subjects (6%) passed screening using the ECG machine but failed using the programmer. No subject passed screening with the programmer but failed with the ECG machine. There can be occasional disagreement in S-ICD patient screening between an ICD programmer and ECG machine, all of whom passed with the ECG machine but failed using the programmer. On a per lead basis, the ECG machine passes more subjects. It is unknown what the inappropriate shock rate would be if an S-ICD was implanted. Clinical judgment should be used in borderline cases. © 2018 Wiley Periodicals, Inc.
Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z
2009-05-01
Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.
Smart Screening System (S3) In Taconite Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daryoush Allaei; Angus Morison; David Tarnowski
2005-09-01
The conventional screening machines used in processing plants have had undesirable high noise and vibration levels. They also have had unsatisfactorily low screening efficiency, high energy consumption, high maintenance cost, low productivity, and poor worker safety. These conventional vibrating machines have been used in almost every processing plant. Most of the current material separation technology uses heavy and inefficient electric motors with an unbalanced rotating mass to generate the shaking. In addition to being excessively noisy, inefficient, and high-maintenance, these vibrating machines are often the bottleneck in the entire process. Furthermore, these motors, along with the vibrating machines and supportingmore » structure, shake other machines and structures in the vicinity. The latter increases maintenance costs while reducing worker health and safety. The conventional vibrating fine screens at taconite processing plants have had the same problems as those listed above. This has resulted in lower screening efficiency, higher energy and maintenance cost, and lower productivity and workers safety concerns. The focus of this work is on the design of a high performance screening machine suitable for taconite processing plants. SmartScreens{trademark} technology uses miniaturized motors, based on smart materials, to generate the shaking. The underlying technologies are Energy Flow Control{trademark} and Vibration Control by Confinement{trademark}. These concepts are used to direct energy flow and confine energy efficiently and effectively to the screen function. The SmartScreens{trademark} technology addresses problems related to noise and vibration, screening efficiency, productivity, and maintenance cost and worker safety. Successful development of SmartScreens{trademark} technology will bring drastic changes to the screening and physical separation industry. The final designs for key components of the SmartScreens{trademark} have been developed. The key components include smart motor and associated electronics, resonators, and supporting structural elements. It is shown that the smart motors have an acceptable life and performance. Resonator (or motion amplifier) designs are selected based on the final system requirement and vibration characteristics. All the components for a fully functional prototype are fabricated. The development program is on schedule. The last semi-annual report described the process of FE model validation and correlation with experimental data in terms of dynamic performance and predicted stresses. It also detailed efforts into making the supporting structure less important to system performance. Finally, an introduction into the dry application concept was presented. Since then, the design refinement phase was completed. This has resulted in a Smart Screen design that meets performance targets both in the dry condition and with taconite slurry flow using PZT motors. Furthermore, this system was successfully demonstrated for the DOE and partner companies at the Coleraine Mineral Research Laboratory in Coleraine, Minnesota.« less
Smart Screening System (S3) In Taconite Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daryoush Allaei; Ryan Wartman; David Tarnowski
2006-03-01
The conventional screening machines used in processing plants have had undesirable high noise and vibration levels. They also have had unsatisfactorily low screening efficiency, high energy consumption, high maintenance cost, low productivity, and poor worker safety. These conventional vibrating machines have been used in almost every processing plant. Most of the current material separation technology uses heavy and inefficient electric motors with an unbalanced rotating mass to generate the shaking. In addition to being excessively noisy, inefficient, and high-maintenance, these vibrating machines are often the bottleneck in the entire process. Furthermore, these motors, along with the vibrating machines and supportingmore » structure, shake other machines and structures in the vicinity. The latter increases maintenance costs while reducing worker health and safety. The conventional vibrating fine screens at taconite processing plants have had the same problems as those listed above. This has resulted in lower screening efficiency, higher energy and maintenance cost, and lower productivity and workers safety concerns. The focus of this work is on the design of a high performance screening machine suitable for taconite processing plants. SmartScreens{trademark} technology uses miniaturized motors, based on smart materials, to generate the shaking. The underlying technologies are Energy Flow Control{trademark} and Vibration Control by Confinement{trademark}. These concepts are used to direct energy flow and confine energy efficiently and effectively to the screen function. The SmartScreens{trademark} technology addresses problems related to noise and vibration, screening efficiency, productivity, and maintenance cost and worker safety. Successful development of SmartScreens{trademark} technology will bring drastic changes to the screening and physical separation industry. The final designs for key components of the SmartScreens{trademark} have been developed. The key components include smart motor and associated electronics, resonators, and supporting structural elements. It is shown that the smart motors have an acceptable life and performance. Resonator (or motion amplifier) designs are selected based on the final system requirement and vibration characteristics. All the components for a fully functional prototype are fabricated. The development program is on schedule. The last semi-annual report described the completion of the design refinement phase. This phase resulted in a Smart Screen design that meets performance targets both in the dry condition and with taconite slurry flow using PZT motors. This system was successfully demonstrated for the DOE and partner companies at the Coleraine Mineral Research Laboratory in Coleraine, Minnesota. Since then, the fabrication of the dry application prototype (incorporating an electromagnetic drive mechanism and a new deblinding concept) has been completed and successfully tested at QRDC's lab.« less
Virtual screening by a new Clustering-based Weighted Similarity Extreme Learning Machine approach
Kudisthalert, Wasu
2018-01-01
Machine learning techniques are becoming popular in virtual screening tasks. One of the powerful machine learning algorithms is Extreme Learning Machine (ELM) which has been applied to many applications and has recently been applied to virtual screening. We propose the Weighted Similarity ELM (WS-ELM) which is based on a single layer feed-forward neural network in a conjunction of 16 different similarity coefficients as activation function in the hidden layer. It is known that the performance of conventional ELM is not robust due to random weight selection in the hidden layer. Thus, we propose a Clustering-based WS-ELM (CWS-ELM) that deterministically assigns weights by utilising clustering algorithms i.e. k-means clustering and support vector clustering. The experiments were conducted on one of the most challenging datasets–Maximum Unbiased Validation Dataset–which contains 17 activity classes carefully selected from PubChem. The proposed algorithms were then compared with other machine learning techniques such as support vector machine, random forest, and similarity searching. The results show that CWS-ELM in conjunction with support vector clustering yields the best performance when utilised together with Sokal/Sneath(1) coefficient. Furthermore, ECFP_6 fingerprint presents the best results in our framework compared to the other types of fingerprints, namely ECFP_4, FCFP_4, and FCFP_6. PMID:29652912
Tear fluid proteomics multimarkers for diabetic retinopathy screening
2013-01-01
Background The aim of the project was to develop a novel method for diabetic retinopathy screening based on the examination of tear fluid biomarker changes. In order to evaluate the usability of protein biomarkers for pre-screening purposes several different approaches were used, including machine learning algorithms. Methods All persons involved in the study had diabetes. Diabetic retinopathy (DR) was diagnosed by capturing 7-field fundus images, evaluated by two independent ophthalmologists. 165 eyes were examined (from 119 patients), 55 were diagnosed healthy and 110 images showed signs of DR. Tear samples were taken from all eyes and state-of-the-art nano-HPLC coupled ESI-MS/MS mass spectrometry protein identification was performed on all samples. Applicability of protein biomarkers was evaluated by six different optimally parameterized machine learning algorithms: Support Vector Machine, Recursive Partitioning, Random Forest, Naive Bayes, Logistic Regression, K-Nearest Neighbor. Results Out of the six investigated machine learning algorithms the result of Recursive Partitioning proved to be the most accurate. The performance of the system realizing the above algorithm reached 74% sensitivity and 48% specificity. Conclusions Protein biomarkers selected and classified with machine learning algorithms alone are at present not recommended for screening purposes because of low specificity and sensitivity values. This tool can be potentially used to improve the results of image processing methods as a complementary tool in automatic or semiautomatic systems. PMID:23919537
Evaluation of machine learning algorithms for improved risk assessment for Down's syndrome.
Koivu, Aki; Korpimäki, Teemu; Kivelä, Petri; Pahikkala, Tapio; Sairanen, Mikko
2018-05-04
Prenatal screening generates a great amount of data that is used for predicting risk of various disorders. Prenatal risk assessment is based on multiple clinical variables and overall performance is defined by how well the risk algorithm is optimized for the population in question. This article evaluates machine learning algorithms to improve performance of first trimester screening of Down syndrome. Machine learning algorithms pose an adaptive alternative to develop better risk assessment models using the existing clinical variables. Two real-world data sets were used to experiment with multiple classification algorithms. Implemented models were tested with a third, real-world, data set and performance was compared to a predicate method, a commercial risk assessment software. Best performing deep neural network model gave an area under the curve of 0.96 and detection rate of 78% with 1% false positive rate with the test data. Support vector machine model gave area under the curve of 0.95 and detection rate of 61% with 1% false positive rate with the same test data. When compared with the predicate method, the best support vector machine model was slightly inferior, but an optimized deep neural network model was able to give higher detection rates with same false positive rate or similar detection rate but with markedly lower false positive rate. This finding could further improve the first trimester screening for Down syndrome, by using existing clinical variables and a large training data derived from a specific population. Copyright © 2018 Elsevier Ltd. All rights reserved.
The influence of negative training set size on machine learning-based virtual screening.
Kurczab, Rafał; Smusz, Sabina; Bojarski, Andrzej J
2014-01-01
The paper presents a thorough analysis of the influence of the number of negative training examples on the performance of machine learning methods. The impact of this rather neglected aspect of machine learning methods application was examined for sets containing a fixed number of positive and a varying number of negative examples randomly selected from the ZINC database. An increase in the ratio of positive to negative training instances was found to greatly influence most of the investigated evaluating parameters of ML methods in simulated virtual screening experiments. In a majority of cases, substantial increases in precision and MCC were observed in conjunction with some decreases in hit recall. The analysis of dynamics of those variations let us recommend an optimal composition of training data. The study was performed on several protein targets, 5 machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest) and 2 types of molecular fingerprints (MACCS and CDK FP). The most effective classification was provided by the combination of CDK FP with SMO or Random Forest algorithms. The Naïve Bayes models appeared to be hardly sensitive to changes in the number of negative instances in the training set. In conclusion, the ratio of positive to negative training instances should be taken into account during the preparation of machine learning experiments, as it might significantly influence the performance of particular classifier. What is more, the optimization of negative training set size can be applied as a boosting-like approach in machine learning-based virtual screening.
The influence of negative training set size on machine learning-based virtual screening
2014-01-01
Background The paper presents a thorough analysis of the influence of the number of negative training examples on the performance of machine learning methods. Results The impact of this rather neglected aspect of machine learning methods application was examined for sets containing a fixed number of positive and a varying number of negative examples randomly selected from the ZINC database. An increase in the ratio of positive to negative training instances was found to greatly influence most of the investigated evaluating parameters of ML methods in simulated virtual screening experiments. In a majority of cases, substantial increases in precision and MCC were observed in conjunction with some decreases in hit recall. The analysis of dynamics of those variations let us recommend an optimal composition of training data. The study was performed on several protein targets, 5 machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest) and 2 types of molecular fingerprints (MACCS and CDK FP). The most effective classification was provided by the combination of CDK FP with SMO or Random Forest algorithms. The Naïve Bayes models appeared to be hardly sensitive to changes in the number of negative instances in the training set. Conclusions In conclusion, the ratio of positive to negative training instances should be taken into account during the preparation of machine learning experiments, as it might significantly influence the performance of particular classifier. What is more, the optimization of negative training set size can be applied as a boosting-like approach in machine learning-based virtual screening. PMID:24976867
Fernandez, Michael; Boyd, Peter G; Daff, Thomas D; Aghaji, Mohammad Zein; Woo, Tom K
2014-09-04
In this work, we have developed quantitative structure-property relationship (QSPR) models using advanced machine learning algorithms that can rapidly and accurately recognize high-performing metal organic framework (MOF) materials for CO2 capture. More specifically, QSPR classifiers have been developed that can, in a fraction of a section, identify candidate MOFs with enhanced CO2 adsorption capacity (>1 mmol/g at 0.15 bar and >4 mmol/g at 1 bar). The models were tested on a large set of 292 050 MOFs that were not part of the training set. The QSPR classifier could recover 945 of the top 1000 MOFs in the test set while flagging only 10% of the whole library for compute intensive screening. Thus, using the machine learning classifiers as part of a high-throughput screening protocol would result in an order of magnitude reduction in compute time and allow intractably large structure libraries and search spaces to be screened.
Method and system for rendering and interacting with an adaptable computing environment
Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM
2012-06-12
An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.
Patterson, Olga V; Forbush, Tyler B; Saini, Sameer D; Moser, Stephanie E; DuVall, Scott L
2015-01-01
In order to measure the level of utilization of colonoscopy procedures, identifying the primary indication for the procedure is required. Colonoscopies may be utilized not only for screening, but also for diagnostic or therapeutic purposes. To determine whether a colonoscopy was performed for screening, we created a natural language processing system to identify colonoscopy reports in the electronic medical record system and extract indications for the procedure. A rule-based model and three machine-learning models were created using 2,000 manually annotated clinical notes of patients cared for in the Department of Veterans Affairs. Performance of the models was measured and compared. Analysis of the models on a test set of 1,000 documents indicates that the rule-based system performance stays fairly constant as evaluated on training and testing sets. However, the machine learning model without feature selection showed significant decrease in performance. Therefore, rule-based classification system appears to be more robust than a machine-learning system in cases when no feature selection is performed.
Korkmaz, Selcuk; Zararsiz, Gokmen; Goksuluk, Dincer
2015-01-01
Virtual screening is an important step in early-phase of drug discovery process. Since there are thousands of compounds, this step should be both fast and effective in order to distinguish drug-like and nondrug-like molecules. Statistical machine learning methods are widely used in drug discovery studies for classification purpose. Here, we aim to develop a new tool, which can classify molecules as drug-like and nondrug-like based on various machine learning methods, including discriminant, tree-based, kernel-based, ensemble and other algorithms. To construct this tool, first, performances of twenty-three different machine learning algorithms are compared by ten different measures, then, ten best performing algorithms have been selected based on principal component and hierarchical cluster analysis results. Besides classification, this application has also ability to create heat map and dendrogram for visual inspection of the molecules through hierarchical cluster analysis. Moreover, users can connect the PubChem database to download molecular information and to create two-dimensional structures of compounds. This application is freely available through www.biosoft.hacettepe.edu.tr/MLViS/. PMID:25928885
Performance of machine-learning scoring functions in structure-based virtual screening.
Wójcikowski, Maciej; Ballester, Pedro J; Siedlecki, Pawel
2017-04-25
Classical scoring functions have reached a plateau in their performance in virtual screening and binding affinity prediction. Recently, machine-learning scoring functions trained on protein-ligand complexes have shown great promise in small tailored studies. They have also raised controversy, specifically concerning model overfitting and applicability to novel targets. Here we provide a new ready-to-use scoring function (RF-Score-VS) trained on 15 426 active and 893 897 inactive molecules docked to a set of 102 targets. We use the full DUD-E data sets along with three docking tools, five classical and three machine-learning scoring functions for model building and performance assessment. Our results show RF-Score-VS can substantially improve virtual screening performance: RF-Score-VS top 1% provides 55.6% hit rate, whereas that of Vina only 16.2% (for smaller percent the difference is even more encouraging: RF-Score-VS top 0.1% achieves 88.6% hit rate for 27.5% using Vina). In addition, RF-Score-VS provides much better prediction of measured binding affinity than Vina (Pearson correlation of 0.56 and -0.18, respectively). Lastly, we test RF-Score-VS on an independent test set from the DEKOIS benchmark and observed comparable results. We provide full data sets to facilitate further research in this area (http://github.com/oddt/rfscorevs) as well as ready-to-use RF-Score-VS (http://github.com/oddt/rfscorevs_binary).
SWIFT-Review: a text-mining workbench for systematic review.
Howard, Brian E; Phillips, Jason; Miller, Kyle; Tandon, Arpit; Mav, Deepak; Shah, Mihir R; Holmgren, Stephanie; Pelch, Katherine E; Walker, Vickie; Rooney, Andrew A; Macleod, Malcolm; Shah, Ruchir R; Thayer, Kristina
2016-05-23
There is growing interest in using machine learning approaches to priority rank studies and reduce human burden in screening literature when conducting systematic reviews. In addition, identifying addressable questions during the problem formulation phase of systematic review can be challenging, especially for topics having a large literature base. Here, we assess the performance of the SWIFT-Review priority ranking algorithm for identifying studies relevant to a given research question. We also explore the use of SWIFT-Review during problem formulation to identify, categorize, and visualize research areas that are data rich/data poor within a large literature corpus. Twenty case studies, including 15 public data sets, representing a range of complexity and size, were used to assess the priority ranking performance of SWIFT-Review. For each study, seed sets of manually annotated included and excluded titles and abstracts were used for machine training. The remaining references were then ranked for relevance using an algorithm that considers term frequency and latent Dirichlet allocation (LDA) topic modeling. This ranking was evaluated with respect to (1) the number of studies screened in order to identify 95 % of known relevant studies and (2) the "Work Saved over Sampling" (WSS) performance metric. To assess SWIFT-Review for use in problem formulation, PubMed literature search results for 171 chemicals implicated as EDCs were uploaded into SWIFT-Review (264,588 studies) and categorized based on evidence stream and health outcome. Patterns of search results were surveyed and visualized using a variety of interactive graphics. Compared with the reported performance of other tools using the same datasets, the SWIFT-Review ranking procedure obtained the highest scores on 11 out of 15 of the public datasets. Overall, these results suggest that using machine learning to triage documents for screening has the potential to save, on average, more than 50 % of the screening effort ordinarily required when using un-ordered document lists. In addition, the tagging and annotation capabilities of SWIFT-Review can be useful during the activities of scoping and problem formulation. Text-mining and machine learning software such as SWIFT-Review can be valuable tools to reduce the human screening burden and assist in problem formulation.
Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta
2008-04-22
Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge.
Performance of machine-learning scoring functions in structure-based virtual screening
Wójcikowski, Maciej; Ballester, Pedro J.; Siedlecki, Pawel
2017-01-01
Classical scoring functions have reached a plateau in their performance in virtual screening and binding affinity prediction. Recently, machine-learning scoring functions trained on protein-ligand complexes have shown great promise in small tailored studies. They have also raised controversy, specifically concerning model overfitting and applicability to novel targets. Here we provide a new ready-to-use scoring function (RF-Score-VS) trained on 15 426 active and 893 897 inactive molecules docked to a set of 102 targets. We use the full DUD-E data sets along with three docking tools, five classical and three machine-learning scoring functions for model building and performance assessment. Our results show RF-Score-VS can substantially improve virtual screening performance: RF-Score-VS top 1% provides 55.6% hit rate, whereas that of Vina only 16.2% (for smaller percent the difference is even more encouraging: RF-Score-VS top 0.1% achieves 88.6% hit rate for 27.5% using Vina). In addition, RF-Score-VS provides much better prediction of measured binding affinity than Vina (Pearson correlation of 0.56 and −0.18, respectively). Lastly, we test RF-Score-VS on an independent test set from the DEKOIS benchmark and observed comparable results. We provide full data sets to facilitate further research in this area (http://github.com/oddt/rfscorevs) as well as ready-to-use RF-Score-VS (http://github.com/oddt/rfscorevs_binary). PMID:28440302
Adapting human-machine interfaces to user performance.
Danziger, Zachary; Fishbach, Alon; Mussa-Ivaldi, Ferdinando A
2008-01-01
The goal of this study was to create and examine machine learning algorithms that adapt in a controlled and cadenced way to foster a harmonious learning environment between the user of a human-machine interface and the controlled device. In this experiment, subjects' high-dimensional finger motions remotely controlled the joint angles of a simulated planar 2-link arm, which was used to hit targets on a computer screen. Subjects were required to move the cursor at the endpoint of the simulated arm.
Screening Electronic Health Record-Related Patient Safety Reports Using Machine Learning.
Marella, William M; Sparnon, Erin; Finley, Edward
2017-03-01
The objective of this study was to develop a semiautomated approach to screening cases that describe hazards associated with the electronic health record (EHR) from a mandatory, population-based patient safety reporting system. Potentially relevant cases were identified through a query of the Pennsylvania Patient Safety Reporting System. A random sample of cases were manually screened for relevance and divided into training, testing, and validation data sets to develop a machine learning model. This model was used to automate screening of remaining potentially relevant cases. Of the 4 algorithms tested, a naive Bayes kernel performed best, with an area under the receiver operating characteristic curve of 0.927 ± 0.023, accuracy of 0.855 ± 0.033, and F score of 0.877 ± 0.027. The machine learning model and text mining approach described here are useful tools for identifying and analyzing adverse event and near-miss reports. Although reporting systems are beginning to incorporate structured fields on health information technology and the EHR, these methods can identify related events that reporters classify in other ways. These methods can facilitate analysis of legacy safety reports by retrieving health information technology-related and EHR-related events from databases without fields and controlled values focused on this subject and distinguishing them from reports in which the EHR is mentioned only in passing. Machine learning and text mining are useful additions to the patient safety toolkit and can be used to semiautomate screening and analysis of unstructured text in safety reports from frontline staff.
Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta
2008-01-01
Background Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. Results The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. Conclusion GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge. PMID:18430222
Interpreting linear support vector machine models with heat map molecule coloring
2011-01-01
Background Model-based virtual screening plays an important role in the early drug discovery stage. The outcomes of high-throughput screenings are a valuable source for machine learning algorithms to infer such models. Besides a strong performance, the interpretability of a machine learning model is a desired property to guide the optimization of a compound in later drug discovery stages. Linear support vector machines showed to have a convincing performance on large-scale data sets. The goal of this study is to present a heat map molecule coloring technique to interpret linear support vector machine models. Based on the weights of a linear model, the visualization approach colors each atom and bond of a compound according to its importance for activity. Results We evaluated our approach on a toxicity data set, a chromosome aberration data set, and the maximum unbiased validation data sets. The experiments show that our method sensibly visualizes structure-property and structure-activity relationships of a linear support vector machine model. The coloring of ligands in the binding pocket of several crystal structures of a maximum unbiased validation data set target indicates that our approach assists to determine the correct ligand orientation in the binding pocket. Additionally, the heat map coloring enables the identification of substructures important for the binding of an inhibitor. Conclusions In combination with heat map coloring, linear support vector machine models can help to guide the modification of a compound in later stages of drug discovery. Particularly substructures identified as important by our method might be a starting point for optimization of a lead compound. The heat map coloring should be considered as complementary to structure based modeling approaches. As such, it helps to get a better understanding of the binding mode of an inhibitor. PMID:21439031
Posture and performance: sitting vs. standing for security screening.
Drury, C G; Hsiao, Y L; Joseph, C; Joshi, S; Lapp, J; Pennathur, P R
2008-03-01
A classification of the literature on the effects of workplace posture on performance of different mental tasks showed few consistent patterns. A parallel classification of the complementary effect of performance on postural variables gave similar results. Because of a lack of data for signal detection tasks, an experiment was performed using 12 experienced security operators performing an X-ray baggage-screening task with three different workplace arrangements. The current workplace, sitting on a high chair viewing a screen placed on top of the X-ray machine, was compared to a standing workplace and a conventional desk-sitting workplace. No performance effects of workplace posture were found, although the experiment was able to measure performance effects of learning and body part discomfort effects of workplace posture. There are implications for the classification of posture and performance and for the justification of ergonomics improvements based on performance increases.
NASA Astrophysics Data System (ADS)
Li, Shaoxin; Zhang, Yanjiao; Xu, Junfa; Li, Linfang; Zeng, Qiuyao; Lin, Lin; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Liu, Songhao
2014-09-01
This study aims to present a noninvasive prostate cancer screening methods using serum surface-enhanced Raman scattering (SERS) and support vector machine (SVM) techniques through peripheral blood sample. SERS measurements are performed using serum samples from 93 prostate cancer patients and 68 healthy volunteers by silver nanoparticles. Three types of kernel functions including linear, polynomial, and Gaussian radial basis function (RBF) are employed to build SVM diagnostic models for classifying measured SERS spectra. For comparably evaluating the performance of SVM classification models, the standard multivariate statistic analysis method of principal component analysis (PCA) is also applied to classify the same datasets. The study results show that for the RBF kernel SVM diagnostic model, the diagnostic accuracy of 98.1% is acquired, which is superior to the results of 91.3% obtained from PCA methods. The receiver operating characteristic curve of diagnostic models further confirm above research results. This study demonstrates that label-free serum SERS analysis technique combined with SVM diagnostic algorithm has great potential for noninvasive prostate cancer screening.
12. BUILDING 621, INTERIOR, GROUND FLOOR, LOOKING NORTHWEST AT SCREENING ...
12. BUILDING 621, INTERIOR, GROUND FLOOR, LOOKING NORTHWEST AT SCREENING MACHINE THAT REMOVES SHELL FRAGMENTS. METALLIC DUST REMOVED BY MAGNETIC SEPERATOR UNDERNEATH SCREEN. SAWDUST IS RETURNED TO SAWDUST HOPPER BY ELEVATOR. HOODS OVER SCREENING MACHINE AT WORKBENCH REMOVE FINE SAWDUST. - Picatinny Arsenal, 600 Area, Test Areas District, State Route 15 near I-80, Dover, Morris County, NJ
Industrial Inspection with Open Eyes: Advance with Machine Vision Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zheng; Ukida, H.; Niel, Kurt
Machine vision systems have evolved significantly with the technology advances to tackle the challenges from modern manufacturing industry. A wide range of industrial inspection applications for quality control are benefiting from visual information captured by different types of cameras variously configured in a machine vision system. This chapter screens the state of the art in machine vision technologies in the light of hardware, software tools, and major algorithm advances for industrial inspection. The inspection beyond visual spectrum offers a significant complementary to the visual inspection. The combination with multiple technologies makes it possible for the inspection to achieve a bettermore » performance and efficiency in varied applications. The diversity of the applications demonstrates the great potential of machine vision systems for industry.« less
Type 2 Diabetes Screening Test by Means of a Pulse Oximeter.
Moreno, Enrique Monte; Lujan, Maria Jose Anyo; Rusinol, Montse Torrres; Fernandez, Paqui Juarez; Manrique, Pilar Nunez; Trivino, Cristina Aragon; Miquel, Magda Pedrosa; Rodriguez, Marife Alvarez; Burguillos, M Jose Gonzalez
2017-02-01
In this paper, we propose a method for screening for the presence of type 2 diabetes by means of the signal obtained from a pulse oximeter. The screening system consists of two parts: the first analyzes the signal obtained from the pulse oximeter, and the second consists of a machine-learning module. The system consists of a front end that extracts a set of features form the pulse oximeter signal. These features are based on physiological considerations. The set of features were the input of a machine-learning algorithm that determined the class of the input sample, i.e., whether the subject had diabetes or not. The machine-learning algorithms were random forests, gradient boosting, and linear discriminant analysis as benchmark. The system was tested on a database of [Formula: see text] subjects (two samples per subject) collected from five community health centers. The mean receiver operating characteristic area found was [Formula: see text]% (median value [Formula: see text]% and range [Formula: see text]%), with a specificity = [Formula: see text]% for a threshold that gave a sensitivity = [Formula: see text]%. We present a screening method for detecting diabetes that has a performance comparable to the glycated haemoglobin (haemoglobin A1c HbA1c) test, does not require blood extraction, and yields results in less than 5 min.
Bone, Daniel; Bishop, Somer; Black, Matthew P.; Goodwin, Matthew S.; Lord, Catherine; Narayanan, Shrikanth S.
2016-01-01
Background Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely-used ASD screening and diagnostic tools. Methods The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders (DD), split at age 10. Algorithms were created via a robust ML classifier, support vector machine (SVM), while targeting best-estimate clinical diagnosis of ASD vs. non-ASD. Parameter settings were tuned in multiple levels of cross-validation. Results The created algorithms were more effective (higher performing) than current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. Conclusions ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. PMID:27090613
Bone, Daniel; Bishop, Somer L; Black, Matthew P; Goodwin, Matthew S; Lord, Catherine; Narayanan, Shrikanth S
2016-08-01
Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely used ASD screening and diagnostic tools. The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders, split at age 10. Algorithms were created via a robust ML classifier, support vector machine, while targeting best-estimate clinical diagnosis of ASD versus non-ASD. Parameter settings were tuned in multiple levels of cross-validation. The created algorithms were more effective (higher performing) than the current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight the limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. © 2016 Association for Child and Adolescent Mental Health.
Smart material screening machines using smart materials and controls
NASA Astrophysics Data System (ADS)
Allaei, Daryoush; Corradi, Gary; Waigand, Al
2002-07-01
The objective of this product is to address the specific need for improvements in the efficiency and effectiveness in physical separation technologies in the screening areas. Currently, the mining industry uses approximately 33 billion kW-hr per year, costing 1.65 billion dollars at 0.05 cents per kW-hr, of electrical energy for physical separations. Even though screening and size separations are not the single most energy intensive process in the mining industry, they are often the major bottleneck in the whole process. Improvements to this area offer tremendous potential in both energy savings and production improvements. Additionally, the vibrating screens used in the mining processing plants are the most costly areas from maintenance and worker health and safety point of views. The goal of this product is to reduce energy use in the screening and total processing areas. This goal is accomplished by developing an innovative screening machine based on smart materials and smart actuators, namely smart screen that uses advanced sensory system to continuously monitor the screening process and make appropriate adjustments to improve production. The theory behind the development of Smart Screen technology is based on two key technologies, namely smart actuators and smart Energy Flow ControlT (EFCT) strategies, developed initially for military applications. Smart Screen technology controls the flow of vibration energy and confines it to the screen rather than shaking much of the mass that makes up the conventional vibratory screening machine. Consequently, Smart Screens eliminates and downsizes many of the structural components associated with conventional vibratory screening machines. As a result, the surface area of the screen increases for a given envelope. This increase in usable screening surface area extends the life of the screens, reduces required maintenance by reducing the frequency of screen change-outs and improves throughput or productivity.
Cao, Ran; Pu, Xianjie; Du, Xinyu; Yang, Wei; Wang, Jiaona; Guo, Hengyu; Zhao, Shuyu; Yuan, Zuqing; Zhang, Chi; Li, Congju; Wang, Zhong Lin
2018-05-22
Multifunctional electronic textiles (E-textiles) with embedded electric circuits hold great application prospects for future wearable electronics. However, most E-textiles still have critical challenges, including air permeability, satisfactory washability, and mass fabrication. In this work, we fabricate a washable E-textile that addresses all of the concerns and shows its application as a self-powered triboelectric gesture textile for intelligent human-machine interfacing. Utilizing conductive carbon nanotubes (CNTs) and screen-printing technology, this kind of E-textile embraces high conductivity (0.2 kΩ/sq), high air permeability (88.2 mm/s), and can be manufactured on common fabric at large scales. Due to the advantage of the interaction between the CNTs and the fabrics, the electrode shows excellent stability under harsh mechanical deformation and even after being washed. Moreover, based on a single-electrode mode triboelectric nanogenerator and electrode pattern design, our E-textile exhibits highly sensitive touch/gesture sensing performance and has potential applications for human-machine interfacing.
ERIC Educational Resources Information Center
WENDT, PAUL R.; AND OTHERS
A BRANCHING TEACHING-MACHINE PROGRAM WAS DEVELOPED TO TEACH FRESHMEN TO LOCATE MATERIALS WITHOUT THE HELP OF A LIBRARIAN. THE STUDENT WAS SEATED IN FRONT OF A CONSOLE IN A DARKENED, QUIET, AIR-CONDITIONED ROOM. USING A KEYBOARD, THE STUDENT WAS ABLE TO CALL UP ON A SCREEN ANY ONE OF 150 SLIDES. PICTORIAL AND PERFORMANCE FRAMES WERE DEVELOPED TO…
1001 Ways to run AutoDock Vina for virtual screening
NASA Astrophysics Data System (ADS)
Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.
2016-03-01
Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.
1001 Ways to run AutoDock Vina for virtual screening.
Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D
2016-03-01
Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Nina; Zhou, Nan; Fridley, David
2012-03-01
This report presents a technical review of international minimum energy performance standards (MEPS), voluntary and mandatory energy efficiency labels and test procedures for five products being considered for new or revised MEPS in China: copy machines, external power supply, LED displays, residential gas cooktops and flat-screen televisions. For each product, an overview of the scope of existing international standards and labeling programs, energy values and energy performance metrics and description and detailed summary table of criteria and procedures in major test standards are presented.
Virtual Environment Training: Auxiliary Machinery Room (AMR) Watchstation Trainer.
ERIC Educational Resources Information Center
Hriber, Dennis C.; And Others
1993-01-01
Describes a project implemented at Newport News Shipbuilding that used Virtual Environment Training to improve the performance of submarine crewmen. Highlights include development of the Auxiliary Machine Room (AMR) Watchstation Trainer; Digital Video Interactive (DVI); screen layout; test design and evaluation; user reactions; authoring language;…
Optimization of temperature field of tobacco heat shrink machine
NASA Astrophysics Data System (ADS)
Yang, Xudong; Yang, Hai; Sun, Dong; Xu, Mingyang
2018-06-01
A company currently shrinking machine in the course of the film shrinkage is not compact, uneven temperature, resulting in poor quality of the shrinkage of the surface film. To solve this problem, the simulation and optimization of the temperature field are performed by using the k-epsilon turbulence model and the MRF model in fluent. The simulation results show that after the mesh screen structure is installed at the suction inlet of the centrifugal fan, the suction resistance of the fan can be increased and the eddy current intensity caused by the high-speed rotation of the fan can be improved, so that the internal temperature continuity of the heat shrinkable machine is Stronger.
Pointright: a system to redirect mouse and keyboard control among multiple machines
Johanson, Bradley E [Palo Alto, CA; Winograd, Terry A [Stanford, CA; Hutchins, Gregory M [Mountain View, CA
2008-09-30
The present invention provides a software system, PointRight, that allows for smooth and effortless control of pointing and input devices among multiple displays. With PointRight, a single free-floating mouse and keyboard can be used to control multiple screens. When the cursor reaches the edge of a screen it seamlessly moves to the adjacent screen and keyboard control is simultaneously redirected to the appropriate machine. Laptops may also redirect their keyboard and pointing device, and multiple pointers are supported simultaneously. The system automatically reconfigures itself as displays go on, go off, or change the machine they display.
Kagawa, Rina; Kawazoe, Yoshimasa; Ida, Yusuke; Shinohara, Emiko; Tanaka, Katsuya; Imai, Takeshi; Ohe, Kazuhiko
2017-07-01
Phenotyping is an automated technique that can be used to distinguish patients based on electronic health records. To improve the quality of medical care and advance type 2 diabetes mellitus (T2DM) research, the demand for T2DM phenotyping has been increasing. Some existing phenotyping algorithms are not sufficiently accurate for screening or identifying clinical research subjects. We propose a practical phenotyping framework using both expert knowledge and a machine learning approach to develop 2 phenotyping algorithms: one is for screening; the other is for identifying research subjects. We employ expert knowledge as rules to exclude obvious control patients and machine learning to increase accuracy for complicated patients. We developed phenotyping algorithms on the basis of our framework and performed binary classification to determine whether a patient has T2DM. To facilitate development of practical phenotyping algorithms, this study introduces new evaluation metrics: area under the precision-sensitivity curve (AUPS) with a high sensitivity and AUPS with a high positive predictive value. The proposed phenotyping algorithms based on our framework show higher performance than baseline algorithms. Our proposed framework can be used to develop 2 types of phenotyping algorithms depending on the tuning approach: one for screening, the other for identifying research subjects. We develop a novel phenotyping framework that can be easily implemented on the basis of proper evaluation metrics, which are in accordance with users' objectives. The phenotyping algorithms based on our framework are useful for extraction of T2DM patients in retrospective studies.
Influence of grid bar shape on field cleaner performance - Screening tests
USDA-ARS?s Scientific Manuscript database
Extractor type cleaners are used on cotton strippers and in the seed cotton cleaning machinery in the ginning process to remove large foreign material such as burrs and sticks. Previous research on the development of extractor type cleaners focused on machine design and operating parameters that max...
ERIC Educational Resources Information Center
Shawsheen Valley Regional Vocational-Technical High School, Billerica, MA.
This manual contains a work sample intended to assess a handicapped student's interest in and to screen interested students into a training program in basic machine shop I. (The course is based on the entry level of the drill press operator.) Section 1 describes the assessment, correlates the work performed and worker traits required for…
NASA Astrophysics Data System (ADS)
Yan, X. Y.; Chen, G. X.; Liu, J. W.
2018-03-01
A kind of superhydrophobic copper surface with micro-nanocomposite structure has been successfully fabricated by employing a silk-screen printing aided electrochemical machining method. At first silk-screen printing technology has been used to form a column point array mask, and then the microcolumn array would be fabricated by electrochemical machining (ECM) effect. In this study, the drop contact angles have been studied and scanning electron microscopy (SEM) has been used to study the surface characteristic of the workpiece. The experiment results show that the micro-nanocomposite structure with cylindrical array can be successfully fabricated on the metal surface. And the maximum contact angle is 151° when the fluoroalkylsilane ethanol solution was used to modify the machined surface in this study.
Virtual screening of inorganic materials synthesis parameters with deep learning
NASA Astrophysics Data System (ADS)
Kim, Edward; Huang, Kevin; Jegelka, Stefanie; Olivetti, Elsa
2017-12-01
Virtual materials screening approaches have proliferated in the past decade, driven by rapid advances in first-principles computational techniques, and machine-learning algorithms. By comparison, computationally driven materials synthesis screening is still in its infancy, and is mired by the challenges of data sparsity and data scarcity: Synthesis routes exist in a sparse, high-dimensional parameter space that is difficult to optimize over directly, and, for some materials of interest, only scarce volumes of literature-reported syntheses are available. In this article, we present a framework for suggesting quantitative synthesis parameters and potential driving factors for synthesis outcomes. We use a variational autoencoder to compress sparse synthesis representations into a lower dimensional space, which is found to improve the performance of machine-learning tasks. To realize this screening framework even in cases where there are few literature data, we devise a novel data augmentation methodology that incorporates literature synthesis data from related materials systems. We apply this variational autoencoder framework to generate potential SrTiO3 synthesis parameter sets, propose driving factors for brookite TiO2 formation, and identify correlations between alkali-ion intercalation and MnO2 polymorph selection.
Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian
2018-10-01
Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure < 1 year) benefitted substantially from EDSCB in detecting both improvised explosive devices and bare explosives. A comparison of all three conditions showed that automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Automatic machine learning based prediction of cardiovascular events in lung cancer screening data
NASA Astrophysics Data System (ADS)
de Vos, Bob D.; de Jong, Pim A.; Wolterink, Jelmer M.; Vliegenthart, Rozemarijn; Wielingen, Geoffrey V. F.; Viergever, Max A.; Išgum, Ivana
2015-03-01
Calcium burden determined in CT images acquired in lung cancer screening is a strong predictor of cardiovascular events (CVEs). This study investigated whether subjects undergoing such screening who are at risk of a CVE can be identified using automatic image analysis and subject characteristics. Moreover, the study examined whether these individuals can be identified using solely image information, or if a combination of image and subject data is needed. A set of 3559 male subjects undergoing Dutch-Belgian lung cancer screening trial was included. Low-dose non-ECG synchronized chest CT images acquired at baseline were analyzed (1834 scanned in the University Medical Center Groningen, 1725 in the University Medical Center Utrecht). Aortic and coronary calcifications were identified using previously developed automatic algorithms. A set of features describing number, volume and size distribution of the detected calcifications was computed. Age of the participants was extracted from image headers. Features describing participants' smoking status, smoking history and past CVEs were obtained. CVEs that occurred within three years after the imaging were used as outcome. Support vector machine classification was performed employing different feature sets using sets of only image features, or a combination of image and subject related characteristics. Classification based solely on the image features resulted in the area under the ROC curve (Az) of 0.69. A combination of image and subject features resulted in an Az of 0.71. The results demonstrate that subjects undergoing lung cancer screening who are at risk of CVE can be identified using automatic image analysis. Adding subject information slightly improved the performance.
Fang, Xingang; Bagui, Sikha; Bagui, Subhash
2017-08-01
The readily available high throughput screening (HTS) data from the PubChem database provides an opportunity for mining of small molecules in a variety of biological systems using machine learning techniques. From the thousands of available molecular descriptors developed to encode useful chemical information representing the characteristics of molecules, descriptor selection is an essential step in building an optimal quantitative structural-activity relationship (QSAR) model. For the development of a systematic descriptor selection strategy, we need the understanding of the relationship between: (i) the descriptor selection; (ii) the choice of the machine learning model; and (iii) the characteristics of the target bio-molecule. In this work, we employed the Signature descriptor to generate a dataset on the Human kallikrein 5 (hK 5) inhibition confirmatory assay data and compared multiple classification models including logistic regression, support vector machine, random forest and k-nearest neighbor. Under optimal conditions, the logistic regression model provided extremely high overall accuracy (98%) and precision (90%), with good sensitivity (65%) in the cross validation test. In testing the primary HTS screening data with more than 200K molecular structures, the logistic regression model exhibited the capability of eliminating more than 99.9% of the inactive structures. As part of our exploration of the descriptor-model-target relationship, the excellent predictive performance of the combination of the Signature descriptor and the logistic regression model on the assay data of the Human kallikrein 5 (hK 5) target suggested a feasible descriptor/model selection strategy on similar targets. Copyright © 2017 Elsevier Ltd. All rights reserved.
Learning algorithms for human-machine interfaces.
Danziger, Zachary; Fishbach, Alon; Mussa-Ivaldi, Ferdinando A
2009-05-01
The goal of this study is to create and examine machine learning algorithms that adapt in a controlled and cadenced way to foster a harmonious learning environment between the user and the controlled device. To evaluate these algorithms, we have developed a simple experimental framework. Subjects wear an instrumented data glove that records finger motions. The high-dimensional glove signals remotely control the joint angles of a simulated planar two-link arm on a computer screen, which is used to acquire targets. A machine learning algorithm was applied to adaptively change the transformation between finger motion and the simulated robot arm. This algorithm was either LMS gradient descent or the Moore-Penrose (MP) pseudoinverse transformation. Both algorithms modified the glove-to-joint angle map so as to reduce the endpoint errors measured in past performance. The MP group performed worse than the control group (subjects not exposed to any machine learning), while the LMS group outperformed the control subjects. However, the LMS subjects failed to achieve better generalization than the control subjects, and after extensive training converged to the same level of performance as the control subjects. These results highlight the limitations of coadaptive learning using only endpoint error reduction.
Learning Algorithms for Human–Machine Interfaces
Fishbach, Alon; Mussa-Ivaldi, Ferdinando A.
2012-01-01
The goal of this study is to create and examine machine learning algorithms that adapt in a controlled and cadenced way to foster a harmonious learning environment between the user and the controlled device. To evaluate these algorithms, we have developed a simple experimental framework. Subjects wear an instrumented data glove that records finger motions. The high-dimensional glove signals remotely control the joint angles of a simulated planar two-link arm on a computer screen, which is used to acquire targets. A machine learning algorithm was applied to adaptively change the transformation between finger motion and the simulated robot arm. This algorithm was either LMS gradient descent or the Moore–Penrose (MP) pseudoinverse transformation. Both algorithms modified the glove-to-joint angle map so as to reduce the endpoint errors measured in past performance. The MP group performed worse than the control group (subjects not exposed to any machine learning), while the LMS group outperformed the control subjects. However, the LMS subjects failed to achieve better generalization than the control subjects, and after extensive training converged to the same level of performance as the control subjects. These results highlight the limitations of coadaptive learning using only endpoint error reduction. PMID:19203886
Active machine learning-driven experimentation to determine compound effects on protein patterns.
Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F
2016-02-03
High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance.
Ji, Xiaonan; Yen, Po-Yin
2015-08-31
Systematic reviews and their implementation in practice provide high quality evidence for clinical practice but are both time and labor intensive due to the large number of articles. Automatic text classification has proven to be instrumental in identifying relevant articles for systematic reviews. Existing approaches use machine learning model training to generate classification algorithms for the article screening process but have limitations. We applied a network approach to assist in the article screening process for systematic reviews using predetermined article relationships (similarity). The article similarity metric is calculated using the MEDLINE elements title (TI), abstract (AB), medical subject heading (MH), author (AU), and publication type (PT). We used an article network to illustrate the concept of article relationships. Using the concept, each article can be modeled as a node in the network and the relationship between 2 articles is modeled as an edge connecting them. The purpose of our study was to use the article relationship to facilitate an interactive article recommendation process. We used 15 completed systematic reviews produced by the Drug Effectiveness Review Project and demonstrated the use of article networks to assist article recommendation. We evaluated the predictive performance of MEDLINE elements and compared our approach with existing machine learning model training approaches. The performance was measured by work saved over sampling at 95% recall (WSS95) and the F-measure (F1). We also used repeated analysis over variance and Hommel's multiple comparison adjustment to demonstrate statistical evidence. We found that although there is no significant difference across elements (except AU), TI and AB have better predictive capability in general. Collaborative elements bring performance improvement in both F1 and WSS95. With our approach, a simple combination of TI+AB+PT could achieve a WSS95 performance of 37%, which is competitive to traditional machine learning model training approaches (23%-41% WSS95). We demonstrated a new approach to assist in labor intensive systematic reviews. Predictive ability of different elements (both single and composited) was explored. Without using model training approaches, we established a generalizable method that can achieve a competitive performance.
Using information from historical high-throughput screens to predict active compounds.
Riniker, Sereina; Wang, Yuan; Jenkins, Jeremy L; Landrum, Gregory A
2014-07-28
Modern high-throughput screening (HTS) is a well-established approach for hit finding in drug discovery that is routinely employed in the pharmaceutical industry to screen more than a million compounds within a few weeks. However, as the industry shifts to more disease-relevant but more complex phenotypic screens, the focus has moved to piloting smaller but smarter chemically/biologically diverse subsets followed by an expansion around hit compounds. One standard method for doing this is to train a machine-learning (ML) model with the chemical fingerprints of the tested subset of molecules and then select the next compounds based on the predictions of this model. An alternative approach would be to take advantage of the wealth of bioactivity information contained in older (full-deck) screens using so-called HTS fingerprints, where each element of the fingerprint corresponds to the outcome of a particular assay, as input to machine-learning algorithms. We constructed HTS fingerprints using two collections of data: 93 in-house assays and 95 publicly available assays from PubChem. For each source, an additional set of 51 and 46 assays, respectively, was collected for testing. Three different ML methods, random forest (RF), logistic regression (LR), and naïve Bayes (NB), were investigated for both the HTS fingerprint and a chemical fingerprint, Morgan2. RF was found to be best suited for learning from HTS fingerprints yielding area under the receiver operating characteristic curve (AUC) values >0.8 for 78% of the internal assays and enrichment factors at 5% (EF(5%)) >10 for 55% of the assays. The RF(HTS-fp) generally outperformed the LR trained with Morgan2, which was the best ML method for the chemical fingerprint, for the majority of assays. In addition, HTS fingerprints were found to retrieve more diverse chemotypes. Combining the two models through heterogeneous classifier fusion led to a similar or better performance than the best individual model for all assays. Further validation using a pair of in-house assays and data from a confirmatory screen--including a prospective set of around 2000 compounds selected based on our approach--confirmed the good performance. Thus, the combination of machine-learning with HTS fingerprints and chemical fingerprints utilizes information from both domains and presents a very promising approach for hit expansion, leading to more hits. The source code used with the public data is provided.
A Machine Learning Ensemble Classifier for Early Prediction of Diabetic Retinopathy.
S K, Somasundaram; P, Alli
2017-11-09
The main complication of diabetes is Diabetic retinopathy (DR), retinal vascular disease and it leads to the blindness. Regular screening for early DR disease detection is considered as an intensive labor and resource oriented task. Therefore, automatic detection of DR diseases is performed only by using the computational technique is the great solution. An automatic method is more reliable to determine the presence of an abnormality in Fundus images (FI) but, the classification process is poorly performed. Recently, few research works have been designed for analyzing texture discrimination capacity in FI to distinguish the healthy images. However, the feature extraction (FE) process was not performed well, due to the high dimensionality. Therefore, to identify retinal features for DR disease diagnosis and early detection using Machine Learning and Ensemble Classification method, called, Machine Learning Bagging Ensemble Classifier (ML-BEC) is designed. The ML-BEC method comprises of two stages. The first stage in ML-BEC method comprises extraction of the candidate objects from Retinal Images (RI). The candidate objects or the features for DR disease diagnosis include blood vessels, optic nerve, neural tissue, neuroretinal rim, optic disc size, thickness and variance. These features are initially extracted by applying Machine Learning technique called, t-distributed Stochastic Neighbor Embedding (t-SNE). Besides, t-SNE generates a probability distribution across high-dimensional images where the images are separated into similar and dissimilar pairs. Then, t-SNE describes a similar probability distribution across the points in the low-dimensional map. This lessens the Kullback-Leibler divergence among two distributions regarding the locations of the points on the map. The second stage comprises of application of ensemble classifiers to the extracted features for providing accurate analysis of digital FI using machine learning. In this stage, an automatic detection of DR screening system using Bagging Ensemble Classifier (BEC) is investigated. With the help of voting the process in ML-BEC, bagging minimizes the error due to variance of the base classifier. With the publicly available retinal image databases, our classifier is trained with 25% of RI. Results show that the ensemble classifier can achieve better classification accuracy (CA) than single classification models. Empirical experiments suggest that the machine learning-based ensemble classifier is efficient for further reducing DR classification time (CT).
NASA Astrophysics Data System (ADS)
Tu, Shu-Ju; Wang, Chih-Wei; Pan, Kuang-Tse; Wu, Yi-Cheng; Wu, Chen-Te
2018-03-01
Lung cancer screening aims to detect small pulmonary nodules and decrease the mortality rate of those affected. However, studies from large-scale clinical trials of lung cancer screening have shown that the false-positive rate is high and positive predictive value is low. To address these problems, a technical approach is greatly needed for accurate malignancy differentiation among these early-detected nodules. We studied the clinical feasibility of an additional protocol of localized thin-section CT for further assessment on recalled patients from lung cancer screening tests. Our approach of localized thin-section CT was integrated with radiomics features extraction and machine learning classification which was supervised by pathological diagnosis. Localized thin-section CT images of 122 nodules were retrospectively reviewed and 374 radiomics features were extracted. In this study, 48 nodules were benign and 74 malignant. There were nine patients with multiple nodules and four with synchronous multiple malignant nodules. Different machine learning classifiers with a stratified ten-fold cross-validation were used and repeated 100 times to evaluate classification accuracy. Of the image features extracted from the thin-section CT images, 238 (64%) were useful in differentiating between benign and malignant nodules. These useful features include CT density (p = 0.002 518), sigma (p = 0.002 781), uniformity (p = 0.032 41), and entropy (p = 0.006 685). The highest classification accuracy was 79% by the logistic classifier. The performance metrics of this logistic classification model was 0.80 for the positive predictive value, 0.36 for the false-positive rate, and 0.80 for the area under the receiver operating characteristic curve. Our approach of direct risk classification supervised by the pathological diagnosis with localized thin-section CT and radiomics feature extraction may support clinical physicians in determining truly malignant nodules and therefore reduce problems in lung cancer screening.
Tu, Shu-Ju; Wang, Chih-Wei; Pan, Kuang-Tse; Wu, Yi-Cheng; Wu, Chen-Te
2018-03-14
Lung cancer screening aims to detect small pulmonary nodules and decrease the mortality rate of those affected. However, studies from large-scale clinical trials of lung cancer screening have shown that the false-positive rate is high and positive predictive value is low. To address these problems, a technical approach is greatly needed for accurate malignancy differentiation among these early-detected nodules. We studied the clinical feasibility of an additional protocol of localized thin-section CT for further assessment on recalled patients from lung cancer screening tests. Our approach of localized thin-section CT was integrated with radiomics features extraction and machine learning classification which was supervised by pathological diagnosis. Localized thin-section CT images of 122 nodules were retrospectively reviewed and 374 radiomics features were extracted. In this study, 48 nodules were benign and 74 malignant. There were nine patients with multiple nodules and four with synchronous multiple malignant nodules. Different machine learning classifiers with a stratified ten-fold cross-validation were used and repeated 100 times to evaluate classification accuracy. Of the image features extracted from the thin-section CT images, 238 (64%) were useful in differentiating between benign and malignant nodules. These useful features include CT density (p = 0.002 518), sigma (p = 0.002 781), uniformity (p = 0.032 41), and entropy (p = 0.006 685). The highest classification accuracy was 79% by the logistic classifier. The performance metrics of this logistic classification model was 0.80 for the positive predictive value, 0.36 for the false-positive rate, and 0.80 for the area under the receiver operating characteristic curve. Our approach of direct risk classification supervised by the pathological diagnosis with localized thin-section CT and radiomics feature extraction may support clinical physicians in determining truly malignant nodules and therefore reduce problems in lung cancer screening.
A Machine Learning Framework for Plan Payment Risk Adjustment.
Rose, Sherri
2016-12-01
To introduce cross-validation and a nonparametric machine learning framework for plan payment risk adjustment and then assess whether they have the potential to improve risk adjustment. 2011-2012 Truven MarketScan database. We compare the performance of multiple statistical approaches within a broad machine learning framework for estimation of risk adjustment formulas. Total annual expenditure was predicted using age, sex, geography, inpatient diagnoses, and hierarchical condition category variables. The methods included regression, penalized regression, decision trees, neural networks, and an ensemble super learner, all in concert with screening algorithms that reduce the set of variables considered. The performance of these methods was compared based on cross-validated R 2 . Our results indicate that a simplified risk adjustment formula selected via this nonparametric framework maintains much of the efficiency of a traditional larger formula. The ensemble approach also outperformed classical regression and all other algorithms studied. The implementation of cross-validated machine learning techniques provides novel insight into risk adjustment estimation, possibly allowing for a simplified formula, thereby reducing incentives for increased coding intensity as well as the ability of insurers to "game" the system with aggressive diagnostic upcoding. © Health Research and Educational Trust.
A deep learning and novelty detection framework for rapid phenotyping in high-content screening
Sommer, Christoph; Hoefler, Rudolf; Samwer, Matthias; Gerlich, Daniel W.
2017-01-01
Supervised machine learning is a powerful and widely used method for analyzing high-content screening data. Despite its accuracy, efficiency, and versatility, supervised machine learning has drawbacks, most notably its dependence on a priori knowledge of expected phenotypes and time-consuming classifier training. We provide a solution to these limitations with CellCognition Explorer, a generic novelty detection and deep learning framework. Application to several large-scale screening data sets on nuclear and mitotic cell morphologies demonstrates that CellCognition Explorer enables discovery of rare phenotypes without user training, which has broad implications for improved assay development in high-content screening. PMID:28954863
Antibiotic Residues in Milk from Three Popular Kenyan Milk Vending Machines.
Kosgey, Amos; Shitandi, Anakalo; Marion, Jason W
2018-05-01
Milk vending machines (MVMs) are growing in popularity in Kenya and worldwide. Milk vending machines dispense varying quantities of locally sourced, pasteurized milk. The Kenya Dairy Board has a regulatory framework, but surveillance is weak because of several factors. Milk vending machines' milk is not routinely screened for antibiotics, thereby increasing potential for antibiotic misuse. To investigate, a total of 80 milk samples from four commercial providers ( N = 25), street vendors ( N = 21), and three MVMs ( N = 34) were collected and screened in Eldoret, Kenya. Antibiotic residue surveillance occurred during December 2016 and January 2017 using Idexx SNAP ® tests for tetracyclines, sulfamethazine, beta-lactams, and gentamicin. Overall, 24% of MVM samples and 24% of street vendor samples were presumably positive for at least one antibiotic. No commercial samples were positive. Research into cost-effective screening methods and increased monitoring by food safety agencies are needed to uphold hazard analysis and critical control point for improving antibiotic stewardship throughout the Kenyan private dairy industry.
Gates, Allison; Johnson, Cydney; Hartling, Lisa
2018-03-12
Machine learning tools can expedite systematic review (SR) processes by semi-automating citation screening. Abstrackr semi-automates citation screening by predicting relevant records. We evaluated its performance for four screening projects. We used a convenience sample of screening projects completed at the Alberta Research Centre for Health Evidence, Edmonton, Canada: three SRs and one descriptive analysis for which we had used SR screening methods. The projects were heterogeneous with respect to search yield (median 9328; range 5243 to 47,385 records; interquartile range (IQR) 15,688 records), topic (Antipsychotics, Bronchiolitis, Diabetes, Child Health SRs), and screening complexity. We uploaded the records to Abstrackr and screened until it made predictions about the relevance of the remaining records. Across three trials for each project, we compared the predictions to human reviewer decisions and calculated the sensitivity, specificity, precision, false negative rate, proportion missed, and workload savings. Abstrackr's sensitivity was > 0.75 for all projects and the mean specificity ranged from 0.69 to 0.90 with the exception of Child Health SRs, for which it was 0.19. The precision (proportion of records correctly predicted as relevant) varied by screening task (median 26.6%; range 14.8 to 64.7%; IQR 29.7%). The median false negative rate (proportion of records incorrectly predicted as irrelevant) was 12.6% (range 3.5 to 21.2%; IQR 12.3%). The workload savings were often large (median 67.2%, range 9.5 to 88.4%; IQR 23.9%). The proportion missed (proportion of records predicted as irrelevant that were included in the final report, out of the total number predicted as irrelevant) was 0.1% for all SRs and 6.4% for the descriptive analysis. This equated to 4.2% (range 0 to 12.2%; IQR 7.8%) of the records in the final reports. Abstrackr's reliability and the workload savings varied by screening task. Workload savings came at the expense of potentially missing relevant records. How this might affect the results and conclusions of SRs needs to be evaluated. Studies evaluating Abstrackr as the second reviewer in a pair would be of interest to determine if concerns for reliability would diminish. Further evaluations of Abstrackr's performance and usability will inform its refinement and practical utility.
Sakumura, Yuichi; Koyama, Yutaro; Tokutake, Hiroaki; Hida, Toyoaki; Sato, Kazuo; Itoh, Toshio; Akamatsu, Takafumi; Shin, Woosuck
2017-01-01
Monitoring exhaled breath is a very attractive, noninvasive screening technique for early diagnosis of diseases, especially lung cancer. However, the technique provides insufficient accuracy because the exhaled air has many crucial volatile organic compounds (VOCs) at very low concentrations (ppb level). We analyzed the breath exhaled by lung cancer patients and healthy subjects (controls) using gas chromatography/mass spectrometry (GC/MS), and performed a subsequent statistical analysis to diagnose lung cancer based on the combination of multiple lung cancer-related VOCs. We detected 68 VOCs as marker species using GC/MS analysis. We reduced the number of VOCs and used support vector machine (SVM) algorithm to classify the samples. We observed that a combination of five VOCs (CHN, methanol, CH3CN, isoprene, 1-propanol) is sufficient for 89.0% screening accuracy, and hence, it can be used for the design and development of a desktop GC-sensor analysis system for lung cancer. PMID:28165388
Sakumura, Yuichi; Koyama, Yutaro; Tokutake, Hiroaki; Hida, Toyoaki; Sato, Kazuo; Itoh, Toshio; Akamatsu, Takafumi; Shin, Woosuck
2017-02-04
Monitoring exhaled breath is a very attractive, noninvasive screening technique for early diagnosis of diseases, especially lung cancer. However, the technique provides insufficient accuracy because the exhaled air has many crucial volatile organic compounds (VOCs) at very low concentrations (ppb level). We analyzed the breath exhaled by lung cancer patients and healthy subjects (controls) using gas chromatography/mass spectrometry (GC/MS), and performed a subsequent statistical analysis to diagnose lung cancer based on the combination of multiple lung cancer-related VOCs. We detected 68 VOCs as marker species using GC/MS analysis. We reduced the number of VOCs and used support vector machine (SVM) algorithm to classify the samples. We observed that a combination of five VOCs (CHN, methanol, CH₃CN, isoprene, 1-propanol) is sufficient for 89.0% screening accuracy, and hence, it can be used for the design and development of a desktop GC-sensor analysis system for lung cancer.
Active machine learning-driven experimentation to determine compound effects on protein patterns
Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F
2016-01-01
High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance. DOI: http://dx.doi.org/10.7554/eLife.10047.001 PMID:26840049
Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter
2017-06-28
High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.
Active Learning Strategies for Phenotypic Profiling of High-Content Screens.
Smith, Kevin; Horvath, Peter
2014-06-01
High-content screening is a powerful method to discover new drugs and carry out basic biological research. Increasingly, high-content screens have come to rely on supervised machine learning (SML) to perform automatic phenotypic classification as an essential step of the analysis. However, this comes at a cost, namely, the labeled examples required to train the predictive model. Classification performance increases with the number of labeled examples, and because labeling examples demands time from an expert, the training process represents a significant time investment. Active learning strategies attempt to overcome this bottleneck by presenting the most relevant examples to the annotator, thereby achieving high accuracy while minimizing the cost of obtaining labeled data. In this article, we investigate the impact of active learning on single-cell-based phenotype recognition, using data from three large-scale RNA interference high-content screens representing diverse phenotypic profiling problems. We consider several combinations of active learning strategies and popular SML methods. Our results show that active learning significantly reduces the time cost and can be used to reveal the same phenotypic targets identified using SML. We also identify combinations of active learning strategies and SML methods which perform better than others on the phenotypic profiling problems we studied. © 2014 Society for Laboratory Automation and Screening.
Telehealth solutions to enable global collaboration in rheumatic heart disease screening.
Lopes, Eduardo Lv; Beaton, Andrea Z; Nascimento, Bruno R; Tompsett, Alison; Dos Santos, Julia Pa; Perlman, Lindsay; Diamantino, Adriana C; Oliveira, Kaciane Kb; Oliveira, Cassio M; Nunes, Maria do Carmo P; Bonisson, Leonardo; Ribeiro, Antônio Lp; Sable, Craig
2018-02-01
Background The global burden of rheumatic heart disease is nearly 33 million people. Telemedicine, using cloud-server technology, provides an ideal solution for sharing images performed by non-physicians with cardiologists who are experts in rheumatic heart disease. Objective We describe our experience in using telemedicine to support a large rheumatic heart disease outreach screening programme in the Brazilian state of Minas Gerais. Methods The Programa de Rastreamento da Valvopatia Reumática (PROVAR) is a prospective cross-sectional study aimed at gathering epidemiological data on the burden of rheumatic heart disease in Minas Gerais and testing of a non-expert, telemedicine-supported model of outreach rheumatic heart disease screening. The primary goal is to enable expert support of remote rheumatic heart disease outreach through cloud-based sharing of echocardiographic images between Minas Gerais and Washington. Secondary goals include (a) developing and sharing online training modules for non-physicians in echocardiography performance and interpretation and (b) utilising a secure web-based system to share clinical and research data. Results PROVAR included 4615 studies that were performed by non-experts at 21 schools and shared via cloud-telemedicine technology. Latent rheumatic heart disease was found in 251 subjects (4.2% of subjects: 3.7% borderline and 0.5% definite disease). Of the studies, 50% were preformed on full functional echocardiography machines and transmitted via Digital Imaging and Communications in Medicine (DICOM) and 50% were performed on handheld echocardiography machines and transferred via a secure Dropbox connection. The average time between study performance date and interpretation was 10 days. There was 100% success in initial image transfer. Less than 1% of studies performed by non-experts could not be interpreted. Discussion A sustainable, low-cost telehealth model, using task-shifting with non-medical personal in low and middle income countries can improve access to echocardiography for rheumatic heart disease.
Heidari, Morteza; Khuzani, Abolfazl Zargari; Hollingsworth, Alan B; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin
2018-01-30
In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.
NASA Astrophysics Data System (ADS)
Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin
2018-02-01
In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.
Evaluation of the Effectiveness of Simulation for M4 Marksmanship Training
2014-02-01
DEMOGRAPHIC QUESTIONNAIRE ................................................. 34 APPENDIX C: ANALYSIS OF MARKSMANSHIP PERFORMANCE DATA TO IDENTIFY POTENTIAL...machine guns and anti- armour weapons. In these simulators, firers aim a modified weapon at a target image on a screen. When the firer pulls the trigger...investigate predictors of live-fire LF6 qualification. Specifically, we examined the utility of LF6 simulator scores and trainee demographic data as
Quantifying Pollutant Emissions from Office Equipment Phase IReport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.
2006-12-01
Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants ofmore » interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment with respect to human exposures. The more detailed studies of the next phase of research (Phase II) are meant to characterize changes in emissions with time and may identify factors that can be modified to reduce emissions. These measurements may identify 'win-win' situations in which low energy consumption machines have lower pollutant emissions. This information will be used to compare machines to determine if some are substantially better than their peers with respect to their emissions of pollutants.« less
Optical alignment of electrodes on electrical discharge machines
NASA Technical Reports Server (NTRS)
Boissevain, A. G.; Nelson, B. W.
1972-01-01
Shadowgraph system projects magnified image on screen so that alignment of small electrodes mounted on electrical discharge machines can be corrected and verified. Technique may be adapted to other machine tool equipment where physical contact cannot be made during inspection and access to tool limits conventional runout checking procedures.
Web-based newborn screening system for metabolic diseases: machine learning versus clinicians.
Chen, Wei-Hsin; Hsieh, Sheau-Ling; Hsu, Kai-Ping; Chen, Han-Ping; Su, Xing-Yu; Tseng, Yi-Ju; Chien, Yin-Hsiu; Hwu, Wuh-Liang; Lai, Feipei
2013-05-23
A hospital information system (HIS) that integrates screening data and interpretation of the data is routinely requested by hospitals and parents. However, the accuracy of disease classification may be low because of the disease characteristics and the analytes used for classification. The objective of this study is to describe a system that enhanced the neonatal screening system of the Newborn Screening Center at the National Taiwan University Hospital. The system was designed and deployed according to a service-oriented architecture (SOA) framework under the Web services .NET environment. The system consists of sample collection, testing, diagnosis, evaluation, treatment, and follow-up services among collaborating hospitals. To improve the accuracy of newborn screening, machine learning and optimal feature selection mechanisms were investigated for screening newborns for inborn errors of metabolism. The framework of the Newborn Screening Hospital Information System (NSHIS) used the embedded Health Level Seven (HL7) standards for data exchanges among heterogeneous platforms integrated by Web services in the C# language. In this study, machine learning classification was used to predict phenylketonuria (PKU), hypermethioninemia, and 3-methylcrotonyl-CoA-carboxylase (3-MCC) deficiency. The classification methods used 347,312 newborn dried blood samples collected at the Center between 2006 and 2011. Of these, 220 newborns had values over the diagnostic cutoffs (positive cases) and 1557 had values that were over the screening cutoffs but did not meet the diagnostic cutoffs (suspected cases). The original 35 analytes and the manifested features were ranked based on F score, then combinations of the top 20 ranked features were selected as input features to support vector machine (SVM) classifiers to obtain optimal feature sets. These feature sets were tested using 5-fold cross-validation and optimal models were generated. The datasets collected in year 2011 were used as predicting cases. The feature selection strategies were implemented and the optimal markers for PKU, hypermethioninemia, and 3-MCC deficiency were obtained. The results of the machine learning approach were compared with the cutoff scheme. The number of the false positive cases were reduced from 21 to 2 for PKU, from 30 to 10 for hypermethioninemia, and 209 to 46 for 3-MCC deficiency. This SOA Web service-based newborn screening system can accelerate screening procedures effectively and efficiently. An SVM learning methodology for PKU, hypermethioninemia, and 3-MCC deficiency metabolic diseases classification, including optimal feature selection strategies, is presented. By adopting the results of this study, the number of suspected cases could be reduced dramatically.
Web-Based Newborn Screening System for Metabolic Diseases: Machine Learning Versus Clinicians
Chen, Wei-Hsin; Hsu, Kai-Ping; Chen, Han-Ping; Su, Xing-Yu; Tseng, Yi-Ju; Chien, Yin-Hsiu; Hwu, Wuh-Liang; Lai, Feipei
2013-01-01
Background A hospital information system (HIS) that integrates screening data and interpretation of the data is routinely requested by hospitals and parents. However, the accuracy of disease classification may be low because of the disease characteristics and the analytes used for classification. Objective The objective of this study is to describe a system that enhanced the neonatal screening system of the Newborn Screening Center at the National Taiwan University Hospital. The system was designed and deployed according to a service-oriented architecture (SOA) framework under the Web services .NET environment. The system consists of sample collection, testing, diagnosis, evaluation, treatment, and follow-up services among collaborating hospitals. To improve the accuracy of newborn screening, machine learning and optimal feature selection mechanisms were investigated for screening newborns for inborn errors of metabolism. Methods The framework of the Newborn Screening Hospital Information System (NSHIS) used the embedded Health Level Seven (HL7) standards for data exchanges among heterogeneous platforms integrated by Web services in the C# language. In this study, machine learning classification was used to predict phenylketonuria (PKU), hypermethioninemia, and 3-methylcrotonyl-CoA-carboxylase (3-MCC) deficiency. The classification methods used 347,312 newborn dried blood samples collected at the Center between 2006 and 2011. Of these, 220 newborns had values over the diagnostic cutoffs (positive cases) and 1557 had values that were over the screening cutoffs but did not meet the diagnostic cutoffs (suspected cases). The original 35 analytes and the manifested features were ranked based on F score, then combinations of the top 20 ranked features were selected as input features to support vector machine (SVM) classifiers to obtain optimal feature sets. These feature sets were tested using 5-fold cross-validation and optimal models were generated. The datasets collected in year 2011 were used as predicting cases. Results The feature selection strategies were implemented and the optimal markers for PKU, hypermethioninemia, and 3-MCC deficiency were obtained. The results of the machine learning approach were compared with the cutoff scheme. The number of the false positive cases were reduced from 21 to 2 for PKU, from 30 to 10 for hypermethioninemia, and 209 to 46 for 3-MCC deficiency. Conclusions This SOA Web service–based newborn screening system can accelerate screening procedures effectively and efficiently. An SVM learning methodology for PKU, hypermethioninemia, and 3-MCC deficiency metabolic diseases classification, including optimal feature selection strategies, is presented. By adopting the results of this study, the number of suspected cases could be reduced dramatically. PMID:23702487
Chemically intuited, large-scale screening of MOFs by machine learning techniques
NASA Astrophysics Data System (ADS)
Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.
2017-10-01
A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.
Li, Liwei; Khanna, May; Jo, Inha; Wang, Fang; Ashpole, Nicole M; Hudmon, Andy; Meroueh, Samy O
2011-04-25
We assess the performance of our previously reported structure-based support vector machine target-specific scoring function across 41 targets, 40 among them from the Directory of Useful Decoys (DUD). The area under the curve of receiver operating characteristic plots (ROC-AUC) revealed that scoring with SVM-SP resulted in consistently better enrichment over all target families, outperforming Glide and other scoring functions, most notably among kinases. In addition, SVM-SP performance showed little variation among protein classes, exhibited excellent performance in a test case using a homology model, and in some cases showed high enrichment even with few structures used to train a model. We put SVM-SP to the test by virtual screening 1125 compounds against two kinases, EGFR and CaMKII. Among the top 25 EGFR compounds, three compounds (1-3) inhibited kinase activity in vitro with IC₅₀ of 58, 2, and 10 μM. In cell cultures, compounds 1-3 inhibited nonsmall cell lung carcinoma (H1299) cancer cell proliferation with similar IC₅₀ values for compound 3. For CaMKII, one compound inhibited kinase activity in a dose-dependent manner among 20 tested with an IC₅₀ of 48 μM. These results are encouraging given that our in-house library consists of compounds that emerged from virtual screening of other targets with pockets that are different from typical ATP binding sites found in kinases. In light of the importance of kinases in chemical biology, these findings could have implications in future efforts to identify chemical probes of kinases within the human kinome.
Wear-screening and joint simulation studies vs. materials selection and prosthesis design.
Clarke, I C
1982-01-01
Satisfactory friction and wear performance of orthomedic biomaterials is an essential criterion for both hemiarthroplasty and total joint replacements. This report will chart the clinical historical experience of candidate biomaterials with their wear resistance and compare/contrast these data to experimental test predictions. The latter review will encompass publications dealing with both joint simulators and the more basic friction and wear screening devices. Special consideration will be given to the adequacy of the test protocol, the design of the experimental machines, and the accuracy of the measurement techniques. The discussion will then center on clinical reality vs. experimental adequacy and summarize current developments.
Machine Learning for Social Services: A Study of Prenatal Case Management in Illinois.
Pan, Ian; Nolan, Laura B; Brown, Rashida R; Khan, Romana; van der Boor, Paul; Harris, Daniel G; Ghani, Rayid
2017-06-01
To evaluate the positive predictive value of machine learning algorithms for early assessment of adverse birth risk among pregnant women as a means of improving the allocation of social services. We used administrative data for 6457 women collected by the Illinois Department of Human Services from July 2014 to May 2015 to develop a machine learning model for adverse birth prediction and improve upon the existing paper-based risk assessment. We compared different models and determined the strongest predictors of adverse birth outcomes using positive predictive value as the metric for selection. Machine learning algorithms performed similarly, outperforming the current paper-based risk assessment by up to 36%; a refined paper-based assessment outperformed the current assessment by up to 22%. We estimate that these improvements will allow 100 to 170 additional high-risk pregnant women screened for program eligibility each year to receive services that would have otherwise been unobtainable. Our analysis exhibits the potential for machine learning to move government agencies toward a more data-informed approach to evaluating risk and providing social services. Overall, such efforts will improve the efficiency of allocating resource-intensive interventions.
Muthu Rama Krishnan, M; Shah, Pratik; Chakraborty, Chandan; Ray, Ajoy K
2012-04-01
The objective of this paper is to provide an improved technique, which can assist oncopathologists in correct screening of oral precancerous conditions specially oral submucous fibrosis (OSF) with significant accuracy on the basis of collagen fibres in the sub-epithelial connective tissue. The proposed scheme is composed of collagen fibres segmentation, its textural feature extraction and selection, screening perfomance enhancement under Gaussian transformation and finally classification. In this study, collagen fibres are segmented on R,G,B color channels using back-probagation neural network from 60 normal and 59 OSF histological images followed by histogram specification for reducing the stain intensity variation. Henceforth, textural features of collgen area are extracted using fractal approaches viz., differential box counting and brownian motion curve . Feature selection is done using Kullback-Leibler (KL) divergence criterion and the screening performance is evaluated based on various statistical tests to conform Gaussian nature. Here, the screening performance is enhanced under Gaussian transformation of the non-Gaussian features using hybrid distribution. Moreover, the routine screening is designed based on two statistical classifiers viz., Bayesian classification and support vector machines (SVM) to classify normal and OSF. It is observed that SVM with linear kernel function provides better classification accuracy (91.64%) as compared to Bayesian classifier. The addition of fractal features of collagen under Gaussian transformation improves Bayesian classifier's performance from 80.69% to 90.75%. Results are here studied and discussed.
Micro Dot Patterning on the Light Guide Panel Using Powder Blasting.
Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam
2008-02-08
This study is to develop a micromachining technology for a light guidepanel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a singleinjection process instead of existing screen printing processes. The micro powder blastingtechnique is applied to form micro dot patterns on the LGP mold surface. The optimalconditions for masking, laminating, exposure, and developing processes to form the microdot patterns are first experimentally investigated. A LGP mold with masked micro patternsis then machined using the micro powder blasting method and the machinability of themicro dot patterns is verified. A prototype LGP is test- injected using the developed LGPmold and a shape analysis of the patterns and performance testing of the injected LGP arecarried out. As an additional approach, matte finishing, a special surface treatment method,is applied to the mold surface to improve the light diffusion characteristics, uniformity andbrightness of the LGP. The results of this study show that the applied powder blastingmethod can be successfully used to manufacture LGPs with micro patterns by just singleinjection using the developed mold and thereby replace existing screen printing methods.
Lyles, Courtney Rees; Godbehere, Andrew; Le, Gem; El Ghaoui, Laurent; Sarkar, Urmimala
2016-06-10
It is difficult to synthesize the vast amount of textual data available from social media websites. Capturing real-world discussions via social media could provide insights into individuals' opinions and the decision-making process. We conducted a sequential mixed methods study to determine the utility of sparse machine learning techniques in summarizing Twitter dialogues. We chose a narrowly defined topic for this approach: cervical cancer discussions over a 6-month time period surrounding a change in Pap smear screening guidelines. We applied statistical methodologies, known as sparse machine learning algorithms, to summarize Twitter messages about cervical cancer before and after the 2012 change in Pap smear screening guidelines by the US Preventive Services Task Force (USPSTF). All messages containing the search terms "cervical cancer," "Pap smear," and "Pap test" were analyzed during: (1) January 1-March 13, 2012, and (2) March 14-June 30, 2012. Topic modeling was used to discern the most common topics from each time period, and determine the singular value criterion for each topic. The results were then qualitatively coded from top 10 relevant topics to determine the efficiency of clustering method in grouping distinct ideas, and how the discussion differed before vs. after the change in guidelines . This machine learning method was effective in grouping the relevant discussion topics about cervical cancer during the respective time periods (~20% overall irrelevant content in both time periods). Qualitative analysis determined that a significant portion of the top discussion topics in the second time period directly reflected the USPSTF guideline change (eg, "New Screening Guidelines for Cervical Cancer"), and many topics in both time periods were addressing basic screening promotion and education (eg, "It is Cervical Cancer Awareness Month! Click the link to see where you can receive a free or low cost Pap test.") It was demonstrated that machine learning tools can be useful in cervical cancer prevention and screening discussions on Twitter. This method allowed us to prove that there is publicly available significant information about cervical cancer screening on social media sites. Moreover, we observed a direct impact of the guideline change within the Twitter messages.
Machines and Human Beings in the Movies
ERIC Educational Resources Information Center
van der Laan, J. M.
2006-01-01
Over the years, many movies have presented on-screen a struggle between machines and human beings. Typically, the machines have come to rule and threaten the existence of humanity. They must be conquered to ensure the survival of and to secure the freedom of the human race. Although these movies appear to expose the dangers of an autonomous and…
Applications of color machine vision in the agricultural and food industries
NASA Astrophysics Data System (ADS)
Zhang, Min; Ludas, Laszlo I.; Morgan, Mark T.; Krutz, Gary W.; Precetti, Cyrille J.
1999-01-01
Color is an important factor in Agricultural and the Food Industry. Agricultural or prepared food products are often grade by producers and consumers using color parameters. Color is used to estimate maturity, sort produce for defects, but also perform genetic screenings or make an aesthetic judgement. The task of sorting produce following a color scale is very complex, requires special illumination and training. Also, this task cannot be performed for long durations without fatigue and loss of accuracy. This paper describes a machine vision system designed to perform color classification in real-time. Applications for sorting a variety of agricultural products are included: e.g. seeds, meat, baked goods, plant and wood.FIrst the theory of color classification of agricultural and biological materials is introduced. Then, some tools for classifier development are presented. Finally, the implementation of the algorithm on real-time image processing hardware and example applications for industry is described. This paper also presented an image analysis algorithm and a prototype machine vision system which was developed for industry. This system will automatically locate the surface of some plants using digital camera and predict information such as size, potential value and type of this plant. The algorithm developed will be feasible for real-time identification in an industrial environment.
Makinde, O A; Mpofu, K; Vrabic, R; Ramatsetse, B I
2017-01-01
The development of a robotic-driven maintenance solution capable of automatically maintaining reconfigurable vibrating screen (RVS) machine when utilized in dangerous and hazardous underground mining environment has called for the design of a multifunctional robotic end-effector capable of carrying out all the maintenance tasks on the RVS machine. In view of this, the paper presents a bio-inspired approach which unfolds the design of a novel multifunctional robotic end-effector embedded with mechanical and control mechanisms capable of automatically maintaining the RVS machine. To achieve this, therblig and morphological methodologies (which classifies the motions as well as the actions required by the robotic end-effector in carrying out RVS machine maintenance tasks), obtained from a detailed analogy of how human being (i.e. a machine maintenance manager) will carry out different maintenance tasks on the RVS machine, were used to obtain the maintenance objective functions or goals of the multifunctional robotic end-effector as well as the maintenance activity constraints of the RVS machine that must be adhered to by the multifunctional robotic end-effector during the machine maintenance. The results of the therblig and morphological analyses of five (5) different maintenance tasks capture and classify one hundred and thirty-four (134) repetitive motions and fifty-four (54) functions required in automating the maintenance tasks of the RVS machine. Based on these findings, a worm-gear mechanism embedded with fingers extruded with a hexagonal shaped heads capable of carrying out the "gripping and ungrasping" and "loosening and bolting" functions of the robotic end-effector and an electric cylinder actuator module capable of carrying out "unpinning and hammering" functions of the robotic end-effector were integrated together to produce the customized multifunctional robotic end-effector capable of automatically maintaining the RVS machine. The axial forces ([Formula: see text] and [Formula: see text]), normal forces ([Formula: see text]) and total load [Formula: see text] acting on the teeth of the worm-gear module of the multifunctional robotic end-effector during the gripping of worn-out or new RVS machine subsystems, which are 978.547, 1245.06 and 1016.406 N, respectively, were satisfactory. The nominal bending and torsional stresses acting on the shoulder of the socket module of the multifunctional robotic end-effector during the loosing and tightening of bolts, which are 1450.72 and 179.523 MPa, respectively, were satisfactory. The hammering and unpinning forces utilized by the electric cylinder actuator module of the multifunctional robotic end-effector during the unpinning and hammering of screen panel pins out of and into the screen panels were satisfactory.
Zhang, Bin; He, Xin; Ouyang, Fusheng; Gu, Dongsheng; Dong, Yuhao; Zhang, Lu; Mo, Xiaokai; Huang, Wenhui; Tian, Jie; Zhang, Shuixing
2017-09-10
We aimed to identify optimal machine-learning methods for radiomics-based prediction of local failure and distant failure in advanced nasopharyngeal carcinoma (NPC). We enrolled 110 patients with advanced NPC. A total of 970 radiomic features were extracted from MRI images for each patient. Six feature selection methods and nine classification methods were evaluated in terms of their performance. We applied the 10-fold cross-validation as the criterion for feature selection and classification. We repeated each combination for 50 times to obtain the mean area under the curve (AUC) and test error. We observed that the combination methods Random Forest (RF) + RF (AUC, 0.8464 ± 0.0069; test error, 0.3135 ± 0.0088) had the highest prognostic performance, followed by RF + Adaptive Boosting (AdaBoost) (AUC, 0.8204 ± 0.0095; test error, 0.3384 ± 0.0097), and Sure Independence Screening (SIS) + Linear Support Vector Machines (LSVM) (AUC, 0.7883 ± 0.0096; test error, 0.3985 ± 0.0100). Our radiomics study identified optimal machine-learning methods for the radiomics-based prediction of local failure and distant failure in advanced NPC, which could enhance the applications of radiomics in precision oncology and clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Man-machine communication - A transparent switchboard for computers
NASA Technical Reports Server (NTRS)
Rasmussen, H.
1971-01-01
Device uses pattern of transparent contact touch points that are put on cathode ray tube screen. Touch point system compels more precise and unambiguous communication between man and machine than is possible with any other means, and speeds up operation responses.
NASA Astrophysics Data System (ADS)
Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qian, Wei; Zheng, Bin
2018-03-01
Both conventional and deep machine learning has been used to develop decision-support tools applied in medical imaging informatics. In order to take advantages of both conventional and deep learning approach, this study aims to investigate feasibility of applying a locally preserving projection (LPP) based feature regeneration algorithm to build a new machine learning classifier model to predict short-term breast cancer risk. First, a computer-aided image processing scheme was used to segment and quantify breast fibro-glandular tissue volume. Next, initially computed 44 image features related to the bilateral mammographic tissue density asymmetry were extracted. Then, an LLP-based feature combination method was applied to regenerate a new operational feature vector using a maximal variance approach. Last, a k-nearest neighborhood (KNN) algorithm based machine learning classifier using the LPP-generated new feature vectors was developed to predict breast cancer risk. A testing dataset involving negative mammograms acquired from 500 women was used. Among them, 250 were positive and 250 remained negative in the next subsequent mammography screening. Applying to this dataset, LLP-generated feature vector reduced the number of features from 44 to 4. Using a leave-onecase-out validation method, area under ROC curve produced by the KNN classifier significantly increased from 0.62 to 0.68 (p < 0.05) and odds ratio was 4.60 with a 95% confidence interval of [3.16, 6.70]. Study demonstrated that this new LPP-based feature regeneration approach enabled to produce an optimal feature vector and yield improved performance in assisting to predict risk of women having breast cancer detected in the next subsequent mammography screening.
NASA Astrophysics Data System (ADS)
Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Qiu, Yuchen; Zheng, Bin
2018-02-01
Objective of this study is to develop and test a new computer-aided detection (CAD) scheme with improved region of interest (ROI) segmentation combined with an image feature extraction framework to improve performance in predicting short-term breast cancer risk. A dataset involving 570 sets of "prior" negative mammography screening cases was retrospectively assembled. In the next sequential "current" screening, 285 cases were positive and 285 cases remained negative. A CAD scheme was applied to all 570 "prior" negative images to stratify cases into the high and low risk case group of having cancer detected in the "current" screening. First, a new ROI segmentation algorithm was used to automatically remove useless area of mammograms. Second, from the matched bilateral craniocaudal view images, a set of 43 image features related to frequency characteristics of ROIs were initially computed from the discrete cosine transform and spatial domain of the images. Third, a support vector machine model based machine learning classifier was used to optimally classify the selected optimal image features to build a CAD-based risk prediction model. The classifier was trained using a leave-one-case-out based cross-validation method. Applying this improved CAD scheme to the testing dataset, an area under ROC curve, AUC = 0.70+/-0.04, which was significantly higher than using the extracting features directly from the dataset without the improved ROI segmentation step (AUC = 0.63+/-0.04). This study demonstrated that the proposed approach could improve accuracy on predicting short-term breast cancer risk, which may play an important role in helping eventually establish an optimal personalized breast cancer paradigm.
Artificial intelligence approaches for rational drug design and discovery.
Duch, Włodzisław; Swaminathan, Karthikeyan; Meller, Jarosław
2007-01-01
Pattern recognition, machine learning and artificial intelligence approaches play an increasingly important role in rational drug design, screening and identification of candidate molecules and studies on quantitative structure-activity relationships (QSAR). In this review, we present an overview of basic concepts and methodology in the fields of machine learning and artificial intelligence (AI). An emphasis is put on methods that enable an intuitive interpretation of the results and facilitate gaining an insight into the structure of the problem at hand. We also discuss representative applications of AI methods to docking, screening and QSAR studies. The growing trend to integrate computational and experimental efforts in that regard and some future developments are discussed. In addition, we comment on a broader role of machine learning and artificial intelligence approaches in biomedical research.
Godbehere, Andrew; Le, Gem; El Ghaoui, Laurent; Sarkar, Urmimala
2016-01-01
Background It is difficult to synthesize the vast amount of textual data available from social media websites. Capturing real-world discussions via social media could provide insights into individuals’ opinions and the decision-making process. Objective We conducted a sequential mixed methods study to determine the utility of sparse machine learning techniques in summarizing Twitter dialogues. We chose a narrowly defined topic for this approach: cervical cancer discussions over a 6-month time period surrounding a change in Pap smear screening guidelines. Methods We applied statistical methodologies, known as sparse machine learning algorithms, to summarize Twitter messages about cervical cancer before and after the 2012 change in Pap smear screening guidelines by the US Preventive Services Task Force (USPSTF). All messages containing the search terms “cervical cancer,” “Pap smear,” and “Pap test” were analyzed during: (1) January 1–March 13, 2012, and (2) March 14–June 30, 2012. Topic modeling was used to discern the most common topics from each time period, and determine the singular value criterion for each topic. The results were then qualitatively coded from top 10 relevant topics to determine the efficiency of clustering method in grouping distinct ideas, and how the discussion differed before vs. after the change in guidelines . Results This machine learning method was effective in grouping the relevant discussion topics about cervical cancer during the respective time periods (~20% overall irrelevant content in both time periods). Qualitative analysis determined that a significant portion of the top discussion topics in the second time period directly reflected the USPSTF guideline change (eg, “New Screening Guidelines for Cervical Cancer”), and many topics in both time periods were addressing basic screening promotion and education (eg, “It is Cervical Cancer Awareness Month! Click the link to see where you can receive a free or low cost Pap test.”) Conclusions It was demonstrated that machine learning tools can be useful in cervical cancer prevention and screening discussions on Twitter. This method allowed us to prove that there is publicly available significant information about cervical cancer screening on social media sites. Moreover, we observed a direct impact of the guideline change within the Twitter messages. PMID:27288093
Mookiah, M R K; Rohrmeier, A; Dieckmeyer, M; Mei, K; Kopp, F K; Noel, P B; Kirschke, J S; Baum, T; Subburaj, K
2018-04-01
This study investigated the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. The results showed an acceptable reproducibility of texture features, and these features could discriminate healthy/osteoporotic fracture cohort with an accuracy of 83%. This aim of this study is to investigate the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. We performed texture analysis at the spine in routine MDCT exams and investigated the effect of intravenous contrast medium (IVCM) (n = 7), slice thickness (n = 7), the long-term reproducibility (n = 9), and the ability to differentiate healthy/osteoporotic fracture cohort (n = 9 age and gender matched pairs). Eight texture features were extracted using gray level co-occurrence matrix (GLCM). The independent sample t test was used to rank the features of healthy/fracture cohort and classification was performed using support vector machine (SVM). The results revealed significant correlations between texture parameters derived from MDCT scans with and without IVCM (r up to 0.91) slice thickness of 1 mm versus 2 and 3 mm (r up to 0.96) and scan-rescan (r up to 0.59). The performance of the SVM classifier was evaluated using 10-fold cross-validation and revealed an average classification accuracy of 83%. Opportunistic osteoporosis screening at the spine using specific texture parameters (energy, entropy, and homogeneity) and SVM can be performed in routine contrast-enhanced MDCT exams.
NASA Astrophysics Data System (ADS)
Rai, A.; Minsker, B. S.
2016-12-01
In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.
Decision support system for diabetic retinopathy using discrete wavelet transform.
Noronha, K; Acharya, U R; Nayak, K P; Kamath, S; Bhandary, S V
2013-03-01
Prolonged duration of the diabetes may affect the tiny blood vessels of the retina causing diabetic retinopathy. Routine eye screening of patients with diabetes helps to detect diabetic retinopathy at the early stage. It is very laborious and time-consuming for the doctors to go through many fundus images continuously. Therefore, decision support system for diabetic retinopathy detection can reduce the burden of the ophthalmologists. In this work, we have used discrete wavelet transform and support vector machine classifier for automated detection of normal and diabetic retinopathy classes. The wavelet-based decomposition was performed up to the second level, and eight energy features were extracted. Two energy features from the approximation coefficients of two levels and six energy values from the details in three orientations (horizontal, vertical and diagonal) were evaluated. These features were fed to the support vector machine classifier with various kernel functions (linear, radial basis function, polynomial of orders 2 and 3) to evaluate the highest classification accuracy. We obtained the highest average classification accuracy, sensitivity and specificity of more than 99% with support vector machine classifier (polynomial kernel of order 3) using three discrete wavelet transform features. We have also proposed an integrated index called Diabetic Retinopathy Risk Index using clinically significant wavelet energy features to identify normal and diabetic retinopathy classes using just one number. We believe that this (Diabetic Retinopathy Risk Index) can be used as an adjunct tool by the doctors during the eye screening to cross-check their diagnosis.
NASA Technical Reports Server (NTRS)
Waterman, A. W.; Huxford, R. L.; Nelson, W. G.
1976-01-01
Molded high temperature plastic first and second stage rod seal elements were evaluated in seal assemblies to determine performance characteristics. These characteristics were compared with the performance of machined seal elements. The 6.35 cm second stage Chevron seal assembly was tested using molded Chevrons fabricated from five molding materials. Impulse screening tests conducted over a range of 311 K to 478 K revealed thermal setting deficiencies in the aromatic polyimide molding materials. Seal elements fabricated from aromatic copolyester materials structurally failed during impulse cycle calibration. Endurance testing of 3.85 million cycles at 450 K using MIL-H-83283 fluid showed poorer seal performance with the unfilled aromatic polyimide material than had been attained with seals machined from Vespel SP-21 material. The 6.35 cm first stage step-cut compression loaded seal ring fabricated from copolyester injection molding material failed structurally during impulse cycle calibration. Molding of complex shape rod seals was shown to be a potentially controllable technique, but additional molding material property testing is recommended.
Development of techniques to enhance man/machine communication
NASA Technical Reports Server (NTRS)
Targ, R.; Cole, P.; Puthoff, H.
1974-01-01
A four-state random stimulus generator, considered to function as an ESP teaching machine was used to investigate an approach to facilitating interactions between man and machines. A subject tries to guess in which of four states the machine is. The machine offers the user feedback and reinforcement as to the correctness of his choice. Using this machine, 148 volunteer subjects were screened under various protocols. Several whose learning slope and/or mean score departed significantly from chance expectation were identified. Direct physiological evidence of perception of remote stimuli not presented to any known sense of the percipient using electroencephalographic (EEG) output when a light was flashed in a distant room was also studied.
Improving compound-protein interaction prediction by building up highly credible negative samples.
Liu, Hui; Sun, Jianjiang; Guan, Jihong; Zheng, Jie; Zhou, Shuigeng
2015-06-15
Computational prediction of compound-protein interactions (CPIs) is of great importance for drug design and development, as genome-scale experimental validation of CPIs is not only time-consuming but also prohibitively expensive. With the availability of an increasing number of validated interactions, the performance of computational prediction approaches is severely impended by the lack of reliable negative CPI samples. A systematic method of screening reliable negative sample becomes critical to improving the performance of in silico prediction methods. This article aims at building up a set of highly credible negative samples of CPIs via an in silico screening method. As most existing computational models assume that similar compounds are likely to interact with similar target proteins and achieve remarkable performance, it is rational to identify potential negative samples based on the converse negative proposition that the proteins dissimilar to every known/predicted target of a compound are not much likely to be targeted by the compound and vice versa. We integrated various resources, including chemical structures, chemical expression profiles and side effects of compounds, amino acid sequences, protein-protein interaction network and functional annotations of proteins, into a systematic screening framework. We first tested the screened negative samples on six classical classifiers, and all these classifiers achieved remarkably higher performance on our negative samples than on randomly generated negative samples for both human and Caenorhabditis elegans. We then verified the negative samples on three existing prediction models, including bipartite local model, Gaussian kernel profile and Bayesian matrix factorization, and found that the performances of these models are also significantly improved on the screened negative samples. Moreover, we validated the screened negative samples on a drug bioactivity dataset. Finally, we derived two sets of new interactions by training an support vector machine classifier on the positive interactions annotated in DrugBank and our screened negative interactions. The screened negative samples and the predicted interactions provide the research community with a useful resource for identifying new drug targets and a helpful supplement to the current curated compound-protein databases. Supplementary files are available at: http://admis.fudan.edu.cn/negative-cpi/. © The Author 2015. Published by Oxford University Press.
Micro Dot Patterning on the Light Guide Panel Using Powder Blasting
Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam
2008-01-01
This study is to develop a micromachining technology for a light guide panel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a single injection process instead of existing screen printing processes. The micro powder blasting technique is applied to form micro dot patterns on the LGP mold surface. The optimal conditions for masking, laminating, exposure, and developing processes to form the micro dot patterns are first experimentally investigated. A LGP mold with masked micro patterns is then machined using the micro powder blasting method and the machinability of the micro dot patterns is verified. A prototype LGP is test- injected using the developed LGP mold and a shape analysis of the patterns and performance testing of the injected LGP are carried out. As an additional approach, matte finishing, a special surface treatment method, is applied to the mold surface to improve the light diffusion characteristics, uniformity and brightness of the LGP. The results of this study show that the applied powder blasting method can be successfully used to manufacture LGPs with micro patterns by just single injection using the developed mold and thereby replace existing screen printing methods. PMID:27879740
Yim, Wen-Wai; Chien, Shu; Kusumoto, Yasuyuki; Date, Susumu; Haga, Jason
2010-01-01
Large-scale in-silico screening is a necessary part of drug discovery and Grid computing is one answer to this demand. A disadvantage of using Grid computing is the heterogeneous computational environments characteristic of a Grid. In our study, we have found that for the molecular docking simulation program DOCK, different clusters within a Grid organization can yield inconsistent results. Because DOCK in-silico virtual screening (VS) is currently used to help select chemical compounds to test with in-vitro experiments, such differences have little effect on the validity of using virtual screening before subsequent steps in the drug discovery process. However, it is difficult to predict whether the accumulation of these discrepancies over sequentially repeated VS experiments will significantly alter the results if VS is used as the primary means for identifying potential drugs. Moreover, such discrepancies may be unacceptable for other applications requiring more stringent thresholds. This highlights the need for establishing a more complete solution to provide the best scientific accuracy when executing an application across Grids. One possible solution to platform heterogeneity in DOCK performance explored in our study involved the use of virtual machines as a layer of abstraction. This study investigated the feasibility and practicality of using virtual machine and recent cloud computing technologies in a biological research application. We examined the differences and variations of DOCK VS variables, across a Grid environment composed of different clusters, with and without virtualization. The uniform computer environment provided by virtual machines eliminated inconsistent DOCK VS results caused by heterogeneous clusters, however, the execution time for the DOCK VS increased. In our particular experiments, overhead costs were found to be an average of 41% and 2% in execution time for two different clusters, while the actual magnitudes of the execution time costs were minimal. Despite the increase in overhead, virtual clusters are an ideal solution for Grid heterogeneity. With greater development of virtual cluster technology in Grid environments, the problem of platform heterogeneity may be eliminated through virtualization, allowing greater usage of VS, and will benefit all Grid applications in general.
NASA Technical Reports Server (NTRS)
1989-01-01
Loredan Biomedical, Inc.'s LIDO, a computerized physical therapy system, was purchased by NASA in 1985 for evaluation as a Space Station Freedom exercise program. In 1986, while involved in an ARC muscle conditioning project, Malcom Bond, Loredan's chairman, designed an advanced software package for NASA which became the basis for LIDOSOFT software used in the commercially available system. The system employs a "proprioceptive" software program which perceives internal body conditions, induces perturbations to muscular effort and evaluates the response. Biofeedback on a screen allows a patient to observe his own performance.
A Mobile Health Application to Predict Postpartum Depression Based on Machine Learning.
Jiménez-Serrano, Santiago; Tortajada, Salvador; García-Gómez, Juan Miguel
2015-07-01
Postpartum depression (PPD) is a disorder that often goes undiagnosed. The development of a screening program requires considerable and careful effort, where evidence-based decisions have to be taken in order to obtain an effective test with a high level of sensitivity and an acceptable specificity that is quick to perform, easy to interpret, culturally sensitive, and cost-effective. The purpose of this article is twofold: first, to develop classification models for detecting the risk of PPD during the first week after childbirth, thus enabling early intervention; and second, to develop a mobile health (m-health) application (app) for the Android(®) (Google, Mountain View, CA) platform based on the model with best performance for both mothers who have just given birth and clinicians who want to monitor their patient's test. A set of predictive models for estimating the risk of PPD was trained using machine learning techniques and data about postpartum women collected from seven Spanish hospitals. An internal evaluation was carried out using a hold-out strategy. An easy flowchart and architecture for designing the graphical user interface of the m-health app was followed. Naive Bayes showed the best balance between sensitivity and specificity as a predictive model for PPD during the first week after delivery. It was integrated into the clinical decision support system for Android mobile apps. This approach can enable the early prediction and detection of PPD because it fulfills the conditions of an effective screening test with a high level of sensitivity and specificity that is quick to perform, easy to interpret, culturally sensitive, and cost-effective.
Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D; Duvenaud, David; Maclaurin, Dougal; Blood-Forsythe, Martin A; Chae, Hyun Sik; Einzinger, Markus; Ha, Dong-Gwang; Wu, Tony; Markopoulos, Georgios; Jeon, Soonok; Kang, Hosuk; Miyazaki, Hiroshi; Numata, Masaki; Kim, Sunghan; Huang, Wenliang; Hong, Seong Ik; Baldo, Marc; Adams, Ryan P; Aspuru-Guzik, Alán
2016-10-01
Virtual screening is becoming a ground-breaking tool for molecular discovery due to the exponential growth of available computer time and constant improvement of simulation and machine learning techniques. We report an integrated organic functional material design process that incorporates theoretical insight, quantum chemistry, cheminformatics, machine learning, industrial expertise, organic synthesis, molecular characterization, device fabrication and optoelectronic testing. After exploring a search space of 1.6 million molecules and screening over 400,000 of them using time-dependent density functional theory, we identified thousands of promising novel organic light-emitting diode molecules across the visible spectrum. Our team collaboratively selected the best candidates from this set. The experimentally determined external quantum efficiencies for these synthesized candidates were as large as 22%.
NASA Astrophysics Data System (ADS)
Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Duvenaud, David; MacLaurin, Dougal; Blood-Forsythe, Martin A.; Chae, Hyun Sik; Einzinger, Markus; Ha, Dong-Gwang; Wu, Tony; Markopoulos, Georgios; Jeon, Soonok; Kang, Hosuk; Miyazaki, Hiroshi; Numata, Masaki; Kim, Sunghan; Huang, Wenliang; Hong, Seong Ik; Baldo, Marc; Adams, Ryan P.; Aspuru-Guzik, Alán
2016-10-01
Virtual screening is becoming a ground-breaking tool for molecular discovery due to the exponential growth of available computer time and constant improvement of simulation and machine learning techniques. We report an integrated organic functional material design process that incorporates theoretical insight, quantum chemistry, cheminformatics, machine learning, industrial expertise, organic synthesis, molecular characterization, device fabrication and optoelectronic testing. After exploring a search space of 1.6 million molecules and screening over 400,000 of them using time-dependent density functional theory, we identified thousands of promising novel organic light-emitting diode molecules across the visible spectrum. Our team collaboratively selected the best candidates from this set. The experimentally determined external quantum efficiencies for these synthesized candidates were as large as 22%.
Man-Machine Integrated Design and Analysis System (MIDAS): Functional Overview
NASA Technical Reports Server (NTRS)
Corker, Kevin; Neukom, Christian
1998-01-01
Included in the series of screen print-outs illustrates the structure and function of the Man-Machine Integrated Design and Analysis System (MIDAS). Views into the use of the system and editors are featured. The use-case in this set of graphs includes the development of a simulation scenario.
Hanlon, John A.; Gill, Timothy J.
2001-01-01
Machine tools can be accurately measured and positioned on manufacturing machines within very small tolerances by use of an autocollimator on a 3-axis mount on a manufacturing machine and positioned so as to focus on a reference tooling ball or a machine tool, a digital camera connected to the viewing end of the autocollimator, and a marker and measure generator for receiving digital images from the camera, then displaying or measuring distances between the projection reticle and the reference reticle on the monitoring screen, and relating the distances to the actual position of the autocollimator relative to the reference tooling ball. The images and measurements are used to set the position of the machine tool and to measure the size and shape of the machine tool tip, and examine cutting edge wear. patent
Chang, Anthony C
2012-03-01
The preparticipation screening for athlete participation in sports typically entails a comprehensive medical and family history and a complete physical examination. A 12-lead electrocardiogram (ECG) can increase the likelihood of detecting cardiac diagnoses such as hypertrophic cardiomyopathy, but this diagnostic test as part of the screening process has engendered considerable controversy. The pro position is supported by argument that international screening protocols support its use, positive diagnosis has multiple benefits, history and physical examination are inadequate, primary prevention is essential, and the cost effectiveness is justified. Although the aforementioned myriad of justifications for routine ECG screening of young athletes can be persuasive, several valid contentions oppose supporting such a policy, namely, that the sudden death incidence is very (too) low, the ECG screening will be too costly, the false-positive rate is too high, resources will be allocated away from other diseases, and manpower is insufficient for its execution. Clinicians, including pediatric cardiologists, have an understandable proclivity for avoiding this prodigious national endeavor. The controversy, however, should not be focused on whether an inexpensive, noninvasive test such as an ECG should be mandated but should instead be directed at just how these tests for young athletes can be performed in the clinical imbroglio of these disease states (with variable genetic penetrance and phenotypic expression) with concomitant fiscal accountability and logistical expediency in this era of economic restraint. This monumental endeavor in any city or region requires two crucial elements well known to business scholars: implementation and execution. The eventual solution for the screening ECG dilemma requires a truly innovative and systematic approach that will liberate us from inadequate conventional solutions. Artificial intelligence, specifically the process termed "machine learning" and "neural networking," involves complex algorithms that allow computers to improve the decision-making process based on repeated input of empirical data (e.g., databases and ECGs). These elements all can be improved with a national database, evidence-based medicine, and in the near future, innovation that entails a Kurzweilian artificial intelligence infrastructure with machine learning and neural networking that will construct the ultimate clinical decision-making algorithm.
Advice Taking from Humans and Machines: An fMRI and Effective Connectivity Study.
Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank
2016-01-01
With new technological advances, advice can come from different sources such as machines or humans, but how individuals respond to such advice and the neural correlates involved need to be better understood. We combined functional MRI and multivariate Granger causality analysis with an X-ray luggage-screening task to investigate the neural basis and corresponding effective connectivity involved with advice utilization from agents framed as experts. Participants were asked to accept or reject good or bad advice from a human or machine agent with low reliability (high false alarm rate). We showed that unreliable advice decreased performance overall and participants interacting with the human agent had a greater depreciation of advice utilization during bad advice compared to the machine agent. These differences in advice utilization can be perceivably due to reevaluation of expectations arising from association of dispositional credibility for each agent. We demonstrated that differences in advice utilization engaged brain regions that may be associated with evaluation of personal characteristics and traits (precuneus, posterior cingulate cortex, temporoparietal junction) and interoception (posterior insula). We found that the right posterior insula and left precuneus were the drivers of the advice utilization network that were reciprocally connected to each other and also projected to all other regions. Our behavioral and neuroimaging results have significant implications for society because of progressions in technology and increased interactions with machines.
Advice Taking from Humans and Machines: An fMRI and Effective Connectivity Study
Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank
2016-01-01
With new technological advances, advice can come from different sources such as machines or humans, but how individuals respond to such advice and the neural correlates involved need to be better understood. We combined functional MRI and multivariate Granger causality analysis with an X-ray luggage-screening task to investigate the neural basis and corresponding effective connectivity involved with advice utilization from agents framed as experts. Participants were asked to accept or reject good or bad advice from a human or machine agent with low reliability (high false alarm rate). We showed that unreliable advice decreased performance overall and participants interacting with the human agent had a greater depreciation of advice utilization during bad advice compared to the machine agent. These differences in advice utilization can be perceivably due to reevaluation of expectations arising from association of dispositional credibility for each agent. We demonstrated that differences in advice utilization engaged brain regions that may be associated with evaluation of personal characteristics and traits (precuneus, posterior cingulate cortex, temporoparietal junction) and interoception (posterior insula). We found that the right posterior insula and left precuneus were the drivers of the advice utilization network that were reciprocally connected to each other and also projected to all other regions. Our behavioral and neuroimaging results have significant implications for society because of progressions in technology and increased interactions with machines. PMID:27867351
Complex extreme learning machine applications in terahertz pulsed signals feature sets.
Yin, X-X; Hadjiloucas, S; Zhang, Y
2014-11-01
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Machine learning plus optical flow: a simple and sensitive method to detect cardioactive drugs
NASA Astrophysics Data System (ADS)
Lee, Eugene K.; Kurokawa, Yosuke K.; Tu, Robin; George, Steven C.; Khine, Michelle
2015-07-01
Current preclinical screening methods do not adequately detect cardiotoxicity. Using human induced pluripotent stem cell-derived cardiomyocytes (iPS-CMs), more physiologically relevant preclinical or patient-specific screening to detect potential cardiotoxic effects of drug candidates may be possible. However, one of the persistent challenges for developing a high-throughput drug screening platform using iPS-CMs is the need to develop a simple and reliable method to measure key electrophysiological and contractile parameters. To address this need, we have developed a platform that combines machine learning paired with brightfield optical flow as a simple and robust tool that can automate the detection of cardiomyocyte drug effects. Using three cardioactive drugs of different mechanisms, including those with primarily electrophysiological effects, we demonstrate the general applicability of this screening method to detect subtle changes in cardiomyocyte contraction. Requiring only brightfield images of cardiomyocyte contractions, we detect changes in cardiomyocyte contraction comparable to - and even superior to - fluorescence readouts. This automated method serves as a widely applicable screening tool to characterize the effects of drugs on cardiomyocyte function.
Hu, Ben; Kuang, Zheng-Kun; Feng, Shi-Yu; Wang, Dong; He, Song-Bing; Kong, De-Xin
2016-11-17
The crystallized ligands in the Protein Data Bank (PDB) can be treated as the inverse shapes of the active sites of corresponding proteins. Therefore, the shape similarity between a molecule and PDB ligands indicated the possibility of the molecule to bind with the targets. In this paper, we proposed a shape similarity profile that can be used as a molecular descriptor for ligand-based virtual screening. First, through three-dimensional (3D) structural clustering, 300 diverse ligands were extracted from the druggable protein-ligand database, sc-PDB. Then, each of the molecules under scrutiny was flexibly superimposed onto the 300 ligands. Superimpositions were scored by shape overlap and property similarity, producing a 300 dimensional similarity array termed the "Three-Dimensional Biologically Relevant Spectrum (BRS-3D)". Finally, quantitative or discriminant models were developed with the 300 dimensional descriptor using machine learning methods (support vector machine). The effectiveness of this approach was evaluated using 42 benchmark data sets from the G protein-coupled receptor (GPCR) ligand library and the GPCR decoy database (GLL/GDD). We compared the performance of BRS-3D with other 2D and 3D state-of-the-art molecular descriptors. The results showed that models built with BRS-3D performed best for most GLL/GDD data sets. We also applied BRS-3D in histone deacetylase 1 inhibitors screening and GPCR subtype selectivity prediction. The advantages and disadvantages of this approach are discussed.
Sun, Huiyong; Pan, Peichen; Tian, Sheng; Xu, Lei; Kong, Xiaotian; Li, Youyong; Dan Li; Hou, Tingjun
2016-01-01
The MIEC-SVM approach, which combines molecular interaction energy components (MIEC) derived from free energy decomposition and support vector machine (SVM), has been found effective in capturing the energetic patterns of protein-peptide recognition. However, the performance of this approach in identifying small molecule inhibitors of drug targets has not been well assessed and validated by experiments. Thereafter, by combining different model construction protocols, the issues related to developing best MIEC-SVM models were firstly discussed upon three kinase targets (ABL, ALK, and BRAF). As for the investigated targets, the optimized MIEC-SVM models performed much better than the models based on the default SVM parameters and Autodock for the tested datasets. Then, the proposed strategy was utilized to screen the Specs database for discovering potential inhibitors of the ALK kinase. The experimental results showed that the optimized MIEC-SVM model, which identified 7 actives with IC50 < 10 μM from 50 purchased compounds (namely hit rate of 14%, and 4 in nM level) and performed much better than Autodock (3 actives with IC50 < 10 μM from 50 purchased compounds, namely hit rate of 6%, and 2 in nM level), suggesting that the proposed strategy is a powerful tool in structure-based virtual screening. PMID:27102549
Sun, Huiyong; Pan, Peichen; Tian, Sheng; Xu, Lei; Kong, Xiaotian; Li, Youyong; Dan Li; Hou, Tingjun
2016-04-22
The MIEC-SVM approach, which combines molecular interaction energy components (MIEC) derived from free energy decomposition and support vector machine (SVM), has been found effective in capturing the energetic patterns of protein-peptide recognition. However, the performance of this approach in identifying small molecule inhibitors of drug targets has not been well assessed and validated by experiments. Thereafter, by combining different model construction protocols, the issues related to developing best MIEC-SVM models were firstly discussed upon three kinase targets (ABL, ALK, and BRAF). As for the investigated targets, the optimized MIEC-SVM models performed much better than the models based on the default SVM parameters and Autodock for the tested datasets. Then, the proposed strategy was utilized to screen the Specs database for discovering potential inhibitors of the ALK kinase. The experimental results showed that the optimized MIEC-SVM model, which identified 7 actives with IC50 < 10 μM from 50 purchased compounds (namely hit rate of 14%, and 4 in nM level) and performed much better than Autodock (3 actives with IC50 < 10 μM from 50 purchased compounds, namely hit rate of 6%, and 2 in nM level), suggesting that the proposed strategy is a powerful tool in structure-based virtual screening.
Joutsijoki, Henry; Haponen, Markus; Rasku, Jyrki; Aalto-Setälä, Katriina; Juhola, Martti
2016-01-01
The focus of this research is on automated identification of the quality of human induced pluripotent stem cell (iPSC) colony images. iPS cell technology is a contemporary method by which the patient's cells are reprogrammed back to stem cells and are differentiated to any cell type wanted. iPS cell technology will be used in future to patient specific drug screening, disease modeling, and tissue repairing, for instance. However, there are technical challenges before iPS cell technology can be used in practice and one of them is quality control of growing iPSC colonies which is currently done manually but is unfeasible solution in large-scale cultures. The monitoring problem returns to image analysis and classification problem. In this paper, we tackle this problem using machine learning methods such as multiclass Support Vector Machines and several baseline methods together with Scaled Invariant Feature Transformation based features. We perform over 80 test arrangements and do a thorough parameter value search. The best accuracy (62.4%) for classification was obtained by using a k-NN classifier showing improved accuracy compared to earlier studies.
The applications of machine learning algorithms in the modeling of estrogen-like chemicals.
Liu, Huanxiang; Yao, Xiaojun; Gramatica, Paola
2009-06-01
Increasing concern is being shown by the scientific community, government regulators, and the public about endocrine-disrupting chemicals that, in the environment, are adversely affecting human and wildlife health through a variety of mechanisms, mainly estrogen receptor-mediated mechanisms of toxicity. Because of the large number of such chemicals in the environment, there is a great need for an effective means of rapidly assessing endocrine-disrupting activity in the toxicology assessment process. When faced with the challenging task of screening large libraries of molecules for biological activity, the benefits of computational predictive models based on quantitative structure-activity relationships to identify possible estrogens become immediately obvious. Recently, in order to improve the accuracy of prediction, some machine learning techniques were introduced to build more effective predictive models. In this review we will focus our attention on some recent advances in the use of these methods in modeling estrogen-like chemicals. The advantages and disadvantages of the machine learning algorithms used in solving this problem, the importance of the validation and performance assessment of the built models as well as their applicability domains will be discussed.
Cheminformatics in Drug Discovery, an Industrial Perspective.
Chen, Hongming; Kogej, Thierry; Engkvist, Ola
2018-05-18
Cheminformatics has established itself as a core discipline within large scale drug discovery operations. It would be impossible to handle the amount of data generated today in a small molecule drug discovery project without persons skilled in cheminformatics. In addition, due to increased emphasis on "Big Data", machine learning and artificial intelligence, not only in the society in general, but also in drug discovery, it is expected that the cheminformatics field will be even more important in the future. Traditional areas like virtual screening, library design and high-throughput screening analysis are highlighted in this review. Applying machine learning in drug discovery is an area that has become very important. Applications of machine learning in early drug discovery has been extended from predicting ADME properties and target activity to tasks like de novo molecular design and prediction of chemical reactions. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Jamal, Salma; Scaria, Vinod
2013-11-19
Leishmaniasis is a neglected tropical disease which affects approx. 12 million individuals worldwide and caused by parasite Leishmania. The current drugs used in the treatment of Leishmaniasis are highly toxic and has seen widespread emergence of drug resistant strains which necessitates the need for the development of new therapeutic options. The high throughput screen data available has made it possible to generate computational predictive models which have the ability to assess the active scaffolds in a chemical library followed by its ADME/toxicity properties in the biological trials. In the present study, we have used publicly available, high-throughput screen datasets of chemical moieties which have been adjudged to target the pyruvate kinase enzyme of L. mexicana (LmPK). The machine learning approach was used to create computational models capable of predicting the biological activity of novel antileishmanial compounds. Further, we evaluated the molecules using the substructure based approach to identify the common substructures contributing to their activity. We generated computational models based on machine learning methods and evaluated the performance of these models based on various statistical figures of merit. Random forest based approach was determined to be the most sensitive, better accuracy as well as ROC. We further added a substructure based approach to analyze the molecules to identify potentially enriched substructures in the active dataset. We believe that the models developed in the present study would lead to reduction in cost and length of clinical studies and hence newer drugs would appear faster in the market providing better healthcare options to the patients.
Investigating the Link Between Radiologists Gaze, Diagnostic Decision, and Image Content
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tourassi, Georgia; Voisin, Sophie; Paquit, Vincent C
2013-01-01
Objective: To investigate machine learning for linking image content, human perception, cognition, and error in the diagnostic interpretation of mammograms. Methods: Gaze data and diagnostic decisions were collected from six radiologists who reviewed 20 screening mammograms while wearing a head-mounted eye-tracker. Texture analysis was performed in mammographic regions that attracted radiologists attention and in all abnormal regions. Machine learning algorithms were investigated to develop predictive models that link: (i) image content with gaze, (ii) image content and gaze with cognition, and (iii) image content, gaze, and cognition with diagnostic error. Both group-based and individualized models were explored. Results: By poolingmore » the data from all radiologists machine learning produced highly accurate predictive models linking image content, gaze, cognition, and error. Merging radiologists gaze metrics and cognitive opinions with computer-extracted image features identified 59% of the radiologists diagnostic errors while confirming 96.2% of their correct diagnoses. The radiologists individual errors could be adequately predicted by modeling the behavior of their peers. However, personalized tuning appears to be beneficial in many cases to capture more accurately individual behavior. Conclusions: Machine learning algorithms combining image features with radiologists gaze data and diagnostic decisions can be effectively developed to recognize cognitive and perceptual errors associated with the diagnostic interpretation of mammograms.« less
Collaborative filtering on a family of biological targets.
Erhan, Dumitru; L'heureux, Pierre-Jean; Yue, Shi Yi; Bengio, Yoshua
2006-01-01
Building a QSAR model of a new biological target for which few screening data are available is a statistical challenge. However, the new target may be part of a bigger family, for which we have more screening data. Collaborative filtering or, more generally, multi-task learning, is a machine learning approach that improves the generalization performance of an algorithm by using information from related tasks as an inductive bias. We use collaborative filtering techniques for building predictive models that link multiple targets to multiple examples. The more commonalities between the targets, the better the multi-target model that can be built. We show an example of a multi-target neural network that can use family information to produce a predictive model of an undersampled target. We evaluate JRank, a kernel-based method designed for collaborative filtering. We show their performance on compound prioritization for an HTS campaign and the underlying shared representation between targets. JRank outperformed the neural network both in the single- and multi-target models.
Kim, Taegu; Hong, Jungsik; Kang, Pilsung
2017-01-01
Accurate box office forecasting models are developed by considering competition and word-of-mouth (WOM) effects in addition to screening-related information. Nationality, genre, ratings, and distributors of motion pictures running concurrently with the target motion picture are used to describe the competition, whereas the numbers of informative, positive, and negative mentions posted on social network services (SNS) are used to gauge the atmosphere spread by WOM. Among these candidate variables, only significant variables are selected by genetic algorithm (GA), based on which machine learning algorithms are trained to build forecasting models. The forecasts are combined to improve forecasting performance. Experimental results on the Korean film market show that the forecasting accuracy in early screening periods can be significantly improved by considering competition. In addition, WOM has a stronger influence on total box office forecasting. Considering both competition and WOM improves forecasting performance to a larger extent than when only one of them is considered.
Kim, Taegu; Hong, Jungsik
2017-01-01
Accurate box office forecasting models are developed by considering competition and word-of-mouth (WOM) effects in addition to screening-related information. Nationality, genre, ratings, and distributors of motion pictures running concurrently with the target motion picture are used to describe the competition, whereas the numbers of informative, positive, and negative mentions posted on social network services (SNS) are used to gauge the atmosphere spread by WOM. Among these candidate variables, only significant variables are selected by genetic algorithm (GA), based on which machine learning algorithms are trained to build forecasting models. The forecasts are combined to improve forecasting performance. Experimental results on the Korean film market show that the forecasting accuracy in early screening periods can be significantly improved by considering competition. In addition, WOM has a stronger influence on total box office forecasting. Considering both competition and WOM improves forecasting performance to a larger extent than when only one of them is considered. PMID:28819355
Buhling, Kai J; Lezon, S; Eulenburg, C; Schmalfeldt, B
2017-05-01
The purpose of this study was to systematically analyze the effect of transvaginal ultrasonography in an asymptomatic female population as an annual screening procedure with regard to mortality data. Studies were evaluated descriptively on their strengths and weaknesses considering the methods and results. We evaluated 632 international studies by selecting only randomized controlled trials (RCTs). Three RCTs concerning transvaginal ultrasonography were found, performed in Japan, the USA, and Great Britain. Currently, no clear recommendation for the screening for ovarian cancer in an asymptomatic population can be given based on these three studies. The authors could not show a change in mortality using transvaginal ultrasonography for annual screening. An annual palpation does not offer a beneficial effect. The development of new ultrasound machines with higher image resolution in combination with a well-standardized algorithm for ovarian cancer in upcoming years might provide an improvement regarding mortality. The current studies do not show a benefit in screening an asymptomatic population annually with transvaginal ultrasonography, but the most recent publication showed a trend toward lower mortality in patients who underwent screening after 7-14 years of follow-up. Nevertheless, all three heterogeneous RCTs had weaknesses in their methods and therefore they neither contradict the general recommendation for screening in an asymptomatic population nor do they support it.
Recursive feature selection with significant variables of support vectors.
Tsai, Chen-An; Huang, Chien-Hsun; Chang, Ching-Wei; Chen, Chun-Houh
2012-01-01
The development of DNA microarray makes researchers screen thousands of genes simultaneously and it also helps determine high- and low-expression level genes in normal and disease tissues. Selecting relevant genes for cancer classification is an important issue. Most of the gene selection methods use univariate ranking criteria and arbitrarily choose a threshold to choose genes. However, the parameter setting may not be compatible to the selected classification algorithms. In this paper, we propose a new gene selection method (SVM-t) based on the use of t-statistics embedded in support vector machine. We compared the performance to two similar SVM-based methods: SVM recursive feature elimination (SVMRFE) and recursive support vector machine (RSVM). The three methods were compared based on extensive simulation experiments and analyses of two published microarray datasets. In the simulation experiments, we found that the proposed method is more robust in selecting informative genes than SVMRFE and RSVM and capable to attain good classification performance when the variations of informative and noninformative genes are different. In the analysis of two microarray datasets, the proposed method yields better performance in identifying fewer genes with good prediction accuracy, compared to SVMRFE and RSVM.
An EEG-based machine learning method to screen alcohol use disorder.
Mumtaz, Wajid; Vuong, Pham Lam; Xia, Likun; Malik, Aamir Saeed; Rashid, Rusdi Bin Abd
2017-04-01
Screening alcohol use disorder (AUD) patients has been challenging due to the subjectivity involved in the process. Hence, robust and objective methods are needed to automate the screening of AUD patients. In this paper, a machine learning method is proposed that utilized resting-state electroencephalography (EEG)-derived features as input data to classify the AUD patients and healthy controls and to perform automatic screening of AUD patients. In this context, the EEG data were recorded during 5 min of eyes closed and 5 min of eyes open conditions. For this purpose, 30 AUD patients and 15 aged-matched healthy controls were recruited. After preprocessing the EEG data, EEG features such as inter-hemispheric coherences and spectral power for EEG delta, theta, alpha, beta and gamma bands were computed involving 19 scalp locations. The selection of most discriminant features was performed with a rank-based feature selection method assigning a weight value to each feature according to a criterion, i.e., receiver operating characteristics curve. For example, a feature with large weight was considered more relevant to the target labels than a feature with less weight. Therefore, a reduced set of most discriminant features was identified and further be utilized during classification of AUD patients and healthy controls. As results, the inter-hemispheric coherences between the brain regions were found significantly different between the study groups and provided high classification efficiency ( Accuracy = 80.8, sensitivity = 82.5, and specificity = 80, F - Measure = 0.78). In addition, the power computed in different EEG bands were found significant and provided an overall classification efficiency as ( Accuracy = 86.6, sensitivity = 95, specificity = 82.5, and F - Measure = 0.88). Further, the integration of these EEG feature resulted into even higher results ( Accuracy = 89.3 %, sensitivity = 88.5 %, specificity = 91 %, and F - Measure = 0.90). Based on the results, it is concluded that the EEG data (integration of the theta, beta, and gamma power and inter-hemispheric coherence) could be utilized as objective markers to screen the AUD patients and healthy controls.
Prediction of Chemical Function: Model Development and ...
The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi
Mu, Lin
2018-01-01
This work introduces a number of algebraic topology approaches, including multi-component persistent homology, multi-level persistent homology, and electrostatic persistence for the representation, characterization, and description of small molecules and biomolecular complexes. In contrast to the conventional persistent homology, multi-component persistent homology retains critical chemical and biological information during the topological simplification of biomolecular geometric complexity. Multi-level persistent homology enables a tailored topological description of inter- and/or intra-molecular interactions of interest. Electrostatic persistence incorporates partial charge information into topological invariants. These topological methods are paired with Wasserstein distance to characterize similarities between molecules and are further integrated with a variety of machine learning algorithms, including k-nearest neighbors, ensemble of trees, and deep convolutional neural networks, to manifest their descriptive and predictive powers for protein-ligand binding analysis and virtual screening of small molecules. Extensive numerical experiments involving 4,414 protein-ligand complexes from the PDBBind database and 128,374 ligand-target and decoy-target pairs in the DUD database are performed to test respectively the scoring power and the discriminatory power of the proposed topological learning strategies. It is demonstrated that the present topological learning outperforms other existing methods in protein-ligand binding affinity prediction and ligand-decoy discrimination. PMID:29309403
Image processing and machine learning in the morphological analysis of blood cells.
Rodellar, J; Alférez, S; Acevedo, A; Molina, A; Merino, A
2018-05-01
This review focuses on how image processing and machine learning can be useful for the morphological characterization and automatic recognition of cell images captured from peripheral blood smears. The basics of the 3 core elements (segmentation, quantitative features, and classification) are outlined, and recent literature is discussed. Although red blood cells are a significant part of this context, this study focuses on malignant lymphoid cells and blast cells. There is no doubt that these technologies may help the cytologist to perform efficient, objective, and fast morphological analysis of blood cells. They may also help in the interpretation of some morphological features and may serve as learning and survey tools. Although research is still needed, it is important to define screening strategies to exploit the potential of image-based automatic recognition systems integrated in the daily routine of laboratories along with other analysis methodologies. © 2018 John Wiley & Sons Ltd.
Cell classification using big data analytics plus time stretch imaging (Conference Presentation)
NASA Astrophysics Data System (ADS)
Jalali, Bahram; Chen, Claire L.; Mahjoubfar, Ata
2016-09-01
We show that blood cells can be classified with high accuracy and high throughput by combining machine learning with time stretch quantitative phase imaging. Our diagnostic system captures quantitative phase images in a flow microscope at millions of frames per second and extracts multiple biophysical features from individual cells including morphological characteristics, light absorption and scattering parameters, and protein concentration. These parameters form a hyperdimensional feature space in which supervised learning and cell classification is performed. We show binary classification of T-cells against colon cancer cells, as well classification of algae cell strains with high and low lipid content. The label-free screening averts the negative impact of staining reagents on cellular viability or cell signaling. The combination of time stretch machine vision and learning offers unprecedented cell analysis capabilities for cancer diagnostics, drug development and liquid biopsy for personalized genomics.
He, Qiwei; Veldkamp, Bernard P; Glas, Cees A W; de Vries, Theo
2017-03-01
Patients' narratives about traumatic experiences and symptoms are useful in clinical screening and diagnostic procedures. In this study, we presented an automated assessment system to screen patients for posttraumatic stress disorder via a natural language processing and text-mining approach. Four machine-learning algorithms-including decision tree, naive Bayes, support vector machine, and an alternative classification approach called the product score model-were used in combination with n-gram representation models to identify patterns between verbal features in self-narratives and psychiatric diagnoses. With our sample, the product score model with unigrams attained the highest prediction accuracy when compared with practitioners' diagnoses. The addition of multigrams contributed most to balancing the metrics of sensitivity and specificity. This article also demonstrates that text mining is a promising approach for analyzing patients' self-expression behavior, thus helping clinicians identify potential patients from an early stage.
Self-propulsion and interactions of catalytic particles in a chemically active medium.
Banigan, Edward J; Marko, John F
2016-01-01
Enzymatic "machines," such as catalytic rods or colloids, can self-propel and interact by generating gradients of their substrates. We theoretically investigate the behaviors of such machines in a chemically active environment where their catalytic substrates are continuously synthesized and destroyed, as occurs in living cells. We show how the kinetic properties of the medium modulate self-propulsion and pairwise interactions between machines, with the latter controlled by a tunable characteristic interaction range analogous to the Debye screening length in an electrolytic solution. Finally, we discuss the effective force arising between interacting machines and possible biological applications, such as partitioning of bacterial plasmids.
Podlewska, Sabina; Czarnecki, Wojciech M; Kafel, Rafał; Bojarski, Andrzej J
2017-02-27
The growing computational abilities of various tools that are applied in the broadly understood field of computer-aided drug design have led to the extreme popularity of virtual screening in the search for new biologically active compounds. Most often, the source of such molecules consists of commercially available compound databases, but they can also be searched for within the libraries of structures generated in silico from existing ligands. Various computational combinatorial approaches are based solely on the chemical structure of compounds, using different types of substitutions for new molecules formation. In this study, the starting point for combinatorial library generation was the fingerprint referring to the optimal substructural composition in terms of the activity toward a considered target, which was obtained using a machine learning-based optimization procedure. The systematic enumeration of all possible connections between preferred substructures resulted in the formation of target-focused libraries of new potential ligands. The compounds were initially assessed by machine learning methods using a hashed fingerprint to represent molecules; the distribution of their physicochemical properties was also investigated, as well as their synthetic accessibility. The examination of various fingerprints and machine learning algorithms indicated that the Klekota-Roth fingerprint and support vector machine were an optimal combination for such experiments. This study was performed for 8 protein targets, and the obtained compound sets and their characterization are publically available at http://skandal.if-pan.krakow.pl/comb_lib/ .
Li, Yubo; Wang, Lei; Ju, Liang; Deng, Haoyue; Zhang, Zhenzhu; Hou, Zhiguo; Xie, Jiabin; Wang, Yuming; Zhang, Yanjun
2016-04-01
Current studies that evaluate toxicity based on metabolomics have primarily focused on the screening of biomarkers while largely neglecting further verification and biomarker applications. For this reason, we used drug-induced hepatotoxicity as an example to establish a systematic strategy for screening specific biomarkers and applied these biomarkers to evaluate whether the drugs have potential hepatotoxicity toxicity. Carbon tetrachloride (5 ml/kg), acetaminophen (1500 mg/kg), and atorvastatin (5 mg/kg) are established as rat hepatotoxicity models. Fifteen common biomarkers were screened by multivariate statistical analysis and integration analysis-based metabolomics data. The receiver operating characteristic curve was used to evaluate the sensitivity and specificity of the biomarkers. We obtained 10 specific biomarker candidates with an area under the curve greater than 0.7. Then, a support vector machine model was established by extracting specific biomarker candidate data from the hepatotoxic drugs and nonhepatotoxic drugs; the accuracy of the model was 94.90% (92.86% sensitivity and 92.59% specificity) and the results demonstrated that those ten biomarkers are specific. 6 drugs were used to predict the hepatotoxicity by the support vector machines model; the prediction results were consistent with the biochemical and histopathological results, demonstrating that the model was reliable. Thus, this support vector machine model can be applied to discriminate the between the hepatic or nonhepatic toxicity of drugs. This approach not only presents a new strategy for screening-specific biomarkers with greater diagnostic significance but also provides a new evaluation pattern for hepatotoxicity, and it will be a highly useful tool in toxicity estimation and disease diagnoses. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Lin, Chern-Sheng; Ho, Chien-Wa; Chang, Kai-Chieh; Hung, San-Shan; Shei, Hung-Jung; Yeh, Mau-Shiun
2006-06-01
This study describes the design and combination of an eye-controlled and a head-controlled human-machine interface system. This system is a highly effective human-machine interface, detecting head movement by changing positions and numbers of light sources on the head. When the users utilize the head-mounted display to browse a computer screen, the system will catch the images of the user's eyes with CCD cameras, which can also measure the angle and position of the light sources. In the eye-tracking system, the program in the computer will locate each center point of the pupils in the images, and record the information on moving traces and pupil diameters. In the head gesture measurement system, the user wears a double-source eyeglass frame, so the system catches images of the user's head by using a CCD camera in front of the user. The computer program will locate the center point of the head, transferring it to the screen coordinates, and then the user can control the cursor by head motions. We combine the eye-controlled and head-controlled human-machine interface system for the virtual reality applications.
NASA Astrophysics Data System (ADS)
Gur, David; Zheng, Bin; Lederman, Dror; Dhurjaty, Sreeram; Sumkin, Jules; Zuley, Margarita
2010-02-01
A new resonance-frequency based electronic impedance spectroscopy (REIS) system with multi-probes, including one central probe and six external probes that are designed to contact the breast skin in a circular form with a radius of 60 millimeters to the central ("nipple") probe, has been assembled and installed in our breast imaging facility. We are conducting a prospective clinical study to test the performance of this REIS system in identifying younger women (< 50 years old) at higher risk for having or developing breast cancer. In this preliminary analysis, we selected a subset of 100 examinations. Among these, 50 examinations were recommended for a biopsy due to detection of a highly suspicious breast lesion and 50 were determined negative during mammography screening. REIS output signal sweeps that we used to compute an initial feature included both amplitude and phase information representing differences between corresponding (matched) EIS signal values acquired from the left and right breasts. A genetic algorithm was applied to reduce the feature set and optimize a support vector machine (SVM) to classify the REIS examinations into "biopsy recommended" and "non-biopsy" recommended groups. Using the leave-one-case-out testing method, the classification performance as measured by the area under the receiver operating characteristic (ROC) curve was 0.816 +/- 0.042. This pilot analysis suggests that the new multi-probe-based REIS system could potentially be used as a risk stratification tool to identify pre-screened young women who are at higher risk of having or developing breast cancer.
Towards automatic pulmonary nodule management in lung cancer screening with deep learning
NASA Astrophysics Data System (ADS)
Ciompi, Francesco; Chung, Kaman; van Riel, Sarah J.; Setio, Arnaud Arindra Adiyoso; Gerke, Paul K.; Jacobs, Colin; Th. Scholten, Ernst; Schaefer-Prokop, Cornelia; Wille, Mathilde M. W.; Marchianò, Alfonso; Pastorino, Ugo; Prokop, Mathias; van Ginneken, Bram
2017-04-01
The introduction of lung cancer screening programs will produce an unprecedented amount of chest CT scans in the near future, which radiologists will have to read in order to decide on a patient follow-up strategy. According to the current guidelines, the workup of screen-detected nodules strongly relies on nodule size and nodule type. In this paper, we present a deep learning system based on multi-stream multi-scale convolutional networks, which automatically classifies all nodule types relevant for nodule workup. The system processes raw CT data containing a nodule without the need for any additional information such as nodule segmentation or nodule size and learns a representation of 3D data by analyzing an arbitrary number of 2D views of a given nodule. The deep learning system was trained with data from the Italian MILD screening trial and validated on an independent set of data from the Danish DLCST screening trial. We analyze the advantage of processing nodules at multiple scales with a multi-stream convolutional network architecture, and we show that the proposed deep learning system achieves performance at classifying nodule type that surpasses the one of classical machine learning approaches and is within the inter-observer variability among four experienced human observers.
Towards automatic pulmonary nodule management in lung cancer screening with deep learning.
Ciompi, Francesco; Chung, Kaman; van Riel, Sarah J; Setio, Arnaud Arindra Adiyoso; Gerke, Paul K; Jacobs, Colin; Scholten, Ernst Th; Schaefer-Prokop, Cornelia; Wille, Mathilde M W; Marchianò, Alfonso; Pastorino, Ugo; Prokop, Mathias; van Ginneken, Bram
2017-04-19
The introduction of lung cancer screening programs will produce an unprecedented amount of chest CT scans in the near future, which radiologists will have to read in order to decide on a patient follow-up strategy. According to the current guidelines, the workup of screen-detected nodules strongly relies on nodule size and nodule type. In this paper, we present a deep learning system based on multi-stream multi-scale convolutional networks, which automatically classifies all nodule types relevant for nodule workup. The system processes raw CT data containing a nodule without the need for any additional information such as nodule segmentation or nodule size and learns a representation of 3D data by analyzing an arbitrary number of 2D views of a given nodule. The deep learning system was trained with data from the Italian MILD screening trial and validated on an independent set of data from the Danish DLCST screening trial. We analyze the advantage of processing nodules at multiple scales with a multi-stream convolutional network architecture, and we show that the proposed deep learning system achieves performance at classifying nodule type that surpasses the one of classical machine learning approaches and is within the inter-observer variability among four experienced human observers.
Towards automatic pulmonary nodule management in lung cancer screening with deep learning
Ciompi, Francesco; Chung, Kaman; van Riel, Sarah J.; Setio, Arnaud Arindra Adiyoso; Gerke, Paul K.; Jacobs, Colin; Th. Scholten, Ernst; Schaefer-Prokop, Cornelia; Wille, Mathilde M. W.; Marchianò, Alfonso; Pastorino, Ugo; Prokop, Mathias; van Ginneken, Bram
2017-01-01
The introduction of lung cancer screening programs will produce an unprecedented amount of chest CT scans in the near future, which radiologists will have to read in order to decide on a patient follow-up strategy. According to the current guidelines, the workup of screen-detected nodules strongly relies on nodule size and nodule type. In this paper, we present a deep learning system based on multi-stream multi-scale convolutional networks, which automatically classifies all nodule types relevant for nodule workup. The system processes raw CT data containing a nodule without the need for any additional information such as nodule segmentation or nodule size and learns a representation of 3D data by analyzing an arbitrary number of 2D views of a given nodule. The deep learning system was trained with data from the Italian MILD screening trial and validated on an independent set of data from the Danish DLCST screening trial. We analyze the advantage of processing nodules at multiple scales with a multi-stream convolutional network architecture, and we show that the proposed deep learning system achieves performance at classifying nodule type that surpasses the one of classical machine learning approaches and is within the inter-observer variability among four experienced human observers. PMID:28422152
Cohen, Leeber; Mangers, Kristie; Grobman, William A; Platt, Lawrence D
2009-12-01
The purpose of this study was to determine the frequency with which 3 standard screening views of the fetal heart (4-chamber, left ventricular outflow tract [LVOT], and right ventricular outflow tract [RVOT]) can be obtained satisfactorily with the spatiotemporal image correlation (STIC) technique. A prospective study of 111 patients undergoing anatomic surveys at 18 to 22 weeks was performed. Two ultrasound machines with fetal cardiac settings were used. The best volume set that could be obtained from each patient during a 45-minute examination was graded by 2 sonologists with regard to whether the 4-chamber, LVOT, and RVOT images were satisfactory for screening. All 3 views were judged satisfactory for screening in most patients: 1 sonologist graded the views as satisfactory in 70% of the patients, whereas the other found the views to be satisfactory in 83%. The position of the placenta did not alter the probability of achieving a satisfactory view, but a fetus in the spine anterior position was associated with a significantly lower probability that the views were regarded as satisfactory for screening (odds ratio, 0.28; 95% confidence interval, 0.09-0.70; P < .05). This study suggests that STIC may assist with screening for cardiac anomalies at 18 to 22 weeks' gestation.
NASA Astrophysics Data System (ADS)
Kelouaz, Moussa; Ouazir, Youcef; Hadjout, Larbi; Mezani, Smail; Lubin, Thiery; Berger, Kévin; Lévêque, Jean
2018-05-01
In this paper a new superconducting inductor topology intended for synchronous machine is presented. The studied machine has a standard 3-phase armature and a new kind of 2-poles inductor (claw-pole structure) excited by two coaxial superconducting coils. The air-gap spatial variation of the radial flux density is obtained by inserting a superconducting bulk, which deviates the magnetic field due to the coils. The complex geometry of this inductor usually needs 3D finite elements (FEM) for its analysis. However, to avoid a long computational time inherent to 3D FEM, we propose in this work an alternative modeling, which uses a 3D meshed reluctance network. The results obtained with the developed model are compared to 3D FEM computations as well as to measurements carried out on a laboratory prototype. Finally, a 3D FEM study of the shielding properties of the superconducting screen demonstrates the suitability of using a diamagnetic-like model of the superconducting screen.
The effects of gray scale image processing on digital mammography interpretation performance.
Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita
2005-05-01
To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.
NASA Astrophysics Data System (ADS)
Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram
2016-04-01
Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.
NASA Astrophysics Data System (ADS)
Gmuender, T.
2017-02-01
Different chemical photo-reactive emulsions are used in screen printing for stencil production. Depending on the bandwidth, optical power and depth of field from the optical system, the reaction / exposure speed has a diverse value. In this paper, the emulsions get categorized and validated in a first step. After that a mathematical model gets developed and adapted due to heuristic experience to estimate the exposure speed under the influence of digitally modulated ultra violet (UV) light. The main intention is to use the technical specifications (intended wavelength, exposure time, distance to the stencil, electrical power, stencil configuration) in the emulsion data sheet primary written down with an uncertainty factor for the end user operating with large projector arc lamps and photo films. These five parameters are the inputs for a mathematical formula which gives as an output the exposure speed for the Computer to Screen (CTS) machine calculated for each emulsion / stencil setup. The importance of this work relies in the possibility to rate with just a few boundaries the performance and capacity of an exposure system used in screen printing instead of processing a long test series for each emulsion / stencil configuration.
Detection of Pathological Voice Using Cepstrum Vectors: A Deep Learning Approach.
Fang, Shih-Hau; Tsao, Yu; Hsiao, Min-Jing; Chen, Ji-Ying; Lai, Ying-Hui; Lin, Feng-Chuan; Wang, Chi-Te
2018-03-19
Computerized detection of voice disorders has attracted considerable academic and clinical interest in the hope of providing an effective screening method for voice diseases before endoscopic confirmation. This study proposes a deep-learning-based approach to detect pathological voice and examines its performance and utility compared with other automatic classification algorithms. This study retrospectively collected 60 normal voice samples and 402 pathological voice samples of 8 common clinical voice disorders in a voice clinic of a tertiary teaching hospital. We extracted Mel frequency cepstral coefficients from 3-second samples of a sustained vowel. The performances of three machine learning algorithms, namely, deep neural network (DNN), support vector machine, and Gaussian mixture model, were evaluated based on a fivefold cross-validation. Collective cases from the voice disorder database of MEEI (Massachusetts Eye and Ear Infirmary) were used to verify the performance of the classification mechanisms. The experimental results demonstrated that DNN outperforms Gaussian mixture model and support vector machine. Its accuracy in detecting voice pathologies reached 94.26% and 90.52% in male and female subjects, based on three representative Mel frequency cepstral coefficient features. When applied to the MEEI database for validation, the DNN also achieved a higher accuracy (99.32%) than the other two classification algorithms. By stacking several layers of neurons with optimized weights, the proposed DNN algorithm can fully utilize the acoustic features and efficiently differentiate between normal and pathological voice samples. Based on this pilot study, future research may proceed to explore more application of DNN from laboratory and clinical perspectives. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi
2016-12-01
Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.
The Visual Uncertainty Paradigm for Controlling Screen-Space Information in Visualization
ERIC Educational Resources Information Center
Dasgupta, Aritra
2012-01-01
The information visualization pipeline serves as a lossy communication channel for presentation of data on a screen-space of limited resolution. The lossy communication is not just a machine-only phenomenon due to information loss caused by translation of data, but also a reflection of the degree to which the human user can comprehend visual…
Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul SF
2015-01-01
Background Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. Objective The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. Methods There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric “Screening Efficiency” that were adopted to evaluate model effectiveness. Results Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Conclusions Individuals in China with high suicide probability are recognizable by profile and text-based information from microblogs. Although there is still much space to improve the performance of classification models in the future, this study may shed light on preliminary screening of risky individuals via machine learning algorithms, which can work side-by-side with expert scrutiny to increase efficiency in large-scale-surveillance of suicide probability from online social media. PMID:26543921
Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul Sf; Zhu, Tingshao
2015-01-01
Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric "Screening Efficiency" that were adopted to evaluate model effectiveness. Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Individuals in China with high suicide probability are recognizable by profile and text-based information from microblogs. Although there is still much space to improve the performance of classification models in the future, this study may shed light on preliminary screening of risky individuals via machine learning algorithms, which can work side-by-side with expert scrutiny to increase efficiency in large-scale-surveillance of suicide probability from online social media.
Tile-based Level of Detail for the Parallel Age
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niski, K; Cohen, J D
Today's PCs incorporate multiple CPUs and GPUs and are easily arranged in clusters for high-performance, interactive graphics. We present an approach based on hierarchical, screen-space tiles to parallelizing rendering with level of detail. Adapt tiles, render tiles, and machine tiles are associated with CPUs, GPUs, and PCs, respectively, to efficiently parallelize the workload with good resource utilization. Adaptive tile sizes provide load balancing while our level of detail system allows total and independent management of the load on CPUs and GPUs. We demonstrate our approach on parallel configurations consisting of both single PCs and a cluster of PCs.
Automated Inference of Chemical Discriminants of Biological Activity.
Raschka, Sebastian; Scott, Anne M; Huertas, Mar; Li, Weiming; Kuhn, Leslie A
2018-01-01
Ligand-based virtual screening has become a standard technique for the efficient discovery of bioactive small molecules. Following assays to determine the activity of compounds selected by virtual screening, or other approaches in which dozens to thousands of molecules have been tested, machine learning techniques make it straightforward to discover the patterns of chemical groups that correlate with the desired biological activity. Defining the chemical features that generate activity can be used to guide the selection of molecules for subsequent rounds of screening and assaying, as well as help design new, more active molecules for organic synthesis.The quantitative structure-activity relationship machine learning protocols we describe here, using decision trees, random forests, and sequential feature selection, take as input the chemical structure of a single, known active small molecule (e.g., an inhibitor, agonist, or substrate) for comparison with the structure of each tested molecule. Knowledge of the atomic structure of the protein target and its interactions with the active compound are not required. These protocols can be modified and applied to any data set that consists of a series of measured structural, chemical, or other features for each tested molecule, along with the experimentally measured value of the response variable you would like to predict or optimize for your project, for instance, inhibitory activity in a biological assay or ΔG binding . To illustrate the use of different machine learning algorithms, we step through the analysis of a dataset of inhibitor candidates from virtual screening that were tested recently for their ability to inhibit GPCR-mediated signaling in a vertebrate.
Graph wavelet alignment kernels for drug virtual screening.
Smalter, Aaron; Huan, Jun; Lushington, Gerald
2009-06-01
In this paper, we introduce a novel statistical modeling technique for target property prediction, with applications to virtual screening and drug design. In our method, we use graphs to model chemical structures and apply a wavelet analysis of graphs to summarize features capturing graph local topology. We design a novel graph kernel function to utilize the topology features to build predictive models for chemicals via Support Vector Machine classifier. We call the new graph kernel a graph wavelet-alignment kernel. We have evaluated the efficacy of the wavelet-alignment kernel using a set of chemical structure-activity prediction benchmarks. Our results indicate that the use of the kernel function yields performance profiles comparable to, and sometimes exceeding that of the existing state-of-the-art chemical classification approaches. In addition, our results also show that the use of wavelet functions significantly decreases the computational costs for graph kernel computation with more than ten fold speedup.
Kernelized rank learning for personalized drug recommendation.
He, Xiao; Folkman, Lukas; Borgwardt, Karsten
2018-03-08
Large-scale screenings of cancer cell lines with detailed molecular profiles against libraries of pharmacological compounds are currently being performed in order to gain a better understanding of the genetic component of drug response and to enhance our ability to recommend therapies given a patient's molecular profile. These comprehensive screens differ from the clinical setting in which (1) medical records only contain the response of a patient to very few drugs, (2) drugs are recommended by doctors based on their expert judgment, and (3) selecting the most promising therapy is often more important than accurately predicting the sensitivity to all potential drugs. Current regression models for drug sensitivity prediction fail to account for these three properties. We present a machine learning approach, named Kernelized Rank Learning (KRL), that ranks drugs based on their predicted effect per cell line (patient), circumventing the difficult problem of precisely predicting the sensitivity to the given drug. Our approach outperforms several state-of-the-art predictors in drug recommendation, particularly if the training dataset is sparse, and generalizes to patient data. Our work phrases personalized drug recommendation as a new type of machine learning problem with translational potential to the clinic. The Python implementation of KRL and scripts for running our experiments are available at https://github.com/BorgwardtLab/Kernelized-Rank-Learning. xiao.he@bsse.ethz.ch, lukas.folkman@bsse.ethz.ch. Supplementary data are available at Bioinformatics online.
Role of Open Source Tools and Resources in Virtual Screening for Drug Discovery.
Karthikeyan, Muthukumarasamy; Vyas, Renu
2015-01-01
Advancement in chemoinformatics research in parallel with availability of high performance computing platform has made handling of large scale multi-dimensional scientific data for high throughput drug discovery easier. In this study we have explored publicly available molecular databases with the help of open-source based integrated in-house molecular informatics tools for virtual screening. The virtual screening literature for past decade has been extensively investigated and thoroughly analyzed to reveal interesting patterns with respect to the drug, target, scaffold and disease space. The review also focuses on the integrated chemoinformatics tools that are capable of harvesting chemical data from textual literature information and transform them into truly computable chemical structures, identification of unique fragments and scaffolds from a class of compounds, automatic generation of focused virtual libraries, computation of molecular descriptors for structure-activity relationship studies, application of conventional filters used in lead discovery along with in-house developed exhaustive PTC (Pharmacophore, Toxicophores and Chemophores) filters and machine learning tools for the design of potential disease specific inhibitors. A case study on kinase inhibitors is provided as an example.
NASA Astrophysics Data System (ADS)
Li, Xiayue; Curtis, Farren S.; Rose, Timothy; Schober, Christoph; Vazquez-Mayagoitia, Alvaro; Reuter, Karsten; Oberhofer, Harald; Marom, Noa
2018-06-01
We present Genarris, a Python package that performs configuration space screening for molecular crystals of rigid molecules by random sampling with physical constraints. For fast energy evaluations, Genarris employs a Harris approximation, whereby the total density of a molecular crystal is constructed via superposition of single molecule densities. Dispersion-inclusive density functional theory is then used for the Harris density without performing a self-consistency cycle. Genarris uses machine learning for clustering, based on a relative coordinate descriptor developed specifically for molecular crystals, which is shown to be robust in identifying packing motif similarity. In addition to random structure generation, Genarris offers three workflows based on different sequences of successive clustering and selection steps: the "Rigorous" workflow is an exhaustive exploration of the potential energy landscape, the "Energy" workflow produces a set of low energy structures, and the "Diverse" workflow produces a maximally diverse set of structures. The latter is recommended for generating initial populations for genetic algorithms. Here, the implementation of Genarris is reported and its application is demonstrated for three test cases.
Mercury ion thruster research, 1978
NASA Technical Reports Server (NTRS)
Wilbur, P. J.
1978-01-01
The effects of 8 cm thruster main and neutralizer cathode operating conditions on cathode orifice plate temperatures were studied. The effects of cathode operating conditions on insert temperature profiles and keeper voltages are presented for three different types of inserts. The bulk of the emission current is generally observed to come from the downstream end of the insert rather than from the cathode orifice plate. Results of a test in which the screen grid plasma sheath of a thruster was probed as the beam current was varied are shown. Grid performance obtained with a grid machined from glass ceramic is discussed. The effects of copper and nitrogen impurities on the sputtering rates of thruster materials are measured experimentally and a model describing the rate of nitrogen chemisorption on materials in either the beam or the discharge chamber is presented. The results of optimization of a radial field thruster design are presented. Performance of this device is shown to be comparable to that of a divergent field thruster and efficient operation with the screen grid biased to floating potential, where its susceptibility to sputter erosion damage is reduced, is demonstrated.
Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida
2016-01-01
Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality. PMID:27190636
INFIBRA: machine vision inspection of acrylic fiber production
NASA Astrophysics Data System (ADS)
Davies, Roger; Correia, Bento A. B.; Contreiras, Jose; Carvalho, Fernando D.
1998-10-01
This paper describes the implementation of INFIBRA, a machine vision system for the inspection of acrylic fiber production lines. The system was developed by INETI under a contract from Fisipe, Fibras Sinteticas de Portugal, S.A. At Fisipe there are ten production lines in continuous operation, each approximately 40 m in length. A team of operators used to perform periodic manual visual inspection of each line in conditions of high ambient temperature and humidity. It is not surprising that failures in the manual inspection process occurred with some frequency, with consequences that ranged from reduced fiber quality to production stoppages. The INFIBRA system architecture is a specialization of a generic, modular machine vision architecture based on a network of Personal Computers (PCs), each equipped with a low cost frame grabber. Each production line has a dedicated PC that performs automatic inspection, using specially designed metrology algorithms, via four video cameras located at key positions on the line. The cameras are mounted inside custom-built, hermetically sealed water-cooled housings to protect them from the unfriendly environment. The ten PCs, one for each production line, communicate with a central PC via a standard Ethernet connection. The operator controls all aspects of the inspection process, from configuration through to handling alarms, via a simple graphical interface on the central PC. At any time the operator can also view on the central PC's screen the live image from any one of the 40 cameras employed by the system.
From CBCL to DSM: A Comparison of Two Methods to Screen for DSM-IV Diagnoses Using CBCL Data
ERIC Educational Resources Information Center
Krol, Nicole P. C. M.; De Bruyn, Eric E. J.; Coolen, Jolanda C.; van Aarle, Edward J. M.
2006-01-01
The screening efficiency of 2 methods to convert Child Behavior Checklist (CBCL) assessment data into Diagnostic and Statistical Manual of Mental Disorders (4th ed. [DSM-IV]; American Psychiatric Association, 1994) diagnoses was compared. The Machine-Aided Diagnosis (MAD) method converts CBCL input data directly into DSM-IV symptom criteria. The…
Melo, Carlos Fernando Odir Rodrigues; Navarro, Luiz Claudio; de Oliveira, Diogo Noin; Guerreiro, Tatiane Melina; Lima, Estela de Oliveira; Delafiori, Jeany; Dabaja, Mohamed Ziad; Ribeiro, Marta da Silva; de Menezes, Maico; Rodrigues, Rafael Gustavo Martins; Morishita, Karen Noda; Esteves, Cibele Zanardi; de Amorim, Aline Lopes Lucas; Aoyagui, Caroline Tiemi; Parise, Pierina Lorencini; Milanez, Guilherme Paier; do Nascimento, Gabriela Mansano; Ribas Freitas, André Ricardo; Angerami, Rodrigo; Costa, Fábio Trindade Maranhão; Arns, Clarice Weis; Resende, Mariangela Ribeiro; Amaral, Eliana; Junior, Renato Passini; Ribeiro-do-Valle, Carolina C; Milanez, Helaine; Moretti, Maria Luiza; Proenca-Modena, Jose Luiz; Avila, Sandra; Rocha, Anderson; Catharino, Rodrigo Ramos
2018-01-01
Recent Zika outbreaks in South America, accompanied by unexpectedly severe clinical complications have brought much interest in fast and reliable screening methods for ZIKV (Zika virus) identification. Reverse-transcriptase polymerase chain reaction (RT-PCR) is currently the method of choice to detect ZIKV in biological samples. This approach, nonetheless, demands a considerable amount of time and resources such as kits and reagents that, in endemic areas, may result in a substantial financial burden over affected individuals and health services veering away from RT-PCR analysis. This study presents a powerful combination of high-resolution mass spectrometry and a machine-learning prediction model for data analysis to assess the existence of ZIKV infection across a series of patients that bear similar symptomatic conditions, but not necessarily are infected with the disease. By using mass spectrometric data that are inputted with the developed decision-making algorithm, we were able to provide a set of features that work as a "fingerprint" for this specific pathophysiological condition, even after the acute phase of infection. Since both mass spectrometry and machine learning approaches are well-established and have largely utilized tools within their respective fields, this combination of methods emerges as a distinct alternative for clinical applications, providing a diagnostic screening-faster and more accurate-with improved cost-effectiveness when compared to existing technologies.
Crowdsourced validation of a machine-learning classification system for autism and ADHD.
Duda, M; Haber, N; Daniels, J; Wall, D P
2017-05-16
Autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) together affect >10% of the children in the United States, but considerable behavioral overlaps between the two disorders can often complicate differential diagnosis. Currently, there is no screening test designed to differentiate between the two disorders, and with waiting times from initial suspicion to diagnosis upwards of a year, methods to quickly and accurately assess risk for these and other developmental disorders are desperately needed. In a previous study, we found that four machine-learning algorithms were able to accurately (area under the curve (AUC)>0.96) distinguish ASD from ADHD using only a small subset of items from the Social Responsiveness Scale (SRS). Here, we expand upon our prior work by including a novel crowdsourced data set of responses to our predefined top 15 SRS-derived questions from parents of children with ASD (n=248) or ADHD (n=174) to improve our model's capability to generalize to new, 'real-world' data. By mixing these novel survey data with our initial archival sample (n=3417) and performing repeated cross-validation with subsampling, we created a classification algorithm that performs with AUC=0.89±0.01 using only 15 questions.
Computer-automated dementia screening using a touch-tone telephone.
Mundt, J C; Ferber, K L; Rizzo, M; Greist, J H
2001-11-12
This study investigated the sensitivity and specificity of a computer-automated telephone system to evaluate cognitive impairment in elderly callers to identify signs of early dementia. The Clinical Dementia Rating Scale was used to assess 155 subjects aged 56 to 93 years (n = 74, 27, 42, and 12, with a Clinical Dementia Rating Scale score of 0, 0.5, 1, and 2, respectively). These subjects performed a battery of tests administered by an interactive voice response system using standard Touch-Tone telephones. Seventy-four collateral informants also completed an interactive voice response version of the Symptoms of Dementia Screener. Sixteen cognitively impaired subjects were unable to complete the telephone call. Performances on 6 of 8 tasks were significantly influenced by Clinical Dementia Rating Scale status. The mean (SD) call length was 12 minutes 27 seconds (2 minutes 32 seconds). A subsample (n = 116) was analyzed using machine-learning methods, producing a scoring algorithm that combined performances across 4 tasks. Results indicated a potential sensitivity of 82.0% and specificity of 85.5%. The scoring model generalized to a validation subsample (n = 39), producing 85.0% sensitivity and 78.9% specificity. The kappa agreement between predicted and actual group membership was 0.64 (P<.001). Of the 16 subjects unable to complete the call, 11 provided sufficient information to permit us to classify them as impaired. Standard scoring of the interactive voice response-administered Symptoms of Dementia Screener (completed by informants) produced a screening sensitivity of 63.5% and 100% specificity. A lower criterion found a 90.4% sensitivity, without lowering specificity. Computer-automated telephone screening for early dementia using either informant or direct assessment is feasible. Such systems could provide wide-scale, cost-effective screening, education, and referral services to patients and caregivers.
NASA Astrophysics Data System (ADS)
Wigdahl, J.; Agurto, C.; Murray, V.; Barriga, S.; Soliz, P.
2013-03-01
Diabetic retinopathy (DR) affects more than 4.4 million Americans age 40 and over. Automatic screening for DR has shown to be an efficient and cost-effective way to lower the burden on the healthcare system, by triaging diabetic patients and ensuring timely care for those presenting with DR. Several supervised algorithms have been developed to detect pathologies related to DR, but little work has been done in determining the size of the training set that optimizes an algorithm's performance. In this paper we analyze the effect of the training sample size on the performance of a top-down DR screening algorithm for different types of statistical classifiers. Results are based on partial least squares (PLS), support vector machines (SVM), k-nearest neighbor (kNN), and Naïve Bayes classifiers. Our dataset consisted of digital retinal images collected from a total of 745 cases (595 controls, 150 with DR). We varied the number of normal controls in the training set, while keeping the number of DR samples constant, and repeated the procedure 10 times using randomized training sets to avoid bias. Results show increasing performance in terms of area under the ROC curve (AUC) when the number of DR subjects in the training set increased, with similar trends for each of the classifiers. Of these, PLS and k-NN had the highest average AUC. Lower standard deviation and a flattening of the AUC curve gives evidence that there is a limit to the learning ability of the classifiers and an optimal number of cases to train on.
Influence relevance voting: an accurate and interpretable virtual high throughput screening method.
Swamidass, S Joshua; Azencott, Chloé-Agathe; Lin, Ting-Wan; Gramajo, Hugo; Tsai, Shiou-Chuan; Baldi, Pierre
2009-04-01
Given activity training data from high-throughput screening (HTS) experiments, virtual high-throughput screening (vHTS) methods aim to predict in silico the activity of untested chemicals. We present a novel method, the Influence Relevance Voter (IRV), specifically tailored for the vHTS task. The IRV is a low-parameter neural network which refines a k-nearest neighbor classifier by nonlinearly combining the influences of a chemical's neighbors in the training set. Influences are decomposed, also nonlinearly, into a relevance component and a vote component. The IRV is benchmarked using the data and rules of two large, open, competitions, and its performance compared to the performance of other participating methods, as well as of an in-house support vector machine (SVM) method. On these benchmark data sets, IRV achieves state-of-the-art results, comparable to the SVM in one case, and significantly better than the SVM in the other, retrieving three times as many actives in the top 1% of its prediction-sorted list. The IRV presents several other important advantages over SVMs and other methods: (1) the output predictions have a probabilistic semantic; (2) the underlying inferences are interpretable; (3) the training time is very short, on the order of minutes even for very large data sets; (4) the risk of overfitting is minimal, due to the small number of free parameters; and (5) additional information can easily be incorporated into the IRV architecture. Combined with its performance, these qualities make the IRV particularly well suited for vHTS.
Cheng, Tiejun; Li, Qingliang; Wang, Yanli; Bryant, Stephen H
2011-02-28
Aqueous solubility is recognized as a critical parameter in both the early- and late-stage drug discovery. Therefore, in silico modeling of solubility has attracted extensive interests in recent years. Most previous studies have been limited in using relatively small data sets with limited diversity, which in turn limits the predictability of derived models. In this work, we present a support vector machines model for the binary classification of solubility by taking advantage of the largest known public data set that contains over 46 000 compounds with experimental solubility. Our model was optimized in combination with a reduction and recombination feature selection strategy. The best model demonstrated robust performance in both cross-validation and prediction of two independent test sets, indicating it could be a practical tool to select soluble compounds for screening, purchasing, and synthesizing. Moreover, our work may be used for comparative evaluation of solubility classification studies ascribe to the use of completely public resources.
NASA Astrophysics Data System (ADS)
Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen
2011-03-01
RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.
NASA Astrophysics Data System (ADS)
Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas
To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.
Das, Dev Kumar; Ghosh, Madhumala; Pal, Mallika; Maiti, Asok K; Chakraborty, Chandan
2013-02-01
The aim of this paper is to address the development of computer assisted malaria parasite characterization and classification using machine learning approach based on light microscopic images of peripheral blood smears. In doing this, microscopic image acquisition from stained slides, illumination correction and noise reduction, erythrocyte segmentation, feature extraction, feature selection and finally classification of different stages of malaria (Plasmodium vivax and Plasmodium falciparum) have been investigated. The erythrocytes are segmented using marker controlled watershed transformation and subsequently total ninety six features describing shape-size and texture of erythrocytes are extracted in respect to the parasitemia infected versus non-infected cells. Ninety four features are found to be statistically significant in discriminating six classes. Here a feature selection-cum-classification scheme has been devised by combining F-statistic, statistical learning techniques i.e., Bayesian learning and support vector machine (SVM) in order to provide the higher classification accuracy using best set of discriminating features. Results show that Bayesian approach provides the highest accuracy i.e., 84% for malaria classification by selecting 19 most significant features while SVM provides highest accuracy i.e., 83.5% with 9 most significant features. Finally, the performance of these two classifiers under feature selection framework has been compared toward malaria parasite classification. Copyright © 2012 Elsevier Ltd. All rights reserved.
Mechanically Compliant Electronic Materials for Wearable Photovoltaics and Human-Machine Interfaces
NASA Astrophysics Data System (ADS)
O'Connor, Timothy Francis, III
Applications of stretchable electronic materials for human-machine interfaces are described herein. Intrinsically stretchable organic conjugated polymers and stretchable electronic composites were used to develop stretchable organic photovoltaics (OPVs), mechanically robust wearable OPVs, and human-machine interfaces for gesture recognition, American Sign Language Translation, haptic control of robots, and touch emulation for virtual reality, augmented reality, and the transmission of touch. The stretchable and wearable OPVs comprise active layers of poly-3-alkylthiophene:phenyl-C61-butyric acid methyl ester (P3AT:PCBM) and transparent conductive electrodes of poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) (PEDOT:PSS) and devices could only be fabricated through a deep understanding of the connection between molecular structure and the co-engineering of electronic performance with mechanical resilience. The talk concludes with the use of composite piezoresistive sensors two smart glove prototypes. The first integrates stretchable strain sensors comprising a carbon-elastomer composite, a wearable microcontroller, low energy Bluetooth, and a 6-axis accelerometer/gyroscope to construct a fully functional gesture recognition glove capable of wirelessly translating American Sign Language to text on a cell phone screen. The second creates a system for the haptic control of a 3D printed robot arm, as well as the transmission of touch and temperature information.
Araki, Tadashi; Ikeda, Nobutaka; Shukla, Devarshi; Jain, Pankaj K; Londhe, Narendra D; Shrivastava, Vimal K; Banchhor, Sumit K; Saba, Luca; Nicolaides, Andrew; Shafique, Shoaib; Laird, John R; Suri, Jasjit S
2016-05-01
Percutaneous coronary interventional procedures need advance planning prior to stenting or an endarterectomy. Cardiologists use intravascular ultrasound (IVUS) for screening, risk assessment and stratification of coronary artery disease (CAD). We hypothesize that plaque components are vulnerable to rupture due to plaque progression. Currently, there are no standard grayscale IVUS tools for risk assessment of plaque rupture. This paper presents a novel strategy for risk stratification based on plaque morphology embedded with principal component analysis (PCA) for plaque feature dimensionality reduction and dominant feature selection technique. The risk assessment utilizes 56 grayscale coronary features in a machine learning framework while linking information from carotid and coronary plaque burdens due to their common genetic makeup. This system consists of a machine learning paradigm which uses a support vector machine (SVM) combined with PCA for optimal and dominant coronary artery morphological feature extraction. Carotid artery proven intima-media thickness (cIMT) biomarker is adapted as a gold standard during the training phase of the machine learning system. For the performance evaluation, K-fold cross validation protocol is adapted with 20 trials per fold. For choosing the dominant features out of the 56 grayscale features, a polling strategy of PCA is adapted where the original value of the features is unaltered. Different protocols are designed for establishing the stability and reliability criteria of the coronary risk assessment system (cRAS). Using the PCA-based machine learning paradigm and cross-validation protocol, a classification accuracy of 98.43% (AUC 0.98) with K=10 folds using an SVM radial basis function (RBF) kernel was achieved. A reliability index of 97.32% and machine learning stability criteria of 5% were met for the cRAS. This is the first Computer aided design (CADx) system of its kind that is able to demonstrate the ability of coronary risk assessment and stratification while demonstrating a successful design of the machine learning system based on our assumptions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Research on computer systems benchmarking
NASA Technical Reports Server (NTRS)
Smith, Alan Jay (Principal Investigator)
1996-01-01
This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.
DPubChem: a web tool for QSAR modeling and high-throughput virtual screening.
Soufan, Othman; Ba-Alawi, Wail; Magana-Mora, Arturo; Essack, Magbubah; Bajic, Vladimir B
2018-06-14
High-throughput screening (HTS) performs the experimental testing of a large number of chemical compounds aiming to identify those active in the considered assay. Alternatively, faster and cheaper methods of large-scale virtual screening are performed computationally through quantitative structure-activity relationship (QSAR) models. However, the vast amount of available HTS heterogeneous data and the imbalanced ratio of active to inactive compounds in an assay make this a challenging problem. Although different QSAR models have been proposed, they have certain limitations, e.g., high false positive rates, complicated user interface, and limited utilization options. Therefore, we developed DPubChem, a novel web tool for deriving QSAR models that implement the state-of-the-art machine-learning techniques to enhance the precision of the models and enable efficient analyses of experiments from PubChem BioAssay database. DPubChem also has a simple interface that provides various options to users. DPubChem predicted active compounds for 300 datasets with an average geometric mean and F 1 score of 76.68% and 76.53%, respectively. Furthermore, DPubChem builds interaction networks that highlight novel predicted links between chemical compounds and biological assays. Using such a network, DPubChem successfully suggested a novel drug for the Niemann-Pick type C disease. DPubChem is freely available at www.cbrc.kaust.edu.sa/dpubchem .
Phoenito experiments: combining the strengths of commercial crystallization automation.
Newman, Janet; Pham, Tam M; Peat, Thomas S
2008-11-01
The use of crystallization robots for initial screening in macromolecular crystallization is well established. This paper describes how four general optimization techniques, growth-rate modulation, fine screening, seeding and additive screening, have been adapted for automation in a medium-throughput crystallization service facility. The use of automation for more challenging optimization experiments is discussed, as is a novel way of using both the Mosquito and the Phoenix nano-dispensing robots during the setup of a single crystallization plate. This dual-dispenser technique plays to the strengths of both machines.
Phoenito experiments: combining the strengths of commercial crystallization automation
Newman, Janet; Pham, Tam M.; Peat, Thomas S.
2008-01-01
The use of crystallization robots for initial screening in macromolecular crystallization is well established. This paper describes how four general optimization techniques, growth-rate modulation, fine screening, seeding and additive screening, have been adapted for automation in a medium-throughput crystallization service facility. The use of automation for more challenging optimization experiments is discussed, as is a novel way of using both the Mosquito and the Phoenix nano-dispensing robots during the setup of a single crystallization plate. This dual-dispenser technique plays to the strengths of both machines. PMID:18997323
A Framework to Guide the Assessment of Human-Machine Systems.
Stowers, Kimberly; Oglesby, James; Sonesh, Shirley; Leyva, Kevin; Iwig, Chelsea; Salas, Eduardo
2017-03-01
We have developed a framework for guiding measurement in human-machine systems. The assessment of safety and performance in human-machine systems often relies on direct measurement, such as tracking reaction time and accidents. However, safety and performance emerge from the combination of several variables. The assessment of precursors to safety and performance are thus an important part of predicting and improving outcomes in human-machine systems. As part of an in-depth literature analysis involving peer-reviewed, empirical articles, we located and classified variables important to human-machine systems, giving a snapshot of the state of science on human-machine system safety and performance. Using this information, we created a framework of safety and performance in human-machine systems. This framework details several inputs and processes that collectively influence safety and performance. Inputs are divided according to human, machine, and environmental inputs. Processes are divided into attitudes, behaviors, and cognitive variables. Each class of inputs influences the processes and, subsequently, outcomes that emerge in human-machine systems. This framework offers a useful starting point for understanding the current state of the science and measuring many of the complex variables relating to safety and performance in human-machine systems. This framework can be applied to the design, development, and implementation of automated machines in spaceflight, military, and health care settings. We present a hypothetical example in our write-up of how it can be used to aid in project success.
Campos, Fernanda Magalhães Freire; Repoles, Laura Cotta; de Araújo, Fernanda Fortes; Peruhype-Magalhães, Vanessa; Xavier, Marcelo Antônio Pascoal; Sabino, Ester Cerdeira; de Freitas Carneiro Proietti, Anna Bárbara; Andrade, Mariléia Chaves; Teixeira-Carvalho, Andréa; Martins-Filho, Olindo Assis; Gontijo, Célia Maria Ferreira
2018-04-01
A relevant issue in Chagas disease serological diagnosis regards the requirement of using several confirmatory methods to elucidate the status of non-negative results from blood bank screening. The development of a single reliable method may potentially contribute to distinguish true and false positive results. Our aim was to evaluate the performance of the multiplexed flow-cytometry anti-T. cruzi/Leishmania IgG1 serology/(FC-TRIPLEX Chagas/Leish IgG1) with three conventional confirmatory criteria (ELISA-EIA, Immunofluorescence assay-IIF and EIA/IIF consensus criterion) to define the final status of samples with actual/previous non-negative results during anti-T. cruzi ELISA-screening in blood banks. Apart from inconclusive results, the FC-TRIPLEX presented a weak agreement index with EIA, while a strong agreement was observed when either IIF or EIA/IIF consensus criteria were applied. Discriminant analysis and Spearman's correlation further corroborates the agreement scores. ROC curve analysis showed that FC-TRIPLEX performance indexes were higher when IIF and EIA/IIF consensus were used as a confirmatory criterion. Logistic regression analysis further demonstrated that the probability of FC-TRIPLEX to yield positive results was higher for inconclusive results from IIF and EIA/IIF consensus. Machine learning tools illustrated the high level of categorical agreement between FC-TRIPLEX versus IIF or EIA/IIF consensus. Together, these findings demonstrated the usefulness of FC-TRIPLEX as a tool to elucidate the status of non-negative results in blood bank screening of Chagas disease. Copyright © 2018. Published by Elsevier B.V.
Workstations take over conceptual design
NASA Technical Reports Server (NTRS)
Kidwell, George H.
1987-01-01
Workstations provide sufficient computing memory and speed for early evaluations of aircraft design alternatives to identify those worthy of further study. It is recommended that the programming of such machines permit integrated calculations of the configuration and performance analysis of new concepts, along with the capability of changing up to 100 variables at a time and swiftly viewing the results. Computations can be augmented through links to mainframes and supercomputers. Programming, particularly debugging operations, are enhanced by the capability of working with one program line at a time and having available on-screen error indices. Workstation networks permit on-line communication among users and with persons and computers outside the facility. Application of the capabilities is illustrated through a description of NASA-Ames design efforts for an oblique wing for a jet performed on a MicroVAX network.
NASA Technical Reports Server (NTRS)
Cohen, P. H.
1982-01-01
Metal cutting is a unique deformation process characterized by large strains, exceptionally high strain rates and few constraints to the deformation. These factors, along with the difficulty of directly measuring the shear angle, make chip formation difficult to model and understand. One technique for skirting the difficulty of post mortem chip measurement is to perform a cutting experiment dynamically in a scanning electron microscope. The performance of the in-situ experiment with full instrumentation allows for component force measurement, orientation measurement (on a round single crystal disk) and a timing device, all superimposed below the deformation on the TV monitor and recorded for future viewing. This allows the sher angle to be directly measured for the screen along with the other needed information.
Man/Machine Interaction Dynamics And Performance (MMIDAP) capability
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
The creation of an ability to study interaction dynamics between a machine and its human operator can be approached from a myriad of directions. The Man/Machine Interaction Dynamics and Performance (MMIDAP) project seeks to create an ability to study the consequences of machine design alternatives relative to the performance of both machine and operator. The class of machines to which this study is directed includes those that require the intelligent physical exertions of a human operator. While Goddard's Flight Telerobotic's program was expected to be a major user, basic engineering design and biomedical applications reach far beyond telerobotics. Ongoing efforts are outlined of the GSFC and its University and small business collaborators to integrate both human performance and musculoskeletal data bases with analysis capabilities necessary to enable the study of dynamic actions, reactions, and performance of coupled machine/operator systems.
Machine characterization based on an abstract high-level language machine
NASA Technical Reports Server (NTRS)
Saavedra-Barrera, Rafael H.; Smith, Alan Jay; Miya, Eugene
1989-01-01
Measurements are presented for a large number of machines ranging from small workstations to supercomputers. The authors combine these measurements into groups of parameters which relate to specific aspects of the machine implementation, and use these groups to provide overall machine characterizations. The authors also define the concept of pershapes, which represent the level of performance of a machine for different types of computation. A metric based on pershapes is introduced that provides a quantitative way of measuring how similar two machines are in terms of their performance distributions. The metric is related to the extent to which pairs of machines have varying relative performance levels depending on which benchmark is used.
NASA Astrophysics Data System (ADS)
Olivares-Amaya, Roberto; Hachmann, Johannes; Amador-Bedolla, Carlos; Daly, Aidan; Jinich, Adrian; Atahan-Evrenk, Sule; Boixo, Sergio; Aspuru-Guzik, Alán
2012-02-01
Organic photovoltaic devices have emerged as competitors to silicon-based solar cells, currently reaching efficiencies of over 9% and offering desirable properties for manufacturing and installation. We study conjugated donor polymers for high-efficiency bulk-heterojunction photovoltaic devices with a molecular library motivated by experimental feasibility. We use quantum mechanics and a distributed computing approach to explore this vast molecular space. We will detail the screening approach starting from the generation of the molecular library, which can be easily extended to other kinds of molecular systems. We will describe the screening method for these materials which ranges from descriptor models, ubiquitous in the drug discovery community, to eventually reaching first principles quantum chemistry methods. We will present results on the statistical analysis, based principally on machine learning, specifically partial least squares and Gaussian processes. Alongside, clustering methods and the use of the hypergeometric distribution reveal moieties important for the donor materials and allow us to quantify structure-property relationships. These efforts enable us to accelerate materials discovery in organic photovoltaics through our collaboration with experimental groups.
NASA Astrophysics Data System (ADS)
Leighs, J. A.; Halling-Brown, M. D.; Patel, M. N.
2018-03-01
The UK currently has a national breast cancer-screening program and images are routinely collected from a number of screening sites, representing a wealth of invaluable data that is currently under-used. Radiologists evaluate screening images manually and recall suspicious cases for further analysis such as biopsy. Histological testing of biopsy samples confirms the malignancy of the tumour, along with other diagnostic and prognostic characteristics such as disease grade. Machine learning is becoming increasingly popular for clinical image classification problems, as it is capable of discovering patterns in data otherwise invisible. This is particularly true when applied to medical imaging features; however clinical datasets are often relatively small. A texture feature extraction toolkit has been developed to mine a wide range of features from medical images such as mammograms. This study analysed a dataset of 1,366 radiologist-marked, biopsy-proven malignant lesions obtained from the OPTIMAM Medical Image Database (OMI-DB). Exploratory data analysis methods were employed to better understand extracted features. Machine learning techniques including Classification and Regression Trees (CART), ensemble methods (e.g. random forests), and logistic regression were applied to the data to predict the disease grade of the analysed lesions. Prediction scores of up to 83% were achieved; sensitivity and specificity of the models trained have been discussed to put the results into a clinical context. The results show promise in the ability to predict prognostic indicators from the texture features extracted and thus enable prioritisation of care for patients at greatest risk.
Low cost automated whole smear microscopy screening system for detection of acid fast bacilli.
Law, Yan Nei; Jian, Hanbin; Lo, Norman W S; Ip, Margaret; Chan, Mia Mei Yuk; Kam, Kai Man; Wu, Xiaohua
2018-01-01
In countries with high tuberculosis (TB) burden, there is urgent need for rapid, large-scale screening to detect smear-positive patients. We developed a computer-aided whole smear screening system that focuses in real-time, captures images and provides diagnostic grading, for both bright-field and fluorescence microscopy for detection of acid-fast-bacilli (AFB) from respiratory specimens. To evaluate the performance of dual-mode screening system in AFB diagnostic algorithms on concentrated smears with auramine O (AO) staining, as well as direct smears with AO and Ziehl-Neelsen (ZN) staining, using mycobacterial culture results as gold standard. Adult patient sputum samples requesting for M. tuberculosis cultures were divided into three batches for staining: direct AO-stained, direct ZN-stained and concentrated smears AO-stained. All slides were graded by an experienced microscopist, in parallel with the automated whole smear screening system. Sensitivity and specificity of a TB diagnostic algorithm in using the screening system alone, and in combination with a microscopist, were evaluated. Of 488 direct AO-stained smears, 228 were culture positive. These yielded a sensitivity of 81.6% and specificity of 74.2%. Of 334 direct smears with ZN staining, 142 were culture positive, which gave a sensitivity of 70.4% and specificity of 76.6%. Of 505 concentrated smears with AO staining, 250 were culture positive, giving a sensitivity of 86.4% and specificity of 71.0%. To further improve performance, machine grading was confirmed by manual smear grading when the number of AFBs detected fell within an uncertainty range. These combined results gave significant improvement in specificity (AO-direct:85.4%; ZN-direct:85.4%; AO-concentrated:92.5%) and slight improvement in sensitivity while requiring only limited manual workload. Our system achieved high sensitivity without substantially compromising specificity when compared to culture results. Significant improvement in specificity was obtained when uncertain results were confirmed by manual smear grading. This approach had potential to substantially reduce workload of microscopists in high burden countries.
Fused man-machine classification schemes to enhance diagnosis of breast microcalcifications
NASA Astrophysics Data System (ADS)
Andreadis, Ioannis; Sevastianos, Chatzistergos; George, Spyrou; Konstantina, Nikita
2017-11-01
Computer aided diagnosis (CAD x ) approaches are developed towards the effective discrimination between benign and malignant clusters of microcalcifications. Different sources of information are exploited, such as features extracted from the image analysis of the region of interest, features related to the location of the cluster inside the breast, age of the patient and descriptors provided by the radiologists while performing their diagnostic task. A series of different CAD x schemes are implemented, each of which uses a different category of features and adopts a variety of machine learning algorithms and alternative image processing techniques. A novel framework is introduced where these independent diagnostic components are properly combined according to features critical to a radiologist in an attempt to identify the most appropriate CAD x schemes for the case under consideration. An open access database (Digital Database of Screening Mammography (DDSM)) has been elaborated to construct a large dataset with cases of varying subtlety, in order to ensure the development of schemes with high generalization ability, as well as extensive evaluation of their performance. The obtained results indicate that the proposed framework succeeds in improving the diagnostic procedure, as the achieved overall classification performance outperforms all the independent single diagnostic components, as well as the radiologists that assessed the same cases, in terms of accuracy, sensitivity, specificity and area under the curve following receiver operating characteristic analysis.
Yun, Ruijuan; Lin, Chung-Chih; Wu, Shuicai; Huang, Chu-Chung; Lin, Ching-Po; Chao, Yi-Ping
2013-01-01
In this study, we employed diffusion tensor imaging (DTI) to construct brain structural network and then derive the connection matrices from 96 healthy elderly subjects. The correlation analysis between these topological properties of network based on graph theory and the Cognitive Abilities Screening Instrument (CASI) index were processed to extract the significant network characteristics. These characteristics were then integrated to estimate the models by various machine-learning algorithms to predict user's cognitive performance. From the results, linear regression model and Gaussian processes model showed presented better abilities with lower mean absolute errors of 5.8120 and 6.25 to predict the cognitive performance respectively. Moreover, these extracted topological properties of brain structural network derived from DTI also could be regarded as the bio-signatures for further evaluation of brain degeneration in healthy aged and early diagnosis of mild cognitive impairment (MCI).
Improving brain-machine interface performance by decoding intended future movements
NASA Astrophysics Data System (ADS)
Willett, Francis R.; Suminski, Aaron J.; Fagg, Andrew H.; Hatsopoulos, Nicholas G.
2013-04-01
Objective. A brain-machine interface (BMI) records neural signals in real time from a subject's brain, interprets them as motor commands, and reroutes them to a device such as a robotic arm, so as to restore lost motor function. Our objective here is to improve BMI performance by minimizing the deleterious effects of delay in the BMI control loop. We mitigate the effects of delay by decoding the subject's intended movements a short time lead in the future. Approach. We use the decoded, intended future movements of the subject as the control signal that drives the movement of our BMI. This should allow the user's intended trajectory to be implemented more quickly by the BMI, reducing the amount of delay in the system. In our experiment, a monkey (Macaca mulatta) uses a future prediction BMI to control a simulated arm to hit targets on a screen. Main Results. Results from experiments with BMIs possessing different system delays (100, 200 and 300 ms) show that the monkey can make significantly straighter, faster and smoother movements when the decoder predicts the user's future intent. We also characterize how BMI performance changes as a function of delay, and explore offline how the accuracy of future prediction decoders varies at different time leads. Significance. This study is the first to characterize the effects of control delays in a BMI and to show that decoding the user's future intent can compensate for the negative effect of control delay on BMI performance.
Cui, De-Mi; Yan, Weizhong; Wang, Xiao-Quan; Lu, Lie-Min
2017-10-25
Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT's turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts' interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology's effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction.
Relative Performance of Hardwood Sawing Machines
Philip H. Steele; Michael W. Wade; Steven H. Bullard; Philip A. Araman
1991-01-01
Only limited information has been available to hardwood sawmillers on the performance of their sawing machines. This study analyzes a large database of individual machine studies to provide detailed information on 6 machine types. These machine types were band headrig, circular headrig, band linebar resaw, vertical band splitter resaw, single arbor gang resaw and...
ZebraZoom: an automated program for high-throughput behavioral analysis and categorization
Mirat, Olivier; Sternberg, Jenna R.; Severi, Kristen E.; Wyart, Claire
2013-01-01
The zebrafish larva stands out as an emergent model organism for translational studies involving gene or drug screening thanks to its size, genetics, and permeability. At the larval stage, locomotion occurs in short episodes punctuated by periods of rest. Although phenotyping behavior is a key component of large-scale screens, it has not yet been automated in this model system. We developed ZebraZoom, a program to automatically track larvae and identify maneuvers for many animals performing discrete movements. Our program detects each episodic movement and extracts large-scale statistics on motor patterns to produce a quantification of the locomotor repertoire. We used ZebraZoom to identify motor defects induced by a glycinergic receptor antagonist. The analysis of the blind mutant atoh7 revealed small locomotor defects associated with the mutation. Using multiclass supervised machine learning, ZebraZoom categorized all episodes of movement for each larva into one of three possible maneuvers: slow forward swim, routine turn, and escape. ZebraZoom reached 91% accuracy for categorization of stereotypical maneuvers that four independent experimenters unanimously identified. For all maneuvers in the data set, ZebraZoom agreed with four experimenters in 73.2–82.5% of cases. We modeled the series of maneuvers performed by larvae as Markov chains and observed that larvae often repeated the same maneuvers within a group. When analyzing subsequent maneuvers performed by different larvae, we found that larva–larva interactions occurred as series of escapes. Overall, ZebraZoom reached the level of precision found in manual analysis but accomplished tasks in a high-throughput format necessary for large screens. PMID:23781175
Resource Sharing in a Network of Personal Computers.
1982-12-01
magnetic card, or a more secure identifier such as a machine-read fingerprint or voiceprint. Security and Protection 57 (3) (R, key) (5) (RB’ B, key) (B...operations are invoked via messages, a program and its terminal can easily be located on separate machines. In Spice, an interface process called Canvas ...request of a process. In Canvas , a process can only subdivide windows that it already has. On the other hand, the window manager treats the screen as a
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
Can laptops be left inside passenger bags if motion imaging is used in X-ray security screening?
Mendes, Marcia; Schwaninger, Adrian; Michel, Stefan
2013-01-01
This paper describes a study where a new X-ray machine for security screening featuring motion imaging (i.e., 5 views of a bag are shown as an image sequence) was evaluated and compared to single view imaging available on conventional X-ray screening systems. More specifically, it was investigated whether with this new technology X-ray screening of passenger bags could be enhanced to such an extent that laptops could be left inside passenger bags, without causing a significant impairment in threat detection performance. An X-ray image interpretation test was created in four different versions, manipulating the factors packing condition (laptop and bag separate vs. laptop in bag) and display condition (single vs. motion imaging). There was a highly significant and large main effect of packing condition. When laptops and bags were screened separately, threat item detection was substantially higher. For display condition, a medium effect was observed. Detection could be slightly enhanced through the application of motion imaging. There was no interaction between display and packing condition, implying that the high negative effect of leaving laptops in passenger bags could not be fully compensated by motion imaging. Additional analyses were carried out to examine effects depending on different threat categories (guns, improvised explosive devices, knives, others), the placement of the threat items (in bag vs. in laptop) and viewpoint (easy vs. difficult view). In summary, although motion imaging provides an enhancement, it is not strong enough to allow leaving laptops in bags for security screening.
Can laptops be left inside passenger bags if motion imaging is used in X-ray security screening?
Mendes, Marcia; Schwaninger, Adrian; Michel, Stefan
2013-01-01
This paper describes a study where a new X-ray machine for security screening featuring motion imaging (i.e., 5 views of a bag are shown as an image sequence) was evaluated and compared to single view imaging available on conventional X-ray screening systems. More specifically, it was investigated whether with this new technology X-ray screening of passenger bags could be enhanced to such an extent that laptops could be left inside passenger bags, without causing a significant impairment in threat detection performance. An X-ray image interpretation test was created in four different versions, manipulating the factors packing condition (laptop and bag separate vs. laptop in bag) and display condition (single vs. motion imaging). There was a highly significant and large main effect of packing condition. When laptops and bags were screened separately, threat item detection was substantially higher. For display condition, a medium effect was observed. Detection could be slightly enhanced through the application of motion imaging. There was no interaction between display and packing condition, implying that the high negative effect of leaving laptops in passenger bags could not be fully compensated by motion imaging. Additional analyses were carried out to examine effects depending on different threat categories (guns, improvised explosive devices, knives, others), the placement of the threat items (in bag vs. in laptop) and viewpoint (easy vs. difficult view). In summary, although motion imaging provides an enhancement, it is not strong enough to allow leaving laptops in bags for security screening. PMID:24151457
Mobility Lab to Assess Balance and Gait with Synchronized Body-worn Sensors
Mancini, Martina; King, Laurie; Salarian, Arash; Holmstrom, Lars; McNames, James; Horak, Fay B
2014-01-01
This paper is a commentary to introduce how rehabilitation professionals can use a new, body-worn sensor system to obtain objective measures of balance and gait. Current assessments of balance and gait in clinical rehabilitation are largely limited to subjective scales, simple stop-watch measures, or complex, expensive machines not practical or largely available. Although accelerometers and gyroscopes have been shown to accurately quantify many aspects of gait and balance kinematics, only recently a comprehensive, portable system has become available for clinicians. By measuring body motion during tests that clinicians are already performing, such as the Timed Up and Go test (TUG) and the Clinical Test of Sensory Integration for Balance (CITSIB), the additional time for assessment is minimal. By providing instant analysis of balance and gait and comparing a patient’s performance to age-matched control values, therapists receive an objective, sensitive screening profile of balance and gait strategies. This motion screening profile can be used to identify mild abnormalities not obvious with traditional clinical testing, measure small changes due to rehabilitation, and design customized rehabilitation programs for each individual’s specific balance and gait deficits. PMID:24955286
Searching Fragment Spaces with feature trees.
Lessel, Uta; Wellenzohn, Bernd; Lilienthal, Markus; Claussen, Holger
2009-02-01
Virtual combinatorial chemistry easily produces billions of compounds, for which conventional virtual screening cannot be performed even with the fastest methods available. An efficient solution for such a scenario is the generation of Fragment Spaces, which encode huge numbers of virtual compounds by their fragments/reagents and rules of how to combine them. Similarity-based searches can be performed in such spaces without ever fully enumerating all virtual products. Here we describe the generation of a huge Fragment Space encoding about 5 * 10(11) compounds based on established in-house synthesis protocols for combinatorial libraries, i.e., we encode practically evaluated combinatorial chemistry protocols in a machine readable form, rendering them accessible to in silico search methods. We show how such searches in this Fragment Space can be integrated as a first step in an overall workflow. It reduces the extremely huge number of virtual products by several orders of magnitude so that the resulting list of molecules becomes more manageable for further more elaborated and time-consuming analysis steps. Results of a case study are presented and discussed, which lead to some general conclusions for an efficient expansion of the chemical space to be screened in pharmaceutical companies.
NASA Astrophysics Data System (ADS)
Pitts, James Daniel
Rotary ultrasonic machining (RUM), a hybrid process combining ultrasonic machining and diamond grinding, was created to increase material removal rates for the fabrication of hard and brittle workpieces. The objective of this research was to experimentally derive empirical equations for the prediction of multiple machined surface roughness parameters for helically pocketed rotary ultrasonic machined Zerodur glass-ceramic workpieces by means of a systematic statistical experimental approach. A Taguchi parametric screening design of experiments was employed to systematically determine the RUM process parameters with the largest effect on mean surface roughness. Next empirically determined equations for the seven common surface quality metrics were developed via Box-Behnken surface response experimental trials. Validation trials were conducted resulting in predicted and experimental surface roughness in varying levels of agreement. The reductions in cutting force and tool wear associated with RUM, reported by previous researchers, was experimentally verified to also extended to helical pocketing of Zerodur glass-ceramic.
Crowdsourced validation of a machine-learning classification system for autism and ADHD
Duda, M; Haber, N; Daniels, J; Wall, D P
2017-01-01
Autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) together affect >10% of the children in the United States, but considerable behavioral overlaps between the two disorders can often complicate differential diagnosis. Currently, there is no screening test designed to differentiate between the two disorders, and with waiting times from initial suspicion to diagnosis upwards of a year, methods to quickly and accurately assess risk for these and other developmental disorders are desperately needed. In a previous study, we found that four machine-learning algorithms were able to accurately (area under the curve (AUC)>0.96) distinguish ASD from ADHD using only a small subset of items from the Social Responsiveness Scale (SRS). Here, we expand upon our prior work by including a novel crowdsourced data set of responses to our predefined top 15 SRS-derived questions from parents of children with ASD (n=248) or ADHD (n=174) to improve our model’s capability to generalize to new, ‘real-world’ data. By mixing these novel survey data with our initial archival sample (n=3417) and performing repeated cross-validation with subsampling, we created a classification algorithm that performs with AUC=0.89±0.01 using only 15 questions. PMID:28509905
Nanocomposites for Machining Tools
Loginov, Pavel; Mishnaevsky, Leon; Levashov, Evgeny
2017-01-01
Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials. A promising way to improve the performance characteristics of these materials is to design new nanocomposites based on them. The application of micromechanical modeling during the elaboration of composite materials for machining tools can reduce the financial and time costs for development of new tools, with enhanced performance. This article reviews the main groups of nanocomposites for machining tools and their performance. PMID:29027926
Real-data comparison of data mining methods in prediction of diabetes in iran.
Tapak, Lily; Mahjub, Hossein; Hamidi, Omid; Poorolajal, Jalal
2013-09-01
Diabetes is one of the most common non-communicable diseases in developing countries. Early screening and diagnosis play an important role in effective prevention strategies. This study compared two traditional classification methods (logistic regression and Fisher linear discriminant analysis) and four machine-learning classifiers (neural networks, support vector machines, fuzzy c-mean, and random forests) to classify persons with and without diabetes. The data set used in this study included 6,500 subjects from the Iranian national non-communicable diseases risk factors surveillance obtained through a cross-sectional survey. The obtained sample was based on cluster sampling of the Iran population which was conducted in 2005-2009 to assess the prevalence of major non-communicable disease risk factors. Ten risk factors that are commonly associated with diabetes were selected to compare the performance of six classifiers in terms of sensitivity, specificity, total accuracy, and area under the receiver operating characteristic (ROC) curve criteria. Support vector machines showed the highest total accuracy (0.986) as well as area under the ROC (0.979). Also, this method showed high specificity (1.000) and sensitivity (0.820). All other methods produced total accuracy of more than 85%, but for all methods, the sensitivity values were very low (less than 0.350). The results of this study indicate that, in terms of sensitivity, specificity, and overall classification accuracy, the support vector machine model ranks first among all the classifiers tested in the prediction of diabetes. Therefore, this approach is a promising classifier for predicting diabetes, and it should be further investigated for the prediction of other diseases.
Bozkurt, Selen; Bostanci, Asli; Turhan, Murat
2017-08-11
The goal of this study is to evaluate the results of machine learning methods for the classification of OSA severity of patients with suspected sleep disorder breathing as normal, mild, moderate and severe based on non-polysomnographic variables: 1) clinical data, 2) symptoms and 3) physical examination. In order to produce classification models for OSA severity, five different machine learning methods (Bayesian network, Decision Tree, Random Forest, Neural Networks and Logistic Regression) were trained while relevant variables and their relationships were derived empirically from observed data. Each model was trained and evaluated using 10-fold cross-validation and to evaluate classification performances of all methods, true positive rate (TPR), false positive rate (FPR), Positive Predictive Value (PPV), F measure and Area Under Receiver Operating Characteristics curve (ROC-AUC) were used. Results of 10-fold cross validated tests with different variable settings promisingly indicated that the OSA severity of suspected OSA patients can be classified, using non-polysomnographic features, with 0.71 true positive rate as the highest and, 0.15 false positive rate as the lowest, respectively. Moreover, the test results of different variables settings revealed that the accuracy of the classification models was significantly improved when physical examination variables were added to the model. Study results showed that machine learning methods can be used to estimate the probabilities of no, mild, moderate, and severe obstructive sleep apnea and such approaches may improve accurate initial OSA screening and help referring only the suspected moderate or severe OSA patients to sleep laboratories for the expensive tests.
Machine learning algorithms for mode-of-action classification in toxicity assessment.
Zhang, Yile; Wong, Yau Shu; Deng, Jian; Anton, Cristina; Gabos, Stephan; Zhang, Weiping; Huang, Dorothy Yu; Jin, Can
2016-01-01
Real Time Cell Analysis (RTCA) technology is used to monitor cellular changes continuously over the entire exposure period. Combining with different testing concentrations, the profiles have potential in probing the mode of action (MOA) of the testing substances. In this paper, we present machine learning approaches for MOA assessment. Computational tools based on artificial neural network (ANN) and support vector machine (SVM) are developed to analyze the time-concentration response curves (TCRCs) of human cell lines responding to tested chemicals. The techniques are capable of learning data from given TCRCs with known MOA information and then making MOA classification for the unknown toxicity. A novel data processing step based on wavelet transform is introduced to extract important features from the original TCRC data. From the dose response curves, time interval leading to higher classification success rate can be selected as input to enhance the performance of the machine learning algorithm. This is particularly helpful when handling cases with limited and imbalanced data. The validation of the proposed method is demonstrated by the supervised learning algorithm applied to the exposure data of HepG2 cell line to 63 chemicals with 11 concentrations in each test case. Classification success rate in the range of 85 to 95 % are obtained using SVM for MOA classification with two clusters to cases up to four clusters. Wavelet transform is capable of capturing important features of TCRCs for MOA classification. The proposed SVM scheme incorporated with wavelet transform has a great potential for large scale MOA classification and high-through output chemical screening.
Korotcov, Alexandru; Tkachenko, Valery; Russo, Daniel P; Ekins, Sean
2017-12-04
Machine learning methods have been applied to many data sets in pharmaceutical research for several decades. The relative ease and availability of fingerprint type molecular descriptors paired with Bayesian methods resulted in the widespread use of this approach for a diverse array of end points relevant to drug discovery. Deep learning is the latest machine learning algorithm attracting attention for many of pharmaceutical applications from docking to virtual screening. Deep learning is based on an artificial neural network with multiple hidden layers and has found considerable traction for many artificial intelligence applications. We have previously suggested the need for a comparison of different machine learning methods with deep learning across an array of varying data sets that is applicable to pharmaceutical research. End points relevant to pharmaceutical research include absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) properties, as well as activity against pathogens and drug discovery data sets. In this study, we have used data sets for solubility, probe-likeness, hERG, KCNQ1, bubonic plague, Chagas, tuberculosis, and malaria to compare different machine learning methods using FCFP6 fingerprints. These data sets represent whole cell screens, individual proteins, physicochemical properties as well as a data set with a complex end point. Our aim was to assess whether deep learning offered any improvement in testing when assessed using an array of metrics including AUC, F1 score, Cohen's kappa, Matthews correlation coefficient and others. Based on ranked normalized scores for the metrics or data sets Deep Neural Networks (DNN) ranked higher than SVM, which in turn was ranked higher than all the other machine learning methods. Visualizing these properties for training and test sets using radar type plots indicates when models are inferior or perhaps over trained. These results also suggest the need for assessing deep learning further using multiple metrics with much larger scale comparisons, prospective testing as well as assessment of different fingerprints and DNN architectures beyond those used.
Li, Guo-Bo; Yang, Ling-Ling; Feng, Shan; Zhou, Jian-Ping; Huang, Qi; Xie, Huan-Zhang; Li, Lin-Li; Yang, Sheng-Yong
2011-03-15
Development of glutamate non-competitive antagonists of mGluR1 (Metabotropic glutamate receptor subtype 1) has increasingly attracted much attention in recent years due to their potential therapeutic application for various nervous disorders. Since there is no crystal structure reported for mGluR1, ligand-based virtual screening (VS) methods, typically pharmacophore-based VS (PB-VS), are often used for the discovery of mGluR1 antagonists. Nevertheless, PB-VS usually suffers a lower hit rate and enrichment factor. In this investigation, we established a multistep ligand-based VS approach that is based on a support vector machine (SVM) classification model and a pharmacophore model. Performance evaluation of these methods in virtual screening against a large independent test set, M-MDDR, show that the multistep VS approach significantly increases the hit rate and enrichment factor compared with the individual SB-VS and PB-VS methods. The multistep VS approach was then used to screen several large chemical libraries including PubChem, Specs, and Enamine. Finally a total of 20 compounds were selected from the top ranking compounds, and shifted to the subsequent in vitro and in vivo studies, which results will be reported in the near future. Copyright © 2011 Elsevier Ltd. All rights reserved.
2017-06-01
AUTONOMOUS CONTROL AND COLLABORATION (UTACC) HUMAN-MACHINE INTEGRATION MEASURES OF PERFORMANCE AND MEASURES OF EFFECTIVENESS by Thomas A...TACTICAL AUTONOMOUS CONTROL AND COLLABORATION (UTACC) HUMAN-MACHINE INTEGRATION MEASURES OF PERFORMANCE AND MEASURES OF EFFECTIVENESS 5. FUNDING...Tactical Autonomous Control and Collaboration (UTACC) program seeks to integrate Marines and autonomous machines to address the challenges encountered in
Xing, Jing; Lu, Wenchao; Liu, Rongfeng; Wang, Yulan; Xie, Yiqian; Zhang, Hao; Shi, Zhe; Jiang, Hao; Liu, Yu-Chih; Chen, Kaixian; Jiang, Hualiang; Luo, Cheng; Zheng, Mingyue
2017-07-24
Bromodomain-containing protein 4 (BRD4) is implicated in the pathogenesis of a number of different cancers, inflammatory diseases and heart failure. Much effort has been dedicated toward discovering novel scaffold BRD4 inhibitors (BRD4is) with different selectivity profiles and potential antiresistance properties. Structure-based drug design (SBDD) and virtual screening (VS) are the most frequently used approaches. Here, we demonstrate a novel, structure-based VS approach that uses machine-learning algorithms trained on the priori structure and activity knowledge to predict the likelihood that a compound is a BRD4i based on its binding pattern with BRD4. In addition to positive experimental data, such as X-ray structures of BRD4-ligand complexes and BRD4 inhibitory potencies, negative data such as false positives (FPs) identified from our earlier ligand screening results were incorporated into our knowledge base. We used the resulting data to train a machine-learning model named BRD4LGR to predict the BRD4i-likeness of a compound. BRD4LGR achieved a 20-30% higher AUC-ROC than that of Glide using the same test set. When conducting in vitro experiments against a library of previously untested, commercially available organic compounds, the second round of VS using BRD4LGR generated 15 new BRD4is. Moreover, inverting the machine-learning model provided easy access to structure-activity relationship (SAR) interpretation for hit-to-lead optimization.
NASA Technical Reports Server (NTRS)
1981-01-01
The impact of modern technology on the role, responsibility, authority, and performance of human operators in modern aircraft and ATC systems was examined in terms of principles defined by Paul Fitts. Research into human factors in aircraft operations and the use of human factors engineering for aircraft safety improvements were discussed, and features of the man-machine interface in computerized cockpit warning systems are examined. The design and operational features of computerized avionics displays and HUDs are described, along with results of investigations into pilot decision-making behavior, aircrew procedural compliance, and aircrew judgment training programs. Experiments in vision and visual perception are detailed, as are behavioral studies of crew workload, coordination, and complement. The effectiveness of pilot selection, screening, and training techniques are assessed, as are methods for evaluating pilot performance.
NASA Astrophysics Data System (ADS)
Sharif, Safian; Sadiq, Ibrahim Ogu; Suhaimi, Mohd Azlan; Rahim, Shayfull Zamree Abd
2017-09-01
Pollution related activities in addition to handling cost of conventional cutting fluid application in metal cutting industry has generated a lot of concern over time. The desire for a green machining environment which will preserve the environment through reduction or elimination of machining related pollution, reduction in oil consumption and safety of the machine operators without compromising an efficient machining process led to search for alternatives to conventional cutting fluid. Amongst the alternatives of dry machining, cryogenic cooling, high pressure cooling, near dry or minimum quantity lubrication (MQL), MQL have shown remarkable performance in terms of cost, machining output, safety of environment and machine operators. However, the MQL under aggressive machining or very high speed machining pose certain restriction as the lubrication media cannot perform efficiently at elevated temperature. In compensating for the shortcomings of MQL technique, high thermal conductivity nanoparticles are introduced in cutting fluids for use in the MQL lubrication process. They have indicated enhanced performance of machining process and significant reduction of loads on the environment. The present work is aimed at evaluating the application and performance of nanofluid in metal cutting process through MQL lubrication technique highlighting their impacts and prospects as lubrication strategy in metal cutting process for sustainable green manufacturing. Enhanced performance of vegetable oil based nanofluids over mineral oil-based nanofluids have been reported and thus highlighted.
Using multiscale texture and density features for near-term breast cancer risk analysis
Sun, Wenqing; Tseng, Tzu-Liang (Bill); Qian, Wei; Zhang, Jianying; Saltzstein, Edward C.; Zheng, Bin; Lure, Fleming; Yu, Hui; Zhou, Shi
2015-01-01
Purpose: To help improve efficacy of screening mammography by eventually establishing a new optimal personalized screening paradigm, the authors investigated the potential of using the quantitative multiscale texture and density feature analysis of digital mammograms to predict near-term breast cancer risk. Methods: The authors’ dataset includes digital mammograms acquired from 340 women. Among them, 141 were positive and 199 were negative/benign cases. The negative digital mammograms acquired from the “prior” screening examinations were used in the study. Based on the intensity value distributions, five subregions at different scales were extracted from each mammogram. Five groups of features, including density and texture features, were developed and calculated on every one of the subregions. Sequential forward floating selection was used to search for the effective combinations. Using the selected features, a support vector machine (SVM) was optimized using a tenfold validation method to predict the risk of each woman having image-detectable cancer in the next sequential mammography screening. The area under the receiver operating characteristic curve (AUC) was used as the performance assessment index. Results: From a total number of 765 features computed from multiscale subregions, an optimal feature set of 12 features was selected. Applying this feature set, a SVM classifier yielded performance of AUC = 0.729 ± 0.021. The positive predictive value was 0.657 (92 of 140) and the negative predictive value was 0.755 (151 of 200). Conclusions: The study results demonstrated a moderately high positive association between risk prediction scores generated by the quantitative multiscale mammographic image feature analysis and the actual risk of a woman having an image-detectable breast cancer in the next subsequent examinations. PMID:26127038
Pizarro, Ricardo A; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A; Goldman, Aaron L; Xiao, Ena; Luo, Qian; Berman, Karen F; Callicott, Joseph H; Weinberger, Daniel R; Mattay, Venkata S
2016-01-01
High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.
Amin, Morteza Moradi; Kermani, Saeed; Talebi, Ardeshir; Oghli, Mostafa Ghelich
2015-01-01
Acute lymphoblastic leukemia is the most common form of pediatric cancer which is categorized into three L1, L2, and L3 and could be detected through screening of blood and bone marrow smears by pathologists. Due to being time-consuming and tediousness of the procedure, a computer-based system is acquired for convenient detection of Acute lymphoblastic leukemia. Microscopic images are acquired from blood and bone marrow smears of patients with Acute lymphoblastic leukemia and normal cases. After applying image preprocessing, cells nuclei are segmented by k-means algorithm. Then geometric and statistical features are extracted from nuclei and finally these cells are classified to cancerous and noncancerous cells by means of support vector machine classifier with 10-fold cross validation. These cells are also classified into their sub-types by multi-Support vector machine classifier. Classifier is evaluated by these parameters: Sensitivity, specificity, and accuracy which values for cancerous and noncancerous cells 98%, 95%, and 97%, respectively. These parameters are also used for evaluation of cell sub-types which values in mean 84.3%, 97.3%, and 95.6%, respectively. The results show that proposed algorithm could achieve an acceptable performance for the diagnosis of Acute lymphoblastic leukemia and its sub-types and can be used as an assistant diagnostic tool for pathologists.
25 CFR 542.13 - What are the minimum internal control standards for gaming machines?
Code of Federal Regulations, 2014 CFR
2014-04-01
.... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...
25 CFR 542.13 - What are the minimum internal control standards for gaming machines?
Code of Federal Regulations, 2012 CFR
2012-04-01
.... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...
25 CFR 542.13 - What are the minimum internal control standards for gaming machines?
Code of Federal Regulations, 2013 CFR
2013-04-01
.... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...
25 CFR 542.13 - What are the minimum internal control standards for gaming machines?
Code of Federal Regulations, 2010 CFR
2010-04-01
.... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...
25 CFR 542.13 - What are the minimum internal control standards for gaming machines?
Code of Federal Regulations, 2011 CFR
2011-04-01
.... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...
Hamon, Véronique; Bourgeas, Raphael; Ducrot, Pierre; Theret, Isabelle; Xuereb, Laura; Basse, Marie Jeanne; Brunel, Jean Michel; Combes, Sebastien; Morelli, Xavier; Roche, Philippe
2014-01-01
Over the last 10 years, protein–protein interactions (PPIs) have shown increasing potential as new therapeutic targets. As a consequence, PPIs are today the most screened target class in high-throughput screening (HTS). The development of broad chemical libraries dedicated to these particular targets is essential; however, the chemical space associated with this ‘high-hanging fruit’ is still under debate. Here, we analyse the properties of 40 non-redundant small molecules present in the 2P2I database (http://2p2idb.cnrs-mrs.fr/) to define a general profile of orthosteric inhibitors and propose an original protocol to filter general screening libraries using a support vector machine (SVM) with 11 standard Dragon molecular descriptors. The filtering protocol has been validated using external datasets from PubChem BioAssay and results from in-house screening campaigns. This external blind validation demonstrated the ability of the SVM model to reduce the size of the filtered chemical library by eliminating up to 96% of the compounds as well as enhancing the proportion of active compounds by up to a factor of 8. We believe that the resulting chemical space identified in this paper will provide the scientific community with a concrete support to search for PPI inhibitors during HTS campaigns. PMID:24196694
Benchmarking Ligand-Based Virtual High-Throughput Screening with the PubChem Database
Butkiewicz, Mariusz; Lowe, Edward W.; Mueller, Ralf; Mendenhall, Jeffrey L.; Teixeira, Pedro L.; Weaver, C. David; Meiler, Jens
2013-01-01
With the rapidly increasing availability of High-Throughput Screening (HTS) data in the public domain, such as the PubChem database, methods for ligand-based computer-aided drug discovery (LB-CADD) have the potential to accelerate and reduce the cost of probe development and drug discovery efforts in academia. We assemble nine data sets from realistic HTS campaigns representing major families of drug target proteins for benchmarking LB-CADD methods. Each data set is public domain through PubChem and carefully collated through confirmation screens validating active compounds. These data sets provide the foundation for benchmarking a new cheminformatics framework BCL::ChemInfo, which is freely available for non-commercial use. Quantitative structure activity relationship (QSAR) models are built using Artificial Neural Networks (ANNs), Support Vector Machines (SVMs), Decision Trees (DTs), and Kohonen networks (KNs). Problem-specific descriptor optimization protocols are assessed including Sequential Feature Forward Selection (SFFS) and various information content measures. Measures of predictive power and confidence are evaluated through cross-validation, and a consensus prediction scheme is tested that combines orthogonal machine learning algorithms into a single predictor. Enrichments ranging from 15 to 101 for a TPR cutoff of 25% are observed. PMID:23299552
NASA Astrophysics Data System (ADS)
Li, S. X.; Zhang, Y. J.; Zeng, Q. Y.; Li, L. F.; Guo, Z. Y.; Liu, Z. M.; Xiong, H. L.; Liu, S. H.
2014-06-01
Cancer is the most common disease to threaten human health. The ability to screen individuals with malignant tumours with only a blood sample would be greatly advantageous to early diagnosis and intervention. This study explores the possibility of discriminating between cancer patients and normal subjects with serum surface-enhanced Raman spectroscopy (SERS) and a support vector machine (SVM) through a peripheral blood sample. A total of 130 blood samples were obtained from patients with liver cancer, colonic cancer, esophageal cancer, nasopharyngeal cancer, gastric cancer, as well as 113 blood samples from normal volunteers. Several diagnostic models were built with the serum SERS spectra using SVM and principal component analysis (PCA) techniques. The results show that a diagnostic accuracy of 85.5% is acquired with a PCA algorithm, while a diagnostic accuracy of 95.8% is obtained using radial basis function (RBF), PCA-SVM methods. The results prove that a RBF kernel PCA-SVM technique is superior to PCA and conventional SVM (C-SVM) algorithms in classification serum SERS spectra. The study demonstrates that serum SERS, in combination with SVM techniques, has great potential for screening cancerous patients with any solid malignant tumour through a peripheral blood sample.
Code of Federal Regulations, 2011 CFR
2011-07-01
... performance test of one representative magnet wire coating machine for each group of identical or very similar... you complete the performance test of a representative magnet wire coating machine. The requirements in... operations, you may, with approval, conduct a performance test of a single magnet wire coating machine that...
2011-01-01
Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025
Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott
2011-07-28
Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.
2017-02-01
DARPA ROBOTICS CHALLENGE (DRC) USING HUMAN-MACHINE TEAMWORK TO PERFORM DISASTER RESPONSE WITH A HUMANOID ROBOT FLORIDA INSTITUTE FOR HUMAN AND...AND SUBTITLE DARPA ROBOTICS CHALLENGE (DRC) USING HUMAN-MACHINE TEAMWORK TO PERFORM DISASTER RESPONSE WITH A HUMANOID ROBOT 5a. CONTRACT NUMBER...Human and Machine Cognition (IHMC) from 2012-2016 through three phases of the Defense Advanced Research Projects Agency (DARPA) Robotics Challenge
NASA Astrophysics Data System (ADS)
Dan, Posa Ioan; Florin, Georgescu Remus; Virgil, Ciobanu; Antonescu, Elisabeta
2011-09-01
The place of the study is a pediatrics clinic which realizes a great variety of emergency, ambulatory ad hospital examinations. The radiology compartment respects work procedures and a system to ensure the quality of X ray examinations. The results show a constant for the programmator of the digital detector machine for the tension applied to the tube. For the screen-film detector machine the applied tension increases proportionally with the physical development of the child considering the trunk thickness.
Method and apparatus for characterizing and enhancing the dynamic performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F
2013-12-17
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include dynamic one axis positional accuracy of the machine tool, dynamic cross-axis stability of the machine tool, and dynamic multi-axis positional accuracy of the machine tool.
Physics-informed machine learning for inorganic scintillator discovery
NASA Astrophysics Data System (ADS)
Pilania, G.; McClellan, K. J.; Stanek, C. R.; Uberuaga, B. P.
2018-06-01
Applications of inorganic scintillators—activated with lanthanide dopants, such as Ce and Eu—are found in diverse fields. As a strict requirement to exhibit scintillation, the 4f ground state (with the electronic configuration of [Xe]4fn 5d0) and 5d1 lowest excited state (with the electronic configuration of [Xe]4fn-1 5d1) levels induced by the activator must lie within the host bandgap. Here we introduce a new machine learning (ML) based search strategy for high-throughput chemical space explorations to discover and design novel inorganic scintillators. Building upon well-known physics-based chemical trends for the host dependent electron binding energies within the 4f and 5d1 energy levels of lanthanide ions and available experimental data, the developed ML model—coupled with knowledge of the vacuum referred valence and conduction band edges computed from first principles—can rapidly and reliably estimate the relative positions of the activator's energy levels relative to the valence and conduction band edges of any given host chemistry. Using perovskite oxides and elpasolite halides as examples, the presented approach has been demonstrated to be able to (i) capture systematic chemical trends across host chemistries and (ii) effectively screen promising compounds in a high-throughput manner. While a number of other application-specific performance requirements need to be considered for a viable scintillator, the scheme developed here can be a practically useful tool to systematically down-select the most promising candidate materials in a first line of screening for a subsequent in-depth investigation.
Foulquier, Nathan; Redou, Pascal; Le Gal, Christophe; Rouvière, Bénédicte; Pers, Jacques-Olivier; Saraux, Alain
2018-05-17
Big data analysis has become a common way to extract information from complex and large datasets among most scientific domains. This approach is now used to study large cohorts of patients in medicine. This work is a review of publications that have used artificial intelligence and advanced machine learning techniques to study physio pathogenesis-based treatments in pSS. A systematic literature review retrieved all articles reporting on the use of advanced statistical analysis applied to the study of systemic autoimmune diseases (SADs) over the last decade. An automatic bibliography screening method has been developed to perform this task. The program called BIBOT was designed to fetch and analyze articles from the pubmed database using a list of keywords and Natural Language Processing approaches. The evolution of trends in statistical approaches, sizes of cohorts and number of publications over this period were also computed in the process. In all, 44077 abstracts were screened and 1017 publications were analyzed. The mean number of selected articles was 101.0 (S.D. 19.16) by year, but increased significantly over the time (from 74 articles in 2008 to 138 in 2017). Among them only 12 focused on pSS but none of them emphasized on the aspect of pathogenesis-based treatments. To conclude, medicine progressively enters the era of big data analysis and artificial intelligence, but these approaches are not yet used to describe pSS-specific pathogenesis-based treatment. Nevertheless, large multicentre studies are investigating this aspect with advanced algorithmic tools on large cohorts of SADs patients.
Investigations on high speed machining of EN-353 steel alloy under different machining environments
NASA Astrophysics Data System (ADS)
Venkata Vishnu, A.; Jamaleswara Kumar, P.
2018-03-01
The addition of Nano Particles into conventional cutting fluids enhances its cooling capabilities; in the present paper an attempt is made by adding nano sized particles into conventional cutting fluids. Taguchi Robust Design Methodology is employed in order to study the performance characteristics of different turning parameters i.e. cutting speed, feed rate, depth of cut and type of tool under different machining environments i.e. dry machining, machining with lubricant - SAE 40 and machining with mixture of nano sized particles of Boric acid and base fluid SAE 40. A series of turning operations were performed using L27 (3)13 orthogonal array, considering high cutting speeds and the other machining parameters to measure hardness. The results are compared among the different machining environments, and it is concluded that there is considerable improvement in the machining performance using lubricant SAE 40 and mixture of SAE 40 + boric acid compared with dry machining. The ANOVA suggests that the selected parameters and the interactions are significant and cutting speed has most significant effect on hardness.
Chimenti, Michael S; Bulfer, Stacie L; Neitz, R Jeffrey; Renslo, Adam R; Jacobson, Matthew P; James, Thomas L; Arkin, Michelle R; Kelly, Mark J S
2015-07-01
The ubiquitous AAA+ ATPase p97 functions as a dynamic molecular machine driving several cellular processes. It is essential in regulating protein homeostasis, and it represents a potential drug target for cancer, particularly when there is a greater reliance on the endoplasmic reticulum-associated protein degradation pathway and ubiquitin-proteasome pathway to degrade an overabundance of secreted proteins. Here, we report a case study for using fragment-based ligand design approaches against this large and dynamic hexamer, which has multiple potential binding sites for small molecules. A screen of a fragment library was conducted by surface plasmon resonance (SPR) and followed up by nuclear magnetic resonance (NMR), two complementary biophysical techniques. Virtual screening was also carried out to examine possible binding sites for the experimental hits and evaluate the potential utility of fragment docking for this target. Out of this effort, 13 fragments were discovered that showed reversible binding with affinities between 140 µM and 1 mM, binding stoichiometries of 1:1 or 2:1, and good ligand efficiencies. Structural data for fragment-protein interactions were obtained with residue-specific [U-(2)H] (13)CH3-methyl-labeling NMR strategies, and these data were compared to poses from docking. The combination of virtual screening, SPR, and NMR enabled us to find and validate a number of interesting fragment hits and allowed us to gain an understanding of the structural nature of fragment binding. © 2015 Society for Laboratory Automation and Screening.
Advanced human machine interaction for an image interpretation workstation
NASA Astrophysics Data System (ADS)
Maier, S.; Martin, M.; van de Camp, F.; Peinsipp-Byma, E.; Beyerer, J.
2016-05-01
In recent years, many new interaction technologies have been developed that enhance the usability of computer systems and allow for novel types of interaction. The areas of application for these technologies have mostly been in gaming and entertainment. However, in professional environments, there are especially demanding tasks that would greatly benefit from improved human machine interfaces as well as an overall improved user experience. We, therefore, envisioned and built an image-interpretation-workstation of the future, a multi-monitor workplace comprised of four screens. Each screen is dedicated to a complex software product such as a geo-information system to provide geographic context, an image annotation tool, software to generate standardized reports and a tool to aid in the identification of objects. Using self-developed systems for hand tracking, pointing gestures and head pose estimation in addition to touchscreens, face identification, and speech recognition systems we created a novel approach to this complex task. For example, head pose information is used to save the position of the mouse cursor on the currently focused screen and to restore it as soon as the same screen is focused again while hand gestures allow for intuitive manipulation of 3d objects in mid-air. While the primary focus is on the task of image interpretation, all of the technologies involved provide generic ways of efficiently interacting with a multi-screen setup and could be utilized in other fields as well. In preliminary experiments, we received promising feedback from users in the military and started to tailor the functionality to their needs
[Results of audiometry screening in adolescent workers].
Hartmann, B
1990-11-01
Results of screening audiometry of male youths aged 16 to 25 (n = 3969) in occupations from metallurgy, machine-building industry and traffic are demonstrated. Part of persons they have hearing loss between 5 to 10 percent increases from 2.8% of pupils before starting vocational training to 4.5% or 7.1% of apprentices and 9.7% of skilled workers. The incidence of persons with respective without middle ear inflammation in anamnesis only differ in stages about 20 percent hearing loss. It shows sensitivity of screening audiometry nevertheless there are possibilities of mistakes. Adolescents already may show measurable hearing loss in connection with professional and nonprofessional expositions as well as individual dispositions.
Ghaly, J; Smith, A L
1994-06-01
A new era has arrived for the Biomedical Engineering Department at the Royal Women's Hospital in Melbourne. We have developed a system to qualitatively test for intermittent or unconfirmed faults, associated with Bear Cub ventilators. Where previous testing has been inadequate, computer logging is now used to interface the RT200 Timeter Calibration Analyser (TCA) to obtain a real time display of data, which can be stored and graphed. Using Quick Basic version 4.5, it was possible to establish communication between the TCA and an IBM compatible computer, such that meaningful displays of machine performance were produced. From the parameters measured it has been possible to obtain data on Peak Pressure, Inspiratory to Expiratory ratio (I:E ratio) Peak Flow and Rate. Monitoring is not limited to these parameters, though these were selected for our particular needs. These parameters are plotted in two ways: 1. Compressed average versus time, up to 24 hours on one screen 2. Raw data, 36 minutes displayed on each screen. The compressed data gives an overview which allows easy identification of intermittent faults. The uncompressed data confirms that the averaged signal is a realistic representation of the situation. One of the major benefits of this type of data analysis, is that ventilator performance may be monitored over a long period of time without requiring the presence of a service technician. It also allows individual ventilator performance to be graphically compared to other ventilators.
Li, Liwei; Wang, Bo; Meroueh, Samy O
2011-09-26
The community structure-activity resource (CSAR) data sets are used to develop and test a support vector machine-based scoring function in regression mode (SVR). Two scoring functions (SVR-KB and SVR-EP) are derived with the objective of reproducing the trend of the experimental binding affinities provided within the two CSAR data sets. The features used to train SVR-KB are knowledge-based pairwise potentials, while SVR-EP is based on physicochemical properties. SVR-KB and SVR-EP were compared to seven other widely used scoring functions, including Glide, X-score, GoldScore, ChemScore, Vina, Dock, and PMF. Results showed that SVR-KB trained with features obtained from three-dimensional complexes of the PDBbind data set outperformed all other scoring functions, including best performing X-score, by nearly 0.1 using three correlation coefficients, namely Pearson, Spearman, and Kendall. It was interesting that higher performance in rank ordering did not translate into greater enrichment in virtual screening assessed using the 40 targets of the Directory of Useful Decoys (DUD). To remedy this situation, a variant of SVR-KB (SVR-KBD) was developed by following a target-specific tailoring strategy that we had previously employed to derive SVM-SP. SVR-KBD showed a much higher enrichment, outperforming all other scoring functions tested, and was comparable in performance to our previously derived scoring function SVM-SP.
Cui, De-Mi; Wang, Xiao-Quan; Lu, Lie-Min
2017-01-01
Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT’s turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts’ interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology’s effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction. PMID:29068431
DeepSynergy: predicting anti-cancer drug synergy with Deep Learning
Preuer, Kristina; Lewis, Richard P I; Hochreiter, Sepp; Bender, Andreas; Bulusu, Krishna C; Klambauer, Günter
2018-01-01
Abstract Motivation While drug combination therapies are a well-established concept in cancer treatment, identifying novel synergistic combinations is challenging due to the size of combinatorial space. However, computational approaches have emerged as a time- and cost-efficient way to prioritize combinations to test, based on recently available large-scale combination screening data. Recently, Deep Learning has had an impact in many research areas by achieving new state-of-the-art model performance. However, Deep Learning has not yet been applied to drug synergy prediction, which is the approach we present here, termed DeepSynergy. DeepSynergy uses chemical and genomic information as input information, a normalization strategy to account for input data heterogeneity, and conical layers to model drug synergies. Results DeepSynergy was compared to other machine learning methods such as Gradient Boosting Machines, Random Forests, Support Vector Machines and Elastic Nets on the largest publicly available synergy dataset with respect to mean squared error. DeepSynergy significantly outperformed the other methods with an improvement of 7.2% over the second best method at the prediction of novel drug combinations within the space of explored drugs and cell lines. At this task, the mean Pearson correlation coefficient between the measured and the predicted values of DeepSynergy was 0.73. Applying DeepSynergy for classification of these novel drug combinations resulted in a high predictive performance of an AUC of 0.90. Furthermore, we found that all compared methods exhibit low predictive performance when extrapolating to unexplored drugs or cell lines, which we suggest is due to limitations in the size and diversity of the dataset. We envision that DeepSynergy could be a valuable tool for selecting novel synergistic drug combinations. Availability and implementation DeepSynergy is available via www.bioinf.jku.at/software/DeepSynergy. Contact klambauer@bioinf.jku.at Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253077
Predicting human liver microsomal stability with machine learning techniques.
Sakiyama, Yojiro; Yuki, Hitomi; Moriya, Takashi; Hattori, Kazunari; Suzuki, Misaki; Shimada, Kaoru; Honma, Teruki
2008-02-01
To ensure a continuing pipeline in pharmaceutical research, lead candidates must possess appropriate metabolic stability in the drug discovery process. In vitro ADMET (absorption, distribution, metabolism, elimination, and toxicity) screening provides us with useful information regarding the metabolic stability of compounds. However, before the synthesis stage, an efficient process is required in order to deal with the vast quantity of data from large compound libraries and high-throughput screening. Here we have derived a relationship between the chemical structure and its metabolic stability for a data set of in-house compounds by means of various in silico machine learning such as random forest, support vector machine (SVM), logistic regression, and recursive partitioning. For model building, 1952 proprietary compounds comprising two classes (stable/unstable) were used with 193 descriptors calculated by Molecular Operating Environment. The results using test compounds have demonstrated that all classifiers yielded satisfactory results (accuracy > 0.8, sensitivity > 0.9, specificity > 0.6, and precision > 0.8). Above all, classification by random forest as well as SVM yielded kappa values of approximately 0.7 in an independent validation set, slightly higher than other classification tools. These results suggest that nonlinear/ensemble-based classification methods might prove useful in the area of in silico ADME modeling.
Jaya, T; Dheeba, J; Singh, N Albert
2015-12-01
Diabetic retinopathy is a major cause of vision loss in diabetic patients. Currently, there is a need for making decisions using intelligent computer algorithms when screening a large volume of data. This paper presents an expert decision-making system designed using a fuzzy support vector machine (FSVM) classifier to detect hard exudates in fundus images. The optic discs in the colour fundus images are segmented to avoid false alarms using morphological operations and based on circular Hough transform. To discriminate between the exudates and the non-exudates pixels, colour and texture features are extracted from the images. These features are given as input to the FSVM classifier. The classifier analysed 200 retinal images collected from diabetic retinopathy screening programmes. The tests made on the retinal images show that the proposed detection system has better discriminating power than the conventional support vector machine. With the best combination of FSVM and features sets, the area under the receiver operating characteristic curve reached 0.9606, which corresponds to a sensitivity of 94.1% with a specificity of 90.0%. The results suggest that detecting hard exudates using FSVM contribute to computer-assisted detection of diabetic retinopathy and as a decision support system for ophthalmologists.
Investigation of a less rare-earth permanent-magnet machine with the consequent pole rotor
NASA Astrophysics Data System (ADS)
Bai, Jingang; Liu, Jiaqi; Wang, Mingqiao; Zheng, Ping; Liu, Yong; Gao, Haibo; Xiao, Lijun
2018-05-01
Due to the rising price of rare-earth materials, permanent-magnet (PM) machines in different applications have a trend of reducing the use of rare-earth materials. Since iron-core poles replace half of PM poles in the consequent pole (CP) rotor, the PM machine with CP rotor can be a promising candidate for less rare-earth PM machine. Additionally, the investigation of CP rotor in special electrical machines, like hybrid excitation permanent-magnet PM machine, bearingless motor, etc., has verified the application feasibility of CP rotor. Therefore, this paper focuses on design and performance of PM machines when traditional PM machine uses the CP rotor. In the CP rotor, all the PMs are of the same polarity and they are inserted into the rotor core. Since the fundamental PM flux density depends on the ratio of PM pole to iron-core pole, the combination rule between them is investigated by analytical and finite-element methods. On this basis, to comprehensively analyze and evaluate PM machine with CP rotor, four typical schemes, i.e., integer-slot machines with CP rotor and surface-mounted PM (SPM) rotor, fractional-slot machines with CP rotor and SPM rotor, are designed to investigate the performance of PM machine with CP rotor, including electromagnetic performance, anti-demagnetization capacity and cost.
Methods for Effective Virtual Screening and Scaffold-Hopping in Chemical Compounds
2007-04-04
contains color images. 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 12 19a...Opterons with 4 GB of memory . We used the descriptor- spaces GF, ECZ3, and ErG (described in Section 4) for the evaluating the methods introduced in...screening: Use of data fusion and machine learning to enchance the effectiveness of sim- ilarity searching. J. Chem. Info. Model., (46):462–470, 2006. [18] J
Machine learning of molecular properties: Locality and active learning
NASA Astrophysics Data System (ADS)
Gubaev, Konstantin; Podryabinkin, Evgeny V.; Shapeev, Alexander V.
2018-06-01
In recent years, the machine learning techniques have shown great potent1ial in various problems from a multitude of disciplines, including materials design and drug discovery. The high computational speed on the one hand and the accuracy comparable to that of density functional theory on another hand make machine learning algorithms efficient for high-throughput screening through chemical and configurational space. However, the machine learning algorithms available in the literature require large training datasets to reach the chemical accuracy and also show large errors for the so-called outliers—the out-of-sample molecules, not well-represented in the training set. In the present paper, we propose a new machine learning algorithm for predicting molecular properties that addresses these two issues: it is based on a local model of interatomic interactions providing high accuracy when trained on relatively small training sets and an active learning algorithm of optimally choosing the training set that significantly reduces the errors for the outliers. We compare our model to the other state-of-the-art algorithms from the literature on the widely used benchmark tests.
Hardware support for software controlled fast multiplexing of performance counters
Salapura, Valentina; Wisniewski, Robert W
2013-10-01
Performance counters may be operable to collect one or more counts of one or more selected activities, and registers may be operable to store a set of performance counter configurations. A state machine may be operable to automatically select a register from the registers for reconfiguring the one or more performance counters in response to receiving a first signal. The state machine may be further operable to reconfigure the one or more performance counters based on a configuration specified in the selected register. The state machine yet further may be operable to copy data in selected one or more of the performance counters to a memory location, or to copy data from the memory location to the counters, in response to receiving a second signal. The state machine may be operable to store or restore the counter values and state machine configuration in response to a context switch event.
Hardware support for software controlled fast multiplexing of performance counters
Salapura, Valentina; Wisniewski, Robert W.
2013-01-01
Performance counters may be operable to collect one or more counts of one or more selected activities, and registers may be operable to store a set of performance counter configurations. A state machine may be operable to automatically select a register from the registers for reconfiguring the one or more performance counters in response to receiving a first signal. The state machine may be further operable to reconfigure the one or more performance counters based on a configuration specified in the selected register. The state machine yet further may be operable to copy data in selected one or more of the performance counters to a memory location, or to copy data from the memory location to the counters, in response to receiving a second signal. The state machine may be operable to store or restore the counter values and state machine configuration in response to a context switch event.
Machine learning approaches to analysing textual injury surveillance data: a systematic review.
Vallmuur, Kirsten
2015-06-01
To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Systematic review. The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field. Copyright © 2015 Elsevier Ltd. All rights reserved.
CHARACTERIZATION OF Pro-Beam LOW VOLTAGE ELECTRON BEAM WELDING MACHINE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burgardt, Paul; Pierce, Stanley W.
The purpose of this paper is to present and discuss data related to the performance of a newly acquired low voltage electron beam welding machine. The machine was made by Pro-Beam AG &Co. KGaA of Germany. This machine was recently installed at LANL in building SM -39; a companion machine was installed in the production facility. The PB machine is substantially different than the EBW machines typically used at LANL and therefore, it is important to understand its characteristics as well as possible. Our basic purpose in this paper is to present basic machine performance data and to compare thosemore » with similar results from the existing EBW machines. It is hoped that this data will provide a historical record of this machine’s characteristics as well as possibly being helpful for transferring welding processes from the old EBW machines to the PB machine or comparable machines that may be purchased in the future.« less
NASA Astrophysics Data System (ADS)
Xu, Lili; Luo, Shuqian
2010-11-01
Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.
Xu, Lili; Luo, Shuqian
2010-01-01
Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.
An iterative learning control method with application for CNC machine tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, D.I.; Kim, S.
1996-01-01
A proportional, integral, and derivative (PID) type iterative learning controller is proposed for precise tracking control of industrial robots and computer numerical controller (CNC) machine tools performing repetitive tasks. The convergence of the output error by the proposed learning controller is guaranteed under a certain condition even when the system parameters are not known exactly and unknown external disturbances exist. As the proposed learning controller is repeatedly applied to the industrial robot or the CNC machine tool with the path-dependent repetitive task, the distance difference between the desired path and the actual tracked or machined path, which is one ofmore » the most significant factors in the evaluation of control performance, is progressively reduced. The experimental results demonstrate that the proposed learning controller can improve machining accuracy when the CNC machine tool performs repetitive machining tasks.« less
Carpenter, Kristy A; Huang, Xudong
2018-06-07
Virtual Screening (VS) has emerged as an important tool in the drug development process, as it conducts efficient in silico searches over millions of compounds, ultimately increasing yields of potential drug leads. As a subset of Artificial Intelligence (AI), Machine Learning (ML) is a powerful way of conducting VS for drug leads. ML for VS generally involves assembling a filtered training set of compounds, comprised of known actives and inactives. After training the model, it is validated and, if sufficiently accurate, used on previously unseen databases to screen for novel compounds with desired drug target binding activity. The study aims to review ML-based methods used for VS and applications to Alzheimer's disease (AD) drug discovery. To update the current knowledge on ML for VS, we review thorough backgrounds, explanations, and VS applications of the following ML techniques: Naïve Bayes (NB), k-Nearest Neighbors (kNN), Support Vector Machines (SVM), Random Forests (RF), and Artificial Neural Networks (ANN). All techniques have found success in VS, but the future of VS is likely to lean more heavily toward the use of neural networks - and more specifically, Convolutional Neural Networks (CNN), which are a subset of ANN that utilize convolution. We additionally conceptualize a work flow for conducting ML-based VS for potential therapeutics of for AD, a complex neurodegenerative disease with no known cure and prevention. This both serves as an example of how to apply the concepts introduced earlier in the review and as a potential workflow for future implementation. Different ML techniques are powerful tools for VS, and they have advantages and disadvantages albeit. ML-based VS can be applied to AD drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Partitioned learning of deep Boltzmann machines for SNP data.
Hess, Moritz; Lenz, Stefan; Blätte, Tamara J; Bullinger, Lars; Binder, Harald
2017-10-15
Learning the joint distributions of measurements, and in particular identification of an appropriate low-dimensional manifold, has been found to be a powerful ingredient of deep leaning approaches. Yet, such approaches have hardly been applied to single nucleotide polymorphism (SNP) data, probably due to the high number of features typically exceeding the number of studied individuals. After a brief overview of how deep Boltzmann machines (DBMs), a deep learning approach, can be adapted to SNP data in principle, we specifically present a way to alleviate the dimensionality problem by partitioned learning. We propose a sparse regression approach to coarsely screen the joint distribution of SNPs, followed by training several DBMs on SNP partitions that were identified by the screening. Aggregate features representing SNP patterns and the corresponding SNPs are extracted from the DBMs by a combination of statistical tests and sparse regression. In simulated case-control data, we show how this can uncover complex SNP patterns and augment results from univariate approaches, while maintaining type 1 error control. Time-to-event endpoints are considered in an application with acute myeloid leukemia patients, where SNP patterns are modeled after a pre-screening based on gene expression data. The proposed approach identified three SNPs that seem to jointly influence survival in a validation dataset. This indicates the added value of jointly investigating SNPs compared to standard univariate analyses and makes partitioned learning of DBMs an interesting complementary approach when analyzing SNP data. A Julia package is provided at 'http://github.com/binderh/BoltzmannMachines.jl'. binderh@imbi.uni-freiburg.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Modelling parallel programs and multiprocessor architectures with AXE
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.
1991-01-01
AXE, An Experimental Environment for Parallel Systems, was designed to model and simulate for parallel systems at the process level. It provides an integrated environment for specifying computation models, multiprocessor architectures, data collection, and performance visualization. AXE is being used at NASA-Ames for developing resource management strategies, parallel problem formulation, multiprocessor architectures, and operating system issues related to the High Performance Computing and Communications Program. AXE's simple, structured user-interface enables the user to model parallel programs and machines precisely and efficiently. Its quick turn-around time keeps the user interested and productive. AXE models multicomputers. The user may easily modify various architectural parameters including the number of sites, connection topologies, and overhead for operating system activities. Parallel computations in AXE are represented as collections of autonomous computing objects known as players. Their use and behavior is described. Performance data of the multiprocessor model can be observed on a color screen. These include CPU and message routing bottlenecks, and the dynamic status of the software.
Hall, R M; Unsworth, A
1997-08-01
Although the reduction of frictional torques was the driving force behind the design of the Charnley prosthesis, later concerns about wear and subsequent loosening of this and other hip replacements have dominated debate within the bioengineering community. To stimulate discussion on the role of friction in loosening, a review of the frictional characteristics of different prostheses was undertaken. The use of simple laboratory screening-type machines in the frictional assessment of different material combinations is discussed together with experiments performed on single axis simulators using both conventional and experimental prostheses. In particular, recent developments in the use of soft layer components are highlighted. Further, the possible link between excessively high frictional torques and loosening is discussed in the light of current results obtained from explanted prostheses.
Reverse engineering of wörner type drilling machine structure.
NASA Astrophysics Data System (ADS)
Wibowo, A.; Belly, I.; llhamsyah, R.; Indrawanto; Yuwana, Y.
2018-03-01
A product design needs to be modified based on the conditions of production facilities and existing resource capabilities without reducing the functional aspects of the product itself. This paper describes the reverse engineering process of the main structure of the wörner type drilling machine to obtain a machine structure design that can be made by resources with limited ability by using simple processes. Some structural, functional and the work mechanism analyzes have been performed to understand the function and role of each basic components. The process of dismantling of the drilling machine and measuring each of the basic components was performed to obtain sets of the geometry and size data of each component. The geometric model of each structure components and the machine assembly were built to facilitate the simulation process and machine performance analysis that refers to ISO standard of drilling machine. The tolerance stackup analysis also performed to determine the type and value of geometrical and dimensional tolerances, which could affect the ease of the components to be manufactured and assembled
Detecting Abnormal Machine Characteristics in Cloud Infrastructures
NASA Technical Reports Server (NTRS)
Bhaduri, Kanishka; Das, Kamalika; Matthews, Bryan L.
2011-01-01
In the cloud computing environment resources are accessed as services rather than as a product. Monitoring this system for performance is crucial because of typical pay-peruse packages bought by the users for their jobs. With the huge number of machines currently in the cloud system, it is often extremely difficult for system administrators to keep track of all machines using distributed monitoring programs such as Ganglia1 which lacks system health assessment and summarization capabilities. To overcome this problem, we propose a technique for automated anomaly detection using machine performance data in the cloud. Our algorithm is entirely distributed and runs locally on each computing machine on the cloud in order to rank the machines in order of their anomalous behavior for given jobs. There is no need to centralize any of the performance data for the analysis and at the end of the analysis, our algorithm generates error reports, thereby allowing the system administrators to take corrective actions. Experiments performed on real data sets collected for different jobs validate the fact that our algorithm has a low overhead for tracking anomalous machines in a cloud infrastructure.
Legrain, Fleur; Carrete, Jesús; van Roekeghem, Ambroise; Madsen, Georg K H; Mingo, Natalio
2018-01-18
Machine learning (ML) is increasingly becoming a helpful tool in the search for novel functional compounds. Here we use classification via random forests to predict the stability of half-Heusler (HH) compounds, using only experimentally reported compounds as a training set. Cross-validation yields an excellent agreement between the fraction of compounds classified as stable and the actual fraction of truly stable compounds in the ICSD. The ML model is then employed to screen 71 178 different 1:1:1 compositions, yielding 481 likely stable candidates. The predicted stability of HH compounds from three previous high-throughput ab initio studies is critically analyzed from the perspective of the alternative ML approach. The incomplete consistency among the three separate ab initio studies and between them and the ML predictions suggests that additional factors beyond those considered by ab initio phase stability calculations might be determinant to the stability of the compounds. Such factors can include configurational entropies and quasiharmonic contributions.
Choice and maintenance of equipment for electron crystallography.
Mills, Deryck J; Vonck, Janet
2013-01-01
The choice of equipment for an electron crystallography laboratory will ultimately be determined by the available budget; nevertheless, the ideal lab will have two electron microscopes: a dedicated 300 kV cryo-EM with a field emission gun and a smaller LaB(6) machine for screening. The high-end machine should be equipped with photographic film or a very large CCD or CMOS camera for 2D crystal data collection; the screening microscope needs a mid-size CCD for rapid evaluation of crystal samples. The microscope room installations should provide adequate space and a special environment that puts no restrictions on the collection of high-resolution data. Equipment for specimen preparation includes a carbon coater, glow discharge unit, light microscope, plunge freezer, and liquid nitrogen containers and storage dewars. When photographic film is to be used, additional requirements are a film desiccator, dark room, optical diffractometer, and a film scanner. Having the electron microscopes and ancillary equipment well maintained and always in optimum condition facilitates the production of high-quality data.
NASA Astrophysics Data System (ADS)
Matras, A.; Kowalczyk, R.
2014-11-01
The analysis results of machining accuracy after the free form surface milling simulations (based on machining EN AW- 7075 alloys) for different machining strategies (Level Z, Radial, Square, Circular) are presented in the work. Particular milling simulations were performed using CAD/CAM Esprit software. The accuracy of obtained allowance is defined as a difference between the theoretical surface of work piece element (the surface designed in CAD software) and the machined surface after a milling simulation. The difference between two surfaces describes a value of roughness, which is as the result of tool shape mapping on the machined surface. Accuracy of the left allowance notifies in direct way a surface quality after the finish machining. Described methodology of usage CAD/CAM software can to let improve a time design of machining process for a free form surface milling by a 5-axis CNC milling machine with omitting to perform the item on a milling machine in order to measure the machining accuracy for the selected strategies and cutting data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demerdash, N.A.; Nehl, T.W.; Nyamusa, T.A.
1985-08-01
Effects of high momentary overloads on the samarium-cobalt and strontium-ferrite permanent magnets and the magnetic field in electronically commutated brushless dc machines, as well as their impact on the associated machine parameters were studied. The effect of overload on the machine parameters, and subsequently on the machine system performance was also investigated. This was accomplished through the combined use of finite element analysis of the magnetic field in such machines, perturbation of the magnetic energies to determine machine inductances, and dynamic simulation of the performance of brushless dc machines, when energized from voltage source inverters. These effects were investigated throughmore » application of the above methods to two equivalent 15 hp brushless dc motors, one of which was built with samarium-cobalt magnets, while the other was built with strontium- ferrite magnets. For momentary overloads as high as 4.5 p.u. magnet flux reductions of 29% and 42% of the no load flux were obtained in the samarium-cobalt and strontiumferrite machines, respectively. Corresponding reductions in the line to line armature inductances of 52% and 46% of the no load values were reported for the samarium-cobalt and strontium-ferrite cases, respectively. The overload affected the profiles and magnitudes of armature induced back emfs. Subsequently, the effects of overload on machine parameters were found to have significant impact on the performance of the machine systems, where findings indicate that the samarium-cobalt unit is more suited for higher overload duties than the strontium-ferrite machine.« less
Develop a solution for protecting and securing enterprise networks from malicious attacks
NASA Astrophysics Data System (ADS)
Kamuru, Harshitha; Nijim, Mais
2014-05-01
In the world of computer and network security, there are myriad ways to launch an attack, which, from the perspective of a network, can usually be defined as "traffic that has huge malicious intent." Firewall acts as one of the measure in order to secure the device from incoming unauthorized data. There are infinite number of computer attacks that no firewall can prevent, such as those executed locally on the machine by a malicious user. From the network's perspective, there are numerous types of attack. All the attacks that degrade the effectiveness of data can be grouped into two types: brute force and precision. The Firewall that belongs to Juniper has the capability to protect against both types of attack. Denial of Service (DoS) attacks are one of the most well-known network security threats under brute force attacks, which is largely due to the high-profile way in which they can affect networks. Over the years, some of the largest, most respected Internet sites have been effectively taken offline by Denial of Service (DOS) attacks. A DoS attack typically has a singular focus, namely, to cause the services running on a particular host or network to become unavailable. Some DoS attacks exploit vulnerabilities in an operating system and cause it to crash, such as the infamous Win nuke attack. Others submerge a network or device with traffic so that there are no more resources to handle legitimate traffic. Precision attacks typically involve multiple phases and often involves a bit more thought than brute force attacks, all the way from reconnaissance to machine ownership. Before a precision attack is launched, information about the victim needs to be gathered. This information gathering typically takes the form of various types of scans to determine available hosts, networks, and ports. The hosts available on a network can be determined by ping sweeps. The available ports on a machine can be located by port scans. Screens cover a wide variety of attack traffic as they are configured on a per-zone basis. Depending on the type of screen being configured, there may be additional settings beyond simply blocking the traffic. Attack prevention is also a native function of any firewall. Juniper Firewall handles traffic on a per-flow basis. We can use flows or sessions as a way to determine whether traffic attempting to traverse the firewall is legitimate. We control the state-checking components resident in Juniper Firewall by configuring "flow" settings. These settings allow you to configure state checking for various conditions on the device. You can use flow settings to protect against TCP hijacking, and to generally ensure that the fire-wall is performing full state processing when desired. We take a case study of attack on a network and perform study of the detection of the malicious packets on a Net screen Firewall. A new solution for securing enterprise networks will be developed here.
Yousefi, Mina; Krzyżak, Adam; Suen, Ching Y
2018-05-01
Digital breast tomosynthesis (DBT) was developed in the field of breast cancer screening as a new tomographic technique to minimize the limitations of conventional digital mammography breast screening methods. A computer-aided detection (CAD) framework for mass detection in DBT has been developed and is described in this paper. The proposed framework operates on a set of two-dimensional (2D) slices. With plane-to-plane analysis on corresponding 2D slices from each DBT, it automatically learns complex patterns of 2D slices through a deep convolutional neural network (DCNN). It then applies multiple instance learning (MIL) with a randomized trees approach to classify DBT images based on extracted information from 2D slices. This CAD framework was developed and evaluated using 5040 2D image slices derived from 87 DBT volumes. The empirical results demonstrate that this proposed CAD framework achieves much better performance than CAD systems that use hand-crafted features and deep cardinality-restricted Bolzmann machines to detect masses in DBTs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Obstructive Sleep Apnea Screening Using a Piezo-Electric Sensor
2017-01-01
In this study, we propose a novel method for obstructive sleep apnea (OSA) detection using a piezo-electric sensor. OSA is a relatively common sleep disorder. However, more than 80% of OSA patients remain undiagnosed. We investigated the feasibility of OSA assessment using a single-channel physiological signal to simplify the OSA screening. We detected both snoring and heartbeat information by using a piezo-electric sensor, and snoring index (SI) and features based on pulse rate variability (PRV) analysis were extracted from the filtered piezo-electric sensor signal. A support vector machine (SVM) was used as a classifier to detect OSA events. The performance of the proposed method was evaluated on 45 patients from mild, moderate, and severe OSA groups. The method achieved a mean sensitivity, specificity, and accuracy of 72.5%, 74.2%, and 71.5%; 85.8%, 80.5%, and 80.0%; and 70.3%, 77.1%, and 71.9% for the mild, moderate, and severe groups, respectively. Finally, these results not only show the feasibility of OSA detection using a piezo-electric sensor, but also illustrate its usefulness for monitoring sleep and diagnosing OSA. PMID:28480645
[Automatic pre-transfusion serology].
Wattar, B; Govaerts, A
1975-12-01
This paper describes an automated apparatus combining Rosenfield's and Lalezari's antibody screening and identification basic technics. PVP bromelin and low ionic strength acid polybren channels are used; agglutinates are decanded; the remaining cells are hemolyzed and the optical density is then measured through a colorimeter and recorded on a chart; speed is of 40 samples an hour. This machine was also used for irregular antibody screening and identification. Sensitivity is shown to be equal to that of manual technics for ABO, Lewis, Lutheran as well as K, S, M, Kpb, Xga, U and Vel antibodies detection. Nevertheless, a much greater sensitivity is achieved (titers 3 to 10 times higher) than by manual technics for Rh, -k, S, Fya antibodies detection. Polybren channel is suitable for anti-Rh, Duffy, I and M (human detection; bromelin channel however, has a greater sensitivity for other specificities. Anti-M and anti-N sera from rabbits were shown to be non specific when using this machine. Over almost 15 000 sera tested, no antibody (detected by manual techniques) escaped the automated screening. This antibody detection machine was applied to compatibility tests prior to transfusion. (21 480 units were tested. aimed to be transfused to 5 611 patients). A third, PVP without bromelin, was set in parallel in order not to let escape any anti-M, even a weak one. The sera distributor was slaved to the cells distributor so that the whole procedure was automated. Furthermore, each serum was tested against red cells to be transfused, but also against the patient's own red cells to be transfused, but also against the patient's own red cells and against two selected red cells panels, so as to ensure irregular antibody detection at the same time. Using this machine, 3 to 4% of the cell samples were rejected, i.e. more than with usual techniques. All manually detected antibodies were identified, but also some others, which showed only weak reactions by classical techniques. Total results can be obtained within 20 to 30 minutes, which is quite rapid, compared to techniques using for example antiglobulin tests.
Merritt, Stephanie M; Ilgen, Daniel R
2008-04-01
We provide an empirical demonstration of the importance of attending to human user individual differences in examinations of trust and automation use. Past research has generally supported the notions that machine reliability predicts trust in automation, and trust in turn predicts automation use. However, links between user personality and perceptions of the machine with trust in automation have not been empirically established. On our X-ray screening task, 255 students rated trust and made automation use decisions while visually searching for weapons in X-ray images of luggage. We demonstrate that individual differences affect perceptions of machine characteristics when actual machine characteristics are constant, that perceptions account for 52% of trust variance above the effects of actual characteristics, and that perceptions mediate the effects of actual characteristics on trust. Importantly, we also demonstrate that when administered at different times, the same six trust items reflect two types of trust (dispositional trust and history-based trust) and that these two trust constructs are differentially related to other variables. Interactions were found among user characteristics, machine characteristics, and automation use. Our results suggest that increased specificity in the conceptualization and measurement of trust is required, future researchers should assess user perceptions of machine characteristics in addition to actual machine characteristics, and incorporation of user extraversion and propensity to trust machines can increase prediction of automation use decisions. Potential applications include the design of flexible automation training programs tailored to individuals who differ in systematic ways.
NASA Astrophysics Data System (ADS)
Liu, Xiangquan
According to the treatment needs of patients with limb movement disorder, on the basis of the limb rehabilitative training prototype, function of measure and control system are analyzed, design of system hardware and software is completed. The touch screen which is adopt as host computer and man-machine interaction window is responsible for sending commands and training information display; The PLC which is adopt as slave computer is responsible for receiving control command from touch screen, collecting the sensor data, regulating torque and speed of motor by analog output according to the different training mode, realizing ultimately active and passive training for limb rehabilitation therapy.
NASA Astrophysics Data System (ADS)
Schierz, Amanda C.; King, Ross D.
Compounds in drug screening-libraries should resemble pharmaceuticals. To operationally test this, we analysed the compounds in terms of known drug-like filters and developed a novel machine learning method to discriminate approved pharmaceuticals from “drug-like” compounds. This method uses both structural features and molecular properties for discrimination. The method has an estimated accuracy of 91% in discriminating between the Maybridge HitFinder library and approved pharmaceuticals, and 99% between the NATDiverse collection (from Analyticon Discovery) and approved pharmaceuticals. These results show that Lipinski’s Rule of 5 for oral absorption is not sufficient to describe “drug-likeness” and be the main basis of screening-library design.
Held, Elizabeth; Cape, Joshua; Tintle, Nathan
2016-01-01
Machine learning methods continue to show promise in the analysis of data from genetic association studies because of the high number of variables relative to the number of observations. However, few best practices exist for the application of these methods. We extend a recently proposed supervised machine learning approach for predicting disease risk by genotypes to be able to incorporate gene expression data and rare variants. We then apply 2 different versions of the approach (radial and linear support vector machines) to simulated data from Genetic Analysis Workshop 19 and compare performance to logistic regression. Method performance was not radically different across the 3 methods, although the linear support vector machine tended to show small gains in predictive ability relative to a radial support vector machine and logistic regression. Importantly, as the number of genes in the models was increased, even when those genes contained causal rare variants, model predictive ability showed a statistically significant decrease in performance for both the radial support vector machine and logistic regression. The linear support vector machine showed more robust performance to the inclusion of additional genes. Further work is needed to evaluate machine learning approaches on larger samples and to evaluate the relative improvement in model prediction from the incorporation of gene expression data.
Prediction of cell penetrating peptides by support vector machines.
Sanders, William S; Johnston, C Ian; Bridges, Susan M; Burgess, Shane C; Willeford, Kenneth O
2011-07-01
Cell penetrating peptides (CPPs) are those peptides that can transverse cell membranes to enter cells. Once inside the cell, different CPPs can localize to different cellular components and perform different roles. Some generate pore-forming complexes resulting in the destruction of cells while others localize to various organelles. Use of machine learning methods to predict potential new CPPs will enable more rapid screening for applications such as drug delivery. We have investigated the influence of the composition of training datasets on the ability to classify peptides as cell penetrating using support vector machines (SVMs). We identified 111 known CPPs and 34 known non-penetrating peptides from the literature and commercial vendors and used several approaches to build training data sets for the classifiers. Features were calculated from the datasets using a set of basic biochemical properties combined with features from the literature determined to be relevant in the prediction of CPPs. Our results using different training datasets confirm the importance of a balanced training set with approximately equal number of positive and negative examples. The SVM based classifiers have greater classification accuracy than previously reported methods for the prediction of CPPs, and because they use primary biochemical properties of the peptides as features, these classifiers provide insight into the properties needed for cell-penetration. To confirm our SVM classifications, a subset of peptides classified as either penetrating or non-penetrating was selected for synthesis and experimental validation. Of the synthesized peptides predicted to be CPPs, 100% of these peptides were shown to be penetrating.
Method and apparatus for characterizing and enhancing the functional performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F; Smith, Kevin Scott; Assaid, Thomas S; McFarland, Justin T; Tursky, David A; Woody, Bethany; Adams, David
2013-04-30
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include workpiece surface finish, and the ability to generate chips of the desired length.
NASA Astrophysics Data System (ADS)
Zhou, Ming; Wu, Jianyang; Xu, Xiaoyi; Mu, Xin; Dou, Yunping
2018-02-01
In order to obtain improved electrical discharge machining (EDM) performance, we have dedicated more than a decade to correcting one essential EDM defect, the weak stability of the machining, by developing adaptive control systems. The instabilities of machining are mainly caused by complicated disturbances in discharging. To counteract the effects from the disturbances on machining, we theoretically developed three control laws from minimum variance (MV) control law to minimum variance and pole placements coupled (MVPPC) control law and then to a two-step-ahead prediction (TP) control law. Based on real-time estimation of EDM process model parameters and measured ratio of arcing pulses which is also called gap state, electrode discharging cycle was directly and adaptively tuned so that a stable machining could be achieved. To this end, we not only theoretically provide three proved control laws for a developed EDM adaptive control system, but also practically proved the TP control law to be the best in dealing with machining instability and machining efficiency though the MVPPC control law provided much better EDM performance than the MV control law. It was also shown that the TP control law also provided a burn free machining.
49 CFR Appendix A to Part 1511 - Aviation Security Infrastructure Fee
Code of Federal Regulations, 2011 CFR
2011-10-01
... final acceptance testing. This includes such equipment as Metal Detection Devices, Hand Wands, X-ray... such equipment as Metal Detection Devices, Hand Wands, X-ray screening machines, Explosives Trace... as test objects and X-ray radiation surveys, electricity costs and maintenance contract costs...
ERIC Educational Resources Information Center
Seeds, Michael A.; Seeds, Kathryn Anne
1983-01-01
Provided is a complete listing (Applesoft Basic) for a children's spelling program. The listing includes a machine language music utility that plays short tunes and uses the Apple's two hi-res screens for animation. Also included is a program that allows pictures to be drawn and saved to animate other programs. (JN)
Automatic detection of atrial fibrillation in cardiac vibration signals.
Brueser, C; Diesel, J; Zink, M D H; Winter, S; Schauerte, P; Leonhardt, S
2013-01-01
We present a study on the feasibility of the automatic detection of atrial fibrillation (AF) from cardiac vibration signals (ballistocardiograms/BCGs) recorded by unobtrusive bedmounted sensors. The proposed system is intended as a screening and monitoring tool in home-healthcare applications and not as a replacement for ECG-based methods used in clinical environments. Based on BCG data recorded in a study with 10 AF patients, we evaluate and rank seven popular machine learning algorithms (naive Bayes, linear and quadratic discriminant analysis, support vector machines, random forests as well as bagged and boosted trees) for their performance in separating 30 s long BCG epochs into one of three classes: sinus rhythm, atrial fibrillation, and artifact. For each algorithm, feature subsets of a set of statistical time-frequency-domain and time-domain features were selected based on the mutual information between features and class labels as well as first- and second-order interactions among features. The classifiers were evaluated on a set of 856 epochs by means of 10-fold cross-validation. The best algorithm (random forests) achieved a Matthews correlation coefficient, mean sensitivity, and mean specificity of 0.921, 0.938, and 0.982, respectively.
Method and system for fault accommodation of machines
NASA Technical Reports Server (NTRS)
Goebel, Kai Frank (Inventor); Subbu, Rajesh Venkat (Inventor); Rausch, Randal Thomas (Inventor); Frederick, Dean Kimball (Inventor)
2011-01-01
A method for multi-objective fault accommodation using predictive modeling is disclosed. The method includes using a simulated machine that simulates a faulted actual machine, and using a simulated controller that simulates an actual controller. A multi-objective optimization process is performed, based on specified control settings for the simulated controller and specified operational scenarios for the simulated machine controlled by the simulated controller, to generate a Pareto frontier-based solution space relating performance of the simulated machine to settings of the simulated controller, including adjustment to the operational scenarios to represent a fault condition of the simulated machine. Control settings of the actual controller are adjusted, represented by the simulated controller, for controlling the actual machine, represented by the simulated machine, in response to a fault condition of the actual machine, based on the Pareto frontier-based solution space, to maximize desirable operational conditions and minimize undesirable operational conditions while operating the actual machine in a region of the solution space defined by the Pareto frontier.
Uhlig, Johannes; Uhlig, Annemarie; Kunze, Meike; Beissbarth, Tim; Fischer, Uwe; Lotz, Joachim; Wienbeck, Susanne
2018-05-24
The purpose of this study is to evaluate the diagnostic performance of machine learning techniques for malignancy prediction at breast cone-beam CT (CBCT) and to compare them to human readers. Five machine learning techniques, including random forests, back propagation neural networks (BPN), extreme learning machines, support vector machines, and K-nearest neighbors, were used to train diagnostic models on a clinical breast CBCT dataset with internal validation by repeated 10-fold cross-validation. Two independent blinded human readers with profound experience in breast imaging and breast CBCT analyzed the same CBCT dataset. Diagnostic performance was compared using AUC, sensitivity, and specificity. The clinical dataset comprised 35 patients (American College of Radiology density type C and D breasts) with 81 suspicious breast lesions examined with contrast-enhanced breast CBCT. Forty-five lesions were histopathologically proven to be malignant. Among the machine learning techniques, BPNs provided the best diagnostic performance, with AUC of 0.91, sensitivity of 0.85, and specificity of 0.82. The diagnostic performance of the human readers was AUC of 0.84, sensitivity of 0.89, and specificity of 0.72 for reader 1 and AUC of 0.72, sensitivity of 0.71, and specificity of 0.67 for reader 2. AUC was significantly higher for BPN when compared with both reader 1 (p = 0.01) and reader 2 (p < 0.001). Machine learning techniques provide a high and robust diagnostic performance in the prediction of malignancy in breast lesions identified at CBCT. BPNs showed the best diagnostic performance, surpassing human readers in terms of AUC and specificity.
Thermal Dispersion Within a Porous Medium Near a Solid Wall
NASA Technical Reports Server (NTRS)
Simon, T.; McFadden, G.; Ibrahim, M.
2006-01-01
The regenerator is a key component to Stirling cycle machine efficiency. Typical regenerators are of sintered fine wires or layers of fine-wire screens. Such porous materials are contained within solid-waH casings. Thermal energy exchange between the regenerator and the casing is important to cycle performance for the matrix and casing would not have the same axial temperature profile in an actual machine. Exchange from one to the other may allow shunting of thermal energy, reducing cycle efficiency. In this paper, temperature profiles within the near-wall region of the matrix are measured and thermal energy transport, termed thermal dispersion, is inferred. The data show how the wall affects thermal transport. Transport normal to the mean flow direction is by conduction within the solid and fluid and by advective transport within the matrix. In the near-wall region, both may be interrupted from their normal in-core pattern. Solid conduction paths are broken and scales of advective transport are damped. An equation is presented which describes this change for a wire screen mesh. The near-wall layer typically acts as an insulating layer. This should be considered in design or analysis. Effective thermal conductivity within the core is uniform. In-core transverse thermal effective conductivity values are compared to direct and indirect measurements reported elsewhere and to 3D numerical simulation results, computed previously and reported elsewhere. The 3-D CFD model is composed of six cylinders in cross flow, staggered in arrangement to match the dimensions and porosity of the matrix used in the experiments. The commercial code FLUENT is used to obtain the flow and thermal fields. The thermal dispersion and effective thermal conductivities for the matrix are computed from the results.
Dry Ribbon for Heated Head Automated Fiber Placement
NASA Technical Reports Server (NTRS)
Hulcher, A. Bruce; Marchello, Joseph M.; Hinkley, Jeffrey A.; Johnston, Norman J.; Lamontia, Mark A.
2000-01-01
Ply-by-ply in situ processes involving automated heated head deposition are being developed for fabrication of high performance, high temperature composite structures from low volatile content polymer matrices. This technology requires (1) dry carbon fiber towpreg, (2) consolidation of towpreg to quality, placement-grade unidirectional ribbon or tape, and (3) rapid, in situ, accurate, ply-by-ply robotic placement and consolidation of this material to fabricate a composite structure. In this study, the physical properties of a candidate thermoplastic ribbon, PIXA/IM7, were evaluated and screened for suitability in robotic placement. Specifically, towpreg was prepared from PIXA powder. Various conditions (temperatures) were used to convert the powder-coated towpreg to ribbons with varying degrees of processability. Ribbon within preset specifications was fabricated at 3 temperatures: 390, 400 and 410 C. Ribbon was also produced out-of-spec by purposely overheating the material to a processing temperature of 450 C. Automated placement equipment at Cincinnati Milacron and NASA Langley was used to fabricate laminates from these experimental ribbons. Ribbons were placed at 405 and 450 C by both sets of equipment. Double cantilever beam and wedge peel tests were used to determine the quality of the laminates and, especially, the interlaminar bond formed during the placement process. Ribbon made under conditions expected to be non-optimal (overheated) resulted in poor placeability and composites with weak interlaminar bond strengths, regardless of placement conditions. Ribbon made under conditions expected to be ideal showed good processability and produced well-consolidated laminates. Results were consistent from machine to machine and demonstrated the importance of ribbon quality in heated-head placement of dry material forms. Preliminary screening criteria for the development and evaluation of ribbon from new matrix materials were validated.
The Early Psychosis Screener (EPS): Quantitative validation against the SIPS using machine learning.
Brodey, B B; Girgis, R R; Favorov, O V; Addington, J; Perkins, D O; Bearden, C E; Woods, S W; Walker, E F; Cornblatt, B A; Brucato, G; Walsh, B; Elkin, K A; Brodey, I S
2018-01-18
Machine learning techniques were used to identify highly informative early psychosis self-report items and to validate an early psychosis screener (EPS) against the Structured Interview for Psychosis-risk Syndromes (SIPS). The Prodromal Questionnaire-Brief Version (PQ-B) and 148 additional items were administered to 229 individuals being screened with the SIPS at 7 North American Prodrome Longitudinal Study sites and at Columbia University. Fifty individuals were found to have SIPS scores of 0, 1, or 2, making them clinically low risk (CLR) controls; 144 were classified as clinically high risk (CHR) (SIPS 3-5) and 35 were found to have first episode psychosis (FEP) (SIPS 6). Spectral clustering analysis, performed on 124 of the items, yielded two cohesive item groups, the first mostly related to psychosis and mania, the second mostly related to depression, anxiety, and social and general work/school functioning. Items within each group were sorted according to their usefulness in distinguishing between CLR and CHR individuals using the Minimum Redundancy Maximum Relevance procedure. A receiver operating characteristic area under the curve (AUC) analysis indicated that maximal differentiation of CLR and CHR participants was achieved with a 26-item solution (AUC=0.899±0.001). The EPS-26 outperformed the PQ-B (AUC=0.834±0.001). For screening purposes, the self-report EPS-26 appeared to differentiate individuals who are either CLR or CHR approximately as well as the clinician-administered SIPS. The EPS-26 may prove useful as a self-report screener and may lead to a decrease in the duration of untreated psychosis. A validation of the EPS-26 against actual conversion is underway. Copyright © 2017 Elsevier B.V. All rights reserved.
Machine learning techniques for energy optimization in mobile embedded systems
NASA Astrophysics Data System (ADS)
Donohoo, Brad Kyoshi
Mobile smartphones and other portable battery operated embedded systems (PDAs, tablets) are pervasive computing devices that have emerged in recent years as essential instruments for communication, business, and social interactions. While performance, capabilities, and design are all important considerations when purchasing a mobile device, a long battery lifetime is one of the most desirable attributes. Battery technology and capacity has improved over the years, but it still cannot keep pace with the power consumption demands of today's mobile devices. This key limiter has led to a strong research emphasis on extending battery lifetime by minimizing energy consumption, primarily using software optimizations. This thesis presents two strategies that attempt to optimize mobile device energy consumption with negligible impact on user perception and quality of service (QoS). The first strategy proposes an application and user interaction aware middleware framework that takes advantage of user idle time between interaction events of the foreground application to optimize CPU and screen backlight energy consumption. The framework dynamically classifies mobile device applications based on their received interaction patterns, then invokes a number of different power management algorithms to adjust processor frequency and screen backlight levels accordingly. The second strategy proposes the usage of machine learning techniques to learn a user's mobile device usage pattern pertaining to spatiotemporal and device contexts, and then predict energy-optimal data and location interface configurations. By learning where and when a mobile device user uses certain power-hungry interfaces (3G, WiFi, and GPS), the techniques, which include variants of linear discriminant analysis, linear logistic regression, non-linear logistic regression, and k-nearest neighbor, are able to dynamically turn off unnecessary interfaces at runtime in order to save energy.
NASA Astrophysics Data System (ADS)
Gaultois, Michael W.; Oliynyk, Anton O.; Mar, Arthur; Sparks, Taylor D.; Mulholland, Gregory J.; Meredig, Bryce
2016-05-01
The experimental search for new thermoelectric materials remains largely confined to a limited set of successful chemical and structural families, such as chalcogenides, skutterudites, and Zintl phases. In principle, computational tools such as density functional theory (DFT) offer the possibility of rationally guiding experimental synthesis efforts toward very different chemistries. However, in practice, predicting thermoelectric properties from first principles remains a challenging endeavor [J. Carrete et al., Phys. Rev. X 4, 011019 (2014)], and experimental researchers generally do not directly use computation to drive their own synthesis efforts. To bridge this practical gap between experimental needs and computational tools, we report an open machine learning-based recommendation engine (
Research and development of Camellia oleifera fruit sheller and sorting machine
NASA Astrophysics Data System (ADS)
Kang, Di; Wang, Yong; Fan, Youhua; Chen, Zejun
2018-01-01
Camellia oleifera fruit sheller in this paper was designed by the principle of kneading and extruding. This machine adopted the rolling classification sieve to screen camellia oleifera fruit with different sizes into the husking device, and camellia oleifera fruit was shelled in the mutually co-operative action of transport belt and flexible rubbing washboard. After research, in the condition that the moisture content of camellia oleifera fruit was below 55%, the vibration of the motor frequency was 50 Hz and the horizontal angle of sorting belt was 50 degrees∼55 degrees, the processing capacity was more than 900 kg/h, the threshing ratio was more than 97%, the seed broken ratio was less than 5%, the loss ratio was less than 1%. The machine is of great value in actual production, and should be widely spread and applied.
EDM machinability of SiCw/Al composites
NASA Technical Reports Server (NTRS)
Ramulu, M.; Taya, M.
1989-01-01
Machinability of high temperature composites was investigated. Target materials, 15 and 25 vol pct SiC whisker-2124 aluminum composites, were machined by electrodischarge sinker machining and diamond saw. The machined surfaces of these metal matrix composites were examined by SEM and profilometry to determine the surface finish. Microhardness measurements were also performed on the as-machined composites.
Melo, Carlos Fernando Odir Rodrigues; Navarro, Luiz Claudio; de Oliveira, Diogo Noin; Guerreiro, Tatiane Melina; Lima, Estela de Oliveira; Delafiori, Jeany; Dabaja, Mohamed Ziad; Ribeiro, Marta da Silva; de Menezes, Maico; Rodrigues, Rafael Gustavo Martins; Morishita, Karen Noda; Esteves, Cibele Zanardi; de Amorim, Aline Lopes Lucas; Aoyagui, Caroline Tiemi; Parise, Pierina Lorencini; Milanez, Guilherme Paier; do Nascimento, Gabriela Mansano; Ribas Freitas, André Ricardo; Angerami, Rodrigo; Costa, Fábio Trindade Maranhão; Arns, Clarice Weis; Resende, Mariangela Ribeiro; Amaral, Eliana; Junior, Renato Passini; Ribeiro-do-Valle, Carolina C.; Milanez, Helaine; Moretti, Maria Luiza; Proenca-Modena, Jose Luiz; Avila, Sandra; Rocha, Anderson; Catharino, Rodrigo Ramos
2018-01-01
Recent Zika outbreaks in South America, accompanied by unexpectedly severe clinical complications have brought much interest in fast and reliable screening methods for ZIKV (Zika virus) identification. Reverse-transcriptase polymerase chain reaction (RT-PCR) is currently the method of choice to detect ZIKV in biological samples. This approach, nonetheless, demands a considerable amount of time and resources such as kits and reagents that, in endemic areas, may result in a substantial financial burden over affected individuals and health services veering away from RT-PCR analysis. This study presents a powerful combination of high-resolution mass spectrometry and a machine-learning prediction model for data analysis to assess the existence of ZIKV infection across a series of patients that bear similar symptomatic conditions, but not necessarily are infected with the disease. By using mass spectrometric data that are inputted with the developed decision-making algorithm, we were able to provide a set of features that work as a “fingerprint” for this specific pathophysiological condition, even after the acute phase of infection. Since both mass spectrometry and machine learning approaches are well-established and have largely utilized tools within their respective fields, this combination of methods emerges as a distinct alternative for clinical applications, providing a diagnostic screening—faster and more accurate—with improved cost-effectiveness when compared to existing technologies. PMID:29696139
Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C
2013-04-30
A mixed parallel scheme that combines message passing interface (MPI) and multithreading was implemented in the AutoDock Vina molecular docking program. The resulting program, named VinaLC, was tested on the petascale high performance computing (HPC) machines at Lawrence Livermore National Laboratory. To exploit the typical cluster-type supercomputers, thousands of docking calculations were dispatched by the master process to run simultaneously on thousands of slave processes, where each docking calculation takes one slave process on one node, and within the node each docking calculation runs via multithreading on multiple CPU cores and shared memory. Input and output of the program and the data handling within the program were carefully designed to deal with large databases and ultimately achieve HPC on a large number of CPU cores. Parallel performance analysis of the VinaLC program shows that the code scales up to more than 15K CPUs with a very low overhead cost of 3.94%. One million flexible compound docking calculations took only 1.4 h to finish on about 15K CPUs. The docking accuracy of VinaLC has been validated against the DUD data set by the re-docking of X-ray ligands and an enrichment study, 64.4% of the top scoring poses have RMSD values under 2.0 Å. The program has been demonstrated to have good enrichment performance on 70% of the targets in the DUD data set. An analysis of the enrichment factors calculated at various percentages of the screening database indicates VinaLC has very good early recovery of actives. Copyright © 2013 Wiley Periodicals, Inc.
Tool simplifies machining of pipe ends for precision welding
NASA Technical Reports Server (NTRS)
Matus, S. T.
1969-01-01
Single tool prepares a pipe end for precision welding by simultaneously performing internal machining, end facing, and bevel cutting to specification standards. The machining operation requires only one milling adjustment, can be performed quickly, and produces the high quality pipe-end configurations required to ensure precision-welded joints.
Machine Shop. Performance Objectives. Basic Course.
ERIC Educational Resources Information Center
Hilton, Arthur; Lambert, George
Several intermediate performance objectives and corresponding criterion measures are listed for each of 13 terminal objectives for a high school basic machine shop course. The materials were developed for a 36-week course (2 hours daily) designed to enable students to become familiar with the operation of machine shop equipment, to become familiar…
Comparing deep learning models for population screening using chest radiography
NASA Astrophysics Data System (ADS)
Sivaramakrishnan, R.; Antani, Sameer; Candemir, Sema; Xue, Zhiyun; Abuya, Joseph; Kohli, Marc; Alderson, Philip; Thoma, George
2018-02-01
According to the World Health Organization (WHO), tuberculosis (TB) remains the most deadly infectious disease in the world. In a 2015 global annual TB report, 1.5 million TB related deaths were reported. The conditions worsened in 2016 with 1.7 million reported deaths and more than 10 million people infected with the disease. Analysis of frontal chest X-rays (CXR) is one of the most popular methods for initial TB screening, however, the method is impacted by the lack of experts for screening chest radiographs. Computer-aided diagnosis (CADx) tools have gained significance because they reduce the human burden in screening and diagnosis, particularly in countries that lack substantial radiology services. State-of-the-art CADx software typically is based on machine learning (ML) approaches that use hand-engineered features, demanding expertise in analyzing the input variances and accounting for the changes in size, background, angle, and position of the region of interest (ROI) on the underlying medical imagery. More automatic Deep Learning (DL) tools have demonstrated promising results in a wide range of ML applications. Convolutional Neural Networks (CNN), a class of DL models, have gained research prominence in image classification, detection, and localization tasks because they are highly scalable and deliver superior results with end-to-end feature extraction and classification. In this study, we evaluated the performance of CNN based DL models for population screening using frontal CXRs. The results demonstrate that pre-trained CNNs are a promising feature extracting tool for medical imagery including the automated diagnosis of TB from chest radiographs but emphasize the importance of large data sets for the most accurate classification.
Parameter optimization of electrochemical machining process using black hole algorithm
NASA Astrophysics Data System (ADS)
Singh, Dinesh; Shukla, Rajkamal
2017-12-01
Advanced machining processes are significant as higher accuracy in machined component is required in the manufacturing industries. Parameter optimization of machining processes gives optimum control to achieve the desired goals. In this paper, electrochemical machining (ECM) process is considered to evaluate the performance of the considered process using black hole algorithm (BHA). BHA considers the fundamental idea of a black hole theory and it has less operating parameters to tune. The two performance parameters, material removal rate (MRR) and overcut (OC) are considered separately to get optimum machining parameter settings using BHA. The variations of process parameters with respect to the performance parameters are reported for better and effective understanding of the considered process using single objective at a time. The results obtained using BHA are found better while compared with results of other metaheuristic algorithms, such as, genetic algorithm (GA), artificial bee colony (ABC) and bio-geography based optimization (BBO) attempted by previous researchers.
Crowe, Simon F; Mahony, Kate; Jackson, Martin
2004-08-01
The purpose of the current study was to explore whether performance on standardised neuropsychological measures could predict functional ability with automated machines and services among people with an acquired brain injury (ABI). Participants were 45 individuals who met the criteria for mild, moderate or severe ABI and 15 control participants matched on demographic variables including age- and education. Each participant was required to complete a battery of neuropsychological tests, as well as performing three automated service delivery tasks: a transport automated ticketing machine, an automated teller machine (ATM) and an automated telephone service. The results showed consistently high relationship between the neuropsychological measures, both as single predictors and in combination, and level of competency with the automated machines. Automated machines are part of a relatively new phenomena in service delivery and offer an ecologically valid functional measure of performance that represents a true indication of functional disability.
Performance Analysis of Abrasive Waterjet Machining Process at Low Pressure
NASA Astrophysics Data System (ADS)
Murugan, M.; Gebremariam, MA; Hamedon, Z.; Azhari, A.
2018-03-01
Normally, a commercial waterjet cutting machine can generate water pressure up to 600 MPa. This range of pressure is used to machine a wide variety of materials. Hence, the price of waterjet cutting machine is expensive. Therefore, there is a need to develop a low cost waterjet machine in order to make the technology more accessible for the masses. Due to its low cost, such machines may only be able to generate water pressure at a much reduced rate. The present study attempts to investigate the performance of abrasive water jet machining process at low cutting pressure using self-developed low cost waterjet machine. It aims to study the feasibility of machining various materials at low pressure which later can aid in further development of an effective low cost water jet machine. A total of three different materials were machined at a low pressure of 34 MPa. The materials are mild steel, aluminium alloy 6061 and plastics Delrin®. Furthermore, a traverse rate was varied between 1 to 3 mm/min. The study on cutting performance at low pressure for different materials was conducted in terms of depth penetration, kerf taper ratio and surface roughness. It was found that all samples were able to be machined at low cutting pressure with varied qualities. Also, the depth of penetration decreases with an increase in the traverse rate. Meanwhile, the surface roughness and kerf taper ratio increase with an increase in the traverse rate. It can be concluded that a low cost waterjet machine with a much reduced rate of water pressure can be successfully used for machining certain materials with acceptable qualities.
Bioactivity profiling using high-throughput in vitro assays can reduce the cost and time required for toxicological screening of environmental chemicals and can also reduce the need for animal testing. Several public efforts are aimed at discovering patterns or classifiers in hig...
ERIC Educational Resources Information Center
South Carolina State Dept. of Education, Columbia. Office of Vocational Education.
This module on the knife machine, one in a series dealing with industrial sewing machines, their attachments, and operation, covers one topic: performing special operations on the knife machine (a single needle or multi-needle machine which sews and cuts at the same time). These components are provided: an introduction, directions, an objective,…
NASA Astrophysics Data System (ADS)
Zargari Khuzani, Abolfazl; Danala, Gopichandh; Heidari, Morteza; Du, Yue; Mashhadi, Najmeh; Qiu, Yuchen; Zheng, Bin
2018-02-01
Higher recall rates are a major challenge in mammography screening. Thus, developing computer-aided diagnosis (CAD) scheme to classify between malignant and benign breast lesions can play an important role to improve efficacy of mammography screening. Objective of this study is to develop and test a unique image feature fusion framework to improve performance in classifying suspicious mass-like breast lesions depicting on mammograms. The image dataset consists of 302 suspicious masses detected on both craniocaudal and mediolateral-oblique view images. Amongst them, 151 were malignant and 151 were benign. The study consists of following 3 image processing and feature analysis steps. First, an adaptive region growing segmentation algorithm was used to automatically segment mass regions. Second, a set of 70 image features related to spatial and frequency characteristics of mass regions were initially computed. Third, a generalized linear regression model (GLM) based machine learning classifier combined with a bat optimization algorithm was used to optimally fuse the selected image features based on predefined assessment performance index. An area under ROC curve (AUC) with was used as a performance assessment index. Applying CAD scheme to the testing dataset, AUC was 0.75+/-0.04, which was significantly higher than using a single best feature (AUC=0.69+/-0.05) or the classifier with equally weighted features (AUC=0.73+/-0.05). This study demonstrated that comparing to the conventional equal-weighted approach, using an unequal-weighted feature fusion approach had potential to significantly improve accuracy in classifying between malignant and benign breast masses.
A Double-Sided Linear Primary Permanent Magnet Vernier Machine
2015-01-01
The purpose of this paper is to present a new double-sided linear primary permanent magnet (PM) vernier (DSLPPMV) machine, which can offer high thrust force, low detent force, and improved power factor. Both PMs and windings of the proposed machine are on the short translator, while the long stator is designed as a double-sided simple iron core with salient teeth so that it is very robust to transmit high thrust force. The key of this new machine is the introduction of double stator and the elimination of translator yoke, so that the inductance and the volume of the machine can be reduced. Hence, the proposed machine offers improved power factor and thrust force density. The electromagnetic performances of the proposed machine are analyzed including flux, no-load EMF, thrust force density, and inductance. Based on using the finite element analysis, the characteristics and performances of the proposed machine are assessed. PMID:25874250
A double-sided linear primary permanent magnet vernier machine.
Du, Yi; Zou, Chunhua; Liu, Xianxing
2015-01-01
The purpose of this paper is to present a new double-sided linear primary permanent magnet (PM) vernier (DSLPPMV) machine, which can offer high thrust force, low detent force, and improved power factor. Both PMs and windings of the proposed machine are on the short translator, while the long stator is designed as a double-sided simple iron core with salient teeth so that it is very robust to transmit high thrust force. The key of this new machine is the introduction of double stator and the elimination of translator yoke, so that the inductance and the volume of the machine can be reduced. Hence, the proposed machine offers improved power factor and thrust force density. The electromagnetic performances of the proposed machine are analyzed including flux, no-load EMF, thrust force density, and inductance. Based on using the finite element analysis, the characteristics and performances of the proposed machine are assessed.
ERIC Educational Resources Information Center
Stadt, Ronald; And Others
This catalog provides performance objectives, tasks, standards, and performance guides associated with current occupational information relating to the job content of machinists, specifically tool grinder operators, production lathe operators, and production screw machine operators. The catalog is comprised of 262 performance objectives, tool and…
Flexible Conformable Clamps for a Machining Cell with Applications to Turbine Blade Machining.
1983-05-01
PERIOD COVERED * FLEXIBLE CONFORMABLE CLAMPS FOR A MACHINING CELL Interim WITH APPLICATIONS TO TURBINE BLADE MACHINING 6. PERFORMING ORG. REPORT NUMBER...7. AuTmbR(s) 6. CONTRACT OR GRANT NUMBER(a) Eiki Kurokawa 3. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELE%4NTPROJECT. TASK Carnegie-Mellon...University AREA a WORK UhIT NUMBERS The Robotics Institute Pittsburgh, PA. 15213 II. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE May 1983. 13
Hydraulic Fatigue-Testing Machine
NASA Technical Reports Server (NTRS)
Hodo, James D.; Moore, Dennis R.; Morris, Thomas F.; Tiller, Newton G.
1987-01-01
Fatigue-testing machine applies fluctuating tension to number of specimens at same time. When sample breaks, machine continues to test remaining specimens. Series of tensile tests needed to determine fatigue properties of materials performed more rapidly than in conventional fatigue-testing machine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mou, J.I.; King, C.
The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess themore » status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.« less
Solving the Cauchy-Riemann equations on parallel computers
NASA Technical Reports Server (NTRS)
Fatoohi, Raad A.; Grosch, Chester E.
1987-01-01
Discussed is the implementation of a single algorithm on three parallel-vector computers. The algorithm is a relaxation scheme for the solution of the Cauchy-Riemann equations; a set of coupled first order partial differential equations. The computers were chosen so as to encompass a variety of architectures. They are: the MPP, and SIMD machine with 16K bit serial processors; FLEX/32, an MIMD machine with 20 processors; and CRAY/2, an MIMD machine with four vector processors. The machine architectures are briefly described. The implementation of the algorithm is discussed in relation to these architectures and measures of the performance on each machine are given. Simple performance models are used to describe the performance. These models highlight the bottlenecks and limiting factors for this algorithm on these architectures. Conclusions are presented.
'Controversy'. Propaganda versus evidence based health promotion: the case of breast screening.
Hann, A
1999-01-01
Breast cancer is a serious problem in the developed world, and the common perception of the risks of developing the disease are communicated to the public via a variety of means. This includes leaflets in doctors' surgeries, health promotion campaigns and invitations from well woman clinics to attend for various forms of screening. The national breast cancer screening programme in the UK has a very high compliance rate (which is vital) and a well oiled media machine. This article examines the way in which the risks of developing breast cancer are communicated to women of all ages in the UK, and speculates as to the reason behind the misleading manner in which health promoters offer this information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Ching-Fong; Pokharel, Reeju; Brand, Michael J.
Here, we developed a copper/tungsten (Cu/W) composite for mesoscale Materials Science applications using the novel High-Energy Diffraction Microscopy (HEDM) technique. Argon-atomized copper powder was selected as the starting raw powder and screened to remove the extremely large particle fraction. Tungsten particles were collected by milling and screening the -325 mesh tungsten powder between 500 and 635 mesh sieves. Hot pressing of screened Cu powder was performed at 900 °C in Ar/4 %H 2 atmosphere. XRD and ICP results show that the hot-pressed Cu sample consists of about 5 vol% Cu 2O, which is caused by the presence of oxygen onmore » the surface of the starting Cu powder. Hot pressing the copper powder in a pure hydrogen atmosphere was successful in removing most of the surface oxygen. Our process was also implemented for hot pressing the Cu/W composite. The density of the Cu/W composites hot pressed at 950 °C in pure hydrogen was about 94 % of the theoretical density (TD). The hot-pressed Cu/W composites were further hot isostatic pressed at 1050 °C in argon atmosphere, which results in 99.6 % of the TD with the designed Cu grain size and W particle distribution. Tensile specimens with D-notch were machined using the wire EDM method. Furthermore, the processing and consolidation of these materials will be discussed in detail. The HEDM images are also showed and discussed.« less
An EEG-based functional connectivity measure for automatic detection of alcohol use disorder.
Mumtaz, Wajid; Saad, Mohamad Naufal B Mohamad; Kamel, Nidal; Ali, Syed Saad Azhar; Malik, Aamir Saeed
2018-01-01
The abnormal alcohol consumption could cause toxicity and could alter the human brain's structure and function, termed as alcohol used disorder (AUD). Unfortunately, the conventional screening methods for AUD patients are subjective and manual. Hence, to perform automatic screening of AUD patients, objective methods are needed. The electroencephalographic (EEG) data have been utilized to study the differences of brain signals between alcoholics and healthy controls that could further developed as an automatic screening tool for alcoholics. In this work, resting-state EEG-derived features were utilized as input data to the proposed feature selection and classification method. The aim was to perform automatic classification of AUD patients and healthy controls. The validation of the proposed method involved real-EEG data acquired from 30 AUD patients and 30 age-matched healthy controls. The resting-state EEG-derived features such as synchronization likelihood (SL) were computed involving 19 scalp locations resulted into 513 features. Furthermore, the features were rank-ordered to select the most discriminant features involving a rank-based feature selection method according to a criterion, i.e., receiver operating characteristics (ROC). Consequently, a reduced set of most discriminant features was identified and utilized further during classification of AUD patients and healthy controls. In this study, three different classification models such as Support Vector Machine (SVM), Naïve Bayesian (NB), and Logistic Regression (LR) were used. The study resulted into SVM classification accuracy=98%, sensitivity=99.9%, specificity=95%, and f-measure=0.97; LR classification accuracy=91.7%, sensitivity=86.66%, specificity=96.6%, and f-measure=0.90; NB classification accuracy=93.6%, sensitivity=100%, specificity=87.9%, and f-measure=0.95. The SL features could be utilized as objective markers to screen the AUD patients and healthy controls. Copyright © 2017 Elsevier B.V. All rights reserved.
Tóth, László; Hoffmann, Ildikó; Gosztolya, Gábor; Vincze, Veronika; Szatlóczki, Gréta; Bánréti, Zoltán; Pákáski, Magdolna; Kálmán, János
2018-01-01
Background: Even today the reliable diagnosis of the prodromal stages of Alzheimer’s disease (AD) remains a great challenge. Our research focuses on the earliest detectable indicators of cognitive de-cline in mild cognitive impairment (MCI). Since the presence of language impairment has been reported even in the mild stage of AD, the aim of this study is to develop a sensitive neuropsychological screening method which is based on the analysis of spontaneous speech production during performing a memory task. In the future, this can form the basis of an Internet-based interactive screening software for the recognition of MCI. Methods: Participants were 38 healthy controls and 48 clinically diagnosed MCI patients. The provoked spontaneous speech by asking the patients to recall the content of 2 short black and white films (one direct, one delayed), and by answering one question. Acoustic parameters (hesitation ratio, speech tempo, length and number of silent and filled pauses, length of utterance) were extracted from the recorded speech sig-nals, first manually (using the Praat software), and then automatically, with an automatic speech recogni-tion (ASR) based tool. First, the extracted parameters were statistically analyzed. Then we applied machine learning algorithms to see whether the MCI and the control group can be discriminated automatically based on the acoustic features. Results: The statistical analysis showed significant differences for most of the acoustic parameters (speech tempo, articulation rate, silent pause, hesitation ratio, length of utterance, pause-per-utterance ratio). The most significant differences between the two groups were found in the speech tempo in the delayed recall task, and in the number of pauses for the question-answering task. The fully automated version of the analysis process – that is, using the ASR-based features in combination with machine learning - was able to separate the two classes with an F1-score of 78.8%. Conclusion: The temporal analysis of spontaneous speech can be exploited in implementing a new, auto-matic detection-based tool for screening MCI for the community. PMID:29165085
Toth, Laszlo; Hoffmann, Ildiko; Gosztolya, Gabor; Vincze, Veronika; Szatloczki, Greta; Banreti, Zoltan; Pakaski, Magdolna; Kalman, Janos
2018-01-01
Even today the reliable diagnosis of the prodromal stages of Alzheimer's disease (AD) remains a great challenge. Our research focuses on the earliest detectable indicators of cognitive decline in mild cognitive impairment (MCI). Since the presence of language impairment has been reported even in the mild stage of AD, the aim of this study is to develop a sensitive neuropsychological screening method which is based on the analysis of spontaneous speech production during performing a memory task. In the future, this can form the basis of an Internet-based interactive screening software for the recognition of MCI. Participants were 38 healthy controls and 48 clinically diagnosed MCI patients. The provoked spontaneous speech by asking the patients to recall the content of 2 short black and white films (one direct, one delayed), and by answering one question. Acoustic parameters (hesitation ratio, speech tempo, length and number of silent and filled pauses, length of utterance) were extracted from the recorded speech signals, first manually (using the Praat software), and then automatically, with an automatic speech recognition (ASR) based tool. First, the extracted parameters were statistically analyzed. Then we applied machine learning algorithms to see whether the MCI and the control group can be discriminated automatically based on the acoustic features. The statistical analysis showed significant differences for most of the acoustic parameters (speech tempo, articulation rate, silent pause, hesitation ratio, length of utterance, pause-per-utterance ratio). The most significant differences between the two groups were found in the speech tempo in the delayed recall task, and in the number of pauses for the question-answering task. The fully automated version of the analysis process - that is, using the ASR-based features in combination with machine learning - was able to separate the two classes with an F1-score of 78.8%. The temporal analysis of spontaneous speech can be exploited in implementing a new, automatic detection-based tool for screening MCI for the community. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
How Not To Drown in Data: A Guide for Biomaterial Engineers.
Vasilevich, Aliaksei S; Carlier, Aurélie; de Boer, Jan; Singh, Shantanu
2017-08-01
High-throughput assays that produce hundreds of measurements per sample are powerful tools for quantifying cell-material interactions. With advances in automation and miniaturization in material fabrication, hundreds of biomaterial samples can be rapidly produced, which can then be characterized using these assays. However, the resulting deluge of data can be overwhelming. To the rescue are computational methods that are well suited to these problems. Machine learning techniques provide a vast array of tools to make predictions about cell-material interactions and to find patterns in cellular responses. Computational simulations allow researchers to pose and test hypotheses and perform experiments in silico. This review describes approaches from these two domains that can be brought to bear on the problem of analyzing biomaterial screening data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Grain Boundary Engineering the Mechanical Properties of Allvac 718Plus(Trademark) Superalloy
NASA Technical Reports Server (NTRS)
Gabb, Timothy P.; Telesman, Jack; Garg, Anita; Lin, Peter; Provenzano, virgil; Heard, Robert; Miller, Herbert M.
2010-01-01
Grain Boundary Engineering can enhance the population of structurally-ordered "low S" Coincidence Site Lattice (CSL) grain boundaries in the microstructure. In some alloys, these "special" grain boundaries have been reported to improve overall resistance to corrosion, oxidation, and creep resistance. Such improvements could be quite beneficial for superalloys, especially in conditions which encourage damage and cracking at grain boundaries. Therefore, the effects of GBE processing on high-temperature mechanical properties of the cast and wrought superalloy Allvac 718Plus (Allvac ATI) were screened. Bar sections were subjected to varied GBE processing, and then consistently heat treated, machined, and tested at 650 C. Creep, tensile stress relaxation, and dwell fatigue crack growth tests were performed. The influences of GBE processing on microstructure, mechanical properties, and associated failure modes are discussed.
Blood detection in wireless capsule endoscope images based on salient superpixels.
Iakovidis, Dimitris K; Chatzis, Dimitris; Chrysanthopoulos, Panos; Koulaouzidis, Anastasios
2015-08-01
Wireless capsule endoscopy (WCE) enables screening of the gastrointestinal (GI) tract with a miniature, optical endoscope packed within a small swallowable capsule, wirelessly transmitting color images. In this paper we propose a novel method for automatic blood detection in contemporary WCE images. Blood is an alarming indication for the presence of pathologies requiring further treatment. The proposed method is based on a new definition of superpixel saliency. The saliency of superpixels is assessed upon their color, enabling the identification of image regions that are likely to contain blood. The blood patterns are recognized by their color features using a supervised learning machine. Experiments performed on a public dataset using automatically selected first-order statistical features from various color components indicate that the proposed method outperforms state-of-the-art methods.
A consideration of the operation of automatic production machines.
Hoshi, Toshiro; Sugimoto, Noboru
2015-01-01
At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation - operation for which quick performance is required (operation that is not permitted to be delayed) - and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as "asymmetric on the time-axis". Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis.
Communication Studies of DMP and SMP Machines
NASA Technical Reports Server (NTRS)
Sohn, Andrew; Biswas, Rupak; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
Understanding the interplay between machines and problems is key to obtaining high performance on parallel machines. This paper investigates the interplay between programming paradigms and communication capabilities of parallel machines. In particular, we explicate the communication capabilities of the IBM SP-2 distributed-memory multiprocessor and the SGI PowerCHALLENGEarray symmetric multiprocessor. Two benchmark problems of bitonic sorting and Fast Fourier Transform are selected for experiments. Communication-efficient algorithms are developed to exploit the overlapping capabilities of the machines. Programs are written in Message-Passing Interface for portability and identical codes are used for both machines. Various data sizes and message sizes are used to test the machines' communication capabilities. Experimental results indicate that the communication performance of the multiprocessors are consistent with the size of messages. The SP-2 is sensitive to message size but yields a much higher communication overlapping because of the communication co-processor. The PowerCHALLENGEarray is not highly sensitive to message size and yields a low communication overlapping. Bitonic sorting yields lower performance compared to FFT due to a smaller computation-to-communication ratio.
A consideration of the operation of automatic production machines
HOSHI, Toshiro; SUGIMOTO, Noboru
2015-01-01
At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation − operation for which quick performance is required (operation that is not permitted to be delayed) − and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as “asymmetric on the time-axis”. Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis. PMID:25739898
Graph Kernels for Molecular Similarity.
Rupp, Matthias; Schneider, Gisbert
2010-04-12
Molecular similarity measures are important for many cheminformatics applications like ligand-based virtual screening and quantitative structure-property relationships. Graph kernels are formal similarity measures defined directly on graphs, such as the (annotated) molecular structure graph. Graph kernels are positive semi-definite functions, i.e., they correspond to inner products. This property makes them suitable for use with kernel-based machine learning algorithms such as support vector machines and Gaussian processes. We review the major types of kernels between graphs (based on random walks, subgraphs, and optimal assignments, respectively), and discuss their advantages, limitations, and successful applications in cheminformatics. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Implementation and performance of parallel Prolog interpreter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, S.; Kale, L.V.; Balkrishna, R.
1988-01-01
In this paper, the authors discuss the implementation of a parallel Prolog interpreter on different parallel machines. The implementation is based on the REDUCE--OR process model which exploits both AND and OR parallelism in logic programs. It is machine independent as it runs on top of the chare-kernel--a machine-independent parallel programming system. The authors also give the performance of the interpreter running a diverse set of benchmark pargrams on parallel machines including shared memory systems: an Alliant FX/8, Sequent and a MultiMax, and a non-shared memory systems: Intel iPSC/32 hypercube, in addition to its performance on a multiprocessor simulation system.
Balaban, M O; Aparicio, J; Zotarelli, M; Sims, C
2008-11-01
The average colors of mangos and apples were measured using machine vision. A method to quantify the perception of nonhomogeneous colors by sensory panelists was developed. Three colors out of several reference colors and their perceived percentage of the total sample area were selected by untrained panelists. Differences between the average colors perceived by panelists and those from the machine vision were reported as DeltaE values (color difference error). Effects of nonhomogeneity of color, and using real samples or their images in the sensory panels on DeltaE were evaluated. In general, samples with more nonuniform colors had higher DeltaE values, suggesting that panelists had more difficulty in evaluating more nonhomogeneous colors. There was no significant difference in DeltaE values between the real fruits and their screen image, therefore images can be used to evaluate color instead of the real samples.
Implementing finite state machines in a computer-based teaching system
NASA Astrophysics Data System (ADS)
Hacker, Charles H.; Sitte, Renate
1999-09-01
Finite State Machines (FSM) are models for functions commonly implemented in digital circuits such as timers, remote controls, and vending machines. Teaching FSM is core in the curriculum of many university digital electronic or discrete mathematics subjects. Students often have difficulties grasping the theoretical concepts in the design and analysis of FSM. This has prompted the author to develop an MS-WindowsTM compatible software, WinState, that provides a tutorial style teaching aid for understanding the mechanisms of FSM. The animated computer screen is ideal for visually conveying the required design and analysis procedures. WinState complements other software for combinatorial logic previously developed by the author, and enhances the existing teaching package by adding sequential logic circuits. WinState enables the construction of a students own FSM, which can be simulated, to test the design for functionality and possible errors.
Sharma, Amit Kumar; Gangwar, Mayank; Kumar, Dharmendra; Nath, Gopal; Kumar Sinha, Akhoury Sudhir; Tripathi, Yamini Bhushan
2016-01-01
Objective: This study aims to evaluate the antimicrobial activity, phytochemical studies and thin layer chromatography analysis of machine oil, hexane extract of seed oil and methanol extract of presscake & latex of Jatropha curcas Linn (family Euphorbiaceae). Materials and Methods: J. curcas extracts were subjected to preliminary qualitative phytochemical screening to detect the major phytochemicals followed by its reducing power and content of phenol and flavonoids in different fractions. Thin layer chromatography was also performed using different solvent systems for the analysis of a number of constituents in the plant extracts. Antimicrobial activity was evaluated by the disc diffusion method, while the minimum inhibitory concentration, minimum bactericidal concentration and minimum fungicidal concentration were calculated by micro dilution method. Results: The methanolic fraction of latex and cake exhibited marked antifungal and antibacterial activities against Gram-positive and Gram-negative bacteria. Phytochemical analysis revealed the presence of alkaloids, saponins, tannins, terpenoids, steroids, glycosides, phenols and flavonoids. Reducing power showed dose dependent increase in concentration compared to standard Quercetin. Furthermore, this study recommended the isolation and separation of bioactive compounds responsible for the antibacterial activity which would be done by using different chromatographic methods such as high-performance liquid chromatography (HPLC), GC-MS etc. Conclusion: The results of the above study suggest that all parts of the plants possess potent antibacterial activity. Hence, it is important to isolate the active principles for further testing of antimicrobial and other biological efficacy. PMID:27516977
Sharma, Amit Kumar; Gangwar, Mayank; Kumar, Dharmendra; Nath, Gopal; Kumar Sinha, Akhoury Sudhir; Tripathi, Yamini Bhushan
2016-01-01
This study aims to evaluate the antimicrobial activity, phytochemical studies and thin layer chromatography analysis of machine oil, hexane extract of seed oil and methanol extract of presscake & latex of Jatropha curcas Linn (family Euphorbiaceae). J. curcas extracts were subjected to preliminary qualitative phytochemical screening to detect the major phytochemicals followed by its reducing power and content of phenol and flavonoids in different fractions. Thin layer chromatography was also performed using different solvent systems for the analysis of a number of constituents in the plant extracts. Antimicrobial activity was evaluated by the disc diffusion method, while the minimum inhibitory concentration, minimum bactericidal concentration and minimum fungicidal concentration were calculated by micro dilution method. The methanolic fraction of latex and cake exhibited marked antifungal and antibacterial activities against Gram-positive and Gram-negative bacteria. Phytochemical analysis revealed the presence of alkaloids, saponins, tannins, terpenoids, steroids, glycosides, phenols and flavonoids. Reducing power showed dose dependent increase in concentration compared to standard Quercetin. Furthermore, this study recommended the isolation and separation of bioactive compounds responsible for the antibacterial activity which would be done by using different chromatographic methods such as high-performance liquid chromatography (HPLC), GC-MS etc. The results of the above study suggest that all parts of the plants possess potent antibacterial activity. Hence, it is important to isolate the active principles for further testing of antimicrobial and other biological efficacy.
Souillard-Mandar, William; Davis, Randall; Rudin, Cynthia; Au, Rhoda; Libon, David J.; Swenson, Rodney; Price, Catherine C.; Lamar, Melissa; Penney, Dana L.
2015-01-01
The Clock Drawing Test – a simple pencil and paper test – has been used for more than 50 years as a screening tool to differentiate normal individuals from those with cognitive impairment, and has proven useful in helping to diagnose cognitive dysfunction associated with neurological disorders such as Alzheimer’s disease, Parkinson’s disease, and other dementias and conditions. We have been administering the test using a digitizing ballpoint pen that reports its position with considerable spatial and temporal precision, making available far more detailed data about the subject’s performance. Using pen stroke data from these drawings categorized by our software, we designed and computed a large collection of features, then explored the tradeoffs in performance and interpretability in classifiers built using a number of different subsets of these features and a variety of different machine learning techniques. We used traditional machine learning methods to build prediction models that achieve high accuracy. We operationalized widely used manual scoring systems so that we could use them as benchmarks for our models. We worked with clinicians to define guidelines for model interpretability, and constructed sparse linear models and rule lists designed to be as easy to use as scoring systems currently used by clinicians, but more accurate. While our models will require additional testing for validation, they offer the possibility of substantial improvement in detecting cognitive impairment earlier than currently possible, a development with considerable potential impact in practice. PMID:27057085
Souillard-Mandar, William; Davis, Randall; Rudin, Cynthia; Au, Rhoda; Libon, David J; Swenson, Rodney; Price, Catherine C; Lamar, Melissa; Penney, Dana L
2016-03-01
The Clock Drawing Test - a simple pencil and paper test - has been used for more than 50 years as a screening tool to differentiate normal individuals from those with cognitive impairment, and has proven useful in helping to diagnose cognitive dysfunction associated with neurological disorders such as Alzheimer's disease, Parkinson's disease, and other dementias and conditions. We have been administering the test using a digitizing ballpoint pen that reports its position with considerable spatial and temporal precision, making available far more detailed data about the subject's performance. Using pen stroke data from these drawings categorized by our software, we designed and computed a large collection of features, then explored the tradeoffs in performance and interpretability in classifiers built using a number of different subsets of these features and a variety of different machine learning techniques. We used traditional machine learning methods to build prediction models that achieve high accuracy. We operationalized widely used manual scoring systems so that we could use them as benchmarks for our models. We worked with clinicians to define guidelines for model interpretability, and constructed sparse linear models and rule lists designed to be as easy to use as scoring systems currently used by clinicians, but more accurate. While our models will require additional testing for validation, they offer the possibility of substantial improvement in detecting cognitive impairment earlier than currently possible, a development with considerable potential impact in practice.
Radiomic modeling of BI-RADS density categories
NASA Astrophysics Data System (ADS)
Wei, Jun; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Zhou, Chuan; Hadjiiski, Lubomir
2017-03-01
Screening mammography is the most effective and low-cost method to date for early cancer detection. Mammographic breast density has been shown to be highly correlated with breast cancer risk. We are developing a radiomic model for BI-RADS density categorization on digital mammography (FFDM) with a supervised machine learning approach. With IRB approval, we retrospectively collected 478 FFDMs from 478 women. As a gold standard, breast density was assessed by an MQSA radiologist based on BI-RADS categories. The raw FFDMs were used for computerized density assessment. The raw FFDM first underwent log-transform to approximate the x-ray sensitometric response, followed by multiscale processing to enhance the fibroglandular densities and parenchymal patterns. Three ROIs were automatically identified based on the keypoint distribution, where the keypoints were obtained as the extrema in the image Gaussian scale-space. A total of 73 features, including intensity and texture features that describe the density and the parenchymal pattern, were extracted from each breast. Our BI-RADS density estimator was constructed by using a random forest classifier. We used a 10-fold cross validation resampling approach to estimate the errors. With the random forest classifier, computerized density categories for 412 of the 478 cases agree with radiologist's assessment (weighted kappa = 0.93). The machine learning method with radiomic features as predictors demonstrated a high accuracy in classifying FFDMs into BI-RADS density categories. Further work is underway to improve our system performance as well as to perform an independent testing using a large unseen FFDM set.
NASA Astrophysics Data System (ADS)
Torrents-Barrena, Jordina; Puig, Domenec; Melendez, Jaime; Valls, Aida
2016-03-01
Breast cancer is one of the most dangerous diseases that attack women in their 40s worldwide. Due to this fact, it is estimated that one in eight women will develop a malignant carcinoma during their life. In addition, the carelessness of performing regular screenings is an important reason for the increase of mortality. However, computer-aided diagnosis systems attempt to enhance the quality of mammograms as well as the detection of early signs related to the disease. In this paper we propose a bank of Gabor filters to calculate the mean, standard deviation, skewness and kurtosis features by four-sized evaluation windows. Therefore, an active strategy is used to select the most relevant pixels. Finally, a supervised classification stage using two-class support vector machines is utilised through an accurate estimation of kernel parameters. In order to show the development of our methodology based on mammographic image analysis, two main experiments are fulfilled: abnormal/normal breast tissue classification and the ability to detect the different breast cancer types. Moreover, the public screen-film mini-MIAS database is compared with a digitised breast cancer database to evaluate the method robustness. The area under the receiver operating characteristic curve is used to measure the performance of the method. Furthermore, both confusion matrix and accuracy are calculated to assess the results of the proposed algorithm.
Accurate prediction of the refractive index of polymers using first principles and data modeling
NASA Astrophysics Data System (ADS)
Afzal, Mohammad Atif Faiz; Cheng, Chong; Hachmann, Johannes
Organic polymers with a high refractive index (RI) have recently attracted considerable interest due to their potential application in optical and optoelectronic devices. The ability to tailor the molecular structure of polymers is the key to increasing the accessible RI values. Our work concerns the creation of predictive in silico models for the optical properties of organic polymers, the screening of large-scale candidate libraries, and the mining of the resulting data to extract the underlying design principles that govern their performance. This work was set up to guide our experimentalist partners and allow them to target the most promising candidates. Our model is based on the Lorentz-Lorenz equation and thus includes the polarizability and number density values for each candidate. For the former, we performed a detailed benchmark study of different density functionals, basis sets, and the extrapolation scheme towards the polymer limit. For the number density we devised an exceedingly efficient machine learning approach to correlate the polymer structure and the packing fraction in the bulk material. We validated the proposed RI model against the experimentally known RI values of 112 polymers. We could show that the proposed combination of physical and data modeling is both successful and highly economical to characterize a wide range of organic polymers, which is a prerequisite for virtual high-throughput screening.
Baumes, Laurent A
2006-01-01
One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for machine tool operation/machine shop I and II. Presented first are a…
A Study on Software-based Sensing Technology for Multiple Object Control in AR Video
Jung, Sungmo; Song, Jae-gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo
2010-01-01
Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker’should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms. PMID:22163444
Randhawa, Vinay; Kumar Singh, Anil; Acharya, Vishal
2015-12-01
Systems-biology inspired identification of drug targets and machine learning-based screening of small molecules which modulate their activity have the potential to revolutionize modern drug discovery by complementing conventional methods. To utilize the effectiveness of such pipelines, we first analyzed the dysregulated gene pairs between control and tumor samples and then implemented an ensemble-based feature selection approach to prioritize targets in oral squamous cell carcinoma (OSCC) for therapeutic exploration. Based on the structural information of known inhibitors of CXCR4-one of the best targets identified in this study-a feature selection was implemented for the identification of optimal structural features (molecular descriptor) based on which a classification model was generated. Furthermore, the CXCR4-centered descriptor-based classification model was finally utilized to screen a repository of plant derived small-molecules to obtain potential inhibitors. The application of our methodology may assist effective selection of the best targets which may have previously been overlooked, that in turn will lead to the development of new oral cancer medications. The small molecules identified in this study can be ideal candidates for trials as potential novel anti-oral cancer agents. Importantly, distinct steps of this whole study may provide reference for the analysis of other complex human diseases.
A study on software-based sensing technology for multiple object control in AR video.
Jung, Sungmo; Song, Jae-Gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo
2010-01-01
Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker'should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms.
Performance study of a data flow architecture
NASA Technical Reports Server (NTRS)
Adams, George
1985-01-01
Teams of scientists studied data flow concepts, static data flow machine architecture, and the VAL language. Each team mapped its application onto the machine and coded it in VAL. The principal findings of the study were: (1) Five of the seven applications used the full power of the target machine. The galactic simulation and multigrid fluid flow teams found that a significantly smaller version of the machine (16 processing elements) would suffice. (2) A number of machine design parameters including processing element (PE) function unit numbers, array memory size and bandwidth, and routing network capability were found to be crucial for optimal machine performance. (3) The study participants readily acquired VAL programming skills. (4) Participants learned that application-based performance evaluation is a sound method of evaluating new computer architectures, even those that are not fully specified. During the course of the study, participants developed models for using computers to solve numerical problems and for evaluating new architectures. These models form the bases for future evaluation studies.
Currency crisis indication by using ensembles of support vector machine classifiers
NASA Astrophysics Data System (ADS)
Ramli, Nor Azuana; Ismail, Mohd Tahir; Wooi, Hooy Chee
2014-07-01
There are many methods that had been experimented in the analysis of currency crisis. However, not all methods could provide accurate indications. This paper introduces an ensemble of classifiers by using Support Vector Machine that's never been applied in analyses involving currency crisis before with the aim of increasing the indication accuracy. The proposed ensemble classifiers' performances are measured using percentage of accuracy, root mean squared error (RMSE), area under the Receiver Operating Characteristics (ROC) curve and Type II error. The performances of an ensemble of Support Vector Machine classifiers are compared with the single Support Vector Machine classifier and both of classifiers are tested on the data set from 27 countries with 12 macroeconomic indicators for each country. From our analyses, the results show that the ensemble of Support Vector Machine classifiers outperforms single Support Vector Machine classifier on the problem involving indicating a currency crisis in terms of a range of standard measures for comparing the performance of classifiers.
There are little available toxicity data on the vast majority of chemicals in commerce. High-throughput screening (HTS) studies, such as those being carried out by the U.S. Environmental Protection Agency (EPA) ToxCast program in partnership with the federal Tox21 research progra...
The IBM PC as an Online Search Machine. Part 5: Searching through Crosstalk.
ERIC Educational Resources Information Center
Kolner, Stuart J.
1985-01-01
This last of a five-part series on using the IBM personal computer for online searching highlights a brief review, search process, making the connection, switching between screens and modes, online transaction, capture buffer controls, coping with options, function keys, script files, processing downloaded information, note to TELEX users, and…
Lee, Eugene K; Tran, David D; Keung, Wendy; Chan, Patrick; Wong, Gabriel; Chan, Camie W; Costa, Kevin D; Li, Ronald A; Khine, Michelle
2017-11-14
Accurately predicting cardioactive effects of new molecular entities for therapeutics remains a daunting challenge. Immense research effort has been focused toward creating new screening platforms that utilize human pluripotent stem cell (hPSC)-derived cardiomyocytes and three-dimensional engineered cardiac tissue constructs to better recapitulate human heart function and drug responses. As these new platforms become increasingly sophisticated and high throughput, the drug screens result in larger multidimensional datasets. Improved automated analysis methods must therefore be developed in parallel to fully comprehend the cellular response across a multidimensional parameter space. Here, we describe the use of machine learning to comprehensively analyze 17 functional parameters derived from force readouts of hPSC-derived ventricular cardiac tissue strips (hvCTS) electrically paced at a range of frequencies and exposed to a library of compounds. A generated metric is effective for then determining the cardioactivity of a given drug. Furthermore, we demonstrate a classification model that can automatically predict the mechanistic action of an unknown cardioactive drug. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Bench-scale screening tests for a boiling sodium-potassium alloy solar receiver
NASA Astrophysics Data System (ADS)
Moreno, J. B.; Moss, T. A.
1993-06-01
Bench-scale tests were carried out in support of the design of a second-generation 75-kW(sub t) reflux pool-boiler solar receiver. The receiver will be made from Haynes Alloy 230 and will contain the sodium-potassium alloy NaK-78. The bench-scale tests used quartz lamp heated boilers to screen candidate boiling stabilization materials and methods at temperatures up to 750 degree C. Candidates that provided stable boiling were tested for hot-restart behavior. Poor stability was obtained with single 1/4-inch diameter patches of powdered metal hot press sintered onto the wetted side of the heat-input area. Laser-drilled and electric discharge machined cavities in the heated surface also performed poorly. Small additions of xenon, and heated-surface tilt out of the vertical, dramatically improved poor boiling stability; additions of helium or oxygen did not. The most stable boiling was obtained when the entire heat-input area was covered by a powdered-metal coating. The effect of heated-area size was assessed for one coating: at low incident fluxes, when even this coating performed poorly, increasing the heated-area size markedly improved boiling stability. Good hot-restart behavior was not observed with any candidate, although results were significantly better with added xenon in a boiler shortened from 3 to 2 feet. In addition to the screening tests, flash-radiography imaging of metal-vapor bubbles during boiling was attempted. Contrary to the Cole-Rohsenow correlation, these bubble-size estimates did not vary with pressure; instead they were constant, consistent with the only other alkali metal measurements, but about 1/2 their size.
Bickelhaupt, Sebastian; Paech, Daniel; Kickingereder, Philipp; Steudle, Franziska; Lederer, Wolfgang; Daniel, Heidi; Götz, Michael; Gählert, Nils; Tichy, Diana; Wiesenfarth, Manuel; Laun, Frederik B; Maier-Hein, Klaus H; Schlemmer, Heinz-Peter; Bonekamp, David
2017-08-01
To assess radiomics as a tool to determine how well lesions found suspicious on breast cancer screening X-ray mammography can be categorized into malignant and benign with unenhanced magnetic resonance (MR) mammography with diffusion-weighted imaging and T 2 -weighted sequences. From an asymptomatic screening cohort, 50 women with mammographically suspicious findings were examined with contrast-enhanced breast MRI (ceMRI) at 1.5T. Out of this protocol an unenhanced, abbreviated diffusion-weighted imaging protocol (ueMRI) including T 2 -weighted, (T 2 w), diffusion-weighted imaging (DWI), and DWI with background suppression (DWIBS) sequences and corresponding apparent diffusion coefficient (ADC) maps were extracted. From ueMRI-derived radiomic features, three Lasso-supervised machine-learning classifiers were constructed and compared with the clinical performance of a highly experienced radiologist: 1) univariate mean ADC model, 2) unconstrained radiomic model, 3) constrained radiomic model with mandatory inclusion of mean ADC. The unconstrained and constrained radiomic classifiers consisted of 11 parameters each and achieved differentiation of malignant from benign lesions with a .632 + bootstrap receiver operating characteristics (ROC) area under the curve (AUC) of 84.2%/85.1%, compared to 77.4% for mean ADC and 95.9%/95.9% for the experienced radiologist using ceMRI/ueMRI. In this pilot study we identified two ueMRI radiomics classifiers that performed well in the differentiation of malignant from benign lesions and achieved higher performance than the mean ADC parameter alone. Classification was lower than the almost perfect performance of a highly experienced breast radiologist. The potential of radiomics to provide a training-independent diagnostic decision tool is indicated. A performance reaching the human expert would be highly desirable and based on our results is considered possible when the concept is extended in larger cohorts with further development and validation of the technique. 1 Technical Efficacy: Stage 2 J. MAGN. RESON. IMAGING 2017;46:604-616. © 2017 International Society for Magnetic Resonance in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angers, Crystal Plume; Bottema, Ryan; Buckley, Les
Purpose: Treatment unit uptime statistics are typically used to monitor radiation equipment performance. The Ottawa Hospital Cancer Centre has introduced the use of Quality Control (QC) test success as a quality indicator for equipment performance and overall health of the equipment QC program. Methods: Implemented in 2012, QATrack+ is used to record and monitor over 1100 routine machine QC tests each month for 20 treatment and imaging units ( http://qatrackplus.com/ ). Using an SQL (structured query language) script, automated queries of the QATrack+ database are used to generate program metrics such as the number of QC tests executed and themore » percentage of tests passing, at tolerance or at action. These metrics are compared against machine uptime statistics already reported within the program. Results: Program metrics for 2015 show good correlation between pass rate of QC tests and uptime for a given machine. For the nine conventional linacs, the QC test success rate was consistently greater than 97%. The corresponding uptimes for these units are better than 98%. Machines that consistently show higher failure or tolerance rates in the QC tests have lower uptimes. This points to either poor machine performance requiring corrective action or to problems with the QC program. Conclusions: QATrack+ significantly improves the organization of QC data but can also aid in overall equipment management. Complimenting machine uptime statistics with QC test metrics provides a more complete picture of overall machine performance and can be used to identify areas of improvement in the machine service and QC programs.« less
Role of Gist and PHOG Features in Computer-Aided Diagnosis of Tuberculosis without Segmentation
Chauhan, Arun; Chauhan, Devesh; Rout, Chittaranjan
2014-01-01
Purpose Effective diagnosis of tuberculosis (TB) relies on accurate interpretation of radiological patterns found in a chest radiograph (CXR). Lack of skilled radiologists and other resources, especially in developing countries, hinders its efficient diagnosis. Computer-aided diagnosis (CAD) methods provide second opinion to the radiologists for their findings and thereby assist in better diagnosis of cancer and other diseases including TB. However, existing CAD methods for TB are based on the extraction of textural features from manually or semi-automatically segmented CXRs. These methods are prone to errors and cannot be implemented in X-ray machines for automated classification. Methods Gabor, Gist, histogram of oriented gradients (HOG), and pyramid histogram of oriented gradients (PHOG) features extracted from the whole image can be implemented into existing X-ray machines to discriminate between TB and non-TB CXRs in an automated manner. Localized features were extracted for the above methods using various parameters, such as frequency range, blocks and region of interest. The performance of these features was evaluated against textural features. Two digital CXR image datasets (8-bit DA and 14-bit DB) were used for evaluating the performance of these features. Results Gist (accuracy 94.2% for DA, 86.0% for DB) and PHOG (accuracy 92.3% for DA, 92.0% for DB) features provided better results for both the datasets. These features were implemented to develop a MATLAB toolbox, TB-Xpredict, which is freely available for academic use at http://sourceforge.net/projects/tbxpredict/. This toolbox provides both automated training and prediction modules and does not require expertise in image processing for operation. Conclusion Since the features used in TB-Xpredict do not require segmentation, the toolbox can easily be implemented in X-ray machines. This toolbox can effectively be used for the mass screening of TB in high-burden areas with improved efficiency. PMID:25390291
Practical Framework: Implementing OEE Method in Manufacturing Process Environment
NASA Astrophysics Data System (ADS)
Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.
2016-02-01
Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.
Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro
2018-05-09
Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.
NASA Astrophysics Data System (ADS)
Cady, Stephen
2009-02-01
Chiasmus is a responsive and dynamically reflective, two-sided volumetric surface that embodies phenomenological issues such as the formation of images, observer and machine perception and the dynamics of the screen as a space of image reception. It consists of a square grid of 64 individually motorized cube elements engineered to move linearly. Each cube is controlled by custom software that analyzes video imagery for luminance values and sends these values to the motor control mechanisms to coordinate the individual movements. The resolution of the sculptural screen from the individual movements allows its volume to dynamically alter, providing novel and unique perspectives of its mobile form to an observer.
Method and apparatus for monitoring machine performance
Smith, Stephen F.; Castleberry, Kimberly N.
1996-01-01
Machine operating conditions can be monitored by analyzing, in either the time or frequency domain, the spectral components of the motor current. Changes in the electric background noise, induced by mechanical variations in the machine, are correlated to changes in the operating parameters of the machine.
NASA Astrophysics Data System (ADS)
S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr
2014-03-01
An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.
Lin, Yin-Yan; Wu, Hau-Tieng; Hsu, Chi-An; Huang, Po-Chiun; Huang, Yuan-Hao; Lo, Yu-Lun
2016-12-07
Physiologically, the thoracic (THO) and abdominal (ABD) movement signals, captured using wearable piezo-electric bands, provide information about various types of apnea, including central sleep apnea (CSA) and obstructive sleep apnea (OSA). However, the use of piezo-electric wearables in detecting sleep apnea events has been seldom explored in the literature. This study explored the possibility of identifying sleep apnea events, including OSA and CSA, by solely analyzing one or both the THO and ABD signals. An adaptive non-harmonic model was introduced to model the THO and ABD signals, which allows us to design features for sleep apnea events. To confirm the suitability of the extracted features, a support vector machine was applied to classify three categories - normal and hypopnea, OSA, and CSA. According to a database of 34 subjects, the overall classification accuracies were on average 75.9%±11.7% and 73.8%±4.4%, respectively, based on the cross validation. When the features determined from the THO and ABD signals were combined, the overall classification accuracy became 81.8%±9.4%. These features were applied for designing a state machine for online apnea event detection. Two event-byevent accuracy indices, S and I, were proposed for evaluating the performance of the state machine. For the same database, the S index was 84.01%±9.06%, and the I index was 77.21%±19.01%. The results indicate the considerable potential of applying the proposed algorithm to clinical examinations for both screening and homecare purposes.
Feizi, Alborz; Zhang, Yibo; Greenbaum, Alon; Guziak, Alex; Luong, Michelle; Chan, Raymond Yan Lok; Berg, Brandon; Ozkan, Haydar; Luo, Wei; Wu, Michael; Wu, Yichen; Ozcan, Aydogan
2016-11-01
Monitoring yeast cell viability and concentration is important in brewing, baking and biofuel production. However, existing methods of measuring viability and concentration are relatively bulky, tedious and expensive. Here we demonstrate a compact and cost-effective automatic yeast analysis platform (AYAP), which can rapidly measure cell concentration and viability. AYAP is based on digital in-line holography and on-chip microscopy and rapidly images a large field-of-view of 22.5 mm 2 . This lens-free microscope weighs 70 g and utilizes a partially-coherent illumination source and an opto-electronic image sensor chip. A touch-screen user interface based on a tablet-PC is developed to reconstruct the holographic shadows captured by the image sensor chip and use a support vector machine (SVM) model to automatically classify live and dead cells in a yeast sample stained with methylene blue. In order to quantify its accuracy, we varied the viability and concentration of the cells and compared AYAP's performance with a fluorescence exclusion staining based gold-standard using regression analysis. The results agree very well with this gold-standard method and no significant difference was observed between the two methods within a concentration range of 1.4 × 10 5 to 1.4 × 10 6 cells per mL, providing a dynamic range suitable for various applications. This lensfree computational imaging technology that is coupled with machine learning algorithms would be useful for cost-effective and rapid quantification of cell viability and density even in field and resource-poor settings.
Using GPS to evaluate productivity and performance of forest machine systems
Steven E. Taylor; Timothy P. McDonald; Matthew W. Veal; Ton E. Grift
2001-01-01
This paper reviews recent research and operational applications of using GPS as a tool to help monitor the locations, travel patterns, performance, and productivity of forest machines. The accuracy of dynamic GPS data collected on forest machines under different levels of forest canopy is reviewed first. Then, the paper focuses on the use of GPS for monitoring forest...
NASA Astrophysics Data System (ADS)
Bhaumik, Munmun; Maity, Kalipada
Powder mixed electro discharge machining (PMEDM) is further advancement of conventional electro discharge machining (EDM) where the powder particles are suspended in the dielectric medium to enhance the machining rate as well as surface finish. Cryogenic treatment is introduced in this process for improving the tool life and cutting tool properties. In the present investigation, the characterization of the cryotreated tempered electrode was performed. An attempt has been made to study the effect of cryotreated double tempered electrode on the radial overcut (ROC) when SiC powder is mixed in the kerosene dielectric during electro discharge machining of AISI 304. The process performance has been evaluated by means of ROC when peak current, pulse on time, gap voltage, duty cycle and powder concentration are considered as process parameters and machining is performed by using tungsten carbide electrodes (untreated and double tempered electrodes). A regression analysis was performed to correlate the data between the response and the process parameters. Microstructural analysis was carried out on the machined surfaces. Least radial overcut was observed for conventional EDM as compared to powder mixed EDM. Cryotreated double tempered electrode significantly reduced the radial overcut than untreated electrode.
Fang, Jiansong; Yang, Ranyao; Gao, Li; Zhou, Dan; Yang, Shengqian; Liu, Ai-Lin; Du, Guan-hua
2013-11-25
Butyrylcholinesterase (BuChE, EC 3.1.1.8) is an important pharmacological target for Alzheimer's disease (AD) treatment. However, the currently available BuChE inhibitor screening assays are expensive, labor-intensive, and compound-dependent. It is necessary to develop robust in silico methods to predict the activities of BuChE inhibitors for the lead identification. In this investigation, support vector machine (SVM) models and naive Bayesian models were built to discriminate BuChE inhibitors (BuChEIs) from the noninhibitors. Each molecule was initially represented in 1870 structural descriptors (1235 from ADRIANA.Code, 334 from MOE, and 301 from Discovery studio). Correlation analysis and stepwise variable selection method were applied to figure out activity-related descriptors for prediction models. Additionally, structural fingerprint descriptors were added to improve the predictive ability of models, which were measured by cross-validation, a test set validation with 1001 compounds and an external test set validation with 317 diverse chemicals. The best two models gave Matthews correlation coefficient of 0.9551 and 0.9550 for the test set and 0.9132 and 0.9221 for the external test set. To demonstrate the practical applicability of the models in virtual screening, we screened an in-house data set with 3601 compounds, and 30 compounds were selected for further bioactivity assay. The assay results showed that 10 out of 30 compounds exerted significant BuChE inhibitory activities with IC50 values ranging from 0.32 to 22.22 μM, at which three new scaffolds as BuChE inhibitors were identified for the first time. To our best knowledge, this is the first report on BuChE inhibitors using machine learning approaches. The models generated from SVM and naive Bayesian approaches successfully predicted BuChE inhibitors. The study proved the feasibility of a new method for predicting bioactivities of ligands and discovering novel lead compounds.
Wacker, Soren; Noskov, Sergei Yu
2018-05-01
Drug-induced abnormal heart rhythm known as Torsades de Pointes (TdP) is a potential lethal ventricular tachycardia found in many patients. Even newly released anti-arrhythmic drugs, like ivabradine with HCN channel as a primary target, block the hERG potassium current in overlapping concentration interval. Promiscuous drug block to hERG channel may potentially lead to perturbation of the action potential duration (APD) and TdP, especially when with combined with polypharmacy and/or electrolyte disturbances. The example of novel anti-arrhythmic ivabradine illustrates clinically important and ongoing deficit in drug design and warrants for better screening methods. There is an urgent need to develop new approaches for rapid and accurate assessment of how drugs with complex interactions and multiple subcellular targets can predispose or protect from drug-induced TdP. One of the unexpected outcomes of compulsory hERG screening implemented in USA and European Union resulted in large datasets of IC 50 values for various molecules entering the market. The abundant data allows now to construct predictive machine-learning (ML) models. Novel ML algorithms and techniques promise better accuracy in determining IC 50 values of hERG blockade that is comparable or surpassing that of the earlier QSAR or molecular modeling technique. To test the performance of modern ML techniques, we have developed a computational platform integrating various workflows for quantitative structure activity relationship (QSAR) models using data from the ChEMBL database. To establish predictive powers of ML-based algorithms we computed IC 50 values for large dataset of molecules and compared it to automated patch clamp system for a large dataset of hERG blocking and non-blocking drugs, an industry gold standard in studies of cardiotoxicity. The optimal protocol with high sensitivity and predictive power is based on the novel eXtreme gradient boosting (XGBoost) algorithm. The ML-platform with XGBoost displays excellent performance with a coefficient of determination of up to R 2 ~0.8 for pIC 50 values in evaluation datasets, surpassing other metrics and approaches available in literature. Ultimately, the ML-based platform developed in our work is a scalable framework with automation potential to interact with other developing technologies in cardiotoxicity field, including high-throughput electrophysiology measurements delivering large datasets of profiled drugs, rapid synthesis and drug development via progress in synthetic biology.
Articulated, Performance-Based Instruction Objectives Guide for Machine Shop Technology.
ERIC Educational Resources Information Center
Henderson, William Edward, Jr., Ed.
This articulation guide contains 21 units of instruction for two years of machine shop. The objectives of the program are to provide the student with the basic terminology and fundamental knowledge and skills in machining (year 1) and to teach him/her to set up and operate machine tools and make or repair metal parts, tools, and machines (year 2).…
Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; de Visser, Ewart; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank
2017-10-01
As society becomes more reliant on machines and automation, understanding how people utilize advice is a necessary endeavor. Our objective was to reveal the underlying neural associations during advice utilization from expert human and machine agents with fMRI and multivariate Granger causality analysis. During an X-ray luggage-screening task, participants accepted or rejected good or bad advice from either the human or machine agent framed as experts with manipulated reliability (high miss rate). We showed that the machine-agent group decreased their advice utilization compared to the human-agent group and these differences in behaviors during advice utilization could be accounted for by high expectations of reliable advice and changes in attention allocation due to miss errors. Brain areas involved with the salience and mentalizing networks, as well as sensory processing involved with attention, were recruited during the task and the advice utilization network consisted of attentional modulation of sensory information with the lingual gyrus as the driver during the decision phase and the fusiform gyrus as the driver during the feedback phase. Our findings expand on the existing literature by showing that misses degrade advice utilization, which is represented in a neural network involving salience detection and self-processing with perceptual integration.
Performance of solar refrigerant ejector refrigerating machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Khalidy, N.A.H.
1997-12-31
In this work a detailed analysis for the ideal, theoretical, and experimental performance of a solar refrigerant ejector refrigerating machine is presented. A comparison of five refrigerants to select a desirable one for the system is made. The theoretical analysis showed that refrigerant R-113 is more suitable for use in the system. The influence of the boiler, condenser, and evaporator temperatures on system performance is investigated experimentally in a refrigerant ejector refrigerating machine using R-113 as a working refrigerant.
NASA Astrophysics Data System (ADS)
Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie
2015-08-01
The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank, R.N.
1990-02-28
The Inspection Shop at Lawrence Livermore Lab recently purchased a Sheffield Apollo RS50 Direct Computer Control Coordinate Measuring Machine. The performance of the machine was specified to conform to B89 standard which relies heavily upon using the measuring machine in its intended manner to verify its accuracy (rather than parametric tests). Although it would be possible to use the interactive measurement system to perform these tasks, a more thorough and efficient job can be done by creating Function Library programs for certain tasks which integrate Hewlett-Packard Basic 5.0 language and calls to proprietary analysis and machine control routines. This combinationmore » provides efficient use of the measuring machine with a minimum of keyboard input plus an analysis of the data with respect to the B89 Standard rather than a CMM analysis which would require subsequent interpretation. This paper discusses some characteristics of the Sheffield machine control and analysis software and my use of H-P Basic language to create automated measurement programs to support the B89 performance evaluation of the CMM. 1 ref.« less
Machine learning in heart failure: ready for prime time.
Awan, Saqib Ejaz; Sohel, Ferdous; Sanfilippo, Frank Mario; Bennamoun, Mohammed; Dwivedi, Girish
2018-03-01
The aim of this review is to present an up-to-date overview of the application of machine learning methods in heart failure including diagnosis, classification, readmissions and medication adherence. Recent studies have shown that the application of machine learning techniques may have the potential to improve heart failure outcomes and management, including cost savings by improving existing diagnostic and treatment support systems. Recently developed deep learning methods are expected to yield even better performance than traditional machine learning techniques in performing complex tasks by learning the intricate patterns hidden in big medical data. The review summarizes the recent developments in the application of machine and deep learning methods in heart failure management.
Adaptation of existing infrared technologies to unanticipated applications
NASA Astrophysics Data System (ADS)
Peng, Philip
2005-01-01
Radiation thermometry is just but one of many applications, both potential and realized, of infrared technology. During the SARS (Severe Acute Respiratory Syndromes) global crisis in 2003, the technology was utilized as a preliminary screening method for infected persons as a defense against a major outbreak, as the primary symptom of this disease is elevated body temperature. ATC timely developed a product designed specifically for mass volume crowd screening of febrile individuals. For this application, the machine must register temperature of subjects rapidly and efficiently, with a certain degree of accuracy, and function for extended periods of time. The equipment must be safe to use, easily deployed, and function with minimum maintenance needed. The ATIR-303 model satisfies all of the above and other pre-requisite conditions amicably. Studies on the correlation between the maximum temperature registered among individual's facial features, as measured under the conditions of usage, and the core temperature of individuals were performed. The results demonstrated that ATIR-303 is very suitable for this application. Other applications of the infrared technology in various areas, like medical diagnosis, non-destructive testing, security, search and rescue, and others, are also interest areas of ATC. The progress ATC has achieved in these areas is presented also.
Obstructive Sleep Apnea Screening Using a Piezo-Electric Sensor.
Erdenebayar, Urtnasan; Park, Jong Uk; Jeong, Pilsoo; Lee, Kyoung Joung
2017-06-01
In this study, we propose a novel method for obstructive sleep apnea (OSA) detection using a piezo-electric sensor. OSA is a relatively common sleep disorder. However, more than 80% of OSA patients remain undiagnosed. We investigated the feasibility of OSA assessment using a single-channel physiological signal to simplify the OSA screening. We detected both snoring and heartbeat information by using a piezo-electric sensor, and snoring index (SI) and features based on pulse rate variability (PRV) analysis were extracted from the filtered piezo-electric sensor signal. A support vector machine (SVM) was used as a classifier to detect OSA events. The performance of the proposed method was evaluated on 45 patients from mild, moderate, and severe OSA groups. The method achieved a mean sensitivity, specificity, and accuracy of 72.5%, 74.2%, and 71.5%; 85.8%, 80.5%, and 80.0%; and 70.3%, 77.1%, and 71.9% for the mild, moderate, and severe groups, respectively. Finally, these results not only show the feasibility of OSA detection using a piezo-electric sensor, but also illustrate its usefulness for monitoring sleep and diagnosing OSA. © 2017 The Korean Academy of Medical Sciences.
Methods And Systms For Analyzing The Degradation And Failure Of Mechanical Systems
Jarrell, Donald B.; Sisk, Daniel R.; Hatley, Darrel D.; Kirihara, Leslie J.; Peters, Timothy J.
2005-02-08
Methods and systems for identifying, understanding, and predicting the degradation and failure of mechanical systems are disclosed. The methods include measuring and quantifying stressors that are responsible for the activation of degradation mechanisms in the machine component of interest. The intensity of the stressor may be correlated with the rate of physical degradation according to some determinable function such that a derivative relationship exists between the machine performance, degradation, and the underlying stressor. The derivative relationship may be used to make diagnostic and prognostic calculations concerning the performance and projected life of the machine. These calculations may be performed in real time to allow the machine operator to quickly adjust the operational parameters of the machinery in order to help minimize or eliminate the effects of the degradation mechanism, thereby prolonging the life of the machine. Various systems implementing the methods are also disclosed.
NASA Astrophysics Data System (ADS)
Kagawa, Noboru
A Stirling cooler (refrigerator) was proposed in 1862 and the first Stirling cooler was put on market in 1955. Since then, many Stirling coolers have been developed and marketed as cryocoolers. Recently, Stirling cycle machines for heating and cooling at near-ambient temperatures between 173 and 400K, are recognized as promising candidates for alternative system which are more compatible with people and the Earth. The ideal cycles of Stirling cycle machine offer the highest thermal efficiencies and the working fluids do not cause serious environmental problems of ozone depletion and global warming. In this review, the basic thermodynamics of Stirling cycle are briefly described to quantify the attractive cycle performance. The fundamentals to realize actual Stirling coolers and heat pumps are introduced in detail. The current status of the Stirling cycle machine technologies is reviewed. Some machines have almost achieved the target performance. Also, duplex-Stirling-cycle and Vuilleumier-cycle machines and their performance are introduced.
Machining of Aircraft Titanium with Abrasive-Waterjets for Fatigue Critical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H. T.; Hovanski, Yuri; Dahl, Michael E.
2010-10-04
Laboratory tests were conducted to determine the fatigue performance of AWJ-machined aircraft titanium. Dog-bone specimens machined with AWJs were prepared and tested with and without sanding and dry-grit blasting with Al2O3 as secondary processes. The secondary processes were applied to remove the visual appearance of AWJ-generated striations and to clean up the garnet embedment. The fatigue performance of AWJ-machined specimens was compared with baseline specimens machined with CNC milling. Fatigue test results not only confirmed the findings of the aluminum dog-bone specimens but also further enhance the fatigue performance. In addition, titanium is known to be notoriously difficult to cutmore » with contact tools while AWJs cut it 34% faster than stainless steel. AWJ cutting and dry-grit blasting are shown to be a preferred combination for processing aircraft titanium that is fatigue critical.« less
Target specific compound identification using a support vector machine.
Plewczynski, Dariusz; von Grotthuss, Marcin; Spieser, Stephane A H; Rychlewski, Leszek; Wyrwicz, Lucjan S; Ginalski, Krzysztof; Koch, Uwe
2007-03-01
In many cases at the beginning of an HTS-campaign, some information about active molecules is already available. Often known active compounds (such as substrate analogues, natural products, inhibitors of a related protein or ligands published by a pharmaceutical company) are identified in low-throughput validation studies of the biochemical target. In this study we evaluate the effectiveness of a support vector machine applied for those compounds and used to classify a collection with unknown activity. This approach was aimed at reducing the number of compounds to be tested against the given target. Our method predicts the biological activity of chemical compounds based on only the atom pairs (AP) two dimensional topological descriptors. The supervised support vector machine (SVM) method herein is trained on compounds from the MDL drug data report (MDDR) known to be active for specific protein target. For detailed analysis, five different biological targets were selected including cyclooxygenase-2, dihydrofolate reductase, thrombin, HIV-reverse transcriptase and antagonists of the estrogen receptor. The accuracy of compound identification was estimated using the recall and precision values. The sensitivities for all protein targets exceeded 80% and the classification performance reached 100% for selected targets. In another application of the method, we addressed the absence of an initial set of active compounds for a selected protein target at the beginning of an HTS-campaign. In such a case, virtual high-throughput screening (vHTS) is usually applied by using a flexible docking procedure. However, the vHTS experiment typically contains a large percentage of false positives that should be verified by costly and time-consuming experimental follow-up assays. The subsequent use of our machine learning method was found to improve the speed (since the docking procedure was not required for all compounds from the database) and also the accuracy of the HTS hit lists (the enrichment factor).
Christakis, Panos G; Braga-Mele, Rosa M
2012-02-01
To compare the intraoperative performance and postoperative outcomes of 3 phacoemulsification machines that use different modes. Kensington Eye Institute, Toronto, Ontario, Canada. Comparative case series. This chart and video review comprised consecutive eligible patients who had phacoemulsification by the same surgeon using a Whitestar Signature Ellips-FX (transversal), Infiniti-Ozil-IP (torsional), or Stellaris (longitudinal) machine. The review included 98 patients. Baseline characteristics in the groups were similar; the mean nuclear sclerosis grade was 2.0 ± 0.8. There were no significant intraoperative complications. The torsional machine averaged less phacoemulsification needle time (83 ± 33 seconds) than the transversal (99 ± 40 seconds; P=.21) or longitudinal (110 ± 45 seconds; P=.02) machines; the difference was accentuated in cases with high-grade nuclear sclerosis. The torsional machine had less chatter and better followability than the transversal or longitudinal machines (P<.001). The torsional and longitudinal machines had better anterior chamber stability than the transversal machine (P<.001). Postoperatively, the torsional machine yielded less central corneal edema than the transversal (P<.001) and longitudinal (P=.04) machines, corresponding to a smaller increase in mean corneal thickness (torsional 5%, transversal 10%, longitudinal 12%; P=.04). Also, the torsional machine had better 1-day postoperative visual acuities (P<.001). All 3 phacoemulsification machines were effective with no significant intraoperative complications. The torsional machine outperformed the transversal and longitudinal machines, with a lower mean needle time, less chatter, and improved followability. This corresponded to less corneal edema 1 day postoperatively and better visual acuity. Copyright © 2011 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Continuous performance measurement in flight systems. [sequential control model
NASA Technical Reports Server (NTRS)
Connelly, E. M.; Sloan, N. A.; Zeskind, R. M.
1975-01-01
The desired response of many man machine control systems can be formulated as a solution to an optimal control synthesis problem where the cost index is given and the resulting optimal trajectories correspond to the desired trajectories of the man machine system. Optimal control synthesis provides the reference criteria and the significance of error information required for performance measurement. The synthesis procedure described provides a continuous performance measure (CPM) which is independent of the mechanism generating the control action. Therefore, the technique provides a meaningful method for online evaluation of man's control capability in terms of total man machine performance.
ERIC Educational Resources Information Center
Society of the Plastics Industry, Inc., Washington, DC.
Designed to guide training and curriculum development to prepare machine operators for the national certification exam, this publication identifies the important knowledge required for productive performance by a plastics machine operator. Introductory material discusses the rationale for a national standard, uses of the Body of Knowledge,…
Energy Savings and Persistence from an Energy Services Performance Contract at an Army Base
2011-10-01
control system upgrades, lighting retrofits, vending machine controls, and cooling tower variable frequency drivers (VFDs). To accomplish the...controls were installed in the vending machines , and for the 87018 thermal plant, cooling tower VFDs were implemented. To develop baseline models...identify the reasons of improved or deteriorated energy performance of the buildings. For example, periodic submetering of the vending machines
Implementing Machine Learning in the PCWG Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clifton, Andrew; Ding, Yu; Stuart, Peter
The Power Curve Working Group (www.pcwg.org) is an ad-hoc industry-led group to investigate the performance of wind turbines in real-world conditions. As part of ongoing experience-sharing exercises, machine learning has been proposed as a possible way to predict turbine performance. This presentation provides some background information about machine learning and how it might be implemented in the PCWG exercises.
Jakobsen, Markus Due; Sundstrup, Emil; Andersen, Christoffer H; Bandholm, Thomas; Thorborg, Kristian; Zebis, Mette K; Andersen, Lars L
2012-12-01
While elastic resistance training, targeting the upper body is effective for strength training, the effect of elastic resistance training on lower body muscle activity remains questionable. The purpose of this study was to evaluate the EMG-angle relationship of the quadriceps muscle during 10-RM knee-extensions performed with elastic tubing and an isotonic strength training machine. 7 women and 9 men aged 28-67 years (mean age 44 and 41 years, respectively) participated. Electromyographic (EMG) activity was recorded in 10 muscles during the concentric and eccentric contraction phase of a knee extension exercise performed with elastic tubing and in training machine and normalized to maximal voluntary isometric contraction (MVC) EMG (nEMG). Knee joint angle was measured during the exercises using electronic inclinometers (range of motion 0-90°). When comparing the machine and elastic resistance exercises there were no significant differences in peak EMG of the rectus femoris (RF), vastus lateralis (VL), vastus medialis (VM) during the concentric contraction phase. However, during the eccentric phase, peak EMG was significantly higher (p<0.01) in RF and VM when performing knee extensions using the training machine. In VL and VM the EMG-angle pattern was different between the two training modalities (significant angle by exercise interaction). When using elastic resistance, the EMG-angle pattern peaked towards full knee extension (0°), whereas angle at peak EMG occurred closer to knee flexion position (90°) during the machine exercise. Perceived loading (Borg CR10) was similar during knee extensions performed with elastic tubing (5.7±0.6) compared with knee extensions performed in training machine (5.9±0.5). Knee extensions performed with elastic tubing induces similar high (>70% nEMG) quadriceps muscle activity during the concentric contraction phase, but slightly lower during the eccentric contraction phase, as knee extensions performed using an isotonic training machine. During the concentric contraction phase the two different conditions displayed reciprocal EMG-angle patterns during the range of motion. 5.
Permutation parity machines for neural cryptography.
Reyes, Oscar Mauricio; Zimmermann, Karl-Heinz
2010-06-01
Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.
Toward Intelligent Machine Learning Algorithms
1988-05-01
Machine learning is recognized as a tool for improving the performance of many kinds of systems, yet most machine learning systems themselves are not...directed systems, and with the addition of a knowledge store for organizing and maintaining knowledge to assist learning, a learning machine learning (L...ML) algorithm is possible. The necessary components of L-ML systems are presented along with several case descriptions of existing machine learning systems
Evaluation of an Integrated Multi-Task Machine Learning System with Humans in the Loop
2007-01-01
machine learning components natural language processing, and optimization...was examined with a test explicitly developed to measure the impact of integrated machine learning when used by a human user in a real world setting...study revealed that integrated machine learning does produce a positive impact on overall performance. This paper also discusses how specific machine learning components contributed to human-system
Permutation parity machines for neural cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reyes, Oscar Mauricio; Escuela de Ingenieria Electrica, Electronica y Telecomunicaciones, Universidad Industrial de Santander, Bucaramanga; Zimmermann, Karl-Heinz
2010-06-15
Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.
Predicting hepatotoxicity using ToxCast in vitro bioactivity and ...
Background: The U.S. EPA ToxCastTM program is screening thousands of environmental chemicals for bioactivity using hundreds of high-throughput in vitro assays to build predictive models of toxicity. We represented chemicals based on bioactivity and chemical structure descriptors then used supervised machine learning to predict their hepatotoxic effects.Results: A set of 677 chemicals were represented by 711 in vitro bioactivity descriptors (from ToxCast assays), 4,376 chemical structure descriptors (from QikProp, OpenBabel, PADEL, and PubChem), and three hepatotoxicity categories (from animal studies). Hepatotoxicants were defined by rat liver histopathology observed after chronic chemical testing and grouped into hypertrophy (161), injury (101) and proliferative lesions (99). Classifiers were built using six machine learning algorithms: linear discriminant analysis (LDA), Naïve Bayes (NB), support vector classification (SVM), classification and regression trees (CART), k-nearest neighbors (KNN) and an ensemble of classifiers (ENSMB). Classifiers of hepatotoxicity were built using chemical structure, ToxCast bioactivity, and a hybrid representation. Predictive performance was evaluated using 10-fold cross-validation testing and in-loop, filter-based, feature subset selection. Hybrid classifiers had the best balanced accuracy for predicting hypertrophy (0.78±0.08), injury (0.73±0.10) and proliferative lesions (0.72±0.09). Though chemical and bioactivity class
Prediction of bacterial associations with plants using a supervised machine-learning approach.
Martínez-García, Pedro Manuel; López-Solanilla, Emilia; Ramos, Cayo; Rodríguez-Palenzuela, Pablo
2016-12-01
Recent scenarios of fresh produce contamination by human enteric pathogens have resulted in severe food-borne outbreaks, and a new paradigm has emerged stating that some human-associated bacteria can use plants as secondary hosts. As a consequence, there has been growing concern in the scientific community about these interactions that have not yet been elucidated. Since this is a relatively new area, there is a lack of strategies to address the problem of food-borne illnesses due to the ingestion of fruits and vegetables. In the present study, we performed specific genome annotations to train a supervised machine-learning model that allows for the identification of plant-associated bacteria with a precision of ∼93%. The application of our method to approximately 9500 genomes predicted several unknown interactions between well-known human pathogens and plants, and it also confirmed several cases for which evidence has been reported. We observed that factors involved in adhesion, the deconstruction of the plant cell wall and detoxifying activities were highlighted as the most predictive features. The application of our strategy to sequenced strains that are involved in food poisoning can be used as a primary screening tool to determine the possible causes of contaminations. © 2016 Society for Applied Microbiology and John Wiley & Sons Ltd.
Machine Learning of Parameters for Accurate Semiempirical Quantum Chemical Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dral, Pavlo O.; von Lilienfeld, O. Anatole; Thiel, Walter
2015-05-12
We investigate possible improvements in the accuracy of semiempirical quantum chemistry (SQC) methods through the use of machine learning (ML) models for the parameters. For a given class of compounds, ML techniques require sufficiently large training sets to develop ML models that can be used for adapting SQC parameters to reflect changes in molecular composition and geometry. The ML-SQC approach allows the automatic tuning of SQC parameters for individual molecules, thereby improving the accuracy without deteriorating transferability to molecules with molecular descriptors very different from those in the training set. The performance of this approach is demonstrated for the semiempiricalmore » OM2 method using a set of 6095 constitutional isomers C7H10O2, for which accurate ab initio atomization enthalpies are available. The ML-OM2 results show improved average accuracy and a much reduced error range compared with those of standard OM2 results, with mean absolute errors in atomization enthalpies dropping from 6.3 to 1.7 kcal/mol. They are also found to be superior to the results from specific OM2 reparameterizations (rOM2) for the same set of isomers. The ML-SQC approach thus holds promise for fast and reasonably accurate high-throughput screening of materials and molecules.« less
Ngo, Trieu-Du; Tran, Thanh-Dao; Le, Minh-Tri; Thai, Khac-Minh
2016-11-01
The human P-glycoprotein (P-gp) efflux pump is of great interest for medicinal chemists because of its important role in multidrug resistance (MDR). Because of the high polyspecificity as well as the unavailability of high-resolution X-ray crystal structures of this transmembrane protein, ligand-based, and structure-based approaches which were machine learning, homology modeling, and molecular docking were combined for this study. In ligand-based approach, individual two-dimensional quantitative structure-activity relationship models were developed using different machine learning algorithms and subsequently combined into the Ensemble model which showed good performance on both the diverse training set and the validation sets. The applicability domain and the prediction quality of the developed models were also judged using the state-of-the-art methods and tools. In our structure-based approach, the P-gp structure and its binding region were predicted for a docking study to determine possible interactions between the ligands and the receptor. Based on these in silico tools, hit compounds for reversing MDR were discovered from the in-house and DrugBank databases through virtual screening using prediction models and molecular docking in an attempt to restore cancer cell sensitivity to cytotoxic drugs.
Machine learning of parameters for accurate semiempirical quantum chemical calculations
Dral, Pavlo O.; von Lilienfeld, O. Anatole; Thiel, Walter
2015-04-14
We investigate possible improvements in the accuracy of semiempirical quantum chemistry (SQC) methods through the use of machine learning (ML) models for the parameters. For a given class of compounds, ML techniques require sufficiently large training sets to develop ML models that can be used for adapting SQC parameters to reflect changes in molecular composition and geometry. The ML-SQC approach allows the automatic tuning of SQC parameters for individual molecules, thereby improving the accuracy without deteriorating transferability to molecules with molecular descriptors very different from those in the training set. The performance of this approach is demonstrated for the semiempiricalmore » OM2 method using a set of 6095 constitutional isomers C 7H 10O 2, for which accurate ab initio atomization enthalpies are available. The ML-OM2 results show improved average accuracy and a much reduced error range compared with those of standard OM2 results, with mean absolute errors in atomization enthalpies dropping from 6.3 to 1.7 kcal/mol. They are also found to be superior to the results from specific OM2 reparameterizations (rOM2) for the same set of isomers. The ML-SQC approach thus holds promise for fast and reasonably accurate high-throughput screening of materials and molecules.« less
Reynès, Christelle; Host, Hélène; Camproux, Anne-Claude; Laconde, Guillaume; Leroux, Florence; Mazars, Anne; Deprez, Benoit; Fahraeus, Robin; Villoutreix, Bruno O; Sperandio, Olivier
2010-03-05
Protein-protein interactions (PPIs) may represent one of the next major classes of therapeutic targets. So far, only a minute fraction of the estimated 650,000 PPIs that comprise the human interactome are known with a tiny number of complexes being drugged. Such intricate biological systems cannot be cost-efficiently tackled using conventional high-throughput screening methods. Rather, time has come for designing new strategies that will maximize the chance for hit identification through a rationalization of the PPI inhibitor chemical space and the design of PPI-focused compound libraries (global or target-specific). Here, we train machine-learning-based models, mainly decision trees, using a dataset of known PPI inhibitors and of regular drugs in order to determine a global physico-chemical profile for putative PPI inhibitors. This statistical analysis unravels two important molecular descriptors for PPI inhibitors characterizing specific molecular shapes and the presence of a privileged number of aromatic bonds. The best model has been transposed into a computer program, PPI-HitProfiler, that can output from any drug-like compound collection a focused chemical library enriched in putative PPI inhibitors. Our PPI inhibitor profiler is challenged on the experimental screening results of 11 different PPIs among which the p53/MDM2 interaction screened within our own CDithem platform, that in addition to the validation of our concept led to the identification of 4 novel p53/MDM2 inhibitors. Collectively, our tool shows a robust behavior on the 11 experimental datasets by correctly profiling 70% of the experimentally identified hits while removing 52% of the inactive compounds from the initial compound collections. We strongly believe that this new tool can be used as a global PPI inhibitor profiler prior to screening assays to reduce the size of the compound collections to be experimentally screened while keeping most of the true PPI inhibitors. PPI-HitProfiler is freely available on request from our CDithem platform website, www.CDithem.com.
Reynès, Christelle; Host, Hélène; Camproux, Anne-Claude; Laconde, Guillaume; Leroux, Florence; Mazars, Anne; Deprez, Benoit; Fahraeus, Robin; Villoutreix, Bruno O.; Sperandio, Olivier
2010-01-01
Protein-protein interactions (PPIs) may represent one of the next major classes of therapeutic targets. So far, only a minute fraction of the estimated 650,000 PPIs that comprise the human interactome are known with a tiny number of complexes being drugged. Such intricate biological systems cannot be cost-efficiently tackled using conventional high-throughput screening methods. Rather, time has come for designing new strategies that will maximize the chance for hit identification through a rationalization of the PPI inhibitor chemical space and the design of PPI-focused compound libraries (global or target-specific). Here, we train machine-learning-based models, mainly decision trees, using a dataset of known PPI inhibitors and of regular drugs in order to determine a global physico-chemical profile for putative PPI inhibitors. This statistical analysis unravels two important molecular descriptors for PPI inhibitors characterizing specific molecular shapes and the presence of a privileged number of aromatic bonds. The best model has been transposed into a computer program, PPI-HitProfiler, that can output from any drug-like compound collection a focused chemical library enriched in putative PPI inhibitors. Our PPI inhibitor profiler is challenged on the experimental screening results of 11 different PPIs among which the p53/MDM2 interaction screened within our own CDithem platform, that in addition to the validation of our concept led to the identification of 4 novel p53/MDM2 inhibitors. Collectively, our tool shows a robust behavior on the 11 experimental datasets by correctly profiling 70% of the experimentally identified hits while removing 52% of the inactive compounds from the initial compound collections. We strongly believe that this new tool can be used as a global PPI inhibitor profiler prior to screening assays to reduce the size of the compound collections to be experimentally screened while keeping most of the true PPI inhibitors. PPI-HitProfiler is freely available on request from our CDithem platform website, www.CDithem.com. PMID:20221258
Shi, Z; Ma, X H; Qin, C; Jia, J; Jiang, Y Y; Tan, C Y; Chen, Y Z
2012-02-01
Selective multi-target serotonin reuptake inhibitors enhance antidepressant efficacy. Their discovery can be facilitated by multiple methods, including in silico ones. In this study, we developed and tested an in silico method, combinatorial support vector machines (COMBI-SVMs), for virtual screening (VS) multi-target serotonin reuptake inhibitors of seven target pairs (serotonin transporter paired with noradrenaline transporter, H(3) receptor, 5-HT(1A) receptor, 5-HT(1B) receptor, 5-HT(2C) receptor, melanocortin 4 receptor and neurokinin 1 receptor respectively) from large compound libraries. COMBI-SVMs trained with 917-1951 individual target inhibitors correctly identified 22-83.3% (majority >31.1%) of the 6-216 dual inhibitors collected from literature as independent testing sets. COMBI-SVMs showed moderate to good target selectivity in misclassifying as dual inhibitors 2.2-29.8% (majority <15.4%) of the individual target inhibitors of the same target pair and 0.58-7.1% of the other 6 targets outside the target pair. COMBI-SVMs showed low dual inhibitor false hit rates (0.006-0.056%, 0.042-0.21%, 0.2-4%) in screening 17 million PubChem compounds, 168,000 MDDR compounds, and 7-8181 MDDR compounds similar to the dual inhibitors. Compared with similarity searching, k-NN and PNN methods, COMBI-SVM produced comparable dual inhibitor yields, similar target selectivity, and lower false hit rate in screening 168,000 MDDR compounds. The annotated classes of many COMBI-SVMs identified MDDR virtual hits correlate with the reported effects of their predicted targets. COMBI-SVM is potentially useful for searching selective multi-target agents without explicit knowledge of these agents. Copyright © 2011 Elsevier Inc. All rights reserved.
The Lick-Gaertner automatic measuring system
NASA Technical Reports Server (NTRS)
Vasilevskis, S.; Popov, W. A.
1971-01-01
The Lick-Gaertner automatic equipment has been designed mainly for the measurement of stellar proper motions with reference to galaxies, and consists of two main components: the survey machine and the automatic measuring engine. The survey machine is used for initial inspection and selection of objects for subsequent measurement. Two plates, up to 17 x 17 inches each, are surveyed simultaneously by means of projection on a screen. The approximate positions of objects selected are measured by two optical screws: helical lines cut through an aluminum coating on glass cylinders. These approximate coordinates to a precision of the order of 0.03mm are transmitted to a card punch by encoders connected with the cylinders.
Molecular graph convolutions: moving beyond fingerprints
NASA Astrophysics Data System (ADS)
Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick
2016-08-01
Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.
Digital imaging biomarkers feed machine learning for melanoma screening.
Gareau, Daniel S; Correa da Rosa, Joel; Yagerman, Sarah; Carucci, John A; Gulati, Nicholas; Hueto, Ferran; DeFazio, Jennifer L; Suárez-Fariñas, Mayte; Marghoob, Ashfaq; Krueger, James G
2017-07-01
We developed an automated approach for generating quantitative image analysis metrics (imaging biomarkers) that are then analysed with a set of 13 machine learning algorithms to generate an overall risk score that is called a Q-score. These methods were applied to a set of 120 "difficult" dermoscopy images of dysplastic nevi and melanomas that were subsequently excised/classified. This approach yielded 98% sensitivity and 36% specificity for melanoma detection, approaching sensitivity/specificity of expert lesion evaluation. Importantly, we found strong spectral dependence of many imaging biomarkers in blue or red colour channels, suggesting the need to optimize spectral evaluation of pigmented lesions. © 2016 The Authors. Experimental Dermatology Published by John Wiley & Sons Ltd.
Amazing structure of respirasome: unveiling the secrets of cell respiration.
Guo, Runyu; Gu, Jinke; Wu, Meng; Yang, Maojun
2016-12-01
Respirasome, a huge molecular machine that carries out cellular respiration, has gained growing attention since its discovery, because respiration is the most indispensable biological process in almost all living creatures. The concept of respirasome has renewed our understanding of the respiratory chain organization, and most recently, the structure of respirasome solved by Yang's group from Tsinghua University (Gu et al. Nature 237(7622):639-643, 2016) firstly presented the detailed interactions within this huge molecular machine, and provided important information for drug design and screening. However, the study of cellular respiration went through a long history. Here, we briefly showed the detoured history of respiratory chain investigation, and then described the amazing structure of respirasome.
Molecular graph convolutions: moving beyond fingerprints.
Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick
2016-08-01
Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph-atoms, bonds, distances, etc.-which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.
Early experiences in developing and managing the neuroscience gateway.
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T
2015-02-01
The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.
Early experiences in developing and managing the neuroscience gateway
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.
2015-01-01
SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124
Co-Located Collaborative Learning Video Game with Single Display Groupware
ERIC Educational Resources Information Center
Infante, Cristian; Weitz, Juan; Reyes, Tomas; Nussbaum, Miguel; Gomez, Florencia; Radovic, Darinka
2010-01-01
Role Game is a co-located CSCL video game played by three students sitting at one machine sharing a single screen, each with their own input device. Inspired by video console games, Role Game enables students to learn by doing, acquiring social abilities and mastering subject matter in a context of co-located collaboration. After describing the…
ERIC Educational Resources Information Center
Cohen, Ira L.; Liu, Xudong; Hudson, Melissa; Gillis, Jennifer; Cavalari, Rachel N. S.; Romanczyk, Raymond G.; Karmel, Bernard Z.; Gardner, Judith M.
2017-01-01
The PDD Behavior Inventory (PDDBI) has recently been shown, in a large multisite study, to discriminate well between autism spectrum disorder (ASD) and other groups when its scores were examined using a machine learning tool, Classification and Regression Trees (CART). Discrimination was good for toddlers, preschoolers, and school-age children;…
Cutawl Techniques and Silk Screen; Commercial and Advertising Art--Intermediate: 9185.03.
ERIC Educational Resources Information Center
Dade County Public Schools, Miami, FL.
The course is comprised of two comprehensive courses totaling 135 hours of classwork. Orientation to commercial and advertising art is a necessary prerequisite to entry into the course. The first half of the course introduces the student to the function and operation of the cutawl machine. Through supervised classroom practice, the student…
Hardware assisted hypervisor introspection.
Shi, Jiangyong; Yang, Yuexiang; Tang, Chuan
2016-01-01
In this paper, we introduce hypervisor introspection, an out-of-box way to monitor the execution of hypervisors. Similar to virtual machine introspection which has been proposed to protect virtual machines in an out-of-box way over the past decade, hypervisor introspection can be used to protect hypervisors which are the basis of cloud security. Virtual machine introspection tools are usually deployed either in hypervisor or in privileged virtual machines, which might also be compromised. By utilizing hardware support including nested virtualization, EPT protection and #BP, we are able to monitor all hypercalls belongs to the virtual machines of one hypervisor, include that of privileged virtual machine and even when the hypervisor is compromised. What's more, hypercall injection method is used to simulate hypercall-based attacks and evaluate the performance of our method. Experiment results show that our method can effectively detect hypercall-based attacks with some performance cost. Lastly, we discuss our furture approaches of reducing the performance cost and preventing the compromised hypervisor from detecting the existence of our introspector, in addition with some new scenarios to apply our hypervisor introspection system.
NASA Astrophysics Data System (ADS)
Mehmood, Shahid; Shah, Masood; Pasha, Riffat Asim; Sultan, Amir
2017-10-01
The effect of electric discharge machining (EDM) on surface quality and consequently on the fatigue performance of Al 2024 T6 is investigated. Five levels of discharge current are analyzed, while all other electrical and nonelectrical parameters are kept constant. At each discharge current level, dog-bone specimens are machined by generating a peripheral notch at the center. The fatigue tests are performed on four-point rotating bending machine at room temperature. For comparison purposes, fatigue tests are also performed on the conventionally machined specimens. Linearized SN curves for 95% failure probability and with four different confidence levels (75, 90, 95 and 99%) are plotted for each discharge current level as well as for conventionally machined specimens. These plots show that the electric discharge machined (EDMed) specimens give inferior fatigue behavior as compared to conventionally machined specimen. Moreover, discharge current inversely affects the fatigue life, and this influence is highly pronounced at lower stresses. The EDMed surfaces are characterized by surface properties that could be responsible for change in fatigue life such as surface morphology, surface roughness, white layer thickness, microhardness and residual stresses. It is found that all these surface properties are affected by changing discharge current level. However, change in fatigue life by discharge current could not be associated independently to any single surface property.
Testing of Anesthesia Machines and Defibrillators in Healthcare Institutions.
Gurbeta, Lejla; Dzemic, Zijad; Bego, Tamer; Sejdic, Ervin; Badnjevic, Almir
2017-09-01
To improve the quality of patient treatment by improving the functionality of medical devices in healthcare institutions. To present the results of the safety and performance inspection of patient-relevant output parameters of anesthesia machines and defibrillators defined by legal metrology. This study covered 130 anesthesia machines and 161 defibrillators used in public and private healthcare institutions, during a period of two years. Testing procedures were carried out according to international standards and legal metrology legislative procedures in Bosnia and Herzegovina. The results show that in 13.84% of tested anesthesia machine and 14.91% of defibrillators device performance is not in accordance with requirements and should either have its results be verified, or the device removed from use or scheduled for corrective maintenance. Research emphasizes importance of independent safety and performance inspections, and gives recommendations for the frequency of inspection based on measurements. Results offer implications for adequacy of preventive and corrective maintenance performed in healthcare institutions. Based on collected data, the first digital electronical database of anesthesia machines and defibrillators used in healthcare institutions in Bosnia and Herzegovina is created. This database is a useful tool for tracking each device's performance over time.
32 CFR 701.53 - FOIA fee schedule.
Code of Federal Regulations, 2014 CFR
2014-07-01
... human time) and machine time. (1) Human time. Human time is all the time spent by humans performing the...) Machine time. Machine time involves only direct costs of the central processing unit (CPU), input/output... exist to calculate CPU time, no machine costs can be passed on to the requester. When CPU calculations...
32 CFR 701.53 - FOIA fee schedule.
Code of Federal Regulations, 2012 CFR
2012-07-01
... human time) and machine time. (1) Human time. Human time is all the time spent by humans performing the...) Machine time. Machine time involves only direct costs of the central processing unit (CPU), input/output... exist to calculate CPU time, no machine costs can be passed on to the requester. When CPU calculations...
32 CFR 518.20 - Collection of fees and fee rates.
Code of Federal Regulations, 2014 CFR
2014-07-01
...; individual time (hereafter referred to as human time), and machine time. (i) Human time. Human time is all the time spent by humans performing the necessary tasks to prepare the job for a machine to execute..., programmer, database administrator, or action officer). (ii) Machine time. Machine time involves only direct...
32 CFR 518.20 - Collection of fees and fee rates.
Code of Federal Regulations, 2012 CFR
2012-07-01
...; individual time (hereafter referred to as human time), and machine time. (i) Human time. Human time is all the time spent by humans performing the necessary tasks to prepare the job for a machine to execute..., programmer, database administrator, or action officer). (ii) Machine time. Machine time involves only direct...
32 CFR 518.20 - Collection of fees and fee rates.
Code of Federal Regulations, 2013 CFR
2013-07-01
...; individual time (hereafter referred to as human time), and machine time. (i) Human time. Human time is all the time spent by humans performing the necessary tasks to prepare the job for a machine to execute..., programmer, database administrator, or action officer). (ii) Machine time. Machine time involves only direct...
32 CFR 701.53 - FOIA fee schedule.
Code of Federal Regulations, 2013 CFR
2013-07-01
... human time) and machine time. (1) Human time. Human time is all the time spent by humans performing the...) Machine time. Machine time involves only direct costs of the central processing unit (CPU), input/output... exist to calculate CPU time, no machine costs can be passed on to the requester. When CPU calculations...
Standard surface grinder for precision machining of thin-wall tubing
NASA Technical Reports Server (NTRS)
Jones, A.; Kotora, J., Jr.; Rein, J.; Smith, S. V.; Strack, D.; Stuckey, D.
1967-01-01
Standard surface grinder performs precision machining of thin-wall stainless steel tubing by electrical discharge grinding. A related adaptation, a traveling wire electrode fixture, is used for machining slots in thin-walled tubing.
Micro-optical fabrication by ultraprecision diamond machining and precision molding
NASA Astrophysics Data System (ADS)
Li, Hui; Li, Likai; Naples, Neil J.; Roblee, Jeffrey W.; Yi, Allen Y.
2017-06-01
Ultraprecision diamond machining and high volume molding for affordable high precision high performance optical elements are becoming a viable process in optical industry for low cost high quality microoptical component manufacturing. In this process, first high precision microoptical molds are fabricated using ultraprecision single point diamond machining followed by high volume production methods such as compression or injection molding. In the last two decades, there have been steady improvements in ultraprecision machine design and performance, particularly with the introduction of both slow tool and fast tool servo. Today optical molds, including freeform surfaces and microlens arrays, are routinely diamond machined to final finish without post machining polishing. For consumers, compression molding or injection molding provide efficient and high quality optics at extremely low cost. In this paper, first ultraprecision machine design and machining processes such as slow tool and fast too servo are described then both compression molding and injection molding of polymer optics are discussed. To implement precision optical manufacturing by molding, numerical modeling can be included in the future as a critical part of the manufacturing process to ensure high product quality.
Machine characterization and benchmark performance prediction
NASA Technical Reports Server (NTRS)
Saavedra-Barrera, Rafael H.
1988-01-01
From runs of standard benchmarks or benchmark suites, it is not possible to characterize the machine nor to predict the run time of other benchmarks which have not been run. A new approach to benchmarking and machine characterization is reported. The creation and use of a machine analyzer is described, which measures the performance of a given machine on FORTRAN source language constructs. The machine analyzer yields a set of parameters which characterize the machine and spotlight its strong and weak points. Also described is a program analyzer, which analyzes FORTRAN programs and determines the frequency of execution of each of the same set of source language operations. It is then shown that by combining a machine characterization and a program characterization, we are able to predict with good accuracy the run time of a given benchmark on a given machine. Characterizations are provided for the Cray-X-MP/48, Cyber 205, IBM 3090/200, Amdahl 5840, Convex C-1, VAX 8600, VAX 11/785, VAX 11/780, SUN 3/50, and IBM RT-PC/125, and for the following benchmark programs or suites: Los Alamos (BMK8A1), Baskett, Linpack, Livermore Loops, Madelbrot Set, NAS Kernels, Shell Sort, Smith, Whetstone and Sieve of Erathostenes.
Parodi, Stefano; Manneschi, Chiara; Verda, Damiano; Ferrari, Enrico; Muselli, Marco
2018-03-01
This study evaluates the performance of a set of machine learning techniques in predicting the prognosis of Hodgkin's lymphoma using clinical factors and gene expression data. Analysed samples from 130 Hodgkin's lymphoma patients included a small set of clinical variables and more than 54,000 gene features. Machine learning classifiers included three black-box algorithms ( k-nearest neighbour, Artificial Neural Network, and Support Vector Machine) and two methods based on intelligible rules (Decision Tree and the innovative Logic Learning Machine method). Support Vector Machine clearly outperformed any of the other methods. Among the two rule-based algorithms, Logic Learning Machine performed better and identified a set of simple intelligible rules based on a combination of clinical variables and gene expressions. Decision Tree identified a non-coding gene ( XIST) involved in the early phases of X chromosome inactivation that was overexpressed in females and in non-relapsed patients. XIST expression might be responsible for the better prognosis of female Hodgkin's lymphoma patients.
Reverse time migration: A seismic processing application on the connection machine
NASA Technical Reports Server (NTRS)
Fiebrich, Rolf-Dieter
1987-01-01
The implementation of a reverse time migration algorithm on the Connection Machine, a massively parallel computer is described. Essential architectural features of this machine as well as programming concepts are presented. The data structures and parallel operations for the implementation of the reverse time migration algorithm are described. The algorithm matches the Connection Machine architecture closely and executes almost at the peak performance of this machine.
Fatigue Life Variability in Large Aluminum Forgings with Residual Stress
2011-07-01
been conducted. A detailed finite element analysis of the forge/ quench /coldwork/machine process was performed in order to predict the bulk residual...forge/ quench /coldwork/machine process was performed in order to predict the bulk residual stresses in a fictitious aluminum bulkhead. The residual...continues to develop the capability for computational simulation of the forge, quench , cold work and machining processes. In order to handle the
Marucci-Wellman, Helen R; Corns, Helen L; Lehto, Mark R
2017-01-01
Injury narratives are now available real time and include useful information for injury surveillance and prevention. However, manual classification of the cause or events leading to injury found in large batches of narratives, such as workers compensation claims databases, can be prohibitive. In this study we compare the utility of four machine learning algorithms (Naïve Bayes, Single word and Bi-gram models, Support Vector Machine and Logistic Regression) for classifying narratives into Bureau of Labor Statistics Occupational Injury and Illness event leading to injury classifications for a large workers compensation database. These algorithms are known to do well classifying narrative text and are fairly easy to implement with off-the-shelf software packages such as Python. We propose human-machine learning ensemble approaches which maximize the power and accuracy of the algorithms for machine-assigned codes and allow for strategic filtering of rare, emerging or ambiguous narratives for manual review. We compare human-machine approaches based on filtering on the prediction strength of the classifier vs. agreement between algorithms. Regularized Logistic Regression (LR) was the best performing algorithm alone. Using this algorithm and filtering out the bottom 30% of predictions for manual review resulted in high accuracy (overall sensitivity/positive predictive value of 0.89) of the final machine-human coded dataset. The best pairings of algorithms included Naïve Bayes with Support Vector Machine whereby the triple ensemble NB SW =NB BI-GRAM =SVM had very high performance (0.93 overall sensitivity/positive predictive value and high accuracy (i.e. high sensitivity and positive predictive values)) across both large and small categories leaving 41% of the narratives for manual review. Integrating LR into this ensemble mix improved performance only slightly. For large administrative datasets we propose incorporation of methods based on human-machine pairings such as we have done here, utilizing readily-available off-the-shelf machine learning techniques and resulting in only a fraction of narratives that require manual review. Human-machine ensemble methods are likely to improve performance over total manual coding. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Cold machining of high density tungsten and other materials
NASA Technical Reports Server (NTRS)
Ziegelmeier, P.
1969-01-01
Cold machining process, which uses a sub-zero refrigerated cutting fluid, is used for machining refractory or reactive metals and alloys. Special carbide tools for turning and drilling these alloys further improve the cutting performance.
NASA Astrophysics Data System (ADS)
Muralidhara, .; Vasa, Nilesh J.; Singaperumal, M.
2010-02-01
A micro-electro-discharge machine (Micro EDM) was developed incorporating a piezoactuated direct drive tool feed mechanism for micromachining of Silicon using a copper tool. Tool and workpiece materials are removed during Micro EDM process which demand for a tool wear compensation technique to reach the specified depth of machining on the workpiece. An in-situ axial tool wear and machining depth measurement system is developed to investigate axial wear ratio variations with machining depth. Stepwise micromachining experiments on silicon wafer were performed to investigate the variations in the silicon removal and tool wear depths with increase in tool feed. Based on these experimental data, a tool wear compensation method is proposed to reach the desired depth of micromachining on silicon using copper tool. Micromachining experiments are performed with the proposed tool wear compensation method and a maximum workpiece machining depth variation of 6% was observed.
NASA Astrophysics Data System (ADS)
Haikal Ahmad, M. A.; Zulafif Rahim, M.; Fauzi, M. F. Mohd; Abdullah, Aslam; Omar, Z.; Ding, Songlin; Ismail, A. E.; Rasidi Ibrahim, M.
2018-01-01
Polycrystalline diamond (PCD) is regarded as among the hardest material in the world. Electrical Discharge Machining (EDM) typically used to machine this material because of its non-contact process nature. This investigation was purposely done to compare the EDM performances of PCD when using normal electrode of copper (Cu) and newly proposed graphitization catalyst electrode of copper nickel (CuNi). Two level full factorial design of experiment with 4 center points technique was used to study the influence of main and interaction effects of the machining parameter namely; pulse-on, pulse-off, sparking current, and electrode materials (categorical factor). The paper shows interesting discovery in which the newly proposed electrode presented positive impact to the machining performance. With the same machining parameters of finishing, CuNi delivered more than 100% better in Ra and MRR than ordinary Cu electrode.
Stylianou, Neophytos; Akbarov, Artur; Kontopantelis, Evangelos; Buchan, Iain; Dunn, Ken W
2015-08-01
Predicting mortality from burn injury has traditionally employed logistic regression models. Alternative machine learning methods have been introduced in some areas of clinical prediction as the necessary software and computational facilities have become accessible. Here we compare logistic regression and machine learning predictions of mortality from burn. An established logistic mortality model was compared to machine learning methods (artificial neural network, support vector machine, random forests and naïve Bayes) using a population-based (England & Wales) case-cohort registry. Predictive evaluation used: area under the receiver operating characteristic curve; sensitivity; specificity; positive predictive value and Youden's index. All methods had comparable discriminatory abilities, similar sensitivities, specificities and positive predictive values. Although some machine learning methods performed marginally better than logistic regression the differences were seldom statistically significant and clinically insubstantial. Random forests were marginally better for high positive predictive value and reasonable sensitivity. Neural networks yielded slightly better prediction overall. Logistic regression gives an optimal mix of performance and interpretability. The established logistic regression model of burn mortality performs well against more complex alternatives. Clinical prediction with a small set of strong, stable, independent predictors is unlikely to gain much from machine learning outside specialist research contexts. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.
Ozcift, Akin; Gulten, Arif
2011-12-01
Improving accuracies of machine learning algorithms is vital in designing high performance computer-aided diagnosis (CADx) systems. Researches have shown that a base classifier performance might be enhanced by ensemble classification strategies. In this study, we construct rotation forest (RF) ensemble classifiers of 30 machine learning algorithms to evaluate their classification performances using Parkinson's, diabetes and heart diseases from literature. While making experiments, first the feature dimension of three datasets is reduced using correlation based feature selection (CFS) algorithm. Second, classification performances of 30 machine learning algorithms are calculated for three datasets. Third, 30 classifier ensembles are constructed based on RF algorithm to assess performances of respective classifiers with the same disease data. All the experiments are carried out with leave-one-out validation strategy and the performances of the 60 algorithms are evaluated using three metrics; classification accuracy (ACC), kappa error (KE) and area under the receiver operating characteristic (ROC) curve (AUC). Base classifiers succeeded 72.15%, 77.52% and 84.43% average accuracies for diabetes, heart and Parkinson's datasets, respectively. As for RF classifier ensembles, they produced average accuracies of 74.47%, 80.49% and 87.13% for respective diseases. RF, a newly proposed classifier ensemble algorithm, might be used to improve accuracy of miscellaneous machine learning algorithms to design advanced CADx systems. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Audit of mammography requests in Abakaliki, South-East Nigeria.
Eni, U E; Ekwedigwe, K C; Sunday-Adeoye, I; Daniyan, Abc; Isikhuemen, M E
2017-03-07
Breast cancer is the leading cancer in women in both developed and developing countries. Screening mammography detects breast cancer even before a lump can be palpated, with better prognosis. The introduction of mammographic technique for screening breast cancer, despite its importance, has been slow to adopt and virtually non-existent in many parts of Sub-Saharan Africa including Nigeria. For this reason, the indications of mammography have not been well defined in our setting. The aim of this study was to audit our mammography requests, with a view to improving its application in our setting. This is a descriptive study carried out on 69 female patients who had mammography at the National Obstetric Fistula Centre, Abakaliki, from January 2014 to December 2015. Findings on clinical examination were entered in a proforma. Mammography was performed in craniocaudal and mediolateral views using the Lorad M-IV (film-screen) mammography machine. Data was analysed using the Statistical Package for Social Sciences (SPSS) version 21. All 69 patients were females. Their mean age was 42.1 ± 11 years. Majority of the patients (69.6%) were between 30 and 49 years. The commonest indication for mammography was breast lump which was found in 46 patients (66.7%). Breast pain was present in 36 (52.2%) of patients. The different Breast Imaging Reporting and Data System (BIRADS) categories were BIRADS 0: 20 (28.99%), BIRADS 1: 8 (11.59%), BIRADS 2: 9 (13.04%), BIRADS 3: 4 (5.8%), BIRADS 4: 19 (27.54%) and BIRADS 5: 9 (13.04%). Diagnostic mammography remains the commonest indication for mammography in our setting. Public awareness, poverty reduction and ready availability of mammography facilities are required to improve screening mammography in our setting.
Processing and consolidation of copper/tungsten
Chen, Ching-Fong; Pokharel, Reeju; Brand, Michael J.; ...
2016-09-27
Here, we developed a copper/tungsten (Cu/W) composite for mesoscale Materials Science applications using the novel High-Energy Diffraction Microscopy (HEDM) technique. Argon-atomized copper powder was selected as the starting raw powder and screened to remove the extremely large particle fraction. Tungsten particles were collected by milling and screening the -325 mesh tungsten powder between 500 and 635 mesh sieves. Hot pressing of screened Cu powder was performed at 900 °C in Ar/4 %H 2 atmosphere. XRD and ICP results show that the hot-pressed Cu sample consists of about 5 vol% Cu 2O, which is caused by the presence of oxygen onmore » the surface of the starting Cu powder. Hot pressing the copper powder in a pure hydrogen atmosphere was successful in removing most of the surface oxygen. Our process was also implemented for hot pressing the Cu/W composite. The density of the Cu/W composites hot pressed at 950 °C in pure hydrogen was about 94 % of the theoretical density (TD). The hot-pressed Cu/W composites were further hot isostatic pressed at 1050 °C in argon atmosphere, which results in 99.6 % of the TD with the designed Cu grain size and W particle distribution. Tensile specimens with D-notch were machined using the wire EDM method. Furthermore, the processing and consolidation of these materials will be discussed in detail. The HEDM images are also showed and discussed.« less
Pham-The, H; Casañola-Martin, G; Diéguez-Santana, K; Nguyen-Hai, N; Ngoc, N T; Vu-Duc, L; Le-Thi-Thu, H
2017-03-01
Histone deacetylases (HDAC) are emerging as promising targets in cancer, neuronal diseases and immune disorders. Computational modelling approaches have been widely applied for the virtual screening and rational design of novel HDAC inhibitors. In this study, different machine learning (ML) techniques were applied for the development of models that accurately discriminate HDAC2 inhibitors form non-inhibitors. The obtained models showed encouraging results, with the global accuracy in the external set ranging from 0.83 to 0.90. Various aspects related to the comparison of modelling techniques, applicability domain and descriptor interpretations were discussed. Finally, consensus predictions of these models were used for screening HDAC2 inhibitors from four chemical libraries whose bioactivities against HDAC1, HDAC3, HDAC6 and HDAC8 have been known. According to the results of virtual screening assays, structures of some hits with pair-isoform-selective activity (between HDAC2 and other HDACs) were revealed. This study illustrates the power of ML-based QSAR approaches for the screening and discovery of potent, isoform-selective HDACIs.
Mancia, Annalaura; Ryan, James C; Van Dolah, Frances M; Kucklick, John R; Rowles, Teresa K; Wells, Randall S; Rosel, Patricia E; Hohn, Aleta A; Schwacke, Lori H
2014-09-01
As top-level predators, common bottlenose dolphins (Tursiops truncatus) are particularly sensitive to chemical and biological contaminants that accumulate and biomagnify in the marine food chain. This work investigates the potential use of microarray technology and gene expression profile analysis to screen common bottlenose dolphins for exposure to environmental contaminants through the immunological and/or endocrine perturbations associated with these agents. A dolphin microarray representing 24,418 unigene sequences was used to analyze blood samples collected from 47 dolphins during capture-release health assessments from five different US coastal locations (Beaufort, NC, Sarasota Bay, FL, Saint Joseph Bay, FL, Sapelo Island, GA and Brunswick, GA). Organohalogen contaminants including pesticides, polychlorinated biphenyl congeners (PCBs) and polybrominated diphenyl ether congeners were determined in blubber biopsy samples from the same animals. A subset of samples (n = 10, males; n = 8, females) with the highest and the lowest measured values of PCBs in their blubber was used as strata to determine the differential gene expression of the exposure extremes through machine learning classification algorithms. A set of genes associated primarily with nuclear and DNA stability, cell division and apoptosis regulation, intra- and extra-cellular traffic, and immune response activation was selected by the algorithm for identifying the two exposure extremes. In order to test the hypothesis that these gene expression patterns reflect PCB exposure, we next investigated the blood transcriptomes of the remaining dolphin samples using machine-learning approaches, including K-nn and Support Vector Machines classifiers. Using the derived gene sets, the algorithms worked very well (100% success rate) at classifying dolphins according to the contaminant load accumulated in their blubber. These results suggest that gene expression profile analysis may provide a valuable means to screen for indicators of chemical exposure. Copyright © 2014 Elsevier Ltd. All rights reserved.
Machine Learning Based Malware Detection
2015-05-18
A TRIDENT SCHOLAR PROJECT REPORT NO. 440 Machine Learning Based Malware Detection by Midshipman 1/C Zane A. Markel, USN...COVERED (From - To) 4. TITLE AND SUBTITLE Machine Learning Based Malware Detection 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...suitably be projected into realistic performance. This work explores several aspects of machine learning based malware detection . First, we
Screening on oil-decomposing microorganisms and application in organic waste treatment machine.
Lu, Yi-Tong; Chen, Xiao-Bin; Zhou, Pei; Li, Zhen-Hong
2005-01-01
As an oil-decomposable mixture of two bacteria strains (Bacillus sp. and Pseudomonas sp.), Y3 was isolated after 50 d domestication under the condition that oil was used as the limited carbon source. The decomposing rate by Y3 was higher than that by each separate individual strain, indicating a synergistic effect of the two bacteria. Under the conditions that T = 25-40 degrees C, pH = 6-8, HRT (Hydraulic retention time) = 36 h and the oil concentration at 0.1%, Y3 yielded the highest decomposing rate of 95.7%. Y3 was also applied in an organic waste treatment machine and a certain rate of activated bacteria was put into the stuffing. A series of tests including humidity, pH, temperature, C/N rate and oil percentage of the stuffing were carried out to check the efficacy of oil-decomposition. Results showed that the oil content of the stuffing with inoculums was only half of that of the control. Furthermore, the bacteria were also beneficial to maintain the stability of the machine operating. Therefore, the bacteria mixture as well as the machines in this study could be very useful for waste treatment.
Scale effects and a method for similarity evaluation in micro electrical discharge machining
NASA Astrophysics Data System (ADS)
Liu, Qingyu; Zhang, Qinhe; Wang, Kan; Zhu, Guang; Fu, Xiuzhuo; Zhang, Jianhua
2016-08-01
Electrical discharge machining(EDM) is a promising non-traditional micro machining technology that offers a vast array of applications in the manufacturing industry. However, scale effects occur when machining at the micro-scale, which can make it difficult to predict and optimize the machining performances of micro EDM. A new concept of "scale effects" in micro EDM is proposed, the scale effects can reveal the difference in machining performances between micro EDM and conventional macro EDM. Similarity theory is presented to evaluate the scale effects in micro EDM. Single factor experiments are conducted and the experimental results are analyzed by discussing the similarity difference and similarity precision. The results show that the output results of scale effects in micro EDM do not change linearly with discharge parameters. The values of similarity precision of machining time significantly increase when scaling-down the capacitance or open-circuit voltage. It is indicated that the lower the scale of the discharge parameter, the greater the deviation of non-geometrical similarity degree over geometrical similarity degree, which means that the micro EDM system with lower discharge energy experiences more scale effects. The largest similarity difference is 5.34 while the largest similarity precision can be as high as 114.03. It is suggested that the similarity precision is more effective in reflecting the scale effects and their fluctuation than similarity difference. Consequently, similarity theory is suitable for evaluating the scale effects in micro EDM. This proposed research offers engineering values for optimizing the machining parameters and improving the machining performances of micro EDM.
A performance study of sparse Cholesky factorization on INTEL iPSC/860
NASA Technical Reports Server (NTRS)
Zubair, M.; Ghose, M.
1992-01-01
The problem of Cholesky factorization of a sparse matrix has been very well investigated on sequential machines. A number of efficient codes exist for factorizing large unstructured sparse matrices. However, there is a lack of such efficient codes on parallel machines in general, and distributed machines in particular. Some of the issues that are critical to the implementation of sparse Cholesky factorization on a distributed memory parallel machine are ordering, partitioning and mapping, load balancing, and ordering of various tasks within a processor. Here, we focus on the effect of various partitioning schemes on the performance of sparse Cholesky factorization on the Intel iPSC/860. Also, a new partitioning heuristic for structured as well as unstructured sparse matrices is proposed, and its performance is compared with other schemes.
DOE-RCT-0003641 Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, Edward; Lesster, Ted
2014-07-30
This program studied novel concepts for an Axial Flux Reluctance Machine to capture energy from marine hydrokinetic sources and compared their attributes to a Radial Flux Reluctance Machine which was designed under a prior Department of Energy program for the same application. Detailed electromagnetic and mechanical analyses were performed to determine the validity of the concept and to provide a direct comparison with the existing conventional Radial Flux Switched Reluctance Machine designed during the Advanced Wave Energy Conversion Project, DE-EE0003641. The alternate design changed the machine topology so that the flux that is switched flows axially rather than radially andmore » the poles themselves are long radially, as opposed to the radial flux machine that has pole pieces that are long axially. It appeared possible to build an axial flux machine that should be considerably more compact than the radial machine. In an “apples to apples” comparison, the same rules with regard to generating magnetic force and the fundamental limitations of flux density hold, so that at the heart of the machine the same torque equations hold. The differences are in the mechanical configuration that limits or enhances the change of permeance with rotor position, in the amount of permeable iron required to channel the flux via the pole pieces to the air-gaps, and in the sizing and complexity of the electrical winding. Accordingly it was anticipated that the magnetic component weight would be similar but that better use of space would result in a shorter machine with accompanying reduction in housing and support structure. For the comparison the pole count was kept the same at 28 though it was also expected that the radial tapering of the slots between pole pieces would permit a higher pole count machine, enabling the generation of greater power at a given speed in some future design. The baseline Radial Flux Machine design was established during the previous DOE program. Its characteristics were tabulated for use in comparing to the Axial Flux Machine. Three basic conceptual designs for the Axial Flux Machine were considered: (1) a machine with a single coil at the inner diameter of the machine, (2) a machine with a single coil at the outside diameter of the machine, and (3) a machine with a coil around each tooth. Slight variations of these basic configurations were considered during the study. Analysis was performed on these configurations to determine the best candidate design to advance to preliminary design, based on size, weight, performance, cost and manufacturability. The configuration selected as the most promising was the multi-pole machine with a coil around each tooth. This configuration provided the least complexity with respect to the mechanical configuration and manufacturing, which would yield the highest reliability and lowest cost machine of the three options. A preliminary design was performed on this selected configuration. For this first ever axial design of the multi rotor configuration the 'apples to apples' comparison was based on using the same length of rotor pole as the axial length of rotor pole in the radial machine and making the mean radius of the rotor in the axial machine the same as the air gap radius in the radial machine. The tooth to slot ratio at the mean radius of the axial machine was the same as the tooth to slot ratio of the radial machine. The comparison between the original radial flux machine and the new axial flux machine indicates that for the same torque, the axial flux machine diameter will be 27% greater, but it will have 30% of the length, and 76% of the weight. Based on these results, it is concluded that an axial flux reluctance machine presents a viable option for large generators to be used for the capture of wave energy. In the analysis of Task 4, below, it is pointed out that our selection of dimensional similarity for the 'apples to apples' comparison did not produce an optimum axial flux design. There is torque capability to spare, implying we could reduce the magnetic structure, but the winding area, constrained by the pole separation at the inner pole radius has a higher resistance than desirable, implying we need more room for copper. The recommendation is to proceed via one cycle of optimization and review to correct this unbalance and then proceed to a detailed design phase to produce manufacturing drawings, followed by the construction of a prototype to test the performance of the machine against predicted results.« less
Performance evaluation of the croissant production line with reparable machines
NASA Astrophysics Data System (ADS)
Tsarouhas, Panagiotis H.
2015-03-01
In this study, the analytical probability models for an automated serial production system, bufferless that consists of n-machines in series with common transfer mechanism and control system was developed. Both time to failure and time to repair a failure are assumed to follow exponential distribution. Applying those models, the effect of system parameters on system performance in actual croissant production line was studied. The production line consists of six workstations with different numbers of reparable machines in series. Mathematical models of the croissant production line have been developed using Markov process. The strength of this study is in the classification of the whole system in states, representing failures of different machines. Failure and repair data from the actual production environment have been used to estimate reliability and maintainability for each machine, workstation, and the entire line is based on analytical models. The analysis provides a useful insight into the system's behaviour, helps to find design inherent faults and suggests optimal modifications to upgrade the system and improve its performance.
Machining of Aircraft Titanium with Abrasive-Waterjets for Fatigue Critical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H. T.; Hovanski, Yuri; Dahl, Michael E.
2012-02-01
Laboratory tests were conducted to determine the fatigue performance of abrasive-waterjet- (AWJ-) machined aircraft titanium. Dog-bone specimens machined with AWJs were prepared and tested with and without sanding and dry-grit blasting with Al2O3 as secondary processes. The secondary processes were applied to remove the visual appearance of AWJ-generated striations and to clean up the garnet embedment. The fatigue performance of AWJ-machined specimens was compared with baseline specimens machined with CNC milling. Fatigue test results of the titanium specimens not only confirmed our previous findings in aluminum dog-bone specimens but in comparison also further enhanced the fatigue performance of the titanium.more » In addition, titanium is known to be difficult to cut, particularly for thick parts, however AWJs cut the material 34% faster han stainless steel. AWJ cutting and dry-grit blasting are shown to be a preferred ombination for processing aircraft titanium that is fatigue critical.« less
Performance Evaluation of the UT Automated Road Maintenance Machine
DOT National Transportation Integrated Search
1997-10-01
This final report focuses mainly on evaluating the overall performance of The University of Texas' Automated Road Maintenance Machine (ARMM). It was concluded that the introduction of automated methods to the pavement crack-sealing process will impro...
Tunnel Boring Machine Performance Study. Final Report
DOT National Transportation Integrated Search
1984-06-01
Full face tunnel boring machine "TBM" performance during the excavation of 6 tunnels in sedimentary rock is considered in terms of utilization, penetration rates and cutter wear. The construction records are analyzed and the results are used to inves...
Kazemi, Fatemeh; Najafabadi, Tooraj Abbasian; Araabi, Babak Nadjar
2016-01-01
Acute myelogenous leukemia (AML) is a subtype of acute leukemia, which is characterized by the accumulation of myeloid blasts in the bone marrow. Careful microscopic examination of stained blood smear or bone marrow aspirate is still the most significant diagnostic methodology for initial AML screening and considered as the first step toward diagnosis. It is time-consuming and due to the elusive nature of the signs and symptoms of AML; wrong diagnosis may occur by pathologists. Therefore, the need for automation of leukemia detection has arisen. In this paper, an automatic technique for identification and detection of AML and its prevalent subtypes, i.e., M2-M5 is presented. At first, microscopic images are acquired from blood smears of patients with AML and normal cases. After applying image preprocessing, color segmentation strategy is applied for segmenting white blood cells from other blood components and then discriminative features, i.e., irregularity, nucleus-cytoplasm ratio, Hausdorff dimension, shape, color, and texture features are extracted from the entire nucleus in the whole images containing multiple nuclei. Images are classified to cancerous and noncancerous images by binary support vector machine (SVM) classifier with 10-fold cross validation technique. Classifier performance is evaluated by three parameters, i.e., sensitivity, specificity, and accuracy. Cancerous images are also classified into their prevalent subtypes by multi-SVM classifier. The results show that the proposed algorithm has achieved an acceptable performance for diagnosis of AML and its common subtypes. Therefore, it can be used as an assistant diagnostic tool for pathologists.
Campanella, Gabriele; Rajanna, Arjun R; Corsale, Lorraine; Schüffler, Peter J; Yagi, Yukako; Fuchs, Thomas J
2018-04-01
Pathology is on the verge of a profound change from an analog and qualitative to a digital and quantitative discipline. This change is mostly driven by the high-throughput scanning of microscope slides in modern pathology departments, reaching tens of thousands of digital slides per month. The resulting vast digital archives form the basis of clinical use in digital pathology and allow large scale machine learning in computational pathology. One of the most crucial bottlenecks of high-throughput scanning is quality control (QC). Currently, digital slides are screened manually to detected out-of-focus regions, to compensate for the limitations of scanner software. We present a solution to this problem by introducing a benchmark dataset for blur detection, an in-depth comparison of state-of-the art sharpness descriptors and their prediction performance within a random forest framework. Furthermore, we show that convolution neural networks, like residual networks, can be used to train blur detectors from scratch. We thoroughly evaluate the accuracy of feature based and deep learning based approaches for sharpness classification (99.74% accuracy) and regression (MSE 0.004) and additionally compare them to domain experts in a comprehensive human perception study. Our pipeline outputs spacial heatmaps enabling to quantify and localize blurred areas on a slide. Finally, we tested the proposed framework in the clinical setting and demonstrate superior performance over the state-of-the-art QC pipeline comprising commercial software and human expert inspection by reducing the error rate from 17% to 4.7%. Copyright © 2017. Published by Elsevier Ltd.
Al-Numair, Nouf S; Lopes, Luis; Syrris, Petros; Monserrat, Lorenzo; Elliott, Perry; Martin, Andrew C R
2016-10-01
High-throughput sequencing platforms are increasingly used to screen patients with genetic disease for pathogenic mutations, but prediction of the effects of mutations remains challenging. Previously we developed SAAPdap (Single Amino Acid Polymorphism Data Analysis Pipeline) and SAAPpred (Single Amino Acid Polymorphism Predictor) that use a combination of rule-based structural measures to predict whether a missense genetic variant is pathogenic. Here we investigate whether the same methodology can be used to develop a differential phenotype predictor, which, once a mutation has been predicted as pathogenic, is able to distinguish between phenotypes-in this case the two major clinical phenotypes (hypertrophic cardiomyopathy, HCM and dilated cardiomyopathy, DCM) associated with mutations in the beta-myosin heavy chain (MYH7) gene product (Myosin-7). A random forest predictor trained on rule-based structural analyses together with structural clustering data gave a Matthews' correlation coefficient (MCC) of 0.53 (accuracy, 75%). A post hoc removal of machine learning models that performed particularly badly, increased the performance (MCC = 0.61, Acc = 79%). This proof of concept suggests that methods used for pathogenicity prediction can be extended for use in differential phenotype prediction. Analyses were implemented in Perl and C and used the Java-based Weka machine learning environment. Please contact the authors for availability. andrew@bioinf.org.uk or andrew.martin@ucl.ac.uk Supplementary data are available at Bioinformatics online. © The Authors 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Morin, Jean-François; Botton, Eléonore; Jacquemard, François; Richard-Gireme, Anouk
2013-01-01
The Fetal medicine foundation (FMF) has developed a new algorithm called Prenatal Risk Calculation (PRC) to evaluate Down syndrome screening based on free hCGβ, PAPP-A and nuchal translucency. The peculiarity of this algorithm is to use the degree of extremeness (DoE) instead of the multiple of the median (MoM). The biologists measuring maternal seric markers on Kryptor™ machines (Thermo Fisher Scientific) use Fast Screen pre I plus software for the prenatal risk calculation. This software integrates the PRC algorithm. Our study evaluates the data of 2.092 patient files of which 19 show a fœtal abnormality. These files have been first evaluated with the ViewPoint software based on MoM. The link between DoE and MoM has been analyzed and the different calculated risks compared. The study shows that Fast Screen pre I plus software gives the same risk results as ViewPoint software, but yields significantly fewer false positive results.
Xia, Meng-lei; Wang, Lan; Yang, Zhi-xia; Chen, Hong-zhang
2016-04-01
This work proposed a new method which applied image processing and support vector machine (SVM) for screening of mold strains. Taking Monascus as example, morphological characteristics of Monascus colony were quantified by image processing. And the association between the characteristics and pigment production capability was determined by SVM. On this basis, a highly automated screening strategy was achieved. The accuracy of the proposed strategy is 80.6 %, which is compatible with the existing methods (81.1 % for microplate and 85.4 % for flask). Meanwhile, the screening of 500 colonies only takes 20-30 min, which is the highest rate among all published results. By applying this automated method, 13 strains with high-predicted production were obtained and the best one produced as 2.8-fold (226 U/mL) of pigment and 1.9-fold (51 mg/L) of lovastatin compared with the parent strain. The current study provides us with an effective and promising method for strain improvement.
Machine learning of molecular electronic properties in chemical compound space
NASA Astrophysics Data System (ADS)
Montavon, Grégoire; Rupp, Matthias; Gobre, Vivekanand; Vazquez-Mayagoitia, Alvaro; Hansen, Katja; Tkatchenko, Alexandre; Müller, Klaus-Robert; Anatole von Lilienfeld, O.
2013-09-01
The combination of modern scientific computing with electronic structure theory can lead to an unprecedented amount of data amenable to intelligent data analysis for the identification of meaningful, novel and predictive structure-property relationships. Such relationships enable high-throughput screening for relevant properties in an exponentially growing pool of virtual compounds that are synthetically accessible. Here, we present a machine learning model, trained on a database of ab initio calculation results for thousands of organic molecules, that simultaneously predicts multiple electronic ground- and excited-state properties. The properties include atomization energy, polarizability, frontier orbital eigenvalues, ionization potential, electron affinity and excitation energies. The machine learning model is based on a deep multi-task artificial neural network, exploiting the underlying correlations between various molecular properties. The input is identical to ab initio methods, i.e. nuclear charges and Cartesian coordinates of all atoms. For small organic molecules, the accuracy of such a ‘quantum machine’ is similar, and sometimes superior, to modern quantum-chemical methods—at negligible computational cost.
The desktop interface in intelligent tutoring systems
NASA Technical Reports Server (NTRS)
Baudendistel, Stephen; Hua, Grace
1987-01-01
The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.
A state-based approach to trend recognition and failure prediction for the Space Station Freedom
NASA Technical Reports Server (NTRS)
Nelson, Kyle S.; Hadden, George D.
1992-01-01
A state-based reasoning approach to trend recognition and failure prediction for the Altitude Determination, and Control System (ADCS) of the Space Station Freedom (SSF) is described. The problem domain is characterized by features (e.g., trends and impending failures) that develop over a variety of time spans, anywhere from several minutes to several years. Our state-based reasoning approach, coupled with intelligent data screening, allows features to be tracked as they develop in a time-dependent manner. That is, each state machine has the ability to encode a time frame for the feature it detects. As features are detected, they are recorded and can be used as input to other state machines, creating a hierarchical feature recognition scheme. Furthermore, each machine can operate independently of the others, allowing simultaneous tracking of features. State-based reasoning was implemented in the trend recognition and the prognostic modules of a prototype Space Station Freedom Maintenance and Diagnostic System (SSFMDS) developed at Honeywell's Systems and Research Center.
Development of machine learning models to predict inhibition of 3-dehydroquinate dehydratase.
de Ávila, Maurício Boff; de Azevedo, Walter Filgueira
2018-04-20
In this study, we describe the development of new machine learning models to predict inhibition of the enzyme 3-dehydroquinate dehydratase (DHQD). This enzyme is the third step of the shikimate pathway and is responsible for the synthesis of chorismate, which is a natural precursor of aromatic amino acids. The enzymes of shikimate pathway are absent in humans, which make them protein targets for the design of antimicrobial drugs. We focus our study on the crystallographic structures of DHQD in complex with competitive inhibitors, for which experimental inhibition constant data is available. Application of supervised machine learning techniques was able to elaborate a robust DHQD-targeted model to predict binding affinity. Combination of high-resolution crystallographic structures and binding information indicates that the prevalence of intermolecular electrostatic interactions between DHQD and competitive inhibitors is of pivotal importance for the binding affinity against this enzyme. The present findings can be used to speed up virtual screening studies focused on the DHQD structure. © 2018 John Wiley & Sons A/S.
Montare, Alberto
2013-06-01
The three classical Donders' reaction time (RT) tasks (simple, choice, and discriminative RTs) were employed to compare reaction time scores from college students obtained by use of Montare's simplest chronoscope (meterstick) methodology to scores obtained by use of a digital-readout multi-choice reaction timer (machine). Five hypotheses were tested. Simple RT, choice RT, and discriminative RT were faster when obtained by meterstick than by machine. The meterstick method showed higher reliability than the machine method and was less variable. The meterstick method of the simplest chronoscope may help to alleviate the longstanding problems of low reliability and high variability of reaction time performances; while at the same time producing faster performance on Donders' simple, choice and discriminative RT tasks than the machine method.
None
2018-05-01
A new Idaho National Laboratory supercomputer is helping scientists create more realistic simulations of nuclear fuel. Dubbed "Ice Storm" this 2048-processor machine allows researchers to model and predict the complex physics behind nuclear reactor behavior. And with a new visualization lab, the team can see the results of its simulations on the big screen. For more information about INL research, visit http://www.facebook.com/idahonationallaboratory.
Graphing Calculators in the Secondary Mathematics Classroom. Monograph #21.
ERIC Educational Resources Information Center
Eckert, Paul; And Others
The objective of this presentation is to focus on the use of a hand-held graphics calculator. The specific machine referred to in this monograph is the Casio fx-7000G, chosen because of its low cost, its large viewing screen, its versatility, and its simple operation. Sections include: (1) "Basic Operations with the Casio fx-7000G"; (2) "Graphical…
Utilizing residues from in-woods flail processing
Ronald K. Baughman; Bryce J. Stokes; William F. Watson
1990-01-01
A Barkbuster 1100 tub grinder has been employed to process debris discharged by a Manitowoc VFDD-1642. The machine successfully passed the material through a 7.62 cm screen and discharged the reduced debris into a chip van for transport. Fuel production is directly dependent upon the production of clean chips by the flail/chipper portion of the system and the available...
Apple (LCSI) LOGO vs. MIT (Terrapin/Krell) LOGO: A Comparison for Grades 2 thru 4.
ERIC Educational Resources Information Center
Wappler, Reinhold D.
Two LOGO dialects are compared for appropriateness for use with second, third, and fourth grade students on the basis of 18 months of experience with teaching LOGO programing language at this level in a four-machine laboratory setting. Benefits and drawbacks of the dialects are evaluated in the areas of editing. screen modes, debugging,…
Ji, Renjie; Liu, Yonghong; Diao, Ruiqiang; Xu, Chenchen; Li, Xiaopeng; Cai, Baoping; Zhang, Yanzhen
2014-01-01
Engineering ceramics have been widely used in modern industry for their excellent physical and mechanical properties, and they are difficult to machine owing to their high hardness and brittleness. Electrical discharge machining (EDM) is the appropriate process for machining engineering ceramics provided they are electrically conducting. However, the electrical resistivity of the popular engineering ceramics is higher, and there has been no research on the relationship between the EDM parameters and the electrical resistivity of the engineering ceramics. This paper investigates the effects of the electrical resistivity and EDM parameters such as tool polarity, pulse interval, and electrode material, on the ZnO/Al2O3 ceramic's EDM performance, in terms of the material removal rate (MRR), electrode wear ratio (EWR), and surface roughness (SR). The results show that the electrical resistivity and the EDM parameters have the great influence on the EDM performance. The ZnO/Al2O3 ceramic with the electrical resistivity up to 3410 Ω·cm can be effectively machined by EDM with the copper electrode, the negative tool polarity, and the shorter pulse interval. Under most machining conditions, the MRR increases, and the SR decreases with the decrease of electrical resistivity. Moreover, the tool polarity, and pulse interval affect the EWR, respectively, and the electrical resistivity and electrode material have a combined effect on the EWR. Furthermore, the EDM performance of ZnO/Al2O3 ceramic with the electrical resistivity higher than 687 Ω·cm is obviously different from that with the electrical resistivity lower than 687 Ω·cm, when the electrode material changes. The microstructure character analysis of the machined ZnO/Al2O3 ceramic surface shows that the ZnO/Al2O3 ceramic is removed by melting, evaporation and thermal spalling, and the material from the working fluid and the graphite electrode can transfer to the workpiece surface during electrical discharge machining ZnO/Al2O3 ceramic.
Automated EEG-based screening of depression using deep convolutional neural network.
Acharya, U Rajendra; Oh, Shu Lih; Hagiwara, Yuki; Tan, Jen Hong; Adeli, Hojjat; Subha, D P
2018-07-01
In recent years, advanced neurocomputing and machine learning techniques have been used for Electroencephalogram (EEG)-based diagnosis of various neurological disorders. In this paper, a novel computer model is presented for EEG-based screening of depression using a deep neural network machine learning approach, known as Convolutional Neural Network (CNN). The proposed technique does not require a semi-manually-selected set of features to be fed into a classifier for classification. It learns automatically and adaptively from the input EEG signals to differentiate EEGs obtained from depressive and normal subjects. The model was tested using EEGs obtained from 15 normal and 15 depressed patients. The algorithm attained accuracies of 93.5% and 96.0% using EEG signals from the left and right hemisphere, respectively. It was discovered in this research that the EEG signals from the right hemisphere are more distinctive in depression than those from the left hemisphere. This discovery is consistent with recent research and revelation that the depression is associated with a hyperactive right hemisphere. An exciting extension of this research would be diagnosis of different stages and severity of depression and development of a Depression Severity Index (DSI). Copyright © 2018 Elsevier B.V. All rights reserved.
Kim, Dong Wook; Kim, Hwiyoung; Nam, Woong; Kim, Hyung Jun; Cha, In-Ho
2018-04-23
The aim of this study was to build and validate five types of machine learning models that can predict the occurrence of BRONJ associated with dental extraction in patients taking bisphosphonates for the management of osteoporosis. A retrospective review of the medical records was conducted to obtain cases and controls for the study. Total 125 patients consisting of 41 cases and 84 controls were selected for the study. Five machine learning prediction algorithms including multivariable logistic regression model, decision tree, support vector machine, artificial neural network, and random forest were implemented. The outputs of these models were compared with each other and also with conventional methods, such as serum CTX level. Area under the receiver operating characteristic (ROC) curve (AUC) was used to compare the results. The performance of machine learning models was significantly superior to conventional statistical methods and single predictors. The random forest model yielded the best performance (AUC = 0.973), followed by artificial neural network (AUC = 0.915), support vector machine (AUC = 0.882), logistic regression (AUC = 0.844), decision tree (AUC = 0.821), drug holiday alone (AUC = 0.810), and CTX level alone (AUC = 0.630). Machine learning methods showed superior performance in predicting BRONJ associated with dental extraction compared to conventional statistical methods using drug holiday and serum CTX level. Machine learning can thus be applied in a wide range of clinical studies. Copyright © 2017. Published by Elsevier Inc.
High-Throughput Gene Expression Profiles to Define Drug Similarity and Predict Compound Activity.
De Wolf, Hans; Cougnaud, Laure; Van Hoorde, Kirsten; De Bondt, An; Wegner, Joerg K; Ceulemans, Hugo; Göhlmann, Hinrich
2018-04-01
By adding biological information, beyond the chemical properties and desired effect of a compound, uncharted compound areas and connections can be explored. In this study, we add transcriptional information for 31K compounds of Janssen's primary screening deck, using the HT L1000 platform and assess (a) the transcriptional connection score for generating compound similarities, (b) machine learning algorithms for generating target activity predictions, and (c) the scaffold hopping potential of the resulting hits. We demonstrate that the transcriptional connection score is best computed from the significant genes only and should be interpreted within its confidence interval for which we provide the stats. These guidelines help to reduce noise, increase reproducibility, and enable the separation of specific and promiscuous compounds. The added value of machine learning is demonstrated for the NR3C1 and HSP90 targets. Support Vector Machine models yielded balanced accuracy values ≥80% when the expression values from DDIT4 & SERPINE1 and TMEM97 & SPR were used to predict the NR3C1 and HSP90 activity, respectively. Combining both models resulted in 22 new and confirmed HSP90-independent NR3C1 inhibitors, providing two scaffolds (i.e., pyrimidine and pyrazolo-pyrimidine), which could potentially be of interest in the treatment of depression (i.e., inhibiting the glucocorticoid receptor (i.e., NR3C1), while leaving its chaperone, HSP90, unaffected). As such, the initial hit rate increased by a factor 300, as less, but more specific chemistry could be screened, based on the upfront computed activity predictions.
Msyamboza, Kelias Phiri; Phiri, Twambilire; Sichali, Wesley; Kwenda, Willy; Kachale, Fanny
2016-08-17
Malawi has the highest cervical cancer incidence and mortality in the world with age-standardized rate (ASR) of 75.9 and 49.8 per 100,000 population respectively. In response, Ministry of Health established a cervical cancer screening programme using visual inspection with acetic acid (VIA) and treatment of precancerous lesions with cryotherapy. This paper highlights the roll out, integration with family planning services and HIV ART Programme, uptake and challenges of VIA and Cryotherapy programme. We analyzed program data, supportive supervision, quarterly and annual reports from the National Cervical Cancer Control Program. We evaluated the uptake and challenges of screening services by age, HIV serostatus and trends over a five year period (2011-2015). Between 2011 and 2015, number of cervical cancer screening sites, number of women screened and coverage per annum increased from 75 to 130, 15,331 to 49,301 and 9.3 % to 26.5 % respectively. In this five year period, a total of 145,015 women were screened. Of these, 7,349 (5.1 %) and 6,289 (4.3 %) were VIA positive and suspect cancer respectively. Overall 13,638 (9.4 %) were detected to be VIA positive or had suspect cancer. Of the 48,588 women with known age screened in 2015; 13,642 (28.1 %), 27,275 (56.1 %) and 7,671 (15.8 %) were aged 29 or less, 30-45, 46 years or more. Among 39,101 women with data on HIV serostatus; 21,546 (55.1 %) were HIV negative, 6,209 (15.9 %) were HIV positive and 11, 346 (29.0 %) status was unknown. VIA positivity rate and prevalence of suspect cancer were significantly higher in HIV positive than HIV negative women (8.8 % vs 5.0 %, 6.4 % vs 3.0 %); in women aged 30-45 years than women aged 29 years or less (5.6 % vs 2.3 %, 2.6 % vs 1.2 %) respectively, all p <0.05). The main challenge of the programme was failure to treat VIA positive women eligible for cryotherapy. Over the five year period, the programme only treated 1,001 (43.3 %) out of 2,311 eligible women and only 266 (31.8 %) of the 836 women with large lesion or suspect cancer who were referred, received the health care at the referral centre. The reasons for failure to provide cryotherapy treatment were stock out of gas, faulty/broken cryotherapy machine (usually connectors or probes) or no cryotherapy machine at all in the whole district. For women with large lesion or suspect cancer; lack of loop electrosurgical excision procedure (LEEP) machine or inadequate gynaecologists at the referral centre, were the major reasons. Cancer radiotherapy services were not available in Malawi. This study provided data on VIA positivity rate, prevalence of suspect cancer, failure rate of cryotherapy and challenges in the provision of cryotherapy and LEEP treatment in Malawi. These data could be used as baseline for monitoring and evaluation of Human Papillomavirus (HPV) vaccination programme which the country introduced in 2013, the linkage of cervical cancer screening and women on HIV ART and the long term effect of ART, voluntary male medical circumcision on the prevalence and incidence of cervical cancer.
Bae, Sangwon; Chung, Tammy; Ferreira, Denzil; Dey, Anind K; Suffoletto, Brian
2018-08-01
Real-time detection of drinking could improve timely delivery of interventions aimed at reducing alcohol consumption and alcohol-related injury, but existing detection methods are burdensome or impractical. To evaluate whether phone sensor data and machine learning models are useful to detect alcohol use events, and to discuss implications of these results for just-in-time mobile interventions. 38 non-treatment seeking young adult heavy drinkers downloaded AWARE app (which continuously collected mobile phone sensor data), and reported alcohol consumption (number of drinks, start/end time of prior day's drinking) for 28days. We tested various machine learning models using the 20 most informative sensor features to classify time periods as non-drinking, low-risk (1 to 3/4 drinks per occasion for women/men), and high-risk drinking (>4/5 drinks per occasion for women/men). Among 30 participants in the analyses, 207 non-drinking, 41 low-risk, and 45 high-risk drinking episodes were reported. A Random Forest model using 30-min windows with 1day of historical data performed best for detecting high-risk drinking, correctly classifying high-risk drinking windows 90.9% of the time. The most informative sensor features were related to time (i.e., day of week, time of day), movement (e.g., change in activities), device usage (e.g., screen duration), and communication (e.g., call duration, typing speed). Preliminary evidence suggests that sensor data captured from mobile phones of young adults is useful in building accurate models to detect periods of high-risk drinking. Interventions using mobile phone sensor features could trigger delivery of a range of interventions to potentially improve effectiveness. Copyright © 2017 Elsevier Ltd. All rights reserved.
Qureshi, Abid; Tandon, Himani; Kumar, Manoj
2015-11-01
Peptide-based antiviral therapeutics has gradually paved their way into mainstream drug discovery research. Experimental determination of peptides' antiviral activity as expressed by their IC50 values involves a lot of effort. Therefore, we have developed "AVP-IC50 Pred," a regression-based algorithm to predict the antiviral activity in terms of IC50 values (μM). A total of 759 non-redundant peptides from AVPdb and HIPdb were divided into a training/test set having 683 peptides (T(683)) and a validation set with 76 independent peptides (V(76)) for evaluation. We utilized important peptide sequence features like amino-acid compositions, binary profile of N8-C8 residues, physicochemical properties and their hybrids. Four different machine learning techniques (MLTs) namely Support vector machine, Random Forest, Instance-based classifier, and K-Star were employed. During 10-fold cross validation, we achieved maximum Pearson correlation coefficients (PCCs) of 0.66, 0.64, 0.56, 0.55, respectively, for the above MLTs using the best combination of feature sets. All the predictive models also performed well on the independent validation dataset and achieved maximum PCCs of 0.74, 0.68, 0.59, 0.57, respectively, on the best combination of feature sets. The AVP-IC50 Pred web server is anticipated to assist the researchers working on antiviral therapeutics by enabling them to computationally screen many compounds and focus experimental validation on the most promising set of peptides, thus reducing cost and time efforts. The server is available at http://crdd.osdd.net/servers/ic50avp. © 2015 Wiley Periodicals, Inc.
Barua, Shaibal; Begum, Shahina; Ahmed, Mobyen Uddin
2015-01-01
Machine learning algorithms play an important role in computer science research. Recent advancement in sensor data collection in clinical sciences lead to a complex, heterogeneous data processing, and analysis for patient diagnosis and prognosis. Diagnosis and treatment of patients based on manual analysis of these sensor data are difficult and time consuming. Therefore, development of Knowledge-based systems to support clinicians in decision-making is important. However, it is necessary to perform experimental work to compare performances of different machine learning methods to help to select appropriate method for a specific characteristic of data sets. This paper compares classification performance of three popular machine learning methods i.e., case-based reasoning, neutral networks and support vector machine to diagnose stress of vehicle drivers using finger temperature and heart rate variability. The experimental results show that case-based reasoning outperforms other two methods in terms of classification accuracy. Case-based reasoning has achieved 80% and 86% accuracy to classify stress using finger temperature and heart rate variability. On contrary, both neural network and support vector machine have achieved less than 80% accuracy by using both physiological signals.
Analysis and design of asymmetrical reluctance machine
NASA Astrophysics Data System (ADS)
Harianto, Cahya A.
Over the past few decades the induction machine has been chosen for many applications due to its structural simplicity and low manufacturing cost. However, modest torque density and control challenges have motivated researchers to find alternative machines. The permanent magnet synchronous machine has been viewed as one of the alternatives because it features higher torque density for a given loss than the induction machine. However, the assembly and permanent magnet material cost, along with safety under fault conditions, have been concerns for this class of machine. An alternative machine type, namely the asymmetrical reluctance machine, is proposed in this work. Since the proposed machine is of the reluctance machine type, it possesses desirable feature, such as near absence of rotor losses, low assembly cost, low no-load rotational losses, modest torque ripple, and rather benign fault conditions. Through theoretical analysis performed herein, it is shown that this machine has a higher torque density for a given loss than typical reluctance machines, although not as high as the permanent magnet machines. Thus, the asymmetrical reluctance machine is a viable and advantageous machine alternative where the use of permanent magnet machines are undesirable.
ERIC Educational Resources Information Center
Mercer County Schools, Princeton, WV.
A project was undertaken to identify machine shop occupations requiring workers to use computers, identify the computer skills needed to perform machine shop tasks, and determine which software products are currently being used in machine shop programs. A search of the Dictionary of Occupational Titles revealed that computer skills will become…
Assessment of various supervised learning algorithms using different performance metrics
NASA Astrophysics Data System (ADS)
Susheel Kumar, S. M.; Laxkar, Deepak; Adhikari, Sourav; Vijayarajan, V.
2017-11-01
Our work brings out comparison based on the performance of supervised machine learning algorithms on a binary classification task. The supervised machine learning algorithms which are taken into consideration in the following work are namely Support Vector Machine(SVM), Decision Tree(DT), K Nearest Neighbour (KNN), Naïve Bayes(NB) and Random Forest(RF). This paper mostly focuses on comparing the performance of above mentioned algorithms on one binary classification task by analysing the Metrics such as Accuracy, F-Measure, G-Measure, Precision, Misclassification Rate, False Positive Rate, True Positive Rate, Specificity, Prevalence.
Using human brain activity to guide machine learning.
Fong, Ruth C; Scheirer, Walter J; Cox, David D
2018-03-29
Machine learning is a field of computer science that builds algorithms that learn. In many cases, machine learning algorithms are used to recreate a human ability like adding a caption to a photo, driving a car, or playing a game. While the human brain has long served as a source of inspiration for machine learning, little effort has been made to directly use data collected from working brains as a guide for machine learning algorithms. Here we demonstrate a new paradigm of "neurally-weighted" machine learning, which takes fMRI measurements of human brain activity from subjects viewing images, and infuses these data into the training process of an object recognition learning algorithm to make it more consistent with the human brain. After training, these neurally-weighted classifiers are able to classify images without requiring any additional neural data. We show that our neural-weighting approach can lead to large performance gains when used with traditional machine vision features, as well as to significant improvements with already high-performing convolutional neural network features. The effectiveness of this approach points to a path forward for a new class of hybrid machine learning algorithms which take both inspiration and direct constraints from neuronal data.
NASA Astrophysics Data System (ADS)
Dey, Kaushik; Ghose, A. K.
2011-09-01
Rock excavation is carried out either by drilling and blasting or using rock-cutting machines like rippers, bucket wheel excavators, surface miners, road headers etc. Economics of mechanised rock excavation by rock-cutting machines largely depends on the achieved production rates. Thus, assessment of the performance (productivity) is important prior to deploying a rock-cutting machine. In doing so, several researchers have classified rockmass in different ways and have developed cuttability indices to correlate machine performance directly. However, most of these indices were developed to assess the performance of road headers/tunnel-boring machines apart from a few that were developed in the earlier days when the ripper was a popular excavating equipment. Presently, around 400 surface miners are in operation around the world amongst which, 105 are in India. Until now, no rockmass classification system is available to assess the performance of surface miners. Surface miners are being deployed largely on trial and error basis or based on the performance charts provided by the manufacturer. In this context, it is logical to establish a suitable cuttability index to predict the performance of surface miners. In this present paper, the existing cuttability indices are reviewed and a new cuttability indexes proposed. A new relationship is also developed to predict the output from surface miners using the proposed cuttability index.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ang; Song, Shuaiwen; Brugel, Eric
To continuously comply with Moore’s Law, modern parallel machines become increasingly complex. Effectively tuning application performance for these machines therefore becomes a daunting task. Moreover, identifying performance bottlenecks at application and architecture level, as well as evaluating various optimization strategies, are becoming extremely difficult when the entanglement of numerous correlated factors is being presented. To tackle these challenges, we present a visual analytical model named “X”. It is intuitive and sufficiently flexible to track all the typical features of a parallel machine.
Predicting the Performance of Chain Saw Machines Based on Shore Scleroscope Hardness
NASA Astrophysics Data System (ADS)
Tumac, Deniz
2014-03-01
Shore hardness has been used to estimate several physical and mechanical properties of rocks over the last few decades. However, the number of researches correlating Shore hardness with rock cutting performance is quite limited. Also, rather limited researches have been carried out on predicting the performance of chain saw machines. This study differs from the previous investigations in the way that Shore hardness values (SH1, SH2, and deformation coefficient) are used to determine the field performance of chain saw machines. The measured Shore hardness values are correlated with the physical and mechanical properties of natural stone samples, cutting parameters (normal force, cutting force, and specific energy) obtained from linear cutting tests in unrelieved cutting mode, and areal net cutting rate of chain saw machines. Two empirical models developed previously are improved for the prediction of the areal net cutting rate of chain saw machines. The first model is based on a revised chain saw penetration index, which uses SH1, machine weight, and useful arm cutting depth as predictors. The second model is based on the power consumed for only cutting the stone, arm thickness, and specific energy as a function of the deformation coefficient. While cutting force has a strong relationship with Shore hardness values, the normal force has a weak or moderate correlation. Uniaxial compressive strength, Cerchar abrasivity index, and density can also be predicted by Shore hardness values.
Novel diesel exhaust filters for underground mining vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bickel, K.L.; Taubert, T.R.
1995-12-31
The U.S. Bureau of Mines (USBM) pioneered the development of disposable filters for reducing diesel particulate emissions from permissible mining machines. The USBM is now evaluating filter media that can withstand the high exhaust temperatures on nonpermissible machines. The goal of the evaluation is to find an inexpensive medium that can be cleaned or disposed of after use, and will reduce particulate emissions by 50 % or more. This report summarizes the results from screening tests of a lava rock and woven fiberglass filter media. The lava rock media exhibited low collection efficiencies, but with very low increases in exhaustmore » back pressure. Preliminary results indicate a collection efficiency exceeding 80 % for the woven fiber media. Testing of both media is continuing.« less
Molecular graph convolutions: moving beyond fingerprints
Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick
2016-01-01
Molecular “fingerprints” encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement. PMID:27558503
The influence of maintenance quality of hemodialysis machines on hemodialysis efficiency.
Azar, Ahmad Taher
2009-01-01
Several studies suggest that there is a correlation between dose of dialysis and machine maintenance. However, in spite of the current practice, there are conflicting reports regarding the relationship between dose of dialysis or patient outcome, and machine maintenance. In order to evaluate the impact of hemodialysis machine maintenance on dialysis adequacy Kt/V and session performance, data were processed on 134 patients on 3-times-per-week dialysis regimens by dividing the patients into four groups and also dividing the hemodialysis machines into four groups according to their year of installation. The equilibrated dialysis dose eq Kt/V, urea reduction ratio (URR) and the overall equipment effectiveness (OEE) were calculated in each group to show the effect hemodialysis machine efficiency on the overall session performance. The average working time per machine per month was 270 hours. The cumulative number of hours according to the year of installation was: 26,122 hours for machines installed in 1998; 21,596 hours for machines installed in 1999, 8362 hours for those installed in 2003 and 2486 hours for those installed in 2005. The mean time between failures (MTBF) was 1.8, 2.1, 4.2 and 6 months between failures for machines installed in 1999, 1998, 2003 and 2005, respectively. Statistical analysis demonstrated that the dialysis dose eq Kt/V and URR were increased as the overall equipment effectiveness (OEE) increases with regular maintenance procedures. Maintenance has become one of the most expedient approaches to guarantee high machine dependability. The efficiency of dialysis machine is relevant in assuring a proper dialysis adequacy.
The Diagnostics of the External Plasma for the Plasma Rocket
NASA Technical Reports Server (NTRS)
Karr, Gerald R.
1997-01-01
Three regions of plasma temperature/energy are being investigated to understand fully the behavior of the plasma created by the propulsion device and the operation of the RPA. Each type of plasma has a RPA associated with it; i.e. a thermal RPA, a collimated RPA, and a high temperature RPA. Through the process of developing the thermal and collimated RPAs, the proper knowledge and experience has been gained to not only design a high temperature RPA for the plasma rocket, but to understand its operation, results, and uncertainty. After completing a literature search for, reading published papers on, and discussing the operation of the RPA with electric propulsion researchers, I applied the knowledge gained to the development of a RPA for thermal plasma. A design of a thermal RPA was made which compensates for a large Debye length and low ionized plasma. From this design a thermal RPA was constructed. It consists of an outer stainless steel casing, a phenolic insulator (outgases slightly), and stainless steel mesh for the voltage screens. From the experience and knowledge gained in the development of the thermal RPA, a RPA for collimated plasma was developed. A collimated RPA has been designed and constructed. It compensate for a smaller Debye length and much higher ionization than that existing in the thermal plasma. It is 17% of the size of the thermal RPA. A stainless steel casing shields the detector from impinging electrons and ions. An insulating material, epoxy resin, was utilized which has a negligible outgassing. This material can be molded in styrofoam and machined quite nicely. It is capable of withstanding moderately high temperatures. Attached to this resin insulator are inconel screens attached by silver plated copper wire to a voltage supply. All the work on the RPAs and thermal ion source, I performed in the University of Alabama in Huntsville's (UAH) engineering machine shop.
NASA Astrophysics Data System (ADS)
Robert-Perron, Etienne; Blais, Carl; Pelletier, Sylvain; Thomas, Yannig
2007-06-01
The green machining process is an interesting approach for solving the mediocre machining behavior of high-performance powder metallurgy (PM) steels. This process appears as a promising method for extending tool life and reducing machining costs. Recent improvements in binder/lubricant technologies have led to high green strength systems that enable green machining. So far, tool wear has been considered negligible when characterizing the machinability of green PM specimens. This inaccurate assumption may lead to the selection of suboptimum cutting conditions. The first part of this study involves the optimization of the machining parameters to minimize the effects of tool wear on the machinability in turning of green PM components. The second part of our work compares the sintered mechanical properties of components machined in green state with other machined after sintering.
Evaluation of Process Performance for Sustainable Hard Machining
NASA Astrophysics Data System (ADS)
Rotella, Giovanna; Umbrello, Domenico; , Oscar W. Dillon, Jr.; Jawahir, I. S.
This paper aims to evaluate the sustainability performance of machining operation of through-hardening steel, AISI 52100, taking into account the impact of the material removal process in its various aspects. Experiments were performed for dry and cryogenic cutting conditions using chamfered cubic boron nitride (CBN) tool inserts at varying cutting conditions (cutting speed and feed rate). Cutting forces, mechanical power, tool wear, white layer thickness, surface roughness and residual stresses were investigated in order to evaluate the effects of extreme in-process cooling on the machined surface. The results indicate that cryogenic cooling has the potential to be used for surface integrity enhancement for improved product life and more sustainable functional performance.
29 CFR 570.34 - Occupations that may be performed by minors 14 and 15 years of age.
Code of Federal Regulations, 2011 CFR
2011-07-01
... comparative shopping. (e) Price marking and tagging by hand or machine, assembling orders, packing, and... machines shall mean all fixed or portable machines or tools driven by power and used or designed for...
29 CFR 570.34 - Occupations that may be performed by minors 14 and 15 years of age.
Code of Federal Regulations, 2012 CFR
2012-07-01
... comparative shopping. (e) Price marking and tagging by hand or machine, assembling orders, packing, and... machines shall mean all fixed or portable machines or tools driven by power and used or designed for...
29 CFR 570.34 - Occupations that may be performed by minors 14 and 15 years of age.
Code of Federal Regulations, 2013 CFR
2013-07-01
... comparative shopping. (e) Price marking and tagging by hand or machine, assembling orders, packing, and... machines shall mean all fixed or portable machines or tools driven by power and used or designed for...
29 CFR 570.34 - Occupations that may be performed by minors 14 and 15 years of age.
Code of Federal Regulations, 2014 CFR
2014-07-01
... comparative shopping. (e) Price marking and tagging by hand or machine, assembling orders, packing, and... machines shall mean all fixed or portable machines or tools driven by power and used or designed for...
NASA Astrophysics Data System (ADS)
Jusoh, L. I.; Sulaiman, E.; Bahrim, F. S.; Kumar, R.
2017-08-01
Recent advancements have led to the development of flux switching machines (FSMs) with flux sources within the stators. The advantage of being a single-piece machine with a robust rotor structure makes FSM an excellent choice for speed applications. There are three categories of FSM, namely, the permanent magnet (PM) FSM, the field excitation (FE) FSM, and the hybrid excitation (HE) FSM. The PMFSM and the FEFSM have their respective PM and field excitation coil (FEC) as their key flux sources. Meanwhile, as the name suggests, the HEFSM has a combination of PM and FECs as the flux sources. The PMFSM is a simple and cheap machine, and it has the ability to control variable flux, which would be suitable for an electric bicycle. Thus, this paper will present a design comparison between an inner rotor and an outer rotor for a single-phase permanent magnet flux switching machine with 8S-10P, designed specifically for an electric bicycle. The performance of this machine was validated using the 2D- FEA. As conclusion, the outer-rotor has much higher torque approximately at 54.2% of an innerrotor PMFSM. From the comprehensive analysis of both designs it can be conclude that output performance is lower than the SRM and IPMSM design machine. But, it shows that the possibility to increase the design performance by using “deterministic optimization method”.
Efficient machining of ultra precise steel moulds with freeform surfaces
NASA Astrophysics Data System (ADS)
Bulla, B.; Robertson, D. J.; Dambon, O.; Klocke, F.
2013-09-01
Ultra precision diamond turning of hardened steel to produce optical quality surfaces can be realized by applying an ultrasonic assisted process. With this technology optical moulds used typically for injection moulding can be machined directly from steel without the requirement to overcoat the mould with a diamond machinable material such as Nickel Phosphor. This has both the advantage of increasing the mould tool lifetime and also reducing manufacture costs by dispensing with the relatively expensive plating process. This publication will present results we have obtained for generating free form moulds in hardened steel by means of ultrasonic assisted diamond turning with a vibration frequency of 80 kHz. To provide a baseline with which to characterize the system performance we perform plane cutting experiments on different steel alloys with different compositions. The baseline machining results provides us information on the surface roughness and on tool wear caused during machining and we relate these to material composition. Moving on to freeform surfaces, we will present a theoretical background to define the machine program parameters for generating free forms by applying slow slide servo machining techniques. A solution for optimal part generation is introduced which forms the basis for the freeform machining experiments. The entire process chain, from the raw material through to ultra precision machining is presented, with emphasis on maintaining surface alignment when moving a component from CNC pre-machining to final machining using ultrasonic assisted diamond turning. The free form moulds are qualified on the basis of the surface roughness measurements and a form error map comparing the machined surface with the originally defined surface. These experiments demonstrate the feasibility of efficient free form machining applying ultrasonic assisted diamond turning of hardened steel.
Lattice-Gas Automata Fluids on Parallel Supercomputers
1993-11-23
Kelvin-Helmholtz shear instabil- ity, and the Von Karman vortex shedding instability. Performance of the two machines in terms of both site update... PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Phillips Laboratory,Hanscom Field,MA,01731 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...Helmholtz shear instability, and the Von Karman vortex shedding instability. Performance of the two machines in terms of both site update rate and
Do you remember proposing marriage to the Pepsi machine? False recollections from a campus walk.
Seamon, John G; Philbin, Morgan M; Harrison, Liza G
2006-10-01
During a campus walk, participants were given familiar or bizarre action statements (e.g., "Check the Pepsi machine for change" vs. "Propose marriage to the Pepsi machine") with instructions either to perform the actions or imagine performing the actions (Group 1) or to watch the experimenter perform the actions or imagine the experimenter performing the actions (Group 2). One day later, some actions were repeated, along with new actions, on a second walk. Two weeks later, the participants took a recognition test for actions presented during the first walk, and they specified whether a recognized action was imagined or performed. Imagining themselves or the experimenter performing familiar or bizarre actions just once led to false recollections of performance for both types of actions. This study extends previous research on imagination inflation by demonstrating that these false performance recollections can occur in a natural, real-life setting following just one imagining.
Performance prediction: A case study using a multi-ring KSR-1 machine
NASA Technical Reports Server (NTRS)
Sun, Xian-He; Zhu, Jianping
1995-01-01
While computers with tens of thousands of processors have successfully delivered high performance power for solving some of the so-called 'grand-challenge' applications, the notion of scalability is becoming an important metric in the evaluation of parallel machine architectures and algorithms. In this study, the prediction of scalability and its application are carefully investigated. A simple formula is presented to show the relation between scalability, single processor computing power, and degradation of parallelism. A case study is conducted on a multi-ring KSR1 shared virtual memory machine. Experimental and theoretical results show that the influence of topology variation of an architecture is predictable. Therefore, the performance of an algorithm on a sophisticated, heirarchical architecture can be predicted and the best algorithm-machine combination can be selected for a given application.
40 CFR 60.180 - Applicability and designation of affected facility.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Primary Lead Smelters § 60.180 Applicability and designation of affected facility. (a) The...: sintering machine, sintering machine discharge end, blast furnace, dross reverberatory furnace, electric...
40 CFR 60.180 - Applicability and designation of affected facility.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Primary Lead Smelters § 60.180 Applicability and designation of affected facility. (a) The...: sintering machine, sintering machine discharge end, blast furnace, dross reverberatory furnace, electric...
A comparison of machine learning and Bayesian modelling for molecular serotyping.
Newton, Richard; Wernisch, Lorenz
2017-08-11
Streptococcus pneumoniae is a human pathogen that is a major cause of infant mortality. Identifying the pneumococcal serotype is an important step in monitoring the impact of vaccines used to protect against disease. Genomic microarrays provide an effective method for molecular serotyping. Previously we developed an empirical Bayesian model for the classification of serotypes from a molecular serotyping array. With only few samples available, a model driven approach was the only option. In the meanwhile, several thousand samples have been made available to us, providing an opportunity to investigate serotype classification by machine learning methods, which could complement the Bayesian model. We compare the performance of the original Bayesian model with two machine learning algorithms: Gradient Boosting Machines and Random Forests. We present our results as an example of a generic strategy whereby a preliminary probabilistic model is complemented or replaced by a machine learning classifier once enough data are available. Despite the availability of thousands of serotyping arrays, a problem encountered when applying machine learning methods is the lack of training data containing mixtures of serotypes; due to the large number of possible combinations. Most of the available training data comprises samples with only a single serotype. To overcome the lack of training data we implemented an iterative analysis, creating artificial training data of serotype mixtures by combining raw data from single serotype arrays. With the enhanced training set the machine learning algorithms out perform the original Bayesian model. However, for serotypes currently lacking sufficient training data the best performing implementation was a combination of the results of the Bayesian Model and the Gradient Boosting Machine. As well as being an effective method for classifying biological data, machine learning can also be used as an efficient method for revealing subtle biological insights, which we illustrate with an example.
Three-dimensionally printed biological machines powered by skeletal muscle.
Cvetkovic, Caroline; Raman, Ritu; Chan, Vincent; Williams, Brian J; Tolish, Madeline; Bajaj, Piyush; Sakar, Mahmut Selman; Asada, H Harry; Saif, M Taher A; Bashir, Rashid
2014-07-15
Combining biological components, such as cells and tissues, with soft robotics can enable the fabrication of biological machines with the ability to sense, process signals, and produce force. An intuitive demonstration of a biological machine is one that can produce motion in response to controllable external signaling. Whereas cardiac cell-driven biological actuators have been demonstrated, the requirements of these machines to respond to stimuli and exhibit controlled movement merit the use of skeletal muscle, the primary generator of actuation in animals, as a contractile power source. Here, we report the development of 3D printed hydrogel "bio-bots" with an asymmetric physical design and powered by the actuation of an engineered mammalian skeletal muscle strip to result in net locomotion of the bio-bot. Geometric design and material properties of the hydrogel bio-bots were optimized using stereolithographic 3D printing, and the effect of collagen I and fibrin extracellular matrix proteins and insulin-like growth factor 1 on the force production of engineered skeletal muscle was characterized. Electrical stimulation triggered contraction of cells in the muscle strip and net locomotion of the bio-bot with a maximum velocity of ∼ 156 μm s(-1), which is over 1.5 body lengths per min. Modeling and simulation were used to understand both the effect of different design parameters on the bio-bot and the mechanism of motion. This demonstration advances the goal of realizing forward-engineered integrated cellular machines and systems, which can have a myriad array of applications in drug screening, programmable tissue engineering, drug delivery, and biomimetic machine design.
Rotordynamic Instability Problems in High-Performance Turbomachinery
NASA Technical Reports Server (NTRS)
1984-01-01
Rotordynamics and predictions on the stability of characteristics of high performance turbomachinery were discussed. Resolutions of problems on experimental validation of the forces that influence rotordynamics were emphasized. The programs to predict or measure forces and force coefficients in high-performance turbomachinery are illustrated. Data to design new machines with enhanced stability characteristics or upgrading existing machines are presented.
Power electromagnetic strike machine for engineering-geological surveys
NASA Astrophysics Data System (ADS)
Usanov, K. M.; Volgin, A. V.; Chetverikov, E. A.; Kargin, V. A.; Moiseev, A. P.; Ivanova, Z. I.
2017-10-01
When implementing the processes of dynamic sensing of soils and pulsed nonexplosive seismic exploration, the most common and effective method is the strike one, which is provided by a variety of structure and parameters of pneumatic, hydraulic, electrical machines of strike action. The creation of compact portable strike machines which do not require transportation and use of mechanized means is important. A promising direction in the development of strike machines is the use of pulsed electromagnetic actuator characterized by relatively low energy consumption, relatively high specific performance and efficiency, and providing direct conversion of electrical energy into mechanical work of strike mass with linear movement trajectory. The results of these studies allowed establishing on the basis of linear electromagnetic motors the electromagnetic pulse machines with portable performance for dynamic sensing of soils and land seismic pulse of small depths.
Evaluating the Security of Machine Learning Algorithms
2008-05-20
Two far-reaching trends in computing have grown in significance in recent years. First, statistical machine learning has entered the mainstream as a...computing applications. The growing intersection of these trends compels us to investigate how well machine learning performs under adversarial conditions... machine learning has a structure that we can use to build secure learning systems. This thesis makes three high-level contributions. First, we develop a
24 CFR 3280.607 - Plumbing fixtures.
Code of Federal Regulations, 2014 CFR
2014-04-01
... two or more compartments, dishwashers, clothes washing machines, laundry tubs, bath tubs, and not less... for Safety Performance Specifications and Methods of Test for Safety Glazing Materials Used in...) Dishwashing machines. (i) A dishwashing machine shall not be directly connected to any waste piping, but shall...
Ergonomics for enhancing detection of machine abnormalities.
Illankoon, Prasanna; Abeysekera, John; Singh, Sarbjeet
2016-10-17
Detecting abnormal machine conditions is of great importance in an autonomous maintenance environment. Ergonomic aspects can be invaluable when detection of machine abnormalities using human senses is examined. This research outlines the ergonomic issues involved in detecting machine abnormalities and suggests how ergonomics would improve such detections. Cognitive Task Analysis was performed in a plant in Sri Lanka where Total Productive Maintenance is being implemented to identify sensory types that would be used to detect machine abnormalities and relevant Ergonomic characteristics. As the outcome of this research, a methodology comprising of an Ergonomic Gap Analysis Matrix for machine abnormality detection is presented.
Initial planetary base construction techniques and machine implementation
NASA Technical Reports Server (NTRS)
Crockford, William W.
1987-01-01
Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.
Superconductor Armature Winding for High Performance Electrical Machines
2016-12-05
Vol. 3, pp.489-507 [Kalsi1] S. S. Kalsi, ‘Superconducting Wind Turbine Generator Employing MgB2 Windings Both on Rotor and Stator’, IEEE Trans. on...Contract Number: N00014-‐14-‐1-‐0272 Contract Title: Superconductor armature winding for high performance electrical...an all-superconducting machine. Superconductor armature windings in electrical machines bring many design challenges that need to be addressed en
Improved Saturated Hydraulic Conductivity Pedotransfer Functions Using Machine Learning Methods
NASA Astrophysics Data System (ADS)
Araya, S. N.; Ghezzehei, T. A.
2017-12-01
Saturated hydraulic conductivity (Ks) is one of the fundamental hydraulic properties of soils. Its measurement, however, is cumbersome and instead pedotransfer functions (PTFs) are often used to estimate it. Despite a lot of progress over the years, generic PTFs that estimate hydraulic conductivity generally don't have a good performance. We develop significantly improved PTFs by applying state of the art machine learning techniques coupled with high-performance computing on a large database of over 20,000 soils—USKSAT and the Florida Soil Characterization databases. We compared the performance of four machine learning algorithms (k-nearest neighbors, gradient boosted model, support vector machine, and relevance vector machine) and evaluated the relative importance of several soil properties in explaining Ks. An attempt is also made to better account for soil structural properties; we evaluated the importance of variables derived from transformations of soil water retention characteristics and other soil properties. The gradient boosted models gave the best performance with root mean square errors less than 0.7 and mean errors in the order of 0.01 on a log scale of Ks [cm/h]. The effective particle size, D10, was found to be the single most important predictor. Other important predictors included percent clay, bulk density, organic carbon percent, coefficient of uniformity and values derived from water retention characteristics. Model performances were consistently better for Ks values greater than 10 cm/h. This study maximizes the extraction of information from a large database to develop generic machine learning based PTFs to estimate Ks. The study also evaluates the importance of various soil properties and their transformations in explaining Ks.
Zhang, Xiaodong; Zeng, Zhen; Liu, Xianlei; Fang, Fengzhou
2015-09-21
Freeform surface is promising to be the next generation optics, however it needs high form accuracy for excellent performance. The closed-loop of fabrication-measurement-compensation is necessary for the improvement of the form accuracy. It is difficult to do an off-machine measurement during the freeform machining because the remounting inaccuracy can result in significant form deviations. On the other side, on-machine measurement may hides the systematic errors of the machine because the measuring device is placed in situ on the machine. This study proposes a new compensation strategy based on the combination of on-machine and off-machine measurement. The freeform surface is measured in off-machine mode with nanometric accuracy, and the on-machine probe achieves accurate relative position between the workpiece and machine after remounting. The compensation cutting path is generated according to the calculated relative position and shape errors to avoid employing extra manual adjustment or highly accurate reference-feature fixture. Experimental results verified the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Houborg, Rasmus; McCabe, Matthew F.
2018-01-01
With an increasing volume and dimensionality of Earth observation data, enhanced integration of machine-learning methodologies is needed to effectively analyze and utilize these information rich datasets. In machine-learning, a training dataset is required to establish explicit associations between a suite of explanatory 'predictor' variables and the target property. The specifics of this learning process can significantly influence model validity and portability, with a higher generalization level expected with an increasing number of observable conditions being reflected in the training dataset. Here we propose a hybrid training approach for leaf area index (LAI) estimation, which harnesses synergistic attributes of scattered in-situ measurements and systematically distributed physically based model inversion results to enhance the information content and spatial representativeness of the training data. To do this, a complimentary training dataset of independent LAI was derived from a regularized model inversion of RapidEye surface reflectances and subsequently used to guide the development of LAI regression models via Cubist and random forests (RF) decision tree methods. The application of the hybrid training approach to a broad set of Landsat 8 vegetation index (VI) predictor variables resulted in significantly improved LAI prediction accuracies and spatial consistencies, relative to results relying on in-situ measurements alone for model training. In comparing the prediction capacity and portability of the two machine-learning algorithms, a pair of relatively simple multi-variate regression models established by Cubist performed best, with an overall relative mean absolute deviation (rMAD) of ∼11%, determined based on a stringent scene-specific cross-validation approach. In comparison, the portability of RF regression models was less effective (i.e., an overall rMAD of ∼15%), which was attributed partly to model saturation at high LAI in association with inherent extrapolation and transferability limitations. Explanatory VIs formed from bands in the near-infrared (NIR) and shortwave infrared domains (e.g., NDWI) were associated with the highest predictive ability, whereas Cubist models relying entirely on VIs based on NIR and red band combinations (e.g., NDVI) were associated with comparatively high uncertainties (i.e., rMAD ∼ 21%). The most transferable and best performing models were based on combinations of several predictor variables, which included both NDWI- and NDVI-like variables. In this process, prior screening of input VIs based on an assessment of variable relevance served as an effective mechanism for optimizing prediction accuracies from both Cubist and RF. While this study demonstrated benefit in combining data mining operations with physically based constraints via a hybrid training approach, the concept of transferability and portability warrants further investigations in order to realize the full potential of emerging machine-learning techniques for regression purposes.
New design opportunities with OVI
NASA Astrophysics Data System (ADS)
Bleikolm, Anton F.
1998-04-01
Optically Variable Ink (OVITM) chosen for its unique colour shifting properties is applied to the currencies of more than 50 countries. An significant colour difference at viewing angles of 90 degrees and 30 degrees respectively makes colour copying impossible. New manufacturing techniques for the interference pigment (OVP) provide ever better cost/performance ratios. Screen printing presses newly available on the market guarantee production speeds of 8000 sheets/hour or 130 meters/minute in the case of web printing, perfectly in line with the traditional equipment for manufacturing of currency. Specifically developed ink formulations allow UV-curing at high speed or oxidative drying to create highly mechanically and chemically resistant colour shifting prints. The unique colour shifting characteristics together with overprinting in intaglio give design opportunities providing the best protection against colour copying or commercial reprint. Specific designs of OVP together with high security ingredients allow the formulation of machine readable optically variable inks useful for the authentication and sorting of documents.
Younghak Shin; Balasingham, Ilangko
2017-07-01
Colonoscopy is a standard method for screening polyps by highly trained physicians. Miss-detected polyps in colonoscopy are potential risk factor for colorectal cancer. In this study, we investigate an automatic polyp classification framework. We aim to compare two different approaches named hand-craft feature method and convolutional neural network (CNN) based deep learning method. Combined shape and color features are used for hand craft feature extraction and support vector machine (SVM) method is adopted for classification. For CNN approach, three convolution and pooling based deep learning framework is used for classification purpose. The proposed framework is evaluated using three public polyp databases. From the experimental results, we have shown that the CNN based deep learning framework shows better classification performance than the hand-craft feature based methods. It achieves over 90% of classification accuracy, sensitivity, specificity and precision.
NASA Astrophysics Data System (ADS)
Ding, Hao; Cao, Ming; DuPont, Andrew W.; Scott, Larry D.; Guha, Sushovan; Singhal, Shashideep; Younes, Mamoun; Pence, Isaac; Herline, Alan; Schwartz, David; Xu, Hua; Mahadevan-Jansen, Anita; Bi, Xiaohong
2016-03-01
Inflammatory bowel disease (IBD) is an idiopathic disease that is typically characterized by chronic inflammation of the gastrointestinal tract. Recently much effort has been devoted to the development of novel diagnostic tools that can assist physicians for fast, accurate, and automated diagnosis of the disease. Previous research based on Raman spectroscopy has shown promising results in differentiating IBD patients from normal screening cases. In the current study, we examined IBD patients in vivo through a colonoscope-coupled Raman system. Optical diagnosis for IBD discrimination was conducted based on full-range spectra using multivariate statistical methods. Further, we incorporated several feature selection methods in machine learning into the classification model. The diagnostic performance for disease differentiation was significantly improved after feature selection. Our results showed that improved IBD diagnosis can be achieved using Raman spectroscopy in combination with multivariate analysis and feature selection.
Logical Differential Prediction Bayes Net, improving breast cancer diagnosis for older women.
Nassif, Houssam; Wu, Yirong; Page, David; Burnside, Elizabeth
2012-01-01
Overdiagnosis is a phenomenon in which screening identities cancer which may not go on to cause symptoms or death. Women over 65 who develop breast cancer bear the heaviest burden of overdiagnosis. This work introduces novel machine learning algorithms to improve diagnostic accuracy of breast cancer in aging populations. At the same time, we aim at minimizing unnecessary invasive procedures (thus decreasing false positives) and concomitantly addressing overdiagnosis. We develop a novel algorithm. Logical Differential Prediction Bayes Net (LDP-BN), that calculates the risk of breast disease based on mammography findings. LDP-BN uses Inductive Logic Programming (ILP) to learn relational rules, selects older-specific differentially predictive rules, and incorporates them into a Bayes Net, significantly improving its performance. In addition, LDP-BN offers valuable insight into the classification process, revealing novel older-specific rules that link mass presence to invasive, and calcification presence and lack of detectable mass to DCIS.
ERIC Educational Resources Information Center
Deutsch, William
1992-01-01
Reviews the history of the development of the field of performance technology. Highlights include early teaching machines, instructional technology, learning theory, programed instruction, the systems approach, needs assessment, branching versus linear program formats, programing languages, and computer-assisted instruction. (LRW)
Learn more about the new source performance standards (NSPS) for surface coating of plastic parts for business machines by reading the rule summary and history and finding the code of federal regulations as well as related rules.
Computer network defense system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urias, Vincent; Stout, William M. S.; Loverro, Caleb
A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves networkmore » connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.« less
Ranganathan, Rajiv
2017-09-11
Impairment of hand and finger function after stroke is common and affects the ability to perform activities of daily living. Even though many of these coordination deficits such as finger individuation have been well characterized, it is critical to understand how stroke survivors learn to explore and reorganize their finger coordination patterns for optimizing rehabilitation. In this study, I examine the use of a body-machine interface to assess how participants explore their movement repertoire, and how this changes with continued practice. Ten participants with chronic stroke wore a data glove and the finger joint angles were mapped on to the position of a cursor on a screen. The task of the participants was to move the cursor back and forth between two specified targets on a screen. Critically, the map between the finger movements and cursor motion was altered so that participants sometimes had to generate coordination patterns that required finger individuation. There were two phases to the experiment - an initial assessment phase on day 1, followed by a learning phase (days 2-5) where participants trained to reorganize their coordination patterns. Participants showed difficulty in performing tasks which had maps that required finger individuation, and the degree to which they explored their movement repertoire was directly related to clinical tests of hand function. However, over four sessions of practice, participants were able to learn to reorganize their finger movement coordination pattern and improve their performance. Moreover, training also resulted in improvements in movement repertoire outside of the context of the specific task during free exploration. Stroke survivors show deficits in movement repertoire in their paretic hand, but facilitating movement exploration during training can increase the movement repertoire. This suggests that exploration may be an important element of rehabilitation to regain optimal function.
Yılmaz Isıkhan, Selen; Karabulut, Erdem; Alpar, Celal Reha
2016-01-01
Background/Aim . Evaluating the success of dose prediction based on genetic or clinical data has substantially advanced recently. The aim of this study is to predict various clinical dose values from DNA gene expression datasets using data mining techniques. Materials and Methods . Eleven real gene expression datasets containing dose values were included. First, important genes for dose prediction were selected using iterative sure independence screening. Then, the performances of regression trees (RTs), support vector regression (SVR), RT bagging, SVR bagging, and RT boosting were examined. Results . The results demonstrated that a regression-based feature selection method substantially reduced the number of irrelevant genes from raw datasets. Overall, the best prediction performance in nine of 11 datasets was achieved using SVR; the second most accurate performance was provided using a gradient-boosting machine (GBM). Conclusion . Analysis of various dose values based on microarray gene expression data identified common genes found in our study and the referenced studies. According to our findings, SVR and GBM can be good predictors of dose-gene datasets. Another result of the study was to identify the sample size of n = 25 as a cutoff point for RT bagging to outperform a single RT.
SH2 Ligand Prediction-Guidance for In-Silico Screening.
Li, Shawn S C; Li, Lei
2017-01-01
Systematic identification of binding partners for SH2 domains is important for understanding the biological function of the corresponding SH2 domain-containing proteins. Here, we describe two different web-accessible computer programs, SMALI and DomPep, for predicting binding ligands for SH2 domains. The former was developed using a Scoring Matrix method and the latter based on the Support Vector Machine model.
Sun, Yunan; Zhou, Hui; Zhu, Hongmei; Leung, Siu-wai
2016-01-25
Sirtuin 1 (SIRT1) is a nicotinamide adenine dinucleotide-dependent deacetylase, and its dysregulation can lead to ageing, diabetes, and cancer. From 346 experimentally confirmed SIRT1 inhibitors, an inhibitor structure pattern was generated by inductive logic programming (ILP) with DMax Chemistry Assistant software. The pattern contained amide, amine, and hetero-aromatic five-membered rings, each of which had a hetero-atom and an unsubstituted atom at a distance of 2. According to this pattern, a ligand-based virtual screening of 1 444 880 active compounds from Chinese herbs identified 12 compounds as inhibitors of SIRT1. Three compounds (ZINC08790006, ZINC08792229, and ZINC08792355) had high affinity (-7.3, -7.8, and -8.6 kcal/mol, respectively) for SIRT1 as estimated by molecular docking software AutoDock Vina. This study demonstrated a use of ILP and background knowledge in machine learning to facilitate virtual screening.
Wibirama, Sunu; Nugroho, Hanung A
2017-07-01
Mobile devices addiction has been an important research topic in cognitive science, mental health, and human-machine interaction. Previous works observed mobile device addiction by logging mobile devices activity. Although immersion has been linked as a significant predictor of video game addiction, investigation on addiction factors of mobile device with behavioral measurement has never been done before. In this research, we demonstrated the usage of eye tracking to observe effect of screen size on experience of immersion. We compared subjective judgment with eye movements analysis. Non-parametric analysis on immersion score shows that screen size affects experience of immersion (p<;0.05). Furthermore, our experimental results suggest that fixational eye movements may be used as an indicator for future investigation of mobile devices addiction. Our experimental results are also useful to develop a guideline as well as intervention strategy to deal with smartphone addiction.
An intelligent identification algorithm for the monoclonal picking instrument
NASA Astrophysics Data System (ADS)
Yan, Hua; Zhang, Rongfu; Yuan, Xujun; Wang, Qun
2017-11-01
The traditional colony selection is mainly operated by manual mode, which takes on low efficiency and strong subjectivity. Therefore, it is important to develop an automatic monoclonal-picking instrument. The critical stage of the automatic monoclonal-picking and intelligent optimal selection is intelligent identification algorithm. An auto-screening algorithm based on Support Vector Machine (SVM) is proposed in this paper, which uses the supervised learning method, which combined with the colony morphological characteristics to classify the colony accurately. Furthermore, through the basic morphological features of the colony, system can figure out a series of morphological parameters step by step. Through the establishment of maximal margin classifier, and based on the analysis of the growth trend of the colony, the selection of the monoclonal colony was carried out. The experimental results showed that the auto-screening algorithm could screen out the regular colony from the other, which meets the requirement of various parameters.
NASA Astrophysics Data System (ADS)
Sun, Yunan; Zhou, Hui; Zhu, Hongmei; Leung, Siu-Wai
2016-01-01
Sirtuin 1 (SIRT1) is a nicotinamide adenine dinucleotide-dependent deacetylase, and its dysregulation can lead to ageing, diabetes, and cancer. From 346 experimentally confirmed SIRT1 inhibitors, an inhibitor structure pattern was generated by inductive logic programming (ILP) with DMax Chemistry Assistant software. The pattern contained amide, amine, and hetero-aromatic five-membered rings, each of which had a hetero-atom and an unsubstituted atom at a distance of 2. According to this pattern, a ligand-based virtual screening of 1 444 880 active compounds from Chinese herbs identified 12 compounds as inhibitors of SIRT1. Three compounds (ZINC08790006, ZINC08792229, and ZINC08792355) had high affinity (-7.3, -7.8, and -8.6 kcal/mol, respectively) for SIRT1 as estimated by molecular docking software AutoDock Vina. This study demonstrated a use of ILP and background knowledge in machine learning to facilitate virtual screening.
Nagasawa, Shinji; Al-Naamani, Eman; Saeki, Akinori
2018-05-17
Owing to the diverse chemical structures, organic photovoltaic (OPV) applications with a bulk heterojunction framework have greatly evolved over the last two decades, which has produced numerous organic semiconductors exhibiting improved power conversion efficiencies (PCEs). Despite the recent fast progress in materials informatics and data science, data-driven molecular design of OPV materials remains challenging. We report a screening of conjugated molecules for polymer-fullerene OPV applications by supervised learning methods (artificial neural network (ANN) and random forest (RF)). Approximately 1000 experimental parameters including PCE, molecular weight, and electronic properties are manually collected from the literature and subjected to machine learning with digitized chemical structures. Contrary to the low correlation coefficient in ANN, RF yields an acceptable accuracy, which is twice that of random classification. We demonstrate the application of RF screening for the design, synthesis, and characterization of a conjugated polymer, which facilitates a rapid development of optoelectronic materials.
Jagannadh, Veerendra Kalyan; Gopakumar, G; Subrahmanyam, Gorthi R K Sai; Gorthi, Sai Siva
2017-05-01
Each year, about 7-8 million deaths occur due to cancer around the world. More than half of the cancer-related deaths occur in the less-developed parts of the world. Cancer mortality rate can be reduced with early detection and subsequent treatment of the disease. In this paper, we introduce a microfluidic microscopy-based cost-effective and label-free approach for identification of cancerous cells. We outline a diagnostic framework for the same and detail an instrumentation layout. We have employed classical computer vision techniques such as 2D principal component analysis-based cell type representation followed by support vector machine-based classification. Analogous to criminal face recognition systems implemented with help of surveillance cameras, a signature-based approach for cancerous cell identification using microfluidic microscopy surveillance is demonstrated. Such a platform would facilitate affordable mass screening camps in the developing countries and therefore help decrease cancer mortality rate.
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
1997-10-01
This king-size copper disk, manufactured at the Space Optics Manufacturing and Technology Center (SOMTC) at the Marshall Space Flight Center (MSFC), is a special mold for making high resolution monitor screens. This master mold will be used to make several other molds, each capable of forming hundreds of screens that have a type of lens called a Fresnel lens. Weighing much less than conventional optics, Fresnel lenses have multiple concentric grooves, each formed to a precise angle, that together create the curvature needed to focus and project images. MSFC leads NASA's space optics manufacturing technology development as a technology leader for diamond turning. The machine used to manufacture this mold is among many one-of-a-kind pieces of equipment of MSFC's SOMTC.
NASA Astrophysics Data System (ADS)
Hsiao, Ming-Chih; Su, Ling-Huey
2018-02-01
This research addresses the problem of scheduling hybrid machine types, in which one type is a two-machine flowshop and another type is a single machine. A job is either processed on the two-machine flowshop or on the single machine. The objective is to determine a production schedule for all jobs so as to minimize the makespan. The problem is NP-hard since the two parallel machines problem was proved to be NP-hard. Simulated annealing algorithms are developed to solve the problem optimally. A mixed integer programming (MIP) is developed and used to evaluate the performance for two SAs. Computational experiments demonstrate the efficiency of the simulated annealing algorithms, the quality of the simulated annealing algorithms will also be reported.
Intelligence-Augmented Rat Cyborgs in Maze Solving.
Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui
2016-01-01
Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains.
Intelligence-Augmented Rat Cyborgs in Maze Solving
Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui
2016-01-01
Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains. PMID:26859299
Online Sequential Projection Vector Machine with Adaptive Data Mean Update
Chen, Lin; Jia, Ji-Ting; Zhang, Qiong; Deng, Wan-Yu; Wei, Wei
2016-01-01
We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM) which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1) the projection vectors for dimension reduction, (2) the input weights, biases, and output weights, and (3) the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD) approach, adaptive multihyperplane machine (AMM), primal estimated subgradient solver (Pegasos), online sequential extreme learning machine (OSELM), and SVD + OSELM (feature selection based on SVD is performed before OSELM). The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM. PMID:27143958
Online Sequential Projection Vector Machine with Adaptive Data Mean Update.
Chen, Lin; Jia, Ji-Ting; Zhang, Qiong; Deng, Wan-Yu; Wei, Wei
2016-01-01
We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM) which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1) the projection vectors for dimension reduction, (2) the input weights, biases, and output weights, and (3) the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD) approach, adaptive multihyperplane machine (AMM), primal estimated subgradient solver (Pegasos), online sequential extreme learning machine (OSELM), and SVD + OSELM (feature selection based on SVD is performed before OSELM). The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM.
Statistical machine translation for biomedical text: are we there yet?
Wu, Cuijun; Xia, Fei; Deleger, Louise; Solti, Imre
2011-01-01
In our paper we addressed the research question: "Has machine translation achieved sufficiently high quality to translate PubMed titles for patients?". We analyzed statistical machine translation output for six foreign language - English translation pairs (bi-directionally). We built a high performing in-house system and evaluated its output for each translation pair on large scale both with automated BLEU scores and human judgment. In addition to the in-house system, we also evaluated Google Translate's performance specifically within the biomedical domain. We report high performance for German, French and Spanish -- English bi-directional translation pairs for both Google Translate and our system.
Rajaraman, Sivaramakrishnan; Antani, Sameer K; Poostchi, Mahdieh; Silamut, Kamolrat; Hossain, Md A; Maude, Richard J; Jaeger, Stefan; Thoma, George R
2018-01-01
Malaria is a blood disease caused by the Plasmodium parasites transmitted through the bite of female Anopheles mosquito. Microscopists commonly examine thick and thin blood smears to diagnose disease and compute parasitemia. However, their accuracy depends on smear quality and expertise in classifying and counting parasitized and uninfected cells. Such an examination could be arduous for large-scale diagnoses resulting in poor quality. State-of-the-art image-analysis based computer-aided diagnosis (CADx) methods using machine learning (ML) techniques, applied to microscopic images of the smears using hand-engineered features demand expertise in analyzing morphological, textural, and positional variations of the region of interest (ROI). In contrast, Convolutional Neural Networks (CNN), a class of deep learning (DL) models promise highly scalable and superior results with end-to-end feature extraction and classification. Automated malaria screening using DL techniques could, therefore, serve as an effective diagnostic aid. In this study, we evaluate the performance of pre-trained CNN based DL models as feature extractors toward classifying parasitized and uninfected cells to aid in improved disease screening. We experimentally determine the optimal model layers for feature extraction from the underlying data. Statistical validation of the results demonstrates the use of pre-trained CNNs as a promising tool for feature extraction for this purpose.
40 CFR 60.720 - Applicability and designation of affected facility.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Industrial Surface Coating: Surface Coating of Plastic Parts for Business Machines § 60.720... in which plastic parts for use in the manufacture of business machines receive prime coats, color...
V-TECS Guide for Machine Shop (Machinist).
ERIC Educational Resources Information Center
Gregory, Margaret R.; Benson, Robert T.
This curriculum guide is intended to train trade and industrial education students in the hands-on aspects of the occupation of machinist. Included in the guide are course outlines that deal with the following topics: following safety procedures; performing mathematical calculations; designing and planning machine work; performing precision…
Liang, Ja-Der; Ping, Xiao-Ou; Tseng, Yi-Ju; Huang, Guan-Tarn; Lai, Feipei; Yang, Pei-Ming
2014-12-01
Recurrence of hepatocellular carcinoma (HCC) is an important issue despite effective treatments with tumor eradication. Identification of patients who are at high risk for recurrence may provide more efficacious screening and detection of tumor recurrence. The aim of this study was to develop recurrence predictive models for HCC patients who received radiofrequency ablation (RFA) treatment. From January 2007 to December 2009, 83 newly diagnosed HCC patients receiving RFA as their first treatment were enrolled. Five feature selection methods including genetic algorithm (GA), simulated annealing (SA) algorithm, random forests (RF) and hybrid methods (GA+RF and SA+RF) were utilized for selecting an important subset of features from a total of 16 clinical features. These feature selection methods were combined with support vector machine (SVM) for developing predictive models with better performance. Five-fold cross-validation was used to train and test SVM models. The developed SVM-based predictive models with hybrid feature selection methods and 5-fold cross-validation had averages of the sensitivity, specificity, accuracy, positive predictive value, negative predictive value, and area under the ROC curve as 67%, 86%, 82%, 69%, 90%, and 0.69, respectively. The SVM derived predictive model can provide suggestive high-risk recurrent patients, who should be closely followed up after complete RFA treatment. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Summary of ADTT Website Functionality and Features
NASA Technical Reports Server (NTRS)
Hawke, Veronica; Duong, Trang; Liang, Lawrence; Gage, Peter; Lawrence, Scott (Technical Monitor)
2001-01-01
This report summarizes development of the ADTT web-based design environment by the ELORET team in 2000. The Advanced Design Technology Testbed had been in development for several years, with demonstration applications restricted to aerodynamic analyses of subsonic aircraft. The key changes achieved this year were improvements in Web-based accessibility, evaluation of collaborative visualization, remote invocation of geometry updates and performance analysis, and application to aerospace system analysis. Significant effort was also devoted to post-processing of data, chiefly through comparison of similar data for alternative vehicle concepts. Such comparison is an essential requirement for designers to make informed choices between alternatives. The next section of this report provides more discussion of the goals for ADTT development. Section 3 provides screen shots from a sample session in the ADTT environment, including Login and navigation to the project of interest, data inspection, analysis execution and output evaluation. The following section provides discussion of implementation details and recommendations for future development of the software and information technologies that provide the key functionality of the ADTT system. Section 5 discusses the integration architecture for the system, which links machines running different operating systems and provides unified access to data stored in distributed locations. Security is a significant issue for this system, especially for remote access to NAS machines, so Section 6 discusses several architectural considerations with respect to security. Additional details of some aspects of ADTT development are included in Appendices.
ERIC Educational Resources Information Center
Norman, D. A.; And Others
"Machine controlled adaptive training is a promising concept. In adaptive training the task presented to the trainee varies as a function of how well he performs. In machine controlled training, adaptive logic performs a function analogous to that performed by a skilled operator." This study looks at the ways in which gain-effective time…
Analysis of NREL Cold-Drink Vending Machines for Energy Savings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deru, M.; Torcellini, P.; Bottom, K.
NREL Staff, as part of Sustainable NREL, an initiative to improve the overall energy and environmental performance of the lab, decided to control how its vending machines used energy. The cold-drink vending machines across the lab were analyzed for potential energy savings opportunities. This report gives the monitoring and the analysis of two energy conservation measures applied to the cold-drink vending machines at NREL.