Sample records for screening machine suitable

  1. Smart Screening System (S3) In Taconite Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daryoush Allaei; Ryan Wartman; David Tarnowski

    2006-03-01

    The conventional screening machines used in processing plants have had undesirable high noise and vibration levels. They also have had unsatisfactorily low screening efficiency, high energy consumption, high maintenance cost, low productivity, and poor worker safety. These conventional vibrating machines have been used in almost every processing plant. Most of the current material separation technology uses heavy and inefficient electric motors with an unbalanced rotating mass to generate the shaking. In addition to being excessively noisy, inefficient, and high-maintenance, these vibrating machines are often the bottleneck in the entire process. Furthermore, these motors, along with the vibrating machines and supportingmore » structure, shake other machines and structures in the vicinity. The latter increases maintenance costs while reducing worker health and safety. The conventional vibrating fine screens at taconite processing plants have had the same problems as those listed above. This has resulted in lower screening efficiency, higher energy and maintenance cost, and lower productivity and workers safety concerns. The focus of this work is on the design of a high performance screening machine suitable for taconite processing plants. SmartScreens{trademark} technology uses miniaturized motors, based on smart materials, to generate the shaking. The underlying technologies are Energy Flow Control{trademark} and Vibration Control by Confinement{trademark}. These concepts are used to direct energy flow and confine energy efficiently and effectively to the screen function. The SmartScreens{trademark} technology addresses problems related to noise and vibration, screening efficiency, productivity, and maintenance cost and worker safety. Successful development of SmartScreens{trademark} technology will bring drastic changes to the screening and physical separation industry. The final designs for key components of the SmartScreens{trademark} have been developed. The key components include smart motor and associated electronics, resonators, and supporting structural elements. It is shown that the smart motors have an acceptable life and performance. Resonator (or motion amplifier) designs are selected based on the final system requirement and vibration characteristics. All the components for a fully functional prototype are fabricated. The development program is on schedule. The last semi-annual report described the completion of the design refinement phase. This phase resulted in a Smart Screen design that meets performance targets both in the dry condition and with taconite slurry flow using PZT motors. This system was successfully demonstrated for the DOE and partner companies at the Coleraine Mineral Research Laboratory in Coleraine, Minnesota. Since then, the fabrication of the dry application prototype (incorporating an electromagnetic drive mechanism and a new deblinding concept) has been completed and successfully tested at QRDC's lab.« less

  2. 3D Magnetic field modeling of a new superconducting synchronous machine using reluctance network method

    NASA Astrophysics Data System (ADS)

    Kelouaz, Moussa; Ouazir, Youcef; Hadjout, Larbi; Mezani, Smail; Lubin, Thiery; Berger, Kévin; Lévêque, Jean

    2018-05-01

    In this paper a new superconducting inductor topology intended for synchronous machine is presented. The studied machine has a standard 3-phase armature and a new kind of 2-poles inductor (claw-pole structure) excited by two coaxial superconducting coils. The air-gap spatial variation of the radial flux density is obtained by inserting a superconducting bulk, which deviates the magnetic field due to the coils. The complex geometry of this inductor usually needs 3D finite elements (FEM) for its analysis. However, to avoid a long computational time inherent to 3D FEM, we propose in this work an alternative modeling, which uses a 3D meshed reluctance network. The results obtained with the developed model are compared to 3D FEM computations as well as to measurements carried out on a laboratory prototype. Finally, a 3D FEM study of the shielding properties of the superconducting screen demonstrates the suitability of using a diamagnetic-like model of the superconducting screen.

  3. Smart Screening System (S3) In Taconite Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daryoush Allaei; Angus Morison; David Tarnowski

    2005-09-01

    The conventional screening machines used in processing plants have had undesirable high noise and vibration levels. They also have had unsatisfactorily low screening efficiency, high energy consumption, high maintenance cost, low productivity, and poor worker safety. These conventional vibrating machines have been used in almost every processing plant. Most of the current material separation technology uses heavy and inefficient electric motors with an unbalanced rotating mass to generate the shaking. In addition to being excessively noisy, inefficient, and high-maintenance, these vibrating machines are often the bottleneck in the entire process. Furthermore, these motors, along with the vibrating machines and supportingmore » structure, shake other machines and structures in the vicinity. The latter increases maintenance costs while reducing worker health and safety. The conventional vibrating fine screens at taconite processing plants have had the same problems as those listed above. This has resulted in lower screening efficiency, higher energy and maintenance cost, and lower productivity and workers safety concerns. The focus of this work is on the design of a high performance screening machine suitable for taconite processing plants. SmartScreens{trademark} technology uses miniaturized motors, based on smart materials, to generate the shaking. The underlying technologies are Energy Flow Control{trademark} and Vibration Control by Confinement{trademark}. These concepts are used to direct energy flow and confine energy efficiently and effectively to the screen function. The SmartScreens{trademark} technology addresses problems related to noise and vibration, screening efficiency, productivity, and maintenance cost and worker safety. Successful development of SmartScreens{trademark} technology will bring drastic changes to the screening and physical separation industry. The final designs for key components of the SmartScreens{trademark} have been developed. The key components include smart motor and associated electronics, resonators, and supporting structural elements. It is shown that the smart motors have an acceptable life and performance. Resonator (or motion amplifier) designs are selected based on the final system requirement and vibration characteristics. All the components for a fully functional prototype are fabricated. The development program is on schedule. The last semi-annual report described the process of FE model validation and correlation with experimental data in terms of dynamic performance and predicted stresses. It also detailed efforts into making the supporting structure less important to system performance. Finally, an introduction into the dry application concept was presented. Since then, the design refinement phase was completed. This has resulted in a Smart Screen design that meets performance targets both in the dry condition and with taconite slurry flow using PZT motors. Furthermore, this system was successfully demonstrated for the DOE and partner companies at the Coleraine Mineral Research Laboratory in Coleraine, Minnesota.« less

  4. 1001 Ways to run AutoDock Vina for virtual screening

    NASA Astrophysics Data System (ADS)

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  5. 1001 Ways to run AutoDock Vina for virtual screening.

    PubMed

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  6. Graph Kernels for Molecular Similarity.

    PubMed

    Rupp, Matthias; Schneider, Gisbert

    2010-04-12

    Molecular similarity measures are important for many cheminformatics applications like ligand-based virtual screening and quantitative structure-property relationships. Graph kernels are formal similarity measures defined directly on graphs, such as the (annotated) molecular structure graph. Graph kernels are positive semi-definite functions, i.e., they correspond to inner products. This property makes them suitable for use with kernel-based machine learning algorithms such as support vector machines and Gaussian processes. We review the major types of kernels between graphs (based on random walks, subgraphs, and optimal assignments, respectively), and discuss their advantages, limitations, and successful applications in cheminformatics. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Method and system for rendering and interacting with an adaptable computing environment

    DOEpatents

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  8. Subcutaneous ICD screening with the Boston Scientific ZOOM programmer versus a 12-lead ECG machine.

    PubMed

    Chang, Shu C; Patton, Kristen K; Robinson, Melissa R; Poole, Jeanne E; Prutkin, Jordan M

    2018-02-24

    The subcutaneous implantable cardioverter-defibrillator (S-ICD) requires preimplant screening to ensure appropriate sensing and reduce risk of inappropriate shocks. Screening can be performed using either an ICD programmer or a 12-lead electrocardiogram (ECG) machine. It is unclear whether differences in signal filtering and digital sampling change the screening success rate. Subjects were recruited if they had a transvenous single-lead ICD without pacing requirements or were candidates for a new ICD. Screening was performed using both a Boston Scientific ZOOM programmer (Marlborough, MA, USA) and General Electric MAC 5000 ECG machine (Fairfield, CT, USA). A pass was defined as having at least one lead that fit within the screening template in both supine and sitting positions. A total of 69 subjects were included and 27 sets of ECG leads had differing screening results between the two machines (7%). Of these sets, 22 (81%) passed using the ECG machine but failed using the programmer and five (19%) passed using the ECG machine but failed using the programmer (P < 0.001). Four subjects (6%) passed screening using the ECG machine but failed using the programmer. No subject passed screening with the programmer but failed with the ECG machine. There can be occasional disagreement in S-ICD patient screening between an ICD programmer and ECG machine, all of whom passed with the ECG machine but failed using the programmer. On a per lead basis, the ECG machine passes more subjects. It is unknown what the inappropriate shock rate would be if an S-ICD was implanted. Clinical judgment should be used in borderline cases. © 2018 Wiley Periodicals, Inc.

  9. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    PubMed

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  10. 12. BUILDING 621, INTERIOR, GROUND FLOOR, LOOKING NORTHWEST AT SCREENING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. BUILDING 621, INTERIOR, GROUND FLOOR, LOOKING NORTHWEST AT SCREENING MACHINE THAT REMOVES SHELL FRAGMENTS. METALLIC DUST REMOVED BY MAGNETIC SEPERATOR UNDERNEATH SCREEN. SAWDUST IS RETURNED TO SAWDUST HOPPER BY ELEVATOR. HOODS OVER SCREENING MACHINE AT WORKBENCH REMOVE FINE SAWDUST. - Picatinny Arsenal, 600 Area, Test Areas District, State Route 15 near I-80, Dover, Morris County, NJ

  11. [Automatic pre-transfusion serology].

    PubMed

    Wattar, B; Govaerts, A

    1975-12-01

    This paper describes an automated apparatus combining Rosenfield's and Lalezari's antibody screening and identification basic technics. PVP bromelin and low ionic strength acid polybren channels are used; agglutinates are decanded; the remaining cells are hemolyzed and the optical density is then measured through a colorimeter and recorded on a chart; speed is of 40 samples an hour. This machine was also used for irregular antibody screening and identification. Sensitivity is shown to be equal to that of manual technics for ABO, Lewis, Lutheran as well as K, S, M, Kpb, Xga, U and Vel antibodies detection. Nevertheless, a much greater sensitivity is achieved (titers 3 to 10 times higher) than by manual technics for Rh, -k, S, Fya antibodies detection. Polybren channel is suitable for anti-Rh, Duffy, I and M (human detection; bromelin channel however, has a greater sensitivity for other specificities. Anti-M and anti-N sera from rabbits were shown to be non specific when using this machine. Over almost 15 000 sera tested, no antibody (detected by manual techniques) escaped the automated screening. This antibody detection machine was applied to compatibility tests prior to transfusion. (21 480 units were tested. aimed to be transfused to 5 611 patients). A third, PVP without bromelin, was set in parallel in order not to let escape any anti-M, even a weak one. The sera distributor was slaved to the cells distributor so that the whole procedure was automated. Furthermore, each serum was tested against red cells to be transfused, but also against the patient's own red cells to be transfused, but also against the patient's own red cells and against two selected red cells panels, so as to ensure irregular antibody detection at the same time. Using this machine, 3 to 4% of the cell samples were rejected, i.e. more than with usual techniques. All manually detected antibodies were identified, but also some others, which showed only weak reactions by classical techniques. Total results can be obtained within 20 to 30 minutes, which is quite rapid, compared to techniques using for example antiglobulin tests.

  12. Smart material screening machines using smart materials and controls

    NASA Astrophysics Data System (ADS)

    Allaei, Daryoush; Corradi, Gary; Waigand, Al

    2002-07-01

    The objective of this product is to address the specific need for improvements in the efficiency and effectiveness in physical separation technologies in the screening areas. Currently, the mining industry uses approximately 33 billion kW-hr per year, costing 1.65 billion dollars at 0.05 cents per kW-hr, of electrical energy for physical separations. Even though screening and size separations are not the single most energy intensive process in the mining industry, they are often the major bottleneck in the whole process. Improvements to this area offer tremendous potential in both energy savings and production improvements. Additionally, the vibrating screens used in the mining processing plants are the most costly areas from maintenance and worker health and safety point of views. The goal of this product is to reduce energy use in the screening and total processing areas. This goal is accomplished by developing an innovative screening machine based on smart materials and smart actuators, namely smart screen that uses advanced sensory system to continuously monitor the screening process and make appropriate adjustments to improve production. The theory behind the development of Smart Screen technology is based on two key technologies, namely smart actuators and smart Energy Flow ControlT (EFCT) strategies, developed initially for military applications. Smart Screen technology controls the flow of vibration energy and confines it to the screen rather than shaking much of the mass that makes up the conventional vibratory screening machine. Consequently, Smart Screens eliminates and downsizes many of the structural components associated with conventional vibratory screening machines. As a result, the surface area of the screen increases for a given envelope. This increase in usable screening surface area extends the life of the screens, reduces required maintenance by reducing the frequency of screen change-outs and improves throughput or productivity.

  13. Automation of Underground Cable Laying Equipment Using PLC and Hmi

    NASA Astrophysics Data System (ADS)

    Mal Kothari, Kesar; Samba, Vishweshwar; Tania, Kinza; Udayakumar, R., Dr; Karthikeyan, Ram, Dr

    2018-04-01

    Underground cable laying is an alternative for overhead cable laying of telecommunication and power transmission lines. It is becoming very popular in recent times because of some of its advantages over overhead cable laying. This type of cable laying is mostly practiced in developed countries because it is more expensive than overhead cable laying. Underground cable laying is more suitable when land is not available, and it also increases the aesthetics. This paper implements the automation on a manually operated cable pulling winch machine using programmable logic controller (PLC). Winch machines are useful in underground cable laying. The main aim of the project is to replace all the mechanical functions with electrical controls which are operated through a touch screen (HMI). The idea is that the machine should shift between parallel and series circuit automatically based on the pressure sensed instead of manually operating the solenoid valve. Traditional means of throttling the engine using lever and wire is replaced with a linear actuator. Sensors such as proximity, pressure and load sensor are used to provide the input to the system. The HMI used will display the speed, length and tension of the rope being winded. Ladder logic is used to program the PLC.

  14. High-throughput screening of chemicals as functional ...

    EPA Pesticide Factsheets

    Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer

  15. Prostate Cancer Probability Prediction By Machine Learning Technique.

    PubMed

    Jović, Srđan; Miljković, Milica; Ivanović, Miljan; Šaranović, Milena; Arsić, Milena

    2017-11-26

    The main goal of the study was to explore possibility of prostate cancer prediction by machine learning techniques. In order to improve the survival probability of the prostate cancer patients it is essential to make suitable prediction models of the prostate cancer. If one make relevant prediction of the prostate cancer it is easy to create suitable treatment based on the prediction results. Machine learning techniques are the most common techniques for the creation of the predictive models. Therefore in this study several machine techniques were applied and compared. The obtained results were analyzed and discussed. It was concluded that the machine learning techniques could be used for the relevant prediction of prostate cancer.

  16. Pointright: a system to redirect mouse and keyboard control among multiple machines

    DOEpatents

    Johanson, Bradley E [Palo Alto, CA; Winograd, Terry A [Stanford, CA; Hutchins, Gregory M [Mountain View, CA

    2008-09-30

    The present invention provides a software system, PointRight, that allows for smooth and effortless control of pointing and input devices among multiple displays. With PointRight, a single free-floating mouse and keyboard can be used to control multiple screens. When the cursor reaches the edge of a screen it seamlessly moves to the adjacent screen and keyboard control is simultaneously redirected to the appropriate machine. Laptops may also redirect their keyboard and pointing device, and multiple pointers are supported simultaneously. The system automatically reconfigures itself as displays go on, go off, or change the machine they display.

  17. Preparation of superhydrophobic copper surface by a novel silk-screen printing aided electrochemical machining method

    NASA Astrophysics Data System (ADS)

    Yan, X. Y.; Chen, G. X.; Liu, J. W.

    2018-03-01

    A kind of superhydrophobic copper surface with micro-nanocomposite structure has been successfully fabricated by employing a silk-screen printing aided electrochemical machining method. At first silk-screen printing technology has been used to form a column point array mask, and then the microcolumn array would be fabricated by electrochemical machining (ECM) effect. In this study, the drop contact angles have been studied and scanning electron microscopy (SEM) has been used to study the surface characteristic of the workpiece. The experiment results show that the micro-nanocomposite structure with cylindrical array can be successfully fabricated on the metal surface. And the maximum contact angle is 151° when the fluoroalkylsilane ethanol solution was used to modify the machined surface in this study.

  18. 49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... glass, or other material with similar properties, if the machine is designed with a windshield. Each new... wipers or suitable alternatives that provide the machine operator an equivalent level of vision if...

  19. Spectrophores as one-dimensional descriptors calculated from three-dimensional atomic properties: applications ranging from scaffold hopping to multi-target virtual screening.

    PubMed

    Gladysz, Rafaela; Dos Santos, Fabio Mendes; Langenaeker, Wilfried; Thijs, Gert; Augustyns, Koen; De Winter, Hans

    2018-03-07

    Spectrophores are novel descriptors that are calculated from the three-dimensional atomic properties of molecules. In our current implementation, the atomic properties that were used to calculate spectrophores include atomic partial charges, atomic lipophilicity indices, atomic shape deviations and atomic softness properties. This approach can easily be widened to also include additional atomic properties. Our novel methodology finds its roots in the experimental affinity fingerprinting technology developed in the 1990's by Terrapin Technologies. Here we have translated it into a purely virtual approach using artificial affinity cages and a simplified metric to calculate the interaction between these cages and the atomic properties. A typical spectrophore consists of a vector of 48 real numbers. This makes it highly suitable for the calculation of a wide range of similarity measures for use in virtual screening and for the investigation of quantitative structure-activity relationships in combination with advanced statistical approaches such as self-organizing maps, support vector machines and neural networks. In our present report we demonstrate the applicability of our novel methodology for scaffold hopping as well as virtual screening.

  20. A New Type of Tea Baking Machine Based on Pro/E Design

    NASA Astrophysics Data System (ADS)

    Lin, Xin-Ying; Wang, Wei

    2017-11-01

    In this paper, the production process of wulong tea was discussed, mainly the effect of baking on the quality of tea. The suitable baking temperature of different tea was introduced. Based on Pro/E, a new type of baking machine suitable for wulong tea baking was designed. The working principle, mechanical structure and constant temperature timing intelligent control system of baking machine were expounded. Finally, the characteristics and innovation of new baking machine were discussed.The mechanical structure of this baking machine is more simple and reasonable, and can use the heat of the inlet and outlet, more energy saving and environmental protection. The temperature control part adopts fuzzy PID control, which can improve the accuracy and response speed of temperature control and reduce the dependence of baking operation on skilled experience.

  1. Machine Learning Based Malware Detection

    DTIC Science & Technology

    2015-05-18

    A TRIDENT SCHOLAR PROJECT REPORT NO. 440 Machine Learning Based Malware Detection by Midshipman 1/C Zane A. Markel, USN...COVERED (From - To) 4. TITLE AND SUBTITLE Machine Learning Based Malware Detection 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...suitably be projected into realistic performance. This work explores several aspects of machine learning based malware detection . First, we

  2. Application of Electro Chemical Machining for materials used in extreme conditions

    NASA Astrophysics Data System (ADS)

    Pandilov, Z.

    2018-03-01

    Electro-Chemical Machining (ECM) is the generic term for a variety of electrochemical processes. ECM is used to machine work pieces from metal and metal alloys irrespective of their hardness, strength or thermal properties, through the anodic dissolution, in aerospace, automotive, construction, medical equipment, micro-systems and power supply industries. The Electro Chemical Machining is extremely suitable for machining of materials used in extreme conditions. General overview of the Electro-Chemical Machining and its application for different materials used in extreme conditions is presented.

  3. Tear fluid proteomics multimarkers for diabetic retinopathy screening

    PubMed Central

    2013-01-01

    Background The aim of the project was to develop a novel method for diabetic retinopathy screening based on the examination of tear fluid biomarker changes. In order to evaluate the usability of protein biomarkers for pre-screening purposes several different approaches were used, including machine learning algorithms. Methods All persons involved in the study had diabetes. Diabetic retinopathy (DR) was diagnosed by capturing 7-field fundus images, evaluated by two independent ophthalmologists. 165 eyes were examined (from 119 patients), 55 were diagnosed healthy and 110 images showed signs of DR. Tear samples were taken from all eyes and state-of-the-art nano-HPLC coupled ESI-MS/MS mass spectrometry protein identification was performed on all samples. Applicability of protein biomarkers was evaluated by six different optimally parameterized machine learning algorithms: Support Vector Machine, Recursive Partitioning, Random Forest, Naive Bayes, Logistic Regression, K-Nearest Neighbor. Results Out of the six investigated machine learning algorithms the result of Recursive Partitioning proved to be the most accurate. The performance of the system realizing the above algorithm reached 74% sensitivity and 48% specificity. Conclusions Protein biomarkers selected and classified with machine learning algorithms alone are at present not recommended for screening purposes because of low specificity and sensitivity values. This tool can be potentially used to improve the results of image processing methods as a complementary tool in automatic or semiautomatic systems. PMID:23919537

  4. Virtual screening by a new Clustering-based Weighted Similarity Extreme Learning Machine approach

    PubMed Central

    Kudisthalert, Wasu

    2018-01-01

    Machine learning techniques are becoming popular in virtual screening tasks. One of the powerful machine learning algorithms is Extreme Learning Machine (ELM) which has been applied to many applications and has recently been applied to virtual screening. We propose the Weighted Similarity ELM (WS-ELM) which is based on a single layer feed-forward neural network in a conjunction of 16 different similarity coefficients as activation function in the hidden layer. It is known that the performance of conventional ELM is not robust due to random weight selection in the hidden layer. Thus, we propose a Clustering-based WS-ELM (CWS-ELM) that deterministically assigns weights by utilising clustering algorithms i.e. k-means clustering and support vector clustering. The experiments were conducted on one of the most challenging datasets–Maximum Unbiased Validation Dataset–which contains 17 activity classes carefully selected from PubChem. The proposed algorithms were then compared with other machine learning techniques such as support vector machine, random forest, and similarity searching. The results show that CWS-ELM in conjunction with support vector clustering yields the best performance when utilised together with Sokal/Sneath(1) coefficient. Furthermore, ECFP_6 fingerprint presents the best results in our framework compared to the other types of fingerprints, namely ECFP_4, FCFP_4, and FCFP_6. PMID:29652912

  5. A deep learning and novelty detection framework for rapid phenotyping in high-content screening

    PubMed Central

    Sommer, Christoph; Hoefler, Rudolf; Samwer, Matthias; Gerlich, Daniel W.

    2017-01-01

    Supervised machine learning is a powerful and widely used method for analyzing high-content screening data. Despite its accuracy, efficiency, and versatility, supervised machine learning has drawbacks, most notably its dependence on a priori knowledge of expected phenotypes and time-consuming classifier training. We provide a solution to these limitations with CellCognition Explorer, a generic novelty detection and deep learning framework. Application to several large-scale screening data sets on nuclear and mitotic cell morphologies demonstrates that CellCognition Explorer enables discovery of rare phenotypes without user training, which has broad implications for improved assay development in high-content screening. PMID:28954863

  6. Antibiotic Residues in Milk from Three Popular Kenyan Milk Vending Machines.

    PubMed

    Kosgey, Amos; Shitandi, Anakalo; Marion, Jason W

    2018-05-01

    Milk vending machines (MVMs) are growing in popularity in Kenya and worldwide. Milk vending machines dispense varying quantities of locally sourced, pasteurized milk. The Kenya Dairy Board has a regulatory framework, but surveillance is weak because of several factors. Milk vending machines' milk is not routinely screened for antibiotics, thereby increasing potential for antibiotic misuse. To investigate, a total of 80 milk samples from four commercial providers ( N = 25), street vendors ( N = 21), and three MVMs ( N = 34) were collected and screened in Eldoret, Kenya. Antibiotic residue surveillance occurred during December 2016 and January 2017 using Idexx SNAP ® tests for tetracyclines, sulfamethazine, beta-lactams, and gentamicin. Overall, 24% of MVM samples and 24% of street vendor samples were presumably positive for at least one antibiotic. No commercial samples were positive. Research into cost-effective screening methods and increased monitoring by food safety agencies are needed to uphold hazard analysis and critical control point for improving antibiotic stewardship throughout the Kenyan private dairy industry.

  7. Natural speech algorithm applied to baseline interview data can predict which patients will respond to psilocybin for treatment-resistant depression.

    PubMed

    Carrillo, Facundo; Sigman, Mariano; Fernández Slezak, Diego; Ashton, Philip; Fitzgerald, Lily; Stroud, Jack; Nutt, David J; Carhart-Harris, Robin L

    2018-04-01

    Natural speech analytics has seen some improvements over recent years, and this has opened a window for objective and quantitative diagnosis in psychiatry. Here, we used a machine learning algorithm applied to natural speech to ask whether language properties measured before psilocybin for treatment-resistant can predict for which patients it will be effective and for which it will not. A baseline autobiographical memory interview was conducted and transcribed. Patients with treatment-resistant depression received 2 doses of psilocybin, 10 mg and 25 mg, 7 days apart. Psychological support was provided before, during and after all dosing sessions. Quantitative speech measures were applied to the interview data from 17 patients and 18 untreated age-matched healthy control subjects. A machine learning algorithm was used to classify between controls and patients and predict treatment response. Speech analytics and machine learning successfully differentiated depressed patients from healthy controls and identified treatment responders from non-responders with a significant level of 85% of accuracy (75% precision). Automatic natural language analysis was used to predict effective response to treatment with psilocybin, suggesting that these tools offer a highly cost-effective facility for screening individuals for treatment suitability and sensitivity. The sample size was small and replication is required to strengthen inferences on these results. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Comparing and Validating Machine Learning Models for Mycobacterium tuberculosis Drug Discovery.

    PubMed

    Lane, Thomas; Russo, Daniel P; Zorn, Kimberley M; Clark, Alex M; Korotcov, Alexandru; Tkachenko, Valery; Reynolds, Robert C; Perryman, Alexander L; Freundlich, Joel S; Ekins, Sean

    2018-04-26

    Tuberculosis is a global health dilemma. In 2016, the WHO reported 10.4 million incidences and 1.7 million deaths. The need to develop new treatments for those infected with Mycobacterium tuberculosis ( Mtb) has led to many large-scale phenotypic screens and many thousands of new active compounds identified in vitro. However, with limited funding, efforts to discover new active molecules against Mtb needs to be more efficient. Several computational machine learning approaches have been shown to have good enrichment and hit rates. We have curated small molecule Mtb data and developed new models with a total of 18,886 molecules with activity cutoffs of 10 μM, 1 μM, and 100 nM. These data sets were used to evaluate different machine learning methods (including deep learning) and metrics and to generate predictions for additional molecules published in 2017. One Mtb model, a combined in vitro and in vivo data Bayesian model at a 100 nM activity yielded the following metrics for 5-fold cross validation: accuracy = 0.88, precision = 0.22, recall = 0.91, specificity = 0.88, kappa = 0.31, and MCC = 0.41. We have also curated an evaluation set ( n = 153 compounds) published in 2017, and when used to test our model, it showed the comparable statistics (accuracy = 0.83, precision = 0.27, recall = 1.00, specificity = 0.81, kappa = 0.36, and MCC = 0.47). We have also compared these models with additional machine learning algorithms showing Bayesian machine learning models constructed with literature Mtb data generated by different laboratories generally were equivalent to or outperformed deep neural networks with external test sets. Finally, we have also compared our training and test sets to show they were suitably diverse and different in order to represent useful evaluation sets. Such Mtb machine learning models could help prioritize compounds for testing in vitro and in vivo.

  9. The Role of Automata and Machine Theory in School and College Mathematics Syllabuses.

    ERIC Educational Resources Information Center

    Holcombe, M.

    1981-01-01

    The introduction of certain topics in the theory of machines and languages into school and college mathematics courses in place of the more usual discussion of groups and formal logic is proposed. Examples of machines and languages and their interconnections suitable for such courses are outlined. (MP)

  10. Prosthetic EMG control enhancement through the application of man-machine principles

    NASA Technical Reports Server (NTRS)

    Simcox, W. A.

    1977-01-01

    An area in medicine that appears suitable to man-machine principles is rehabilitation research, particularly when the motor aspects of the body are involved. If one considers the limb, whether functional or not, as the machine, the brain as the controller and the neuromuscular system as the man-machine interface, the human body is reduced to a man-machine system that can benefit from the principles behind such systems. The area of rehabilitation that this paper deals with is that of an arm amputee and his prosthetic device. Reducing this area to its man-machine basics, the problem becomes one of attaining natural multiaxis prosthetic control using Electromyographic activity (EMG) as the means of communication between man and prothesis. In order to use EMG as the communication channel it must be amplified and processed to yield a high information signal suitable for control. The most common processing scheme employed is termed Mean Value Processing. This technique for extracting the useful EMG signal consists of a differential to single ended conversion to the surface activity followed by a rectification and smoothing.

  11. Study of Tool Wear Mechanisms and Mathematical Modeling of Flank Wear During Machining of Ti Alloy (Ti6Al4V)

    NASA Astrophysics Data System (ADS)

    Chetan; Narasimhulu, A.; Ghosh, S.; Rao, P. V.

    2015-07-01

    Machinability of titanium is poor due to its low thermal conductivity and high chemical affinity. Lower thermal conductivity of titanium alloy is undesirable on the part of cutting tool causing extensive tool wear. The main task of this work is to predict the various wear mechanisms involved during machining of Ti alloy (Ti6Al4V) and to formulate an analytical mathematical tool wear model for the same. It has been found from various experiments that adhesive and diffusion wear are the dominating wear during machining of Ti alloy with PVD coated tungsten carbide tool. It is also clear from the experiments that the tool wear increases with the increase in cutting parameters like speed, feed and depth of cut. The wear model was validated by carrying out dry machining of Ti alloy at suitable cutting conditions. It has been found that the wear model is able to predict the flank wear suitably under gentle cutting conditions.

  12. Optical alignment of electrodes on electrical discharge machines

    NASA Technical Reports Server (NTRS)

    Boissevain, A. G.; Nelson, B. W.

    1972-01-01

    Shadowgraph system projects magnified image on screen so that alignment of small electrodes mounted on electrical discharge machines can be corrected and verified. Technique may be adapted to other machine tool equipment where physical contact cannot be made during inspection and access to tool limits conventional runout checking procedures.

  13. Web-based newborn screening system for metabolic diseases: machine learning versus clinicians.

    PubMed

    Chen, Wei-Hsin; Hsieh, Sheau-Ling; Hsu, Kai-Ping; Chen, Han-Ping; Su, Xing-Yu; Tseng, Yi-Ju; Chien, Yin-Hsiu; Hwu, Wuh-Liang; Lai, Feipei

    2013-05-23

    A hospital information system (HIS) that integrates screening data and interpretation of the data is routinely requested by hospitals and parents. However, the accuracy of disease classification may be low because of the disease characteristics and the analytes used for classification. The objective of this study is to describe a system that enhanced the neonatal screening system of the Newborn Screening Center at the National Taiwan University Hospital. The system was designed and deployed according to a service-oriented architecture (SOA) framework under the Web services .NET environment. The system consists of sample collection, testing, diagnosis, evaluation, treatment, and follow-up services among collaborating hospitals. To improve the accuracy of newborn screening, machine learning and optimal feature selection mechanisms were investigated for screening newborns for inborn errors of metabolism. The framework of the Newborn Screening Hospital Information System (NSHIS) used the embedded Health Level Seven (HL7) standards for data exchanges among heterogeneous platforms integrated by Web services in the C# language. In this study, machine learning classification was used to predict phenylketonuria (PKU), hypermethioninemia, and 3-methylcrotonyl-CoA-carboxylase (3-MCC) deficiency. The classification methods used 347,312 newborn dried blood samples collected at the Center between 2006 and 2011. Of these, 220 newborns had values over the diagnostic cutoffs (positive cases) and 1557 had values that were over the screening cutoffs but did not meet the diagnostic cutoffs (suspected cases). The original 35 analytes and the manifested features were ranked based on F score, then combinations of the top 20 ranked features were selected as input features to support vector machine (SVM) classifiers to obtain optimal feature sets. These feature sets were tested using 5-fold cross-validation and optimal models were generated. The datasets collected in year 2011 were used as predicting cases. The feature selection strategies were implemented and the optimal markers for PKU, hypermethioninemia, and 3-MCC deficiency were obtained. The results of the machine learning approach were compared with the cutoff scheme. The number of the false positive cases were reduced from 21 to 2 for PKU, from 30 to 10 for hypermethioninemia, and 209 to 46 for 3-MCC deficiency. This SOA Web service-based newborn screening system can accelerate screening procedures effectively and efficiently. An SVM learning methodology for PKU, hypermethioninemia, and 3-MCC deficiency metabolic diseases classification, including optimal feature selection strategies, is presented. By adopting the results of this study, the number of suspected cases could be reduced dramatically.

  14. Web-Based Newborn Screening System for Metabolic Diseases: Machine Learning Versus Clinicians

    PubMed Central

    Chen, Wei-Hsin; Hsu, Kai-Ping; Chen, Han-Ping; Su, Xing-Yu; Tseng, Yi-Ju; Chien, Yin-Hsiu; Hwu, Wuh-Liang; Lai, Feipei

    2013-01-01

    Background A hospital information system (HIS) that integrates screening data and interpretation of the data is routinely requested by hospitals and parents. However, the accuracy of disease classification may be low because of the disease characteristics and the analytes used for classification. Objective The objective of this study is to describe a system that enhanced the neonatal screening system of the Newborn Screening Center at the National Taiwan University Hospital. The system was designed and deployed according to a service-oriented architecture (SOA) framework under the Web services .NET environment. The system consists of sample collection, testing, diagnosis, evaluation, treatment, and follow-up services among collaborating hospitals. To improve the accuracy of newborn screening, machine learning and optimal feature selection mechanisms were investigated for screening newborns for inborn errors of metabolism. Methods The framework of the Newborn Screening Hospital Information System (NSHIS) used the embedded Health Level Seven (HL7) standards for data exchanges among heterogeneous platforms integrated by Web services in the C# language. In this study, machine learning classification was used to predict phenylketonuria (PKU), hypermethioninemia, and 3-methylcrotonyl-CoA-carboxylase (3-MCC) deficiency. The classification methods used 347,312 newborn dried blood samples collected at the Center between 2006 and 2011. Of these, 220 newborns had values over the diagnostic cutoffs (positive cases) and 1557 had values that were over the screening cutoffs but did not meet the diagnostic cutoffs (suspected cases). The original 35 analytes and the manifested features were ranked based on F score, then combinations of the top 20 ranked features were selected as input features to support vector machine (SVM) classifiers to obtain optimal feature sets. These feature sets were tested using 5-fold cross-validation and optimal models were generated. The datasets collected in year 2011 were used as predicting cases. Results The feature selection strategies were implemented and the optimal markers for PKU, hypermethioninemia, and 3-MCC deficiency were obtained. The results of the machine learning approach were compared with the cutoff scheme. The number of the false positive cases were reduced from 21 to 2 for PKU, from 30 to 10 for hypermethioninemia, and 209 to 46 for 3-MCC deficiency. Conclusions This SOA Web service–based newborn screening system can accelerate screening procedures effectively and efficiently. An SVM learning methodology for PKU, hypermethioninemia, and 3-MCC deficiency metabolic diseases classification, including optimal feature selection strategies, is presented. By adopting the results of this study, the number of suspected cases could be reduced dramatically. PMID:23702487

  15. Microgravity simulations with human lymphocytes in the free fall machine and in the random positioning machine

    NASA Technical Reports Server (NTRS)

    Schwarzenberg, M.; Pippia, P.; Meloni, M. A.; Cossu, G.; Cogoli-Greuter, M.; Cogoli, A.

    1998-01-01

    The purpose of this paper is to present the results obtained in our laboratory with both instruments, the FFM [free fall machine] and the RPM [random positioning machine], to compare them with the data from earlier experiments with human lymphocytes conducted in the FRC [fast rotating clinostat] and in space. Furthermore, the suitability of the FFM and RPM for research in gravitational cell biology is discussed.

  16. Pocket-sized versus standard ultrasound machines in abdominal imaging.

    PubMed

    Tse, K H; Luk, W H; Lam, M C

    2014-06-01

    The pocket-sized ultrasound machine has emerged as an invaluable tool for quick assessment in emergency and general practice settings. It is suitable for instant and quick assessment in cardiac imaging. However, its applicability in the imaging of other body parts has yet to be established. In this pictorial review, we compared the performance of the pocketsized ultrasound machine against the standard ultrasound machine for its image quality in common abdominal pathology.

  17. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    NASA Astrophysics Data System (ADS)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  18. An experimental study of cutting performances in machining of nimonic super alloy GH2312

    NASA Astrophysics Data System (ADS)

    Du, Jinfu; Wang, Xi; Xu, Min; Mao, Jin; Zhao, Xinglong

    2018-05-01

    Nimonic super alloy are extensively used in the aerospace industry because of its unique properties. As they are quite costly and difficult to machine, the machining tool is easy to get worn. To solve the problem, an experiment was carried out on a numerical control slitting automatic lathe to analysis the tool wearing conditions and parts' surface quality of nimonic super alloy GH2132 under different cutters. The selection of suitable cutter, reasonable cutting data and cutting speed is obtained and some conclusions are made. The excellent coating tool, compared with other hard alloy cutters, along with suitable cutting data will greatly improve the production efficiency and product quality, it can completely meet the process of nimonic super alloy GH2312.

  19. Readability, suitability, and health content assessment of web-based patient education materials on colorectal cancer screening.

    PubMed

    Tian, Chenlu; Champlin, Sara; Mackert, Michael; Lazard, Allison; Agrawal, Deepak

    2014-08-01

    Colorectal cancer (CRC) screening rates in the Unites States are still below target level. Web-based patient education materials are used by patients and providers to provide supplemental information on CRC screening. Low literacy levels and patient perceptions are significant barriers to screening. There are little data on the quality of these online materials from a health literacy standpoint or whether they address patients' perceptions. To evaluate the readability, suitability, and health content of web-based patient education materials on colon cancer screening. Descriptive study. Web-based patient materials. Twelve reputable and popular online patient education materials were evaluated. Readability was measured by using the Flesch-Kincaid Reading Grade Level, and suitability was determined by the Suitability Assessment of Materials, a scale that considers characteristics such as content, graphics, layout/typography, and learning stimulation. Health content was evaluated within the framework of the Health Belief Model, a behavioral model that relates patients' perceptions of susceptibility to disease, severity, and benefits and barriers to their medical decisions. Each material was scored independently by 3 reviewers. Flesch-Kincaid Reading Grade Level score, Suitability Assessment of Materials score, health content score. Readability for 10 of 12 materials surpassed the maximum recommended sixth-grade reading level. Five were 10th grade level and above. Only 1 of 12 materials received a superior suitability score; 3 materials received inadequate scores. Health content analysis revealed that only 50% of the resources discussed CRC risk in the general population and <25% specifically addressed patients at high risk, such as African Americans, smokers, patients with diabetes, and obese patients. For perceived barriers to screening, only 8.3% of resources discussed embarrassment, 25% discussed pain with colonoscopy, 25% addressed cost of colonoscopy, and none specifically mentioned the need to get colonoscopy when no symptoms are present. No material discussed the social benefits of screening. Descriptive design. Most online patient education materials for CRC screening are written beyond the recommended sixth-grade reading level, with suboptimal suitability. Health content is lacking in addressing key perceived risks, barriers, and benefits to CRC screening. Developing more appropriate and targeted patient education resources on CRC may improve patient understanding and promote screening. Copyright © 2014 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.

  20. Screening Electronic Health Record-Related Patient Safety Reports Using Machine Learning.

    PubMed

    Marella, William M; Sparnon, Erin; Finley, Edward

    2017-03-01

    The objective of this study was to develop a semiautomated approach to screening cases that describe hazards associated with the electronic health record (EHR) from a mandatory, population-based patient safety reporting system. Potentially relevant cases were identified through a query of the Pennsylvania Patient Safety Reporting System. A random sample of cases were manually screened for relevance and divided into training, testing, and validation data sets to develop a machine learning model. This model was used to automate screening of remaining potentially relevant cases. Of the 4 algorithms tested, a naive Bayes kernel performed best, with an area under the receiver operating characteristic curve of 0.927 ± 0.023, accuracy of 0.855 ± 0.033, and F score of 0.877 ± 0.027. The machine learning model and text mining approach described here are useful tools for identifying and analyzing adverse event and near-miss reports. Although reporting systems are beginning to incorporate structured fields on health information technology and the EHR, these methods can identify related events that reporters classify in other ways. These methods can facilitate analysis of legacy safety reports by retrieving health information technology-related and EHR-related events from databases without fields and controlled values focused on this subject and distinguishing them from reports in which the EHR is mentioned only in passing. Machine learning and text mining are useful additions to the patient safety toolkit and can be used to semiautomate screening and analysis of unstructured text in safety reports from frontline staff.

  1. MX Siting Investigation. Geotechnical Evaluation Conterminous United States. Volume I. Coarse Screening.

    DTIC Science & Technology

    1977-06-01

    the screening process, and the number of unit siting regions of 5000 nm 2 contained in each. The highest ranked suitable areas occur in the Basin and...SUITABLE AND POTENTIALLY SUITABLE AREA............23 3.4.1 GENERAL....................23 3.4.2 BASIN AND RANGE PROVINCE. ........... 23 13.4.3 GREAT...Approximately 70 percent of total suitable area occurs in the Basin and Range, Great Plains, and Central Lowlands physiographic provinces of the western and

  2. Applying Sparse Machine Learning Methods to Twitter: Analysis of the 2012 Change in Pap Smear Guidelines. A Sequential Mixed-Methods Study.

    PubMed

    Lyles, Courtney Rees; Godbehere, Andrew; Le, Gem; El Ghaoui, Laurent; Sarkar, Urmimala

    2016-06-10

    It is difficult to synthesize the vast amount of textual data available from social media websites. Capturing real-world discussions via social media could provide insights into individuals' opinions and the decision-making process. We conducted a sequential mixed methods study to determine the utility of sparse machine learning techniques in summarizing Twitter dialogues. We chose a narrowly defined topic for this approach: cervical cancer discussions over a 6-month time period surrounding a change in Pap smear screening guidelines. We applied statistical methodologies, known as sparse machine learning algorithms, to summarize Twitter messages about cervical cancer before and after the 2012 change in Pap smear screening guidelines by the US Preventive Services Task Force (USPSTF). All messages containing the search terms "cervical cancer," "Pap smear," and "Pap test" were analyzed during: (1) January 1-March 13, 2012, and (2) March 14-June 30, 2012. Topic modeling was used to discern the most common topics from each time period, and determine the singular value criterion for each topic. The results were then qualitatively coded from top 10 relevant topics to determine the efficiency of clustering method in grouping distinct ideas, and how the discussion differed before vs. after the change in guidelines . This machine learning method was effective in grouping the relevant discussion topics about cervical cancer during the respective time periods (~20% overall irrelevant content in both time periods). Qualitative analysis determined that a significant portion of the top discussion topics in the second time period directly reflected the USPSTF guideline change (eg, "New Screening Guidelines for Cervical Cancer"), and many topics in both time periods were addressing basic screening promotion and education (eg, "It is Cervical Cancer Awareness Month! Click the link to see where you can receive a free or low cost Pap test.") It was demonstrated that machine learning tools can be useful in cervical cancer prevention and screening discussions on Twitter. This method allowed us to prove that there is publicly available significant information about cervical cancer screening on social media sites. Moreover, we observed a direct impact of the guideline change within the Twitter messages.

  3. Machines and Human Beings in the Movies

    ERIC Educational Resources Information Center

    van der Laan, J. M.

    2006-01-01

    Over the years, many movies have presented on-screen a struggle between machines and human beings. Typically, the machines have come to rule and threaten the existence of humanity. They must be conquered to ensure the survival of and to secure the freedom of the human race. Although these movies appear to expose the dangers of an autonomous and…

  4. Re-designing a mechanism for higher speed: A case history from textile machinery

    NASA Astrophysics Data System (ADS)

    Douglas, S. S.; Rooney, G. T.

    The generation of general mechanism design software which is the formulation of suitable objective functions is discussed. There is a consistent drive towards higher speeds in the development of industrial sewing machines. This led to experimental analyses of dynamic performance and to a search for improved design methods. The experimental work highlighted the need for smoothness of motion at high speed, component inertias, and frame structural stiffness. Smoothness is associated with transmission properties and harmonic analysis. These are added to other design requirements of synchronization, mechanism size, and function. Some of the mechanism trains in overedte sewing machines are shown. All these trains are designed by digital optimization. The design software combines analysis of the sewing machine mechanisms, formulation of objectives innumerical terms, and suitable mathematical optimization ttechniques.

  5. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  6. Adaptation of existing infrared technologies to unanticipated applications

    NASA Astrophysics Data System (ADS)

    Peng, Philip

    2005-01-01

    Radiation thermometry is just but one of many applications, both potential and realized, of infrared technology. During the SARS (Severe Acute Respiratory Syndromes) global crisis in 2003, the technology was utilized as a preliminary screening method for infected persons as a defense against a major outbreak, as the primary symptom of this disease is elevated body temperature. ATC timely developed a product designed specifically for mass volume crowd screening of febrile individuals. For this application, the machine must register temperature of subjects rapidly and efficiently, with a certain degree of accuracy, and function for extended periods of time. The equipment must be safe to use, easily deployed, and function with minimum maintenance needed. The ATIR-303 model satisfies all of the above and other pre-requisite conditions amicably. Studies on the correlation between the maximum temperature registered among individual's facial features, as measured under the conditions of usage, and the core temperature of individuals were performed. The results demonstrated that ATIR-303 is very suitable for this application. Other applications of the infrared technology in various areas, like medical diagnosis, non-destructive testing, security, search and rescue, and others, are also interest areas of ATC. The progress ATC has achieved in these areas is presented also.

  7. A bio-inspired approach for the design of a multifunctional robotic end-effector customized for automated maintenance of a reconfigurable vibrating screen.

    PubMed

    Makinde, O A; Mpofu, K; Vrabic, R; Ramatsetse, B I

    2017-01-01

    The development of a robotic-driven maintenance solution capable of automatically maintaining reconfigurable vibrating screen (RVS) machine when utilized in dangerous and hazardous underground mining environment has called for the design of a multifunctional robotic end-effector capable of carrying out all the maintenance tasks on the RVS machine. In view of this, the paper presents a bio-inspired approach which unfolds the design of a novel multifunctional robotic end-effector embedded with mechanical and control mechanisms capable of automatically maintaining the RVS machine. To achieve this, therblig and morphological methodologies (which classifies the motions as well as the actions required by the robotic end-effector in carrying out RVS machine maintenance tasks), obtained from a detailed analogy of how human being (i.e. a machine maintenance manager) will carry out different maintenance tasks on the RVS machine, were used to obtain the maintenance objective functions or goals of the multifunctional robotic end-effector as well as the maintenance activity constraints of the RVS machine that must be adhered to by the multifunctional robotic end-effector during the machine maintenance. The results of the therblig and morphological analyses of five (5) different maintenance tasks capture and classify one hundred and thirty-four (134) repetitive motions and fifty-four (54) functions required in automating the maintenance tasks of the RVS machine. Based on these findings, a worm-gear mechanism embedded with fingers extruded with a hexagonal shaped heads capable of carrying out the "gripping and ungrasping" and "loosening and bolting" functions of the robotic end-effector and an electric cylinder actuator module capable of carrying out "unpinning and hammering" functions of the robotic end-effector were integrated together to produce the customized multifunctional robotic end-effector capable of automatically maintaining the RVS machine. The axial forces ([Formula: see text] and [Formula: see text]), normal forces ([Formula: see text]) and total load [Formula: see text] acting on the teeth of the worm-gear module of the multifunctional robotic end-effector during the gripping of worn-out or new RVS machine subsystems, which are 978.547, 1245.06 and 1016.406 N, respectively, were satisfactory. The nominal bending and torsional stresses acting on the shoulder of the socket module of the multifunctional robotic end-effector during the loosing and tightening of bolts, which are 1450.72 and 179.523 MPa, respectively, were satisfactory. The hammering and unpinning forces utilized by the electric cylinder actuator module of the multifunctional robotic end-effector during the unpinning and hammering of screen panel pins out of and into the screen panels were satisfactory.

  8. The influence of negative training set size on machine learning-based virtual screening.

    PubMed

    Kurczab, Rafał; Smusz, Sabina; Bojarski, Andrzej J

    2014-01-01

    The paper presents a thorough analysis of the influence of the number of negative training examples on the performance of machine learning methods. The impact of this rather neglected aspect of machine learning methods application was examined for sets containing a fixed number of positive and a varying number of negative examples randomly selected from the ZINC database. An increase in the ratio of positive to negative training instances was found to greatly influence most of the investigated evaluating parameters of ML methods in simulated virtual screening experiments. In a majority of cases, substantial increases in precision and MCC were observed in conjunction with some decreases in hit recall. The analysis of dynamics of those variations let us recommend an optimal composition of training data. The study was performed on several protein targets, 5 machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest) and 2 types of molecular fingerprints (MACCS and CDK FP). The most effective classification was provided by the combination of CDK FP with SMO or Random Forest algorithms. The Naïve Bayes models appeared to be hardly sensitive to changes in the number of negative instances in the training set. In conclusion, the ratio of positive to negative training instances should be taken into account during the preparation of machine learning experiments, as it might significantly influence the performance of particular classifier. What is more, the optimization of negative training set size can be applied as a boosting-like approach in machine learning-based virtual screening.

  9. The influence of negative training set size on machine learning-based virtual screening

    PubMed Central

    2014-01-01

    Background The paper presents a thorough analysis of the influence of the number of negative training examples on the performance of machine learning methods. Results The impact of this rather neglected aspect of machine learning methods application was examined for sets containing a fixed number of positive and a varying number of negative examples randomly selected from the ZINC database. An increase in the ratio of positive to negative training instances was found to greatly influence most of the investigated evaluating parameters of ML methods in simulated virtual screening experiments. In a majority of cases, substantial increases in precision and MCC were observed in conjunction with some decreases in hit recall. The analysis of dynamics of those variations let us recommend an optimal composition of training data. The study was performed on several protein targets, 5 machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest) and 2 types of molecular fingerprints (MACCS and CDK FP). The most effective classification was provided by the combination of CDK FP with SMO or Random Forest algorithms. The Naïve Bayes models appeared to be hardly sensitive to changes in the number of negative instances in the training set. Conclusions In conclusion, the ratio of positive to negative training instances should be taken into account during the preparation of machine learning experiments, as it might significantly influence the performance of particular classifier. What is more, the optimization of negative training set size can be applied as a boosting-like approach in machine learning-based virtual screening. PMID:24976867

  10. High-throughput, label-free, single-cell, microalgal lipid screening by machine-learning-equipped optofluidic time-stretch quantitative phase microscopy.

    PubMed

    Guo, Baoshan; Lei, Cheng; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Jiang, Yiyue; Tanaka, Yo; Ozeki, Yasuyuki; Goda, Keisuke

    2017-05-01

    The development of reliable, sustainable, and economical sources of alternative fuels to petroleum is required to tackle the global energy crisis. One such alternative is microalgal biofuel, which is expected to play a key role in reducing the detrimental effects of global warming as microalgae absorb atmospheric CO 2 via photosynthesis. Unfortunately, conventional analytical methods only provide population-averaged lipid amounts and fail to characterize a diverse population of microalgal cells with single-cell resolution in a non-invasive and interference-free manner. Here high-throughput label-free single-cell screening of lipid-producing microalgal cells with optofluidic time-stretch quantitative phase microscopy was demonstrated. In particular, Euglena gracilis, an attractive microalgal species that produces wax esters (suitable for biodiesel and aviation fuel after refinement), within lipid droplets was investigated. The optofluidic time-stretch quantitative phase microscope is based on an integration of a hydrodynamic-focusing microfluidic chip, an optical time-stretch quantitative phase microscope, and a digital image processor equipped with machine learning. As a result, it provides both the opacity and phase maps of every single cell at a high throughput of 10,000 cells/s, enabling accurate cell classification without the need for fluorescent staining. Specifically, the dataset was used to characterize heterogeneous populations of E. gracilis cells under two different culture conditions (nitrogen-sufficient and nitrogen-deficient) and achieve the cell classification with an error rate of only 2.15%. The method holds promise as an effective analytical tool for microalgae-based biofuel production. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  11. Dry Ribbon for Heated Head Automated Fiber Placement

    NASA Technical Reports Server (NTRS)

    Hulcher, A. Bruce; Marchello, Joseph M.; Hinkley, Jeffrey A.; Johnston, Norman J.; Lamontia, Mark A.

    2000-01-01

    Ply-by-ply in situ processes involving automated heated head deposition are being developed for fabrication of high performance, high temperature composite structures from low volatile content polymer matrices. This technology requires (1) dry carbon fiber towpreg, (2) consolidation of towpreg to quality, placement-grade unidirectional ribbon or tape, and (3) rapid, in situ, accurate, ply-by-ply robotic placement and consolidation of this material to fabricate a composite structure. In this study, the physical properties of a candidate thermoplastic ribbon, PIXA/IM7, were evaluated and screened for suitability in robotic placement. Specifically, towpreg was prepared from PIXA powder. Various conditions (temperatures) were used to convert the powder-coated towpreg to ribbons with varying degrees of processability. Ribbon within preset specifications was fabricated at 3 temperatures: 390, 400 and 410 C. Ribbon was also produced out-of-spec by purposely overheating the material to a processing temperature of 450 C. Automated placement equipment at Cincinnati Milacron and NASA Langley was used to fabricate laminates from these experimental ribbons. Ribbons were placed at 405 and 450 C by both sets of equipment. Double cantilever beam and wedge peel tests were used to determine the quality of the laminates and, especially, the interlaminar bond formed during the placement process. Ribbon made under conditions expected to be non-optimal (overheated) resulted in poor placeability and composites with weak interlaminar bond strengths, regardless of placement conditions. Ribbon made under conditions expected to be ideal showed good processability and produced well-consolidated laminates. Results were consistent from machine to machine and demonstrated the importance of ribbon quality in heated-head placement of dry material forms. Preliminary screening criteria for the development and evaluation of ribbon from new matrix materials were validated.

  12. Rapid and Accurate Machine Learning Recognition of High Performing Metal Organic Frameworks for CO2 Capture.

    PubMed

    Fernandez, Michael; Boyd, Peter G; Daff, Thomas D; Aghaji, Mohammad Zein; Woo, Tom K

    2014-09-04

    In this work, we have developed quantitative structure-property relationship (QSPR) models using advanced machine learning algorithms that can rapidly and accurately recognize high-performing metal organic framework (MOF) materials for CO2 capture. More specifically, QSPR classifiers have been developed that can, in a fraction of a section, identify candidate MOFs with enhanced CO2 adsorption capacity (>1 mmol/g at 0.15 bar and >4 mmol/g at 1 bar). The models were tested on a large set of 292 050 MOFs that were not part of the training set. The QSPR classifier could recover 945 of the top 1000 MOFs in the test set while flagging only 10% of the whole library for compute intensive screening. Thus, using the machine learning classifiers as part of a high-throughput screening protocol would result in an order of magnitude reduction in compute time and allow intractably large structure libraries and search spaces to be screened.

  13. Surface structuring of boron doped CVD diamond by micro electrical discharge machining

    NASA Astrophysics Data System (ADS)

    Schubert, A.; Berger, T.; Martin, A.; Hackert-Oschätzchen, M.; Treffkorn, N.; Kühn, R.

    2018-05-01

    Boron doped diamond materials, which are generated by Chemical Vapor Deposition (CVD), offer a great potential for the application on highly stressed tools, e. g. in cutting or forming processes. As a result of the CVD process rough surfaces arise, which require a finishing treatment in particular for the application in forming tools. Cutting techniques such as milling and grinding are hardly applicable for the finish machining because of the high strength of diamond. Due to its process principle of ablating material by melting and evaporating, Electrical Discharge Machining (EDM) is independent of hardness, brittleness or toughness of the workpiece material. EDM is a suitable technology for machining and structuring CVD diamond, since boron doped CVD diamond is electrically conductive. In this study the ablation characteristics of boron doped CVD diamond by micro electrical discharge machining are investigated. Experiments were carried out to investigate the influence of different process parameters on the machining result. The impact of tool-polarity, voltage and discharge energy on the resulting erosion geometry and the tool wear was analyzed. A variation in path overlapping during the erosion of planar areas leads to different microstructures. The results show that micro EDM is a suitable technology for finishing of boron doped CVD diamond.

  14. Classification of lung cancer histology by gold nanoparticle sensors

    PubMed Central

    Barash, Orna; Peled, Nir; Tisch, Ulrike; Bunn, Paul A.; Hirsch, Fred R.; Haick, Hossam

    2016-01-01

    We propose a nanomedical device for the classification of lung cancer (LC) histology. The device profiles volatile organic compounds (VOCs) in the headspace of (subtypes of) LC cells, using gold nanoparticle (GNP) sensors that are suitable for detecting LC-specific patterns of VOC profiles, as determined by gas chromatography–mass spectrometry analysis. Analyzing the GNP sensing signals by support vector machine allowed significant discrimination between (i) LC and healthy cells; (ii) small cell LC and non–small cell LC; and between (iii) two subtypes of non–small cell LC: adenocarcinoma and squamous cell carcinoma. The discriminative power of the GNP sensors was then linked with the chemical nature and composition of the headspace VOCs of each LC state. These proof-of-concept findings could totally revolutionize LC screening and diagnosis, and might eventually allow early and differential diagnosis of LC subtypes with detectable or unreachable lung nodules. PMID:22033081

  15. A study on the application of voice interaction in automotive human machine interface experience design

    NASA Astrophysics Data System (ADS)

    Huang, Zhaohui; Huang, Xiemin

    2018-04-01

    This paper, firstly, introduces the application trend of the integration of multi-channel interactions in automotive HMI ((Human Machine Interface) from complex information models faced by existing automotive HMI and describes various interaction modes. By comparing voice interaction and touch screen, gestures and other interaction modes, the potential and feasibility of voice interaction in automotive HMI experience design are concluded. Then, the related theories of voice interaction, identification technologies, human beings' cognitive models of voices and voice design methods are further explored. And the research priority of this paper is proposed, i.e. how to design voice interaction to create more humane task-oriented dialogue scenarios to enhance interactive experiences of automotive HMI. The specific scenarios in driving behaviors suitable for the use of voice interaction are studied and classified, and the usability principles and key elements for automotive HMI voice design are proposed according to the scenario features. Then, through the user participatory usability testing experiment, the dialogue processes of voice interaction in automotive HMI are defined. The logics and grammars in voice interaction are classified according to the experimental results, and the mental models in the interaction processes are analyzed. At last, the voice interaction design method to create the humane task-oriented dialogue scenarios in the driving environment is proposed.

  16. Man-machine communication - A transparent switchboard for computers

    NASA Technical Reports Server (NTRS)

    Rasmussen, H.

    1971-01-01

    Device uses pattern of transparent contact touch points that are put on cathode ray tube screen. Touch point system compels more precise and unambiguous communication between man and machine than is possible with any other means, and speeds up operation responses.

  17. RHE: A JVM Courseware

    ERIC Educational Resources Information Center

    Liu, S.; Tang, J.; Deng, C.; Li, X.-F.; Gaudiot, J.-L.

    2011-01-01

    Java Virtual Machine (JVM) education has become essential in training embedded software engineers as well as virtual machine researchers and practitioners. However, due to the lack of suitable instructional tools, it is difficult for students to obtain any kind of hands-on experience and to attain any deep understanding of JVM design. To address…

  18. Fog Machines, Vapors, and Phase Diagrams

    ERIC Educational Resources Information Center

    Vitz, Ed

    2008-01-01

    A series of demonstrations is described that elucidate the operation of commercial fog machines by using common laboratory equipment and supplies. The formation of fogs, or "mixing clouds", is discussed in terms of the phase diagram for water and other chemical principles. The demonstrations can be adapted for presentation suitable for elementary…

  19. Machine Shop I. Oklahoma Trade and Industrial Education.

    ERIC Educational Resources Information Center

    Dunn, James

    Designed to provide the basic knowledge and hands-on skills necessary to prepare job-ready machinist trainees, these instructional materials focus on the following areas of trade and industrial education: orientation, basic and related technology, hand and bench work, and power saws and drilling machines. Suitable for use in secondary,…

  20. Harmonic reduction of Direct Torque Control of six-phase induction motor.

    PubMed

    Taheri, A

    2016-07-01

    In this paper, a new switching method in Direct Torque Control (DTC) of a six-phase induction machine for reduction of current harmonics is introduced. Selecting a suitable vector in each sampling period is an ordinal method in the ST-DTC drive of a six-phase induction machine. The six-phase induction machine has 64 voltage vectors and divided further into four groups. In the proposed DTC method, the suitable voltage vectors are selected from two vector groups. By a suitable selection of two vectors in each sampling period, the harmonic amplitude is decreased more, in and various comparison to that of the ST-DTC drive. The harmonics loss is greater reduced, while the electromechanical energy is decreased with switching loss showing a little increase. Spectrum analysis of the phase current in the standard and new switching table DTC of the six-phase induction machine and determination for the amplitude of each harmonics is proposed in this paper. The proposed method has a less sampling time in comparison to the ordinary method. The Harmonic analyses of the current in the low and high speed shows the performance of the presented method. The simplicity of the proposed method and its implementation without any extra hardware is other advantages of the proposed method. The simulation and experimental results show the preference of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  1. A tubular flux-switching permanent magnet machine

    NASA Astrophysics Data System (ADS)

    Wang, J.; Wang, W.; Clark, R.; Atallah, K.; Howe, D.

    2008-04-01

    The paper describes a novel tubular, three-phase permanent magnet brushless machine, which combines salient features from both switched reluctance and permanent magnet machine technologies. It has no end windings and zero net radial force and offers a high power density and peak force capability, as well as the potential for low manufacturing cost. It is, therefore, eminently suitable for a variety of applications, ranging from free-piston energy converters to active vehicle suspensions.

  2. Evaluation of machine learning algorithms for improved risk assessment for Down's syndrome.

    PubMed

    Koivu, Aki; Korpimäki, Teemu; Kivelä, Petri; Pahikkala, Tapio; Sairanen, Mikko

    2018-05-04

    Prenatal screening generates a great amount of data that is used for predicting risk of various disorders. Prenatal risk assessment is based on multiple clinical variables and overall performance is defined by how well the risk algorithm is optimized for the population in question. This article evaluates machine learning algorithms to improve performance of first trimester screening of Down syndrome. Machine learning algorithms pose an adaptive alternative to develop better risk assessment models using the existing clinical variables. Two real-world data sets were used to experiment with multiple classification algorithms. Implemented models were tested with a third, real-world, data set and performance was compared to a predicate method, a commercial risk assessment software. Best performing deep neural network model gave an area under the curve of 0.96 and detection rate of 78% with 1% false positive rate with the test data. Support vector machine model gave area under the curve of 0.95 and detection rate of 61% with 1% false positive rate with the same test data. When compared with the predicate method, the best support vector machine model was slightly inferior, but an optimized deep neural network model was able to give higher detection rates with same false positive rate or similar detection rate but with markedly lower false positive rate. This finding could further improve the first trimester screening for Down syndrome, by using existing clinical variables and a large training data derived from a specific population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. [Research on infrared safety protection system for machine tool].

    PubMed

    Zhang, Shuan-Ji; Zhang, Zhi-Ling; Yan, Hui-Ying; Wang, Song-De

    2008-04-01

    In order to ensure personal safety and prevent injury accident in machine tool operation, an infrared machine tool safety system was designed with infrared transmitting-receiving module, memory self-locked relay and voice recording-playing module. When the operator does not enter the danger area, the system has no response. Once the operator's whole or part of body enters the danger area and shades the infrared beam, the system will alarm and output an control signal to the machine tool executive element, and at the same time, the system makes the machine tool emergency stop to prevent equipment damaged and person injured. The system has a module framework, and has many advantages including safety, reliability, common use, circuit simplicity, maintenance convenience, low power consumption, low costs, working stability, easy debugging, vibration resistance and interference resistance. It is suitable for being installed and used in different machine tools such as punch machine, pour plastic machine, digital control machine, armor plate cutting machine, pipe bending machine, oil pressure machine etc.

  4. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  5. Sleep Apnea Detection Based on Thoracic and Abdominal Movement Signals of Wearable Piezo-Electric Bands.

    PubMed

    Lin, Yin-Yan; Wu, Hau-Tieng; Hsu, Chi-An; Huang, Po-Chiun; Huang, Yuan-Hao; Lo, Yu-Lun

    2016-12-07

    Physiologically, the thoracic (THO) and abdominal (ABD) movement signals, captured using wearable piezo-electric bands, provide information about various types of apnea, including central sleep apnea (CSA) and obstructive sleep apnea (OSA). However, the use of piezo-electric wearables in detecting sleep apnea events has been seldom explored in the literature. This study explored the possibility of identifying sleep apnea events, including OSA and CSA, by solely analyzing one or both the THO and ABD signals. An adaptive non-harmonic model was introduced to model the THO and ABD signals, which allows us to design features for sleep apnea events. To confirm the suitability of the extracted features, a support vector machine was applied to classify three categories - normal and hypopnea, OSA, and CSA. According to a database of 34 subjects, the overall classification accuracies were on average 75.9%±11.7% and 73.8%±4.4%, respectively, based on the cross validation. When the features determined from the THO and ABD signals were combined, the overall classification accuracy became 81.8%±9.4%. These features were applied for designing a state machine for online apnea event detection. Two event-byevent accuracy indices, S and I, were proposed for evaluating the performance of the state machine. For the same database, the S index was 84.01%±9.06%, and the I index was 77.21%±19.01%. The results indicate the considerable potential of applying the proposed algorithm to clinical examinations for both screening and homecare purposes.

  6. Rapid, portable and cost-effective yeast cell viability and concentration analysis using lensfree on-chip microscopy and machine learning.

    PubMed

    Feizi, Alborz; Zhang, Yibo; Greenbaum, Alon; Guziak, Alex; Luong, Michelle; Chan, Raymond Yan Lok; Berg, Brandon; Ozkan, Haydar; Luo, Wei; Wu, Michael; Wu, Yichen; Ozcan, Aydogan

    2016-11-01

    Monitoring yeast cell viability and concentration is important in brewing, baking and biofuel production. However, existing methods of measuring viability and concentration are relatively bulky, tedious and expensive. Here we demonstrate a compact and cost-effective automatic yeast analysis platform (AYAP), which can rapidly measure cell concentration and viability. AYAP is based on digital in-line holography and on-chip microscopy and rapidly images a large field-of-view of 22.5 mm 2 . This lens-free microscope weighs 70 g and utilizes a partially-coherent illumination source and an opto-electronic image sensor chip. A touch-screen user interface based on a tablet-PC is developed to reconstruct the holographic shadows captured by the image sensor chip and use a support vector machine (SVM) model to automatically classify live and dead cells in a yeast sample stained with methylene blue. In order to quantify its accuracy, we varied the viability and concentration of the cells and compared AYAP's performance with a fluorescence exclusion staining based gold-standard using regression analysis. The results agree very well with this gold-standard method and no significant difference was observed between the two methods within a concentration range of 1.4 × 10 5 to 1.4 × 10 6 cells per mL, providing a dynamic range suitable for various applications. This lensfree computational imaging technology that is coupled with machine learning algorithms would be useful for cost-effective and rapid quantification of cell viability and density even in field and resource-poor settings.

  7. Artificial intelligence approaches for rational drug design and discovery.

    PubMed

    Duch, Włodzisław; Swaminathan, Karthikeyan; Meller, Jarosław

    2007-01-01

    Pattern recognition, machine learning and artificial intelligence approaches play an increasingly important role in rational drug design, screening and identification of candidate molecules and studies on quantitative structure-activity relationships (QSAR). In this review, we present an overview of basic concepts and methodology in the fields of machine learning and artificial intelligence (AI). An emphasis is put on methods that enable an intuitive interpretation of the results and facilitate gaining an insight into the structure of the problem at hand. We also discuss representative applications of AI methods to docking, screening and QSAR studies. The growing trend to integrate computational and experimental efforts in that regard and some future developments are discussed. In addition, we comment on a broader role of machine learning and artificial intelligence approaches in biomedical research.

  8. Applying Sparse Machine Learning Methods to Twitter: Analysis of the 2012 Change in Pap Smear Guidelines. A Sequential Mixed-Methods Study

    PubMed Central

    Godbehere, Andrew; Le, Gem; El Ghaoui, Laurent; Sarkar, Urmimala

    2016-01-01

    Background It is difficult to synthesize the vast amount of textual data available from social media websites. Capturing real-world discussions via social media could provide insights into individuals’ opinions and the decision-making process. Objective We conducted a sequential mixed methods study to determine the utility of sparse machine learning techniques in summarizing Twitter dialogues. We chose a narrowly defined topic for this approach: cervical cancer discussions over a 6-month time period surrounding a change in Pap smear screening guidelines. Methods We applied statistical methodologies, known as sparse machine learning algorithms, to summarize Twitter messages about cervical cancer before and after the 2012 change in Pap smear screening guidelines by the US Preventive Services Task Force (USPSTF). All messages containing the search terms “cervical cancer,” “Pap smear,” and “Pap test” were analyzed during: (1) January 1–March 13, 2012, and (2) March 14–June 30, 2012. Topic modeling was used to discern the most common topics from each time period, and determine the singular value criterion for each topic. The results were then qualitatively coded from top 10 relevant topics to determine the efficiency of clustering method in grouping distinct ideas, and how the discussion differed before vs. after the change in guidelines . Results This machine learning method was effective in grouping the relevant discussion topics about cervical cancer during the respective time periods (~20% overall irrelevant content in both time periods). Qualitative analysis determined that a significant portion of the top discussion topics in the second time period directly reflected the USPSTF guideline change (eg, “New Screening Guidelines for Cervical Cancer”), and many topics in both time periods were addressing basic screening promotion and education (eg, “It is Cervical Cancer Awareness Month! Click the link to see where you can receive a free or low cost Pap test.”) Conclusions It was demonstrated that machine learning tools can be useful in cervical cancer prevention and screening discussions on Twitter. This method allowed us to prove that there is publicly available significant information about cervical cancer screening on social media sites. Moreover, we observed a direct impact of the guideline change within the Twitter messages. PMID:27288093

  9. GAPscreener: an automatic tool for screening human genetic association literature in PubMed using the support vector machine technique.

    PubMed

    Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta

    2008-04-22

    Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge.

  10. Modeling and simulation of the fluid flow in wire electrochemical machining with rotating tool (wire ECM)

    NASA Astrophysics Data System (ADS)

    Klocke, F.; Herrig, T.; Zeis, M.; Klink, A.

    2017-10-01

    Combining the working principle of electrochemical machining (ECM) with a universal rotating tool, like a wire, could manage lots of challenges of the classical ECM sinking process. Such a wire-ECM process could be able to machine flexible and efficient 2.5-dimensional geometries like fir tree slots in turbine discs. Nowadays, established manufacturing technologies for slotting turbine discs are broaching and wire electrical discharge machining (wire EDM). Nevertheless, high requirements on surface integrity of turbine parts need cost intensive process development and - in case of wire-EDM - trim cuts to reduce the heat affected rim zone. Due to the process specific advantages, ECM is an attractive alternative manufacturing technology and is getting more and more relevant for sinking applications within the last few years. But ECM is also opposed with high costs for process development and complex electrolyte flow devices. In the past, few studies dealt with the development of a wire ECM process to meet these challenges. However, previous concepts of wire ECM were only suitable for micro machining applications. Due to insufficient flushing concepts the application of the process for machining macro geometries failed. Therefore, this paper presents the modeling and simulation of a new flushing approach for process assessment. The suitability of a rotating structured wire electrode in combination with an axial flushing for electrodes with high aspect ratios is investigated and discussed.

  11. Case study of a floor-cleaning robot

    NASA Astrophysics Data System (ADS)

    Branch, Allan C.

    1998-01-01

    Developing the technologies suitable of ra high level robotic application such as cleaning a floor has proved extremely difficult. Developing the robot mobility technology has been a stumbling block and developing and integrating the applications technology to the machine and the mobility technology has also been a difficult stage in this quest, but doing so in a cost effective and realistic manner suitable for the market place and to compete with humans and manually operated machines has been the most difficult of all. This paper describes one of these quests spanning a 14 year period and resulting in what is hoped will be the world's first commercially manufactured household robot vacuum cleaner.

  12. Development of techniques to enhance man/machine communication

    NASA Technical Reports Server (NTRS)

    Targ, R.; Cole, P.; Puthoff, H.

    1974-01-01

    A four-state random stimulus generator, considered to function as an ESP teaching machine was used to investigate an approach to facilitating interactions between man and machines. A subject tries to guess in which of four states the machine is. The machine offers the user feedback and reinforcement as to the correctness of his choice. Using this machine, 148 volunteer subjects were screened under various protocols. Several whose learning slope and/or mean score departed significantly from chance expectation were identified. Direct physiological evidence of perception of remote stimuli not presented to any known sense of the percipient using electroencephalographic (EEG) output when a light was flashed in a distant room was also studied.

  13. a Contemporary Approach for Evaluation of the best Measurement Capability of a Force Calibration Machine

    NASA Astrophysics Data System (ADS)

    Kumar, Harish

    The present paper discusses the procedure for evaluation of best measurement capability of a force calibration machine. The best measurement capability of force calibration machine is evaluated by a comparison through the precision force transfer standards to the force standard machines. The force transfer standards are calibrated by the force standard machine and then by the force calibration machine by adopting the similar procedure. The results are reported and discussed in the paper and suitable discussion has been made for force calibration machine of 200 kN capacity. Different force transfer standards of nominal capacity 20 kN, 50 kN and 200 kN are used. It is found that there are significant variations in the .uncertainty of force realization by the force calibration machine according to the proposed method in comparison to the earlier method adopted.

  14. Design and Analysis of Linear Fault-Tolerant Permanent-Magnet Vernier Machines

    PubMed Central

    Xu, Liang; Liu, Guohai; Du, Yi; Liu, Hu

    2014-01-01

    This paper proposes a new linear fault-tolerant permanent-magnet (PM) vernier (LFTPMV) machine, which can offer high thrust by using the magnetic gear effect. Both PMs and windings of the proposed machine are on short mover, while the long stator is only manufactured from iron. Hence, the proposed machine is very suitable for long stroke system applications. The key of this machine is that the magnetizer splits the two movers with modular and complementary structures. Hence, the proposed machine offers improved symmetrical and sinusoidal back electromotive force waveform and reduced detent force. Furthermore, owing to the complementary structure, the proposed machine possesses favorable fault-tolerant capability, namely, independent phases. In particular, differing from the existing fault-tolerant machines, the proposed machine offers fault tolerance without sacrificing thrust density. This is because neither fault-tolerant teeth nor the flux-barriers are adopted. The electromagnetic characteristics of the proposed machine are analyzed using the time-stepping finite-element method, which verifies the effectiveness of the theoretical analysis. PMID:24982959

  15. Design and analysis of linear fault-tolerant permanent-magnet vernier machines.

    PubMed

    Xu, Liang; Ji, Jinghua; Liu, Guohai; Du, Yi; Liu, Hu

    2014-01-01

    This paper proposes a new linear fault-tolerant permanent-magnet (PM) vernier (LFTPMV) machine, which can offer high thrust by using the magnetic gear effect. Both PMs and windings of the proposed machine are on short mover, while the long stator is only manufactured from iron. Hence, the proposed machine is very suitable for long stroke system applications. The key of this machine is that the magnetizer splits the two movers with modular and complementary structures. Hence, the proposed machine offers improved symmetrical and sinusoidal back electromotive force waveform and reduced detent force. Furthermore, owing to the complementary structure, the proposed machine possesses favorable fault-tolerant capability, namely, independent phases. In particular, differing from the existing fault-tolerant machines, the proposed machine offers fault tolerance without sacrificing thrust density. This is because neither fault-tolerant teeth nor the flux-barriers are adopted. The electromagnetic characteristics of the proposed machine are analyzed using the time-stepping finite-element method, which verifies the effectiveness of the theoretical analysis.

  16. Performance of solar refrigerant ejector refrigerating machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Khalidy, N.A.H.

    1997-12-31

    In this work a detailed analysis for the ideal, theoretical, and experimental performance of a solar refrigerant ejector refrigerating machine is presented. A comparison of five refrigerants to select a desirable one for the system is made. The theoretical analysis showed that refrigerant R-113 is more suitable for use in the system. The influence of the boiler, condenser, and evaporator temperatures on system performance is investigated experimentally in a refrigerant ejector refrigerating machine using R-113 as a working refrigerant.

  17. Using Machine Learning for Behavior-Based Access Control: Scalable Anomaly Detection on TCP Connections and HTTP Requests

    DTIC Science & Technology

    2013-11-01

    machine learning techniques used in BBAC to make predictions about the intent of actors establishing TCP connections and issuing HTTP requests. We discuss pragmatic challenges and solutions we encountered in implementing and evaluating BBAC, discussing (a) the general concepts underlying BBAC, (b) challenges we have encountered in identifying suitable datasets, (c) mitigation strategies to cope...and describe current plans for transitioning BBAC capabilities into the Department of Defense together with lessons learned for the machine learning

  18. Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach.

    PubMed

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D; Duvenaud, David; Maclaurin, Dougal; Blood-Forsythe, Martin A; Chae, Hyun Sik; Einzinger, Markus; Ha, Dong-Gwang; Wu, Tony; Markopoulos, Georgios; Jeon, Soonok; Kang, Hosuk; Miyazaki, Hiroshi; Numata, Masaki; Kim, Sunghan; Huang, Wenliang; Hong, Seong Ik; Baldo, Marc; Adams, Ryan P; Aspuru-Guzik, Alán

    2016-10-01

    Virtual screening is becoming a ground-breaking tool for molecular discovery due to the exponential growth of available computer time and constant improvement of simulation and machine learning techniques. We report an integrated organic functional material design process that incorporates theoretical insight, quantum chemistry, cheminformatics, machine learning, industrial expertise, organic synthesis, molecular characterization, device fabrication and optoelectronic testing. After exploring a search space of 1.6 million molecules and screening over 400,000 of them using time-dependent density functional theory, we identified thousands of promising novel organic light-emitting diode molecules across the visible spectrum. Our team collaboratively selected the best candidates from this set. The experimentally determined external quantum efficiencies for these synthesized candidates were as large as 22%.

  19. Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach

    NASA Astrophysics Data System (ADS)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Duvenaud, David; MacLaurin, Dougal; Blood-Forsythe, Martin A.; Chae, Hyun Sik; Einzinger, Markus; Ha, Dong-Gwang; Wu, Tony; Markopoulos, Georgios; Jeon, Soonok; Kang, Hosuk; Miyazaki, Hiroshi; Numata, Masaki; Kim, Sunghan; Huang, Wenliang; Hong, Seong Ik; Baldo, Marc; Adams, Ryan P.; Aspuru-Guzik, Alán

    2016-10-01

    Virtual screening is becoming a ground-breaking tool for molecular discovery due to the exponential growth of available computer time and constant improvement of simulation and machine learning techniques. We report an integrated organic functional material design process that incorporates theoretical insight, quantum chemistry, cheminformatics, machine learning, industrial expertise, organic synthesis, molecular characterization, device fabrication and optoelectronic testing. After exploring a search space of 1.6 million molecules and screening over 400,000 of them using time-dependent density functional theory, we identified thousands of promising novel organic light-emitting diode molecules across the visible spectrum. Our team collaboratively selected the best candidates from this set. The experimentally determined external quantum efficiencies for these synthesized candidates were as large as 22%.

  20. Man-Machine Integrated Design and Analysis System (MIDAS): Functional Overview

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Neukom, Christian

    1998-01-01

    Included in the series of screen print-outs illustrates the structure and function of the Man-Machine Integrated Design and Analysis System (MIDAS). Views into the use of the system and editors are featured. The use-case in this set of graphs includes the development of a simulation scenario.

  1. Machine tool locator

    DOEpatents

    Hanlon, John A.; Gill, Timothy J.

    2001-01-01

    Machine tools can be accurately measured and positioned on manufacturing machines within very small tolerances by use of an autocollimator on a 3-axis mount on a manufacturing machine and positioned so as to focus on a reference tooling ball or a machine tool, a digital camera connected to the viewing end of the autocollimator, and a marker and measure generator for receiving digital images from the camera, then displaying or measuring distances between the projection reticle and the reference reticle on the monitoring screen, and relating the distances to the actual position of the autocollimator relative to the reference tooling ball. The images and measurements are used to set the position of the machine tool and to measure the size and shape of the machine tool tip, and examine cutting edge wear. patent

  2. Ausdruckskraft und Regelmaessigkeit: Was Esperanto fuer automatische Uebersetzung geeignet macht (Expressiveness and Formal Regularity: What Makes Esperanto Suitable for Machine Translation).

    ERIC Educational Resources Information Center

    Schubert, Klaus

    1988-01-01

    Describes DLT, the multilingual machine translation system that uses Esperanto as an intermediate language in which substantial portions of the translation subprocesses are carried out. The criteria for choosing an intermediate language and the reasons for preferring Esperanto over other languages are explained. (Author/DJD)

  3. Milling Machine Operator: Instructor's Guide for an Adult Course.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Secondary Curriculum Development.

    The instructor's guide is for a course expected to help meet the need for trained operators in metalworking. Students successfully completing the course will be qualified for an entry-level job as a milling machine operator. The course is suitable for use in adult education programs of school districts, or in manpower development and training…

  4. Machine learning-based screening of complex molecules for polymer solar cells

    NASA Astrophysics Data System (ADS)

    Jørgensen, Peter Bjørn; Mesta, Murat; Shil, Suranjan; García Lastra, Juan Maria; Jacobsen, Karsten Wedel; Thygesen, Kristian Sommer; Schmidt, Mikkel N.

    2018-06-01

    Polymer solar cells admit numerous potential advantages including low energy payback time and scalable high-speed manufacturing, but the power conversion efficiency is currently lower than for their inorganic counterparts. In a Phenyl-C_61-Butyric-Acid-Methyl-Ester (PCBM)-based blended polymer solar cell, the optical gap of the polymer and the energetic alignment of the lowest unoccupied molecular orbital (LUMO) of the polymer and the PCBM are crucial for the device efficiency. Searching for new and better materials for polymer solar cells is a computationally costly affair using density functional theory (DFT) calculations. In this work, we propose a screening procedure using a simple string representation for a promising class of donor-acceptor polymers in conjunction with a grammar variational autoencoder. The model is trained on a dataset of 3989 monomers obtained from DFT calculations and is able to predict LUMO and the lowest optical transition energy for unseen molecules with mean absolute errors of 43 and 74 meV, respectively, without knowledge of the atomic positions. We demonstrate the merit of the model for generating new molecules with the desired LUMO and optical gap energies which increases the chance of finding suitable polymers by more than a factor of five in comparison to the randomised search used in gathering the training set.

  5. Chatter active control in a lathe machine using magnetostrictive actuator

    NASA Astrophysics Data System (ADS)

    Nosouhi, R.; Behbahani, S.

    2011-01-01

    This paper analyzes the chatter phenomena in lathe machines. Chatter is one of the main causes of inaccuracy, reduction of life cycle of the machine and tool wear in machine tools. This phenomenon limits the depth of cut as a function of the cutting speed, which consequently reduces the material removal rate and machining efficiency. Chatter control is therefore important since it increases the stability region in machining and increases the critical depth of cut in machining case. To control the chatter in lathe machines, a magnetostrictive actuator is used. The materials with magnetostriction properties are kind of smart materials of which their length changes as a result of applying an exterior magnetic field, which make them suitable for control applications. It is assumed that the actuator applies the proper force exactly at the point where the machining force is applied on the tool. In this paper the chatter stability lobes is excelled as a result of applying a PID controller on the magnetostrictive actuator equipped-tool in turning.

  6. Machine learning plus optical flow: a simple and sensitive method to detect cardioactive drugs

    NASA Astrophysics Data System (ADS)

    Lee, Eugene K.; Kurokawa, Yosuke K.; Tu, Robin; George, Steven C.; Khine, Michelle

    2015-07-01

    Current preclinical screening methods do not adequately detect cardiotoxicity. Using human induced pluripotent stem cell-derived cardiomyocytes (iPS-CMs), more physiologically relevant preclinical or patient-specific screening to detect potential cardiotoxic effects of drug candidates may be possible. However, one of the persistent challenges for developing a high-throughput drug screening platform using iPS-CMs is the need to develop a simple and reliable method to measure key electrophysiological and contractile parameters. To address this need, we have developed a platform that combines machine learning paired with brightfield optical flow as a simple and robust tool that can automate the detection of cardiomyocyte drug effects. Using three cardioactive drugs of different mechanisms, including those with primarily electrophysiological effects, we demonstrate the general applicability of this screening method to detect subtle changes in cardiomyocyte contraction. Requiring only brightfield images of cardiomyocyte contractions, we detect changes in cardiomyocyte contraction comparable to - and even superior to - fluorescence readouts. This automated method serves as a widely applicable screening tool to characterize the effects of drugs on cardiomyocyte function.

  7. Scoping the polymer genome: A roadmap for rational polymer dielectrics design and beyond

    DOE PAGES

    Mannodi-Kanakkithodi, Arun; Chandrasekaran, Anand; Kim, Chiho; ...

    2017-12-19

    The Materials Genome Initiative (MGI) has heralded a sea change in the philosophy of materials design. In an increasing number of applications, the successful deployment of novel materials has benefited from the use of computational methodologies, data descriptors, and machine learning. Polymers have long suffered from a lack of data on electronic, mechanical, and dielectric properties across large chemical spaces, causing a stagnation in the set of suitable candidates for various applications. Extensive efforts over the last few years have seen the fruitful application of MGI principles toward the accelerated discovery of attractive polymer dielectrics for capacitive energy storage. Here,more » we review these efforts, highlighting the importance of computational data generation and screening, targeted synthesis and characterization, polymer fingerprinting and machine-learning prediction models, and the creation of an online knowledgebase to guide ongoing and future polymer discovery and design. We lay special emphasis on the fingerprinting of polymers in terms of their genome or constituent atomic and molecular fragments, an idea that pays homage to the pioneers of the human genome project who identified the basic building blocks of the human DNA. As a result, by scoping the polymer genome, we present an essential roadmap for the design of polymer dielectrics, and provide future perspectives and directions for expansions to other polymer subclasses and properties.« less

  8. Scoping the polymer genome: A roadmap for rational polymer dielectrics design and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mannodi-Kanakkithodi, Arun; Chandrasekaran, Anand; Kim, Chiho

    The Materials Genome Initiative (MGI) has heralded a sea change in the philosophy of materials design. In an increasing number of applications, the successful deployment of novel materials has benefited from the use of computational methodologies, data descriptors, and machine learning. Polymers have long suffered from a lack of data on electronic, mechanical, and dielectric properties across large chemical spaces, causing a stagnation in the set of suitable candidates for various applications. Extensive efforts over the last few years have seen the fruitful application of MGI principles toward the accelerated discovery of attractive polymer dielectrics for capacitive energy storage. Here,more » we review these efforts, highlighting the importance of computational data generation and screening, targeted synthesis and characterization, polymer fingerprinting and machine-learning prediction models, and the creation of an online knowledgebase to guide ongoing and future polymer discovery and design. We lay special emphasis on the fingerprinting of polymers in terms of their genome or constituent atomic and molecular fragments, an idea that pays homage to the pioneers of the human genome project who identified the basic building blocks of the human DNA. As a result, by scoping the polymer genome, we present an essential roadmap for the design of polymer dielectrics, and provide future perspectives and directions for expansions to other polymer subclasses and properties.« less

  9. Type 2 Diabetes Screening Test by Means of a Pulse Oximeter.

    PubMed

    Moreno, Enrique Monte; Lujan, Maria Jose Anyo; Rusinol, Montse Torrres; Fernandez, Paqui Juarez; Manrique, Pilar Nunez; Trivino, Cristina Aragon; Miquel, Magda Pedrosa; Rodriguez, Marife Alvarez; Burguillos, M Jose Gonzalez

    2017-02-01

    In this paper, we propose a method for screening for the presence of type 2 diabetes by means of the signal obtained from a pulse oximeter. The screening system consists of two parts: the first analyzes the signal obtained from the pulse oximeter, and the second consists of a machine-learning module. The system consists of a front end that extracts a set of features form the pulse oximeter signal. These features are based on physiological considerations. The set of features were the input of a machine-learning algorithm that determined the class of the input sample, i.e., whether the subject had diabetes or not. The machine-learning algorithms were random forests, gradient boosting, and linear discriminant analysis as benchmark. The system was tested on a database of [Formula: see text] subjects (two samples per subject) collected from five community health centers. The mean receiver operating characteristic area found was [Formula: see text]% (median value [Formula: see text]% and range [Formula: see text]%), with a specificity =  [Formula: see text]% for a threshold that gave a sensitivity = [Formula: see text]%. We present a screening method for detecting diabetes that has a performance comparable to the glycated haemoglobin (haemoglobin A1c HbA1c) test, does not require blood extraction, and yields results in less than 5 min.

  10. MLViS: A Web Tool for Machine Learning-Based Virtual Screening in Early-Phase of Drug Discovery and Development

    PubMed Central

    Korkmaz, Selcuk; Zararsiz, Gokmen; Goksuluk, Dincer

    2015-01-01

    Virtual screening is an important step in early-phase of drug discovery process. Since there are thousands of compounds, this step should be both fast and effective in order to distinguish drug-like and nondrug-like molecules. Statistical machine learning methods are widely used in drug discovery studies for classification purpose. Here, we aim to develop a new tool, which can classify molecules as drug-like and nondrug-like based on various machine learning methods, including discriminant, tree-based, kernel-based, ensemble and other algorithms. To construct this tool, first, performances of twenty-three different machine learning algorithms are compared by ten different measures, then, ten best performing algorithms have been selected based on principal component and hierarchical cluster analysis results. Besides classification, this application has also ability to create heat map and dendrogram for visual inspection of the molecules through hierarchical cluster analysis. Moreover, users can connect the PubChem database to download molecular information and to create two-dimensional structures of compounds. This application is freely available through www.biosoft.hacettepe.edu.tr/MLViS/. PMID:25928885

  11. Shifts in the suitable habitat available for brown trout (Salmo trutta L.) under short-term climate change scenarios.

    PubMed

    Muñoz-Mas, R; Lopez-Nicolas, A; Martínez-Capel, F; Pulido-Velazquez, M

    2016-02-15

    The impact of climate change on the habitat suitability for large brown trout (Salmo trutta L.) was studied in a segment of the Cabriel River (Iberian Peninsula). The future flow and water temperature patterns were simulated at a daily time step with M5 models' trees (NSE of 0.78 and 0.97 respectively) for two short-term scenarios (2011-2040) under the representative concentration pathways (RCP 4.5 and 8.5). An ensemble of five strongly regularized machine learning techniques (generalized additive models, multilayer perceptron ensembles, random forests, support vector machines and fuzzy rule base systems) was used to model the microhabitat suitability (depth, velocity and substrate) during summertime and to evaluate several flows simulated with River2D©. The simulated flow rate and water temperature were combined with the microhabitat assessment to infer bivariate habitat duration curves (BHDCs) under historical conditions and climate change scenarios using either the weighted usable area (WUA) or the Boolean-based suitable area (SA). The forecasts for both scenarios jointly predicted a significant reduction in the flow rate and an increase in water temperature (mean rate of change of ca. -25% and +4% respectively). The five techniques converged on the modelled suitability and habitat preferences; large brown trout selected relatively high flow velocity, large depth and coarse substrate. However, the model developed with support vector machines presented a significantly trimmed output range (max.: 0.38), and thus its predictions were banned from the WUA-based analyses. The BHDCs based on the WUA and the SA broadly matched, indicating an increase in the number of days with less suitable habitat available (WUA and SA) and/or with higher water temperature (trout will endure impoverished environmental conditions ca. 82% of the days). Finally, our results suggested the potential extirpation of the species from the study site during short time spans. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Cheminformatics in Drug Discovery, an Industrial Perspective.

    PubMed

    Chen, Hongming; Kogej, Thierry; Engkvist, Ola

    2018-05-18

    Cheminformatics has established itself as a core discipline within large scale drug discovery operations. It would be impossible to handle the amount of data generated today in a small molecule drug discovery project without persons skilled in cheminformatics. In addition, due to increased emphasis on "Big Data", machine learning and artificial intelligence, not only in the society in general, but also in drug discovery, it is expected that the cheminformatics field will be even more important in the future. Traditional areas like virtual screening, library design and high-throughput screening analysis are highlighted in this review. Applying machine learning in drug discovery is an area that has become very important. Applications of machine learning in early drug discovery has been extended from predicting ADME properties and target activity to tasks like de novo molecular design and prediction of chemical reactions. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Research on bearing fault diagnosis of large machinery based on mathematical morphology

    NASA Astrophysics Data System (ADS)

    Wang, Yu

    2018-04-01

    To study the automatic diagnosis of large machinery fault based on support vector machine, combining the four common faults of the large machinery, the support vector machine is used to classify and identify the fault. The extracted feature vectors are entered. The feature vector is trained and identified by multi - classification method. The optimal parameters of the support vector machine are searched by trial and error method and cross validation method. Then, the support vector machine is compared with BP neural network. The results show that the support vector machines are short in time and high in classification accuracy. It is more suitable for the research of fault diagnosis in large machinery. Therefore, it can be concluded that the training speed of support vector machines (SVM) is fast and the performance is good.

  14. Screening for Psychopathology Versus Selecting for Suitability: Ethical and Legal Considerations

    NASA Technical Reports Server (NTRS)

    Holland, Albert W.; Galarza, Laura; Arvey, Richard; Hysong, Sylvia; Sackett, Paul; Cascio, Wayne

    2000-01-01

    The current system for psychological selection of U.S. astronauts is divided into two phases: The select-out phase and the select-in phase. The select-out phase screens candidates for psychopathology; candidates who do not meet the baseline psychiatric requirements are immediately disqualified. The select-in phase assesses candidates for suitability to fly short- and long-duration missions. Suitability ratings are given for ten factors found to be critical for short and long-duration space missions. There are qualitative differences in the purpose of the two phases (select-in vs. select-out) and in the nature of the information collected in each phase. Furthermore, there are different logistic, ethical, and legal issues related to a medical or psychiatric (select-out) screening versus a suitability (select-in) psychological screening process . The purpose of this presentation is to contrast the ethical and legal environment surrounding the select-out and select-in phases of the psychological selection system. Issues such as data collection, data storage and management, the federal statutory environment, and personnel training will be discussed. Further, a summary of the new standards for psychological testing is presented, along with their implications for astronaut selection.

  15. GAPscreener: An automatic tool for screening human genetic association literature in PubMed using the support vector machine technique

    PubMed Central

    Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta

    2008-01-01

    Background Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. Results The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. Conclusion GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge. PMID:18430222

  16. Adapting human-machine interfaces to user performance.

    PubMed

    Danziger, Zachary; Fishbach, Alon; Mussa-Ivaldi, Ferdinando A

    2008-01-01

    The goal of this study was to create and examine machine learning algorithms that adapt in a controlled and cadenced way to foster a harmonious learning environment between the user of a human-machine interface and the controlled device. In this experiment, subjects' high-dimensional finger motions remotely controlled the joint angles of a simulated planar 2-link arm, which was used to hit targets on a computer screen. Subjects were required to move the cursor at the endpoint of the simulated arm.

  17. Towards a phenotypic screening strategy for emerging β-lactamases in Gram-negative bacilli.

    PubMed

    Willems, Elise; Verhaegen, Jan; Magerman, Koen; Nys, Sita; Cartuyvels, Reinoud

    2013-02-01

    The purpose of this manuscript was to review recent literature and guidelines regarding phenotypic detection of emerging β-lactamases [extended-spectrum β-lactamases (ESBLs), AmpC β-lactamases and carbapenemases] in Gram-negative bacilli (GNB) in order to formulate recommendations on best practice to screen for them. We conclude that chromogenic ESBL screening agar plates are suitable to screen for ESBL-producing Enterobacteriaceae directly from clinical samples. Furthermore, ceftazidime (CAZ) and ceftriaxone or cefotaxime (CTX) are the indicator antimicrobial agents of choice for ESBL detection in GNB. In non-inducible Enterobacteriaceae, the combined double-disk synergy test (CDDST) with at least CTX and CAZ and additionally cefepime as indicators is the preferred ESBL confirmation assay. The two most suitable ESBL confirmation strategies in AmpC co-producing Enterobacteriaceae are adapted CDDSTs: (i) with addition of 3-aminophenylboronic acid to CTX and CAZ disks; and (ii) with addition of cloxacillin (CLOX) to Mueller-Hinton agar. Reduced cefoxitin susceptibility and decreased susceptibility to cefotetan are regarded as suitable screening tests for plasmid-mediated and derepressed AmpC production. A CLOX-based CDDST with CTX and CAZ as indicators is considered to be the best AmpC confirmation assay. Finally, in Enterobacteriaceae isolates we suggest to screen for carbapenemases with a 0.5 μg/mL meropenem screening breakpoint. For class A carbapenemase confirmation, the home-prepared as well as the commercially available boronic acid-based CDDST can be considered. For metallo-β-lactamase confirmation, ethylene diamine tetra-acetic-acid-based home-prepared assays are recommended. The most suitable method (CDDST or DDST) and indicator antimicrobial agent(s) vary depending on the bacterial genus. Copyright © 2012 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.

  18. 46 CFR 160.115-7 - Design, construction, and performance of winches.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... requirements: (1) Materials. (i) All gears must be machine cut and made of steel, bronze, or other suitable... suitable lock washers, cotter pins, or locks to prevent them from coming adrift. (2) Bearings and gears. (i) Positive means of lubrication must be provided for all bearings. (ii) When worm gears are used, the worm...

  19. Precision mechatronics based on high-precision measuring and positioning systems and machines

    NASA Astrophysics Data System (ADS)

    Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert

    2007-06-01

    Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.

  20. Reduced toxicity polyester resins and microvascular pre-preg tapes for advanced composites manufacturing

    NASA Astrophysics Data System (ADS)

    Poillucci, Richard

    Advanced composites manufacturing broadly encapsulates topics ranging from matrix chemistries to automated machines that lay-up fiber-reinforced materials. Environmental regulations are stimulating research to reduce matrix resin formulation toxicity. At present, composites fabricated with polyester resins expose workers to the risk of contact with and inhalation of styrene monomer, which is a potential carcinogen, neurotoxin, and respiratory irritant. The first primary goal of this thesis is to reduce the toxicity associated with polyester resins by: (1) identification of potential monomers to replace styrene, (2) determination of monomer solubility within the polyester, and (3) investigation of approaches to rapidly screen a large resin composition parameter space. Monomers are identified based on their ability to react with polyester and their toxicity as determined by the Globally Harmonized System (GHS) and a green screen method. Solubilities were determined by the Hoftyzer -- Van Krevelen method, Hansen solubility parameter database, and experimental mixing of monomers. A combinatorial microfluidic mixing device is designed and tested to obtain distinct resin compositions from two input chemistries. The push for safer materials is complemented by a thrust for multifunctional composites. The second primary goal of this thesis is to design and implement the manufacture of sacrificial fiber materials suitable for use in automated fiber placement of microvascaular multifunctional composites. Two key advancements are required to achieve this goal: (1) development of a roll-to-roll method to place sacrificial fibers onto carbon fiber pre-preg tape; and (2) demonstration of feasible manufacture of microvascular carbon fiber plates with automated fiber placement. An automated method for placing sacrificial fibers onto carbon fiber tapes is designed and a prototype implemented. Carbon fiber tows with manual placement of sacrificial fibers is implemented within an automated fiber placement machine and the successful fabrication of a carbon fiber plate with an integrated microvascular channel is demonstrated.

  1. High-Strength Undiffused Brushless (HSUB) Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, John S; Tolbert, Leon M; Lee, Seong T

    2007-01-01

    This paper introduces a new high-strength undiffused brushless machine that transfers the stationary excitation magnetomotive force to the rotor without any brushes. For a conventional permanent magnet (PM) machine, the air gap flux density cannot be enhanced effectively but can be weakened. In the new machine, both the stationary excitation coil and the PM in the rotor produce an enhanced air gap flux. The PM in the rotor prevents magnetic flux diffusion between the poles and guides the reluctance flux path. The pole flux density in the air gap can be much higher than what the PM alone can produce.more » A high-strength machine is thus obtained. The air gap flux density can be weakened through the stationary excitation winding. This type of machine is particularly suitable for electric and hybrid-electric vehicle applications. Patents of this new technology are either granted or pending.« less

  2. Automated Assessment of Patients' Self-Narratives for Posttraumatic Stress Disorder Screening Using Natural Language Processing and Text Mining.

    PubMed

    He, Qiwei; Veldkamp, Bernard P; Glas, Cees A W; de Vries, Theo

    2017-03-01

    Patients' narratives about traumatic experiences and symptoms are useful in clinical screening and diagnostic procedures. In this study, we presented an automated assessment system to screen patients for posttraumatic stress disorder via a natural language processing and text-mining approach. Four machine-learning algorithms-including decision tree, naive Bayes, support vector machine, and an alternative classification approach called the product score model-were used in combination with n-gram representation models to identify patterns between verbal features in self-narratives and psychiatric diagnoses. With our sample, the product score model with unigrams attained the highest prediction accuracy when compared with practitioners' diagnoses. The addition of multigrams contributed most to balancing the metrics of sensitivity and specificity. This article also demonstrates that text mining is a promising approach for analyzing patients' self-expression behavior, thus helping clinicians identify potential patients from an early stage.

  3. Self-propulsion and interactions of catalytic particles in a chemically active medium.

    PubMed

    Banigan, Edward J; Marko, John F

    2016-01-01

    Enzymatic "machines," such as catalytic rods or colloids, can self-propel and interact by generating gradients of their substrates. We theoretically investigate the behaviors of such machines in a chemically active environment where their catalytic substrates are continuously synthesized and destroyed, as occurs in living cells. We show how the kinetic properties of the medium modulate self-propulsion and pairwise interactions between machines, with the latter controlled by a tunable characteristic interaction range analogous to the Debye screening length in an electrolytic solution. Finally, we discuss the effective force arising between interacting machines and possible biological applications, such as partitioning of bacterial plasmids.

  4. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  5. Vocational Rehabilitation of Young Adults with a Disability of One Arm or Hand.

    ERIC Educational Resources Information Center

    Nijboer, Irene D.; Wevers, Cornelius J.

    1993-01-01

    A work analysis was conducted to determine whether the job of lathe and milling machine operator is suitable for young adults with one arm or hand. The analysis concluded that it is suitable but adjustments of the workplace may be necessary, such as transporting heavy equipment. The importance of labor research to vocational rehabilitation is…

  6. A Systematic Strategy for Screening and Application of Specific Biomarkers in Hepatotoxicity Using Metabolomics Combined With ROC Curves and SVMs.

    PubMed

    Li, Yubo; Wang, Lei; Ju, Liang; Deng, Haoyue; Zhang, Zhenzhu; Hou, Zhiguo; Xie, Jiabin; Wang, Yuming; Zhang, Yanjun

    2016-04-01

    Current studies that evaluate toxicity based on metabolomics have primarily focused on the screening of biomarkers while largely neglecting further verification and biomarker applications. For this reason, we used drug-induced hepatotoxicity as an example to establish a systematic strategy for screening specific biomarkers and applied these biomarkers to evaluate whether the drugs have potential hepatotoxicity toxicity. Carbon tetrachloride (5 ml/kg), acetaminophen (1500 mg/kg), and atorvastatin (5 mg/kg) are established as rat hepatotoxicity models. Fifteen common biomarkers were screened by multivariate statistical analysis and integration analysis-based metabolomics data. The receiver operating characteristic curve was used to evaluate the sensitivity and specificity of the biomarkers. We obtained 10 specific biomarker candidates with an area under the curve greater than 0.7. Then, a support vector machine model was established by extracting specific biomarker candidate data from the hepatotoxic drugs and nonhepatotoxic drugs; the accuracy of the model was 94.90% (92.86% sensitivity and 92.59% specificity) and the results demonstrated that those ten biomarkers are specific. 6 drugs were used to predict the hepatotoxicity by the support vector machines model; the prediction results were consistent with the biochemical and histopathological results, demonstrating that the model was reliable. Thus, this support vector machine model can be applied to discriminate the between the hepatic or nonhepatic toxicity of drugs. This approach not only presents a new strategy for screening-specific biomarkers with greater diagnostic significance but also provides a new evaluation pattern for hepatotoxicity, and it will be a highly useful tool in toxicity estimation and disease diagnoses. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Automation in airport security X-ray screening of cabin baggage: Examining benefits and possible implementations of automated explosives detection.

    PubMed

    Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian

    2018-10-01

    Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure < 1 year) benefitted substantially from EDSCB in detecting both improvised explosive devices and bare explosives. A comparison of all three conditions showed that automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. SWIFT-Review: a text-mining workbench for systematic review.

    PubMed

    Howard, Brian E; Phillips, Jason; Miller, Kyle; Tandon, Arpit; Mav, Deepak; Shah, Mihir R; Holmgren, Stephanie; Pelch, Katherine E; Walker, Vickie; Rooney, Andrew A; Macleod, Malcolm; Shah, Ruchir R; Thayer, Kristina

    2016-05-23

    There is growing interest in using machine learning approaches to priority rank studies and reduce human burden in screening literature when conducting systematic reviews. In addition, identifying addressable questions during the problem formulation phase of systematic review can be challenging, especially for topics having a large literature base. Here, we assess the performance of the SWIFT-Review priority ranking algorithm for identifying studies relevant to a given research question. We also explore the use of SWIFT-Review during problem formulation to identify, categorize, and visualize research areas that are data rich/data poor within a large literature corpus. Twenty case studies, including 15 public data sets, representing a range of complexity and size, were used to assess the priority ranking performance of SWIFT-Review. For each study, seed sets of manually annotated included and excluded titles and abstracts were used for machine training. The remaining references were then ranked for relevance using an algorithm that considers term frequency and latent Dirichlet allocation (LDA) topic modeling. This ranking was evaluated with respect to (1) the number of studies screened in order to identify 95 % of known relevant studies and (2) the "Work Saved over Sampling" (WSS) performance metric. To assess SWIFT-Review for use in problem formulation, PubMed literature search results for 171 chemicals implicated as EDCs were uploaded into SWIFT-Review (264,588 studies) and categorized based on evidence stream and health outcome. Patterns of search results were surveyed and visualized using a variety of interactive graphics. Compared with the reported performance of other tools using the same datasets, the SWIFT-Review ranking procedure obtained the highest scores on 11 out of 15 of the public datasets. Overall, these results suggest that using machine learning to triage documents for screening has the potential to save, on average, more than 50 % of the screening effort ordinarily required when using un-ordered document lists. In addition, the tagging and annotation capabilities of SWIFT-Review can be useful during the activities of scoping and problem formulation. Text-mining and machine learning software such as SWIFT-Review can be valuable tools to reduce the human screening burden and assist in problem formulation.

  9. A novel device for head gesture measurement system in combination with eye-controlled human machine interface

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Ho, Chien-Wa; Chang, Kai-Chieh; Hung, San-Shan; Shei, Hung-Jung; Yeh, Mau-Shiun

    2006-06-01

    This study describes the design and combination of an eye-controlled and a head-controlled human-machine interface system. This system is a highly effective human-machine interface, detecting head movement by changing positions and numbers of light sources on the head. When the users utilize the head-mounted display to browse a computer screen, the system will catch the images of the user's eyes with CCD cameras, which can also measure the angle and position of the light sources. In the eye-tracking system, the program in the computer will locate each center point of the pupils in the images, and record the information on moving traces and pupil diameters. In the head gesture measurement system, the user wears a double-source eyeglass frame, so the system catches images of the user's head by using a CCD camera in front of the user. The computer program will locate the center point of the head, transferring it to the screen coordinates, and then the user can control the cursor by head motions. We combine the eye-controlled and head-controlled human-machine interface system for the virtual reality applications.

  10. Computer-aided design studies of the homopolar linear synchronous motor

    NASA Astrophysics Data System (ADS)

    Dawson, G. E.; Eastham, A. R.; Ong, R.

    1984-09-01

    The linear induction motor (LIM), as an urban transit drive, can provide good grade-climbing capabilities and propulsion/braking performance that is independent of steel wheel-rail adhesion. In view of its 10-12 mm airgap, the LIM is characterized by a low power factor-efficiency product of order 0.4. A synchronous machine offers high efficiency and controllable power factor. An assessment of the linear homopolar configuration of this machine is presented as an alternative to the LIM. Computer-aided design studies using the finite element technique have been conducted to identify a suitable machine design for urban transit propulsion.

  11. Application of dynamic milling in stainless steel processing

    NASA Astrophysics Data System (ADS)

    Shan, Wenju

    2017-09-01

    This paper mainly introduces the method of parameter setting for NC programming of stainless steel parts by dynamic milling. Stainless steel is of high plasticity and toughness, serious hard working, large cutting force, high temperature in cutting area and easy wear of tool. It is difficult to process material. Dynamic motion technology is the newest NC programming technology of Mastercam software. It is an advanced machining idea. The tool path generated by the dynamic motion technology is more smooth, more efficient and more stable in the machining process. Dynamic motion technology is very suitable for cutting hard machining materials.

  12. Interpreting linear support vector machine models with heat map molecule coloring

    PubMed Central

    2011-01-01

    Background Model-based virtual screening plays an important role in the early drug discovery stage. The outcomes of high-throughput screenings are a valuable source for machine learning algorithms to infer such models. Besides a strong performance, the interpretability of a machine learning model is a desired property to guide the optimization of a compound in later drug discovery stages. Linear support vector machines showed to have a convincing performance on large-scale data sets. The goal of this study is to present a heat map molecule coloring technique to interpret linear support vector machine models. Based on the weights of a linear model, the visualization approach colors each atom and bond of a compound according to its importance for activity. Results We evaluated our approach on a toxicity data set, a chromosome aberration data set, and the maximum unbiased validation data sets. The experiments show that our method sensibly visualizes structure-property and structure-activity relationships of a linear support vector machine model. The coloring of ligands in the binding pocket of several crystal structures of a maximum unbiased validation data set target indicates that our approach assists to determine the correct ligand orientation in the binding pocket. Additionally, the heat map coloring enables the identification of substructures important for the binding of an inhibitor. Conclusions In combination with heat map coloring, linear support vector machine models can help to guide the modification of a compound in later stages of drug discovery. Particularly substructures identified as important by our method might be a starting point for optimization of a lead compound. The heat map coloring should be considered as complementary to structure based modeling approaches. As such, it helps to get a better understanding of the binding mode of an inhibitor. PMID:21439031

  13. Screen-Printed Washable Electronic Textiles as Self-Powered Touch/Gesture Tribo-Sensors for Intelligent Human-Machine Interaction.

    PubMed

    Cao, Ran; Pu, Xianjie; Du, Xinyu; Yang, Wei; Wang, Jiaona; Guo, Hengyu; Zhao, Shuyu; Yuan, Zuqing; Zhang, Chi; Li, Congju; Wang, Zhong Lin

    2018-05-22

    Multifunctional electronic textiles (E-textiles) with embedded electric circuits hold great application prospects for future wearable electronics. However, most E-textiles still have critical challenges, including air permeability, satisfactory washability, and mass fabrication. In this work, we fabricate a washable E-textile that addresses all of the concerns and shows its application as a self-powered triboelectric gesture textile for intelligent human-machine interfacing. Utilizing conductive carbon nanotubes (CNTs) and screen-printing technology, this kind of E-textile embraces high conductivity (0.2 kΩ/sq), high air permeability (88.2 mm/s), and can be manufactured on common fabric at large scales. Due to the advantage of the interaction between the CNTs and the fabrics, the electrode shows excellent stability under harsh mechanical deformation and even after being washed. Moreover, based on a single-electrode mode triboelectric nanogenerator and electrode pattern design, our E-textile exhibits highly sensitive touch/gesture sensing performance and has potential applications for human-machine interfacing.

  14. High-throughput label-free screening of euglena gracilis with optofluidic time-stretch quantitative phase microscopy

    NASA Astrophysics Data System (ADS)

    Guo, Baoshan; Lei, Cheng; Ito, Takuro; Yaxiaer, Yalikun; Kobayashi, Hirofumi; Jiang, Yiyue; Tanaka, Yo; Ozeki, Yasuyuki; Goda, Keisuke

    2017-02-01

    The development of reliable, sustainable, and economical sources of alternative fuels is an important, but challenging goal for the world. As an alternative to liquid fossil fuels, microalgal biofuel is expected to play a key role in reducing the detrimental effects of global warming since microalgae absorb atmospheric CO2 via photosynthesis. Unfortunately, conventional analytical methods only provide population-averaged lipid contents and fail to characterize a diverse population of microalgal cells with single-cell resolution in a noninvasive and interference-free manner. Here we demonstrate high-throughput label-free single-cell screening of lipid-producing microalgal cells with optofluidic time-stretch quantitative phase microscopy. In particular, we use Euglena gracilis - an attractive microalgal species that produces wax esters (suitable for biodiesel and aviation fuel after refinement) within lipid droplets. Our optofluidic time-stretch quantitative phase microscope is based on an integration of a hydrodynamic-focusing microfluidic chip, an optical time-stretch phase-contrast microscope, and a digital image processor equipped with machine learning. As a result, it provides both the opacity and phase contents of every single cell at a high throughput of 10,000 cells/s. We characterize heterogeneous populations of E. gracilis cells under two different culture conditions to evaluate their lipid production efficiency. Our method holds promise as an effective analytical tool for microalgaebased biofuel production.

  15. Lung boundary detection in pediatric chest x-rays

    NASA Astrophysics Data System (ADS)

    Candemir, Sema; Antani, Sameer; Jaeger, Stefan; Browning, Renee; Thoma, George R.

    2015-03-01

    Tuberculosis (TB) is a major public health problem worldwide, and highly prevalent in developing countries. According to the World Health Organization (WHO), over 95% of TB deaths occur in low- and middle- income countries that often have under-resourced health care systems. In an effort to aid population screening in such resource challenged settings, the U.S. National Library of Medicine has developed a chest X-ray (CXR) screening system that provides a pre-decision on pulmonary abnormalities. When the system is presented with a digital CXR image from the Picture Archive and Communication Systems (PACS) or an imaging source, it automatically identifies the lung regions in the image, extracts image features, and classifies the image as normal or abnormal using trained machine-learning algorithms. The system has been trained on adult CXR images, and this article presents enhancements toward including pediatric CXR images. Our adult lung boundary detection algorithm is model-based. We note the lung shape differences during pediatric developmental stages, and adulthood, and propose building new lung models suitable for pediatric developmental stages. In this study, we quantify changes in lung shape from infancy to adulthood toward enhancing our lung segmentation algorithm. Our initial findings suggest pediatric age groupings of 0 - 23 months, 2 - 10 years, and 11 - 18 years. We present justification for our groupings. We report on the quality of boundary detection algorithm with the pediatric lung models.

  16. Improving virtual screening predictive accuracy of Human kallikrein 5 inhibitors using machine learning models.

    PubMed

    Fang, Xingang; Bagui, Sikha; Bagui, Subhash

    2017-08-01

    The readily available high throughput screening (HTS) data from the PubChem database provides an opportunity for mining of small molecules in a variety of biological systems using machine learning techniques. From the thousands of available molecular descriptors developed to encode useful chemical information representing the characteristics of molecules, descriptor selection is an essential step in building an optimal quantitative structural-activity relationship (QSAR) model. For the development of a systematic descriptor selection strategy, we need the understanding of the relationship between: (i) the descriptor selection; (ii) the choice of the machine learning model; and (iii) the characteristics of the target bio-molecule. In this work, we employed the Signature descriptor to generate a dataset on the Human kallikrein 5 (hK 5) inhibition confirmatory assay data and compared multiple classification models including logistic regression, support vector machine, random forest and k-nearest neighbor. Under optimal conditions, the logistic regression model provided extremely high overall accuracy (98%) and precision (90%), with good sensitivity (65%) in the cross validation test. In testing the primary HTS screening data with more than 200K molecular structures, the logistic regression model exhibited the capability of eliminating more than 99.9% of the inactive structures. As part of our exploration of the descriptor-model-target relationship, the excellent predictive performance of the combination of the Signature descriptor and the logistic regression model on the assay data of the Human kallikrein 5 (hK 5) target suggested a feasible descriptor/model selection strategy on similar targets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Industrial Inspection with Open Eyes: Advance with Machine Vision Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zheng; Ukida, H.; Niel, Kurt

    Machine vision systems have evolved significantly with the technology advances to tackle the challenges from modern manufacturing industry. A wide range of industrial inspection applications for quality control are benefiting from visual information captured by different types of cameras variously configured in a machine vision system. This chapter screens the state of the art in machine vision technologies in the light of hardware, software tools, and major algorithm advances for industrial inspection. The inspection beyond visual spectrum offers a significant complementary to the visual inspection. The combination with multiple technologies makes it possible for the inspection to achieve a bettermore » performance and efficiency in varied applications. The diversity of the applications demonstrates the great potential of machine vision systems for industry.« less

  18. Suitability Screening Test for Marine Corps Air Traffic Controllers Phase 2

    DTIC Science & Technology

    2013-06-10

    Karen M. Walker, Ph.D. William L. Farmer, Ph.D. Rebecca C. Roberts, MS Navy Personnel Research, Studies, and Technology NPRST-TR-13-2...June 2013 Suitability Screening Test for Marine Corps Air Traffic Controllers Phase II Karen M. Walker, Ph.D. William L. Farmer, Ph.D. Rebecca...C. Roberts, MS Navy Personnel Research, Studies, and Technology Reviewed by Tanja Blackstone , Ph.D. Approved and released by D. M. Cashbaugh

  19. Suitability Screening Test for Marine Corps Air Traffic Controllers. Phase 2

    DTIC Science & Technology

    2013-06-01

    Karen M. Walker, Ph.D. William L. Farmer, Ph.D. Rebecca C. Roberts, MS Navy Personnel Research, Studies, and Technology NPRST-TR-13-2...June 2013 Suitability Screening Test for Marine Corps Air Traffic Controllers Phase II Karen M. Walker, Ph.D. William L. Farmer, Ph.D. Rebecca...C. Roberts, MS Navy Personnel Research, Studies, and Technology Reviewed by Tanja Blackstone , Ph.D. Approved and released by D. M. Cashbaugh

  20. Machine learning techniques applied to the determination of road suitability for the transportation of dangerous substances.

    PubMed

    Matías, J M; Taboada, J; Ordóñez, C; Nieto, P G

    2007-08-17

    This article describes a methodology to model the degree of remedial action required to make short stretches of a roadway suitable for dangerous goods transport (DGT), particularly pollutant substances, using different variables associated with the characteristics of each segment. Thirty-one factors determining the impact of an accident on a particular stretch of road were identified and subdivided into two major groups: accident probability factors and accident severity factors. Given the number of factors determining the state of a particular road segment, the only viable statistical methods for implementing the model were machine learning techniques, such as multilayer perceptron networks (MLPs), classification trees (CARTs) and support vector machines (SVMs). The results produced by these techniques on a test sample were more favourable than those produced by traditional discriminant analysis, irrespective of whether dimensionality reduction techniques were applied. The best results were obtained using SVMs specifically adapted to ordinal data. This technique takes advantage of the ordinal information contained in the data without penalising the computational load. Furthermore, the technique permits the estimation of the utility function that is latent in expert knowledge.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanchurin, Vitaly, E-mail: vvanchur@d.umn.edu

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly,more » CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.« less

  2. Cosmic logic: a computational model

    NASA Astrophysics Data System (ADS)

    Vanchurin, Vitaly

    2016-02-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.

  3. Hybrid MPI+OpenMP Programming of an Overset CFD Solver and Performance Investigations

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Jin, Haoqiang H.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    This report describes a two level parallelization of a Computational Fluid Dynamic (CFD) solver with multi-zone overset structured grids. The approach is based on a hybrid MPI+OpenMP programming model suitable for shared memory and clusters of shared memory machines. The performance investigations of the hybrid application on an SGI Origin2000 (O2K) machine is reported using medium and large scale test problems.

  4. Rheology of cellulose nanofibrils/silver nanowires suspension for the production of transparent and conductive electrodes by screen printing

    NASA Astrophysics Data System (ADS)

    Hoeng, Fanny; Denneulin, Aurore; Reverdy-Bruas, Nadège; Krosnicki, Guillaume; Bras, Julien

    2017-02-01

    With the aim of processing silver nanowires-based electrodes using screen printing process, this study proposes to evaluate the suitability of cellulose nanofibrils (CNF) as a thickening agent for providing a high viscosity silver nanowires screen printing ink. Rheology of CNF suspension has been specifically investigated according to screen printing process requirements using both rotational and oscillating rheology. It has been found that CNF indeed act as a thickener and stabilizer for the silver nanowires suspension. However, the solid dominant visco-elastic behavior of the CNF suspension was not suitable for screen printing and leads to defects within the printed film. CNF visco-elastic properties were modified by adding hydroxypropylmethyl cellulose (HPMC) to the suspension. Homogeneous transparent conductive layers have been obtained when using CNF-HPMC as a matrix for silver nanowires. The screen printed layers were characterized and performances of Rsh = 12 ± 5 Ω□-1 and T%500nm = 74,8% were achieved without any additional post-treatment to the film.

  5. The Visual Uncertainty Paradigm for Controlling Screen-Space Information in Visualization

    ERIC Educational Resources Information Center

    Dasgupta, Aritra

    2012-01-01

    The information visualization pipeline serves as a lossy communication channel for presentation of data on a screen-space of limited resolution. The lossy communication is not just a machine-only phenomenon due to information loss caused by translation of data, but also a reflection of the degree to which the human user can comprehend visual…

  6. Application of Numerical Simulation for the Analysis of the Processes of Rotary Ultrasonic Drilling

    NASA Astrophysics Data System (ADS)

    Naď, Milan; Čičmancová, Lenka; Hajdu, Štefan

    2016-12-01

    Rotary ultrasonic machining (RUM) is a hybrid process that combines diamond grinding with ultrasonic machining. It is most suitable to machine hard brittle materials such as ceramics and composites. Due to its excellent machining performance, RUM is very often applied for drilling of hard machinable materials. In the final phase of drilling, the edge deterioration of the drilled hole can occur, which results in a phenomenon called edge chipping. During hole drilling, a change in the thickness of the bottom of the drilled hole occurs. Consequently, the bottom of the hole as a plate structure is exposed to the transfer through the resonance state. This resonance state can be considered as one of the important aspects leading to edge chipping. Effects of changes in the bottom thickness and as well as the fillet radius between the wall and bottom of the borehole on the stress-strain states during RUM are analyzed.

  7. High-Strength Undiffused Brushless (HSUB) Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, John S; Lee, Seong T; Tolbert, Leon M

    2008-01-01

    This paper introduces a new high-strength undiffused brushless machine that transfers the stationary excitation magnetomotive force to the rotor without any brushes. For a conventional permanent magnet (PM) machine, the air-gap flux density cannot be enhanced effectively but can be weakened. In the new machine, both the stationary excitation coil and the PM in the rotor produce an enhanced air-gap flux. The PM in the rotor prevents magnetic-flux diffusion between the poles and guides the reluctance flux path. The pole flux density in the air gap can be much higher than what the PM alone can produce. A high-strength machinemore » is thus obtained. The air-gap flux density can be weakened through the stationary excitation winding. This type of machine is particularly suitable for electric and hybrid-electric vehicle applications. Patents of this new technology are either granted or pending.« less

  8. Vibration Sensor Monitoring of Nickel-Titanium Alloy Turning for Machinability Evaluation.

    PubMed

    Segreto, Tiziana; Caggiano, Alessandra; Karam, Sara; Teti, Roberto

    2017-12-12

    Nickel-Titanium (Ni-Ti) alloys are very difficult-to-machine materials causing notable manufacturing problems due to their unique mechanical properties, including superelasticity, high ductility, and severe strain-hardening. In this framework, the aim of this paper is to assess the machinability of Ni-Ti alloys with reference to turning processes in order to realize a reliable and robust in-process identification of machinability conditions. An on-line sensor monitoring procedure based on the acquisition of vibration signals was implemented during the experimental turning tests. The detected vibration sensorial data were processed through an advanced signal processing method in time-frequency domain based on wavelet packet transform (WPT). The extracted sensorial features were used to construct WPT pattern feature vectors to send as input to suitably configured neural networks (NNs) for cognitive pattern recognition in order to evaluate the correlation between input sensorial information and output machinability conditions.

  9. Vibration Sensor Monitoring of Nickel-Titanium Alloy Turning for Machinability Evaluation

    PubMed Central

    Segreto, Tiziana; Karam, Sara; Teti, Roberto

    2017-01-01

    Nickel-Titanium (Ni-Ti) alloys are very difficult-to-machine materials causing notable manufacturing problems due to their unique mechanical properties, including superelasticity, high ductility, and severe strain-hardening. In this framework, the aim of this paper is to assess the machinability of Ni-Ti alloys with reference to turning processes in order to realize a reliable and robust in-process identification of machinability conditions. An on-line sensor monitoring procedure based on the acquisition of vibration signals was implemented during the experimental turning tests. The detected vibration sensorial data were processed through an advanced signal processing method in time-frequency domain based on wavelet packet transform (WPT). The extracted sensorial features were used to construct WPT pattern feature vectors to send as input to suitably configured neural networks (NNs) for cognitive pattern recognition in order to evaluate the correlation between input sensorial information and output machinability conditions. PMID:29231864

  10. Automated Inference of Chemical Discriminants of Biological Activity.

    PubMed

    Raschka, Sebastian; Scott, Anne M; Huertas, Mar; Li, Weiming; Kuhn, Leslie A

    2018-01-01

    Ligand-based virtual screening has become a standard technique for the efficient discovery of bioactive small molecules. Following assays to determine the activity of compounds selected by virtual screening, or other approaches in which dozens to thousands of molecules have been tested, machine learning techniques make it straightforward to discover the patterns of chemical groups that correlate with the desired biological activity. Defining the chemical features that generate activity can be used to guide the selection of molecules for subsequent rounds of screening and assaying, as well as help design new, more active molecules for organic synthesis.The quantitative structure-activity relationship machine learning protocols we describe here, using decision trees, random forests, and sequential feature selection, take as input the chemical structure of a single, known active small molecule (e.g., an inhibitor, agonist, or substrate) for comparison with the structure of each tested molecule. Knowledge of the atomic structure of the protein target and its interactions with the active compound are not required. These protocols can be modified and applied to any data set that consists of a series of measured structural, chemical, or other features for each tested molecule, along with the experimentally measured value of the response variable you would like to predict or optimize for your project, for instance, inhibitory activity in a biological assay or ΔG binding . To illustrate the use of different machine learning algorithms, we step through the analysis of a dataset of inhibitor candidates from virtual screening that were tested recently for their ability to inhibit GPCR-mediated signaling in a vertebrate.

  11. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  12. Data mining in bioinformatics using Weka.

    PubMed

    Frank, Eibe; Hall, Mark; Trigg, Len; Holmes, Geoffrey; Witten, Ian H

    2004-10-12

    The Weka machine learning workbench provides a general-purpose environment for automatic classification, regression, clustering and feature selection-common data mining problems in bioinformatics research. It contains an extensive collection of machine learning algorithms and data pre-processing methods complemented by graphical user interfaces for data exploration and the experimental comparison of different machine learning techniques on the same problem. Weka can process data given in the form of a single relational table. Its main objectives are to (a) assist users in extracting useful information from data and (b) enable them to easily identify a suitable algorithm for generating an accurate predictive model from it. http://www.cs.waikato.ac.nz/ml/weka.

  13. Applications of Support Vector Machines In Chemo And Bioinformatics

    NASA Astrophysics Data System (ADS)

    Jayaraman, V. K.; Sundararajan, V.

    2010-10-01

    Conventional linear & nonlinear tools for classification, regression & data driven modeling are being replaced on a rapid scale by newer techniques & tools based on artificial intelligence and machine learning. While the linear techniques are not applicable for inherently nonlinear problems, newer methods serve as attractive alternatives for solving real life problems. Support Vector Machine (SVM) classifiers are a set of universal feed-forward network based classification algorithms that have been formulated from statistical learning theory and structural risk minimization principle. SVM regression closely follows the classification methodology. In this work recent applications of SVM in Chemo & Bioinformatics will be described with suitable illustrative examples.

  14. A consideration of the operation of automatic production machines.

    PubMed

    Hoshi, Toshiro; Sugimoto, Noboru

    2015-01-01

    At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation - operation for which quick performance is required (operation that is not permitted to be delayed) - and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as "asymmetric on the time-axis". Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis.

  15. A consideration of the operation of automatic production machines

    PubMed Central

    HOSHI, Toshiro; SUGIMOTO, Noboru

    2015-01-01

    At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation − operation for which quick performance is required (operation that is not permitted to be delayed) − and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as “asymmetric on the time-axis”. Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis. PMID:25739898

  16. Impact resistance of materials for guards on cutting machine tools--requirements in future European safety standards.

    PubMed

    Mewes, D; Trapp, R P

    2000-01-01

    Guards on machine tools are meant to protect operators from injuries caused by tools, workpieces, and fragments hurled out of the machine's working zone. This article presents the impact resistance requirements, which guards according to European safety standards for machine tools must satisfy. Based upon these standards the impact resistance of different guard materials was determined using cylindrical steel projectiles. Polycarbonate proves to be a suitable material for vision panels because of its high energy absorption capacity. The impact resistance of 8-mm thick polycarbonate is roughly equal to that of a 3-mm thick steel sheet Fe P01. The limited ageing stability, however, makes it necessary to protect polycarbonate against cooling lubricants by means of additional panes on both sides.

  17. Overview of a workshop on screening methods for detecting potential (anti-) estrogenic/androgenic chemicals in wildlife

    USGS Publications Warehouse

    Ankley, Gerald T.; Mihaich, Ellen; Stahl, Ralph G.; Tillitt, Donald E.; Colborn, Theo; McMaster, Suzzanne; Miller, Ron; Bantle, John; Campbell, Pamela; Denslow, Nancy; Dickerson, Richard L.; Folmar, Leroy C.; Fry, Michael; Giesy, John P.; Gray, L. Earl; Guiney, Patrick; Hutchinson, Thomas; Kennedy, Sean W.; Kramer, Vincent; LeBlanc, Gerald A.; Mayes, Monte; Nimrod, Alison; Patino, Reynaldo; Peterson, Richard; Purdy, Richard; Ringer, Robert; Thomas, Peter C.; Touart, Les; Van Der Kraak, Glen; Zacharewski, Tim

    1998-01-01

    The U.S. Congress has passed legislation requiring the U.S. Environmental Protection Agency (U.S. EPA) to develop, validate, and implement screening tests for identifying potential endocrine-disrupting chemicals within 3 years. To aid in the identification of methods suitable for this purpose, the U.S. EPA, the Chemical Manufacturers Association, and the World Wildlife Fund sponsored several workshops, including the present one, which dealt with wildlife species. This workshop was convened with 30 international scientists representing multiple disciplines in March 1997 in Kansas City, Missouri, USA. Participants at the meeting identified methods in terms of their ability to indicate (anti-) estrogenic/androgenic effects, particularly in the context of developmental and reproductive processes. Data derived from structure-activity relationship models and in vitro test systems, although useful in certain contexts, cannot at present replace in vivo tests as the sole basis for screening. A consensus was reached that existing mammalian test methods (e.g., with rats or mice) generally are suitable as screens for assessing potential (anti-) estrogenic/ androgenic effects in mammalian wildlife. However, due to factors such as among-class variation in receptor structure and endocrine function, it is uncertain if these mammalian assays would be of broad utility as screens for other classes of vertebrate wildlife. Existing full and partial life-cycle tests with some avian and fish species could successfully identify chemicals causing endocrine disruption; however, these long-term tests are not suitable for routine screening. However, a number of short-term tests with species from these two classes exist that could serve as effective screening tools for chemicals inducing (anti-) estrogenic/androgenic effects. Existing methods suitable for identifying chemicals with these mechanisms of action in reptiles and amphibians are limited, but in the future, tests with species from these classes may prove highly effective as screens. In the case of invertebrate species, too little is known at present about the biological role of estrogens and androgens in reproduction and development to recommend specific assays.

  18. 21 CFR 1270.21 - Determination of donor suitability for human tissue intended for transplantation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... virus, Type 1 (e.g., FDA licensed screening test for anti-HIV-1); (2) Human immunodeficiency virus, Type 2 (e.g., FDA licensed screening test for anti-HIV-2); (3) Hepatitis B (e.g., FDA licensed screening... been tested and found negative using FDA licensed screening tests for HIV-1, HIV-2, hepatitis B, and...

  19. 21 CFR 1270.21 - Determination of donor suitability for human tissue intended for transplantation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... virus, Type 1 (e.g., FDA licensed screening test for anti-HIV-1); (2) Human immunodeficiency virus, Type 2 (e.g., FDA licensed screening test for anti-HIV-2); (3) Hepatitis B (e.g., FDA licensed screening... been tested and found negative using FDA licensed screening tests for HIV-1, HIV-2, hepatitis B, and...

  20. 21 CFR 1270.21 - Determination of donor suitability for human tissue intended for transplantation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... virus, Type 1 (e.g., FDA licensed screening test for anti-HIV-1); (2) Human immunodeficiency virus, Type 2 (e.g., FDA licensed screening test for anti-HIV-2); (3) Hepatitis B (e.g., FDA licensed screening... been tested and found negative using FDA licensed screening tests for HIV-1, HIV-2, hepatitis B, and...

  1. 21 CFR 1270.21 - Determination of donor suitability for human tissue intended for transplantation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... virus, Type 1 (e.g., FDA licensed screening test for anti-HIV-1); (2) Human immunodeficiency virus, Type 2 (e.g., FDA licensed screening test for anti-HIV-2); (3) Hepatitis B (e.g., FDA licensed screening... been tested and found negative using FDA licensed screening tests for HIV-1, HIV-2, hepatitis B, and...

  2. Orbital fatigue tester for use in Skylab experiment T032

    NASA Technical Reports Server (NTRS)

    Sandorff, P. E.

    1973-01-01

    A prototype fatigue test machine is described which is suitable for use by an astronaut in conducting constant amplitude materials fatigue tests aboard a Skylab or space shuttle vehicle. The machine is comparised of a mechanical tester, which would be passed through a small (7.6-inch square) airlock to be supported in the space environment on an extendible boom, and a control console, which would provide remote control from within the space vehicle.

  3. Development and Evaluation of a Wide-Bed Former for Vegetable Cultivation in Controlled Tractor Traffic

    NASA Astrophysics Data System (ADS)

    Dixit, Anoop; Khurana, Rohinish; Verma, Aseem; Singh, Arshdeep; Manes, G. S.

    2018-05-01

    India is the second largest producer of vegetables in the world. For vegetable cultivation, a good seed bed preparation is an important task which involves 6-10 different operations. To tackle the issue of multiple operations, a prototype of tractor operated wide bed former was developed and evaluated. The machine comprises of a rotary tiller and a bed forming setup. It forms bed of 1000 mm top width which is suitable as per the track width of an average sized tractor in India. The height of the beds formed is 130 mm whereas the top and bottom width of channel formed on both sides of the bed is 330 and 40 mm respectively at soil moisture content of 12.5-16% (db). The forward speed of 2.75 km/h was observed to be suitable for proper bed formation. The average fuel consumption of the machine was 5.9 l/h. The average bulk density of soil before and after the bed formation was 1.46 and 1.63 g/cc respectively. Field capacity of the machine was found to be 0.31 ha/h. The machine resulted in 93.8% labour saving and 80.4% saving in cost of bed preparation as compared to conventional farmer practice. Overall performance of wide-bed former was found to be satisfactory.

  4. From CBCL to DSM: A Comparison of Two Methods to Screen for DSM-IV Diagnoses Using CBCL Data

    ERIC Educational Resources Information Center

    Krol, Nicole P. C. M.; De Bruyn, Eric E. J.; Coolen, Jolanda C.; van Aarle, Edward J. M.

    2006-01-01

    The screening efficiency of 2 methods to convert Child Behavior Checklist (CBCL) assessment data into Diagnostic and Statistical Manual of Mental Disorders (4th ed. [DSM-IV]; American Psychiatric Association, 1994) diagnoses was compared. The Machine-Aided Diagnosis (MAD) method converts CBCL input data directly into DSM-IV symptom criteria. The…

  5. Virtual screening of inorganic materials synthesis parameters with deep learning

    NASA Astrophysics Data System (ADS)

    Kim, Edward; Huang, Kevin; Jegelka, Stefanie; Olivetti, Elsa

    2017-12-01

    Virtual materials screening approaches have proliferated in the past decade, driven by rapid advances in first-principles computational techniques, and machine-learning algorithms. By comparison, computationally driven materials synthesis screening is still in its infancy, and is mired by the challenges of data sparsity and data scarcity: Synthesis routes exist in a sparse, high-dimensional parameter space that is difficult to optimize over directly, and, for some materials of interest, only scarce volumes of literature-reported syntheses are available. In this article, we present a framework for suggesting quantitative synthesis parameters and potential driving factors for synthesis outcomes. We use a variational autoencoder to compress sparse synthesis representations into a lower dimensional space, which is found to improve the performance of machine-learning tasks. To realize this screening framework even in cases where there are few literature data, we devise a novel data augmentation methodology that incorporates literature synthesis data from related materials systems. We apply this variational autoencoder framework to generate potential SrTiO3 synthesis parameter sets, propose driving factors for brookite TiO2 formation, and identify correlations between alkali-ion intercalation and MnO2 polymorph selection.

  6. Performance of machine-learning scoring functions in structure-based virtual screening.

    PubMed

    Wójcikowski, Maciej; Ballester, Pedro J; Siedlecki, Pawel

    2017-04-25

    Classical scoring functions have reached a plateau in their performance in virtual screening and binding affinity prediction. Recently, machine-learning scoring functions trained on protein-ligand complexes have shown great promise in small tailored studies. They have also raised controversy, specifically concerning model overfitting and applicability to novel targets. Here we provide a new ready-to-use scoring function (RF-Score-VS) trained on 15 426 active and 893 897 inactive molecules docked to a set of 102 targets. We use the full DUD-E data sets along with three docking tools, five classical and three machine-learning scoring functions for model building and performance assessment. Our results show RF-Score-VS can substantially improve virtual screening performance: RF-Score-VS top 1% provides 55.6% hit rate, whereas that of Vina only 16.2% (for smaller percent the difference is even more encouraging: RF-Score-VS top 0.1% achieves 88.6% hit rate for 27.5% using Vina). In addition, RF-Score-VS provides much better prediction of measured binding affinity than Vina (Pearson correlation of 0.56 and -0.18, respectively). Lastly, we test RF-Score-VS on an independent test set from the DEKOIS benchmark and observed comparable results. We provide full data sets to facilitate further research in this area (http://github.com/oddt/rfscorevs) as well as ready-to-use RF-Score-VS (http://github.com/oddt/rfscorevs_binary).

  7. A Machine Learning Application Based in Random Forest for Integrating Mass Spectrometry-Based Metabolomic Data: A Simple Screening Method for Patients With Zika Virus.

    PubMed

    Melo, Carlos Fernando Odir Rodrigues; Navarro, Luiz Claudio; de Oliveira, Diogo Noin; Guerreiro, Tatiane Melina; Lima, Estela de Oliveira; Delafiori, Jeany; Dabaja, Mohamed Ziad; Ribeiro, Marta da Silva; de Menezes, Maico; Rodrigues, Rafael Gustavo Martins; Morishita, Karen Noda; Esteves, Cibele Zanardi; de Amorim, Aline Lopes Lucas; Aoyagui, Caroline Tiemi; Parise, Pierina Lorencini; Milanez, Guilherme Paier; do Nascimento, Gabriela Mansano; Ribas Freitas, André Ricardo; Angerami, Rodrigo; Costa, Fábio Trindade Maranhão; Arns, Clarice Weis; Resende, Mariangela Ribeiro; Amaral, Eliana; Junior, Renato Passini; Ribeiro-do-Valle, Carolina C; Milanez, Helaine; Moretti, Maria Luiza; Proenca-Modena, Jose Luiz; Avila, Sandra; Rocha, Anderson; Catharino, Rodrigo Ramos

    2018-01-01

    Recent Zika outbreaks in South America, accompanied by unexpectedly severe clinical complications have brought much interest in fast and reliable screening methods for ZIKV (Zika virus) identification. Reverse-transcriptase polymerase chain reaction (RT-PCR) is currently the method of choice to detect ZIKV in biological samples. This approach, nonetheless, demands a considerable amount of time and resources such as kits and reagents that, in endemic areas, may result in a substantial financial burden over affected individuals and health services veering away from RT-PCR analysis. This study presents a powerful combination of high-resolution mass spectrometry and a machine-learning prediction model for data analysis to assess the existence of ZIKV infection across a series of patients that bear similar symptomatic conditions, but not necessarily are infected with the disease. By using mass spectrometric data that are inputted with the developed decision-making algorithm, we were able to provide a set of features that work as a "fingerprint" for this specific pathophysiological condition, even after the acute phase of infection. Since both mass spectrometry and machine learning approaches are well-established and have largely utilized tools within their respective fields, this combination of methods emerges as a distinct alternative for clinical applications, providing a diagnostic screening-faster and more accurate-with improved cost-effectiveness when compared to existing technologies.

  8. Analyzing the Utilization of Interferon-Gamma Screening for Tuberculosis at Recruit Training Command, Great Lakes

    DTIC Science & Technology

    2006-05-31

    the project through a business case analysis conducted on this same subject. CAPT Monestersky, Preventive Medicine; LCDR Jacobs, Occupational Medicine...TB and LTBI (Taylor, Nolan, & Blumberg , Interferon-y screening for TB 8 2005). They are based on science founded in the 1 9 th century. The current...identify individuals who may not be suitable for service. For the majority who are suitable, the "P" days allow time to complete necessary business

  9. Readability, Suitability and Health Content Assessment of Cancer Screening Announcements in Municipal Newspapers in Japan.

    PubMed

    Okuhara, Tsuyoshi; Ishikawa, Hirono; Okada, Hiroko; Kiuchi, Takahiro

    2015-01-01

    The objective of this study was to assess the readability, suitability, and health content of cancer screening information in municipal newspapers in Japan. Suitability Assessment of Materials (SAM) and the framework of Health Belief Model (HBM) were used for assessment of municipal newspapers that were published in central Tokyo (23 wards) from January to December 2013. The mean domain SAM scores of content, literacy demand, and layout/typography were considered superior. The SAM scores of interaction with readers, an indication of the models of desirable actions, and elaboration to enhance readers' self-efficacy were low. According to the HBM coding, messages of medical/clinical severity, of social severity, of social benefits, and of barriers of fear were scarce. The articles were generally well written and suitable. However, learning stimulation/motivation was scarce and the HBM constructs were not fully addressed. Articles can be improved to motivate readers to obtain cancer screening by increasing interaction with readers, introducing models of desirable actions and devices to raise readers' self-efficacy, and providing statements of perceived barriers of fear for pain and time constraints, perceived severity, and social benefits and losses.

  10. MEASUREMENTS OF GAMMA-RAY DOSES OF DIFFERENT RADIOISOTOPES BY THE TEST-FILM METHOD (in German)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Domanus, J.; Halski, L.

    The test-film method seems to be most suitable for systematic, periodical measurements of individual doses of ionizing radiation. Persons handling radioisotopes are irradiated with gamma rays of different energies. The energy of gamma radiation lies within much broader limits than is the case with x rays. Therefore it was necessary to check whether the test-film method is suitable for measuring doses of gamma-rays of such different energies and to choose the proper combination of film and screen to reach the necessary measuring range. Polish films, Foton Rentgen and Foton Rentgen Super and films from the German Democratic Republic, Agfa Texomore » R and Agfa Texo S were tested. Expositions were made without intensifying screens as well as with lead and fluorescent screens. The investigations showed that for dosimetric purposes the Foton Rentgen Super films are most suitable. However, not one of the film-screen combinations gave satisfactory results for radioisotopes with radiation of different energies. In such a case the test-film method gives only approximate results. If, on the contrary, gamma energies do not differ greatly, the test- film method proves to be quite good. (auth)« less

  11. Temporary-tattoo for long-term high fidelity biopotential recordings

    NASA Astrophysics Data System (ADS)

    Bareket, Lilach; Inzelberg, Lilah; Rand, David; David-Pur, Moshe; Rabinovich, David; Brandes, Barak; Hanein, Yael

    2016-05-01

    Electromyography is a non-invasive method widely used to map muscle activation. For decades, it was commonly accepted that dry metallic electrodes establish poor electrode-skin contact, making them impractical for skin electromyography applications. Gelled electrodes are therefore the standard in electromyography with their use confined, almost entirely, to laboratory settings. Here we present novel dry electrodes, exhibiting outstanding electromyography recording along with excellent user comfort. The electrodes were realized using screen-printing of carbon ink on a soft support. The conformity of the electrodes helps establish direct contact with the skin, making the use of a gel superfluous. Plasma polymerized 3,4-ethylenedioxythiophene was used to enhance the impedance of the electrodes. Cyclic voltammetry measurements revealed an increase in electrode capacitance by a factor of up to 100 in wet conditions. Impedance measurements show a reduction factor of 10 in electrode impedance on human skin. The suitability of the electrodes for long-term electromyography recordings from the hand and from the face is demonstrated. The presented electrodes are ideally-suited for many applications, such as brain-machine interfacing, muscle diagnostics, post-injury rehabilitation, and gaming.

  12. A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer

    NASA Astrophysics Data System (ADS)

    Luckman, Adrian J.; Allinson, Nigel M.

    1989-03-01

    A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.

  13. Designing Semiconductor Heterostructures Using Digitally Accessible Electronic-Structure Data

    NASA Astrophysics Data System (ADS)

    Shapera, Ethan; Schleife, Andre

    Semiconductor sandwich structures, so-called heterojunctions, are at the heart of modern applications with tremendous societal impact: Light-emitting diodes shape the future of lighting and solar cells are promising for renewable energy. However, their computer-based design is hampered by the high cost of electronic structure techniques used to select materials based on alignment of valence and conduction bands and to evaluate excited state properties. We describe, validate, and demonstrate an open source Python framework which rapidly screens existing online databases and user-provided data to find combinations of suitable, previously fabricated materials for optoelectronic applications. The branch point energy aligns valence and conduction bands of different materials, requiring only the bulk density functional theory band structure. We train machine learning algorithms to predict the dielectric constant, electron mobility, and hole mobility with material descriptors available in online databases. Using CdSe and InP as emitting layers for LEDs and CH3NH3PbI3 and nanoparticle PbS as absorbers for solar cells, we demonstrate our broadly applicable, automated method.

  14. Temporary-tattoo for long-term high fidelity biopotential recordings

    PubMed Central

    Bareket, Lilach; Inzelberg, Lilah; Rand, David; David-Pur, Moshe; Rabinovich, David; Brandes, Barak; Hanein, Yael

    2016-01-01

    Electromyography is a non-invasive method widely used to map muscle activation. For decades, it was commonly accepted that dry metallic electrodes establish poor electrode-skin contact, making them impractical for skin electromyography applications. Gelled electrodes are therefore the standard in electromyography with their use confined, almost entirely, to laboratory settings. Here we present novel dry electrodes, exhibiting outstanding electromyography recording along with excellent user comfort. The electrodes were realized using screen-printing of carbon ink on a soft support. The conformity of the electrodes helps establish direct contact with the skin, making the use of a gel superfluous. Plasma polymerized 3,4-ethylenedioxythiophene was used to enhance the impedance of the electrodes. Cyclic voltammetry measurements revealed an increase in electrode capacitance by a factor of up to 100 in wet conditions. Impedance measurements show a reduction factor of 10 in electrode impedance on human skin. The suitability of the electrodes for long-term electromyography recordings from the hand and from the face is demonstrated. The presented electrodes are ideally-suited for many applications, such as brain-machine interfacing, muscle diagnostics, post-injury rehabilitation, and gaming. PMID:27169387

  15. TiO2 gas sensor to detect the propanol at room temperature

    NASA Astrophysics Data System (ADS)

    Gaidan, Ibrahim; Asbia, Salim; Brabazon, Dermot; Ahad, Inam Ul

    2017-10-01

    Titanium dioxide (TiO2) was used as raw material to create sensing materials for gas sensor applications. The sample was mixed with isopropanol and wet-ball milled for 24 hours and then dried at 120°C to evaporate the solvent. Twenty grams of the dried powder was then pressed at 2 tons (27.58 MPa) using a pellet die. The pellet was heated at 1250°C in air for 5 hours and then milled for 10 minutes to powder form using a Gy-RO Mill machine. FIB and SEM analysis were used to study the microstructure of the materials. The polyvinyl butyral (5 wt.%) was used as a binder, while ethylenglycolmonobutylether served as a solvent to make a suitable paste. The paste was screen-printed on top of an alumina substrate that had copper electrodes to form the sensor. The sensor was used to detect propanol at room temperature over two different ranges (500 to 3000 ppm and 2500 to 5000 ppm). It was observed that the response of the device increased proportionally with increasing gas concentration repeatability.

  16. Automated cell analysis tool for a genome-wide RNAi screen with support vector machine based supervised learning

    NASA Astrophysics Data System (ADS)

    Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen

    2011-03-01

    RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.

  17. Classifying the Indication for Colonoscopy Procedures: A Comparison of NLP Approaches in a Diverse National Healthcare System.

    PubMed

    Patterson, Olga V; Forbush, Tyler B; Saini, Sameer D; Moser, Stephanie E; DuVall, Scott L

    2015-01-01

    In order to measure the level of utilization of colonoscopy procedures, identifying the primary indication for the procedure is required. Colonoscopies may be utilized not only for screening, but also for diagnostic or therapeutic purposes. To determine whether a colonoscopy was performed for screening, we created a natural language processing system to identify colonoscopy reports in the electronic medical record system and extract indications for the procedure. A rule-based model and three machine-learning models were created using 2,000 manually annotated clinical notes of patients cared for in the Department of Veterans Affairs. Performance of the models was measured and compared. Analysis of the models on a test set of 1,000 documents indicates that the rule-based system performance stays fairly constant as evaluated on training and testing sets. However, the machine learning model without feature selection showed significant decrease in performance. Therefore, rule-based classification system appears to be more robust than a machine-learning system in cases when no feature selection is performed.

  18. 21 CFR 1270.21 - Determination of donor suitability for human tissue intended for transplantation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 2 (e.g., FDA licensed screening test for anti-HIV-2); (3) Hepatitis B (e.g., FDA licensed screening test for HBsAg); and (4) Hepatitis C (e.g., FDA licensed screening test for anti-HCV). (b) In the case... been tested and found negative using FDA licensed screening tests for HIV-1, HIV-2, hepatitis B, and...

  19. Solar and Wind Site Screening Decision Trees

    EPA Pesticide Factsheets

    EPA and NREL created a decision tree to guide state and local governments and other stakeholders through a process for screening sites for their suitability for future redevelopment with solar photovoltaic (PV) energy and wind energy.

  20. Localized thin-section CT with radiomics feature extraction and machine learning to classify early-detected pulmonary nodules from lung cancer screening

    NASA Astrophysics Data System (ADS)

    Tu, Shu-Ju; Wang, Chih-Wei; Pan, Kuang-Tse; Wu, Yi-Cheng; Wu, Chen-Te

    2018-03-01

    Lung cancer screening aims to detect small pulmonary nodules and decrease the mortality rate of those affected. However, studies from large-scale clinical trials of lung cancer screening have shown that the false-positive rate is high and positive predictive value is low. To address these problems, a technical approach is greatly needed for accurate malignancy differentiation among these early-detected nodules. We studied the clinical feasibility of an additional protocol of localized thin-section CT for further assessment on recalled patients from lung cancer screening tests. Our approach of localized thin-section CT was integrated with radiomics features extraction and machine learning classification which was supervised by pathological diagnosis. Localized thin-section CT images of 122 nodules were retrospectively reviewed and 374 radiomics features were extracted. In this study, 48 nodules were benign and 74 malignant. There were nine patients with multiple nodules and four with synchronous multiple malignant nodules. Different machine learning classifiers with a stratified ten-fold cross-validation were used and repeated 100 times to evaluate classification accuracy. Of the image features extracted from the thin-section CT images, 238 (64%) were useful in differentiating between benign and malignant nodules. These useful features include CT density (p  =  0.002 518), sigma (p  =  0.002 781), uniformity (p  =  0.032 41), and entropy (p  =  0.006 685). The highest classification accuracy was 79% by the logistic classifier. The performance metrics of this logistic classification model was 0.80 for the positive predictive value, 0.36 for the false-positive rate, and 0.80 for the area under the receiver operating characteristic curve. Our approach of direct risk classification supervised by the pathological diagnosis with localized thin-section CT and radiomics feature extraction may support clinical physicians in determining truly malignant nodules and therefore reduce problems in lung cancer screening.

  1. Localized thin-section CT with radiomics feature extraction and machine learning to classify early-detected pulmonary nodules from lung cancer screening.

    PubMed

    Tu, Shu-Ju; Wang, Chih-Wei; Pan, Kuang-Tse; Wu, Yi-Cheng; Wu, Chen-Te

    2018-03-14

    Lung cancer screening aims to detect small pulmonary nodules and decrease the mortality rate of those affected. However, studies from large-scale clinical trials of lung cancer screening have shown that the false-positive rate is high and positive predictive value is low. To address these problems, a technical approach is greatly needed for accurate malignancy differentiation among these early-detected nodules. We studied the clinical feasibility of an additional protocol of localized thin-section CT for further assessment on recalled patients from lung cancer screening tests. Our approach of localized thin-section CT was integrated with radiomics features extraction and machine learning classification which was supervised by pathological diagnosis. Localized thin-section CT images of 122 nodules were retrospectively reviewed and 374 radiomics features were extracted. In this study, 48 nodules were benign and 74 malignant. There were nine patients with multiple nodules and four with synchronous multiple malignant nodules. Different machine learning classifiers with a stratified ten-fold cross-validation were used and repeated 100 times to evaluate classification accuracy. Of the image features extracted from the thin-section CT images, 238 (64%) were useful in differentiating between benign and malignant nodules. These useful features include CT density (p  =  0.002 518), sigma (p  =  0.002 781), uniformity (p  =  0.032 41), and entropy (p  =  0.006 685). The highest classification accuracy was 79% by the logistic classifier. The performance metrics of this logistic classification model was 0.80 for the positive predictive value, 0.36 for the false-positive rate, and 0.80 for the area under the receiver operating characteristic curve. Our approach of direct risk classification supervised by the pathological diagnosis with localized thin-section CT and radiomics feature extraction may support clinical physicians in determining truly malignant nodules and therefore reduce problems in lung cancer screening.

  2. Using Spelling to Screen Bilingual Kindergarteners at Risk for Reading Difficulties

    ERIC Educational Resources Information Center

    Chua, Shi Min; Rickard Liow, Susan J.; Yeong, Stephanie H. M.

    2016-01-01

    For bilingual children, the results of language and literacy screening tools are often hard to interpret. This leads to late referral for specialized assessment or inappropriate interventions. To facilitate the early identification of reading difficulties in English, we developed a method of screening that is theory-driven yet suitable for…

  3. Organic rankine cycle waste heat applications

    DOEpatents

    Brasz, Joost J.; Biederman, Bruce P.

    2007-02-13

    A machine designed as a centrifugal compressor is applied as an organic rankine cycle turbine by operating the machine in reverse. In order to accommodate the higher pressures when operating as a turbine, a suitable refrigerant is chosen such that the pressures and temperatures are maintained within established limits. Such an adaptation of existing, relatively inexpensive equipment to an application that may be otherwise uneconomical, allows for the convenient and economical use of energy that would be otherwise lost by waste heat to the atmosphere.

  4. Towards the Teraflop CFD

    NASA Technical Reports Server (NTRS)

    Schreiber, Robert; Simon, Horst D.

    1992-01-01

    We are surveying current projects in the area of parallel supercomputers. The machines considered here will become commercially available in the 1990 - 1992 time frame. All are suitable for exploring the critical issues in applying parallel processors to large scale scientific computations, in particular CFD calculations. This chapter presents an overview of the surveyed machines, and a detailed analysis of the various architectural and technology approaches taken. Particular emphasis is placed on the feasibility of a Teraflops capability following the paths proposed by various developers.

  5. Recursive computer architecture for VLSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Treleaven, P.C.; Hopkins, R.P.

    1982-01-01

    A general-purpose computer architecture based on the concept of recursion and suitable for VLSI computer systems built from replicated (lego-like) computing elements is presented. The recursive computer architecture is defined by presenting a program organisation, a machine organisation and an experimental machine implementation oriented to VLSI. The experimental implementation is being restricted to simple, identical microcomputers each containing a memory, a processor and a communications capability. This future generation of lego-like computer systems are termed fifth generation computers by the Japanese. 30 references.

  6. On the suitability of the connection machine for direct particle simulation

    NASA Technical Reports Server (NTRS)

    Dagum, Leonard

    1990-01-01

    The algorithmic structure was examined of the vectorizable Stanford particle simulation (SPS) method and the structure is reformulated in data parallel form. Some of the SPS algorithms can be directly translated to data parallel, but several of the vectorizable algorithms have no direct data parallel equivalent. This requires the development of new, strictly data parallel algorithms. In particular, a new sorting algorithm is developed to identify collision candidates in the simulation and a master/slave algorithm is developed to minimize communication cost in large table look up. Validation of the method is undertaken through test calculations for thermal relaxation of a gas, shock wave profiles, and shock reflection from a stationary wall. A qualitative measure is provided of the performance of the Connection Machine for direct particle simulation. The massively parallel architecture of the Connection Machine is found quite suitable for this type of calculation. However, there are difficulties in taking full advantage of this architecture because of lack of a broad based tradition of data parallel programming. An important outcome of this work has been new data parallel algorithms specifically of use for direct particle simulation but which also expand the data parallel diction.

  7. Biomachining - A new approach for micromachining of metals

    NASA Astrophysics Data System (ADS)

    Vigneshwaran, S. C. Sakthi; Ramakrishnan, R.; Arun Prakash, C.; Sashank, C.

    2018-04-01

    Machining is the process of removal of material from workpiece. Machining can be done by physical, chemical or biological methods. Though physical and chemical methods have been widely used in machining process, they have their own disadvantages such as development of heat affected zone and usage of hazardous chemicals. Biomachining is the machining process in which bacteria is used to remove material from the metal parts. Chemolithotrophic bacteria such as Acidothiobacillus ferroxidans has been used in biomachining of metals like copper, iron etc. These bacteria are used because of their property of catalyzing the oxidation of inorganic substances. Biomachining is a suitable process for micromachining of metals. This paper reviews the biomachining process and various mechanisms involved in biomachining. This paper also briefs about various parameters/factors to be considered in biomachining and also the effect of those parameters on metal removal rate.

  8. Design Study for a Free-piston Vuilleumier Cycle Heat Pump

    NASA Astrophysics Data System (ADS)

    Matsue, Junji; Hoshino, Norimasa; Ikumi, Yonezou; Shirai, Hiroyuki

    Conceptual design for a free-piston Vuilleumier cycle heat pump machine was proposed. The machine was designed based upon the numerical results of a dynamic analysis method. The method included the effect of self excitation vibration with dissipation caused by the flow friction of an oscillating working gas flow and solid friction of seals. It was found that the design values of reciprocating masses and spring constants proposed in published papers related to this study were suitable for practical use. The fundamental effects of heat exchanger elements on dynamic behaviors of the machine were clarified. It has been pointed out that some improvements were required for thermodynamic analysis of heat exchangers and working spaces.

  9. Comparative Analysis of Automatic Exudate Detection between Machine Learning and Traditional Approaches

    NASA Astrophysics Data System (ADS)

    Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas

    To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.

  10. An imperialist competitive algorithm for virtual machine placement in cloud computing

    NASA Astrophysics Data System (ADS)

    Jamali, Shahram; Malektaji, Sepideh; Analoui, Morteza

    2017-05-01

    Cloud computing, the recently emerged revolution in IT industry, is empowered by virtualisation technology. In this paradigm, the user's applications run over some virtual machines (VMs). The process of selecting proper physical machines to host these virtual machines is called virtual machine placement. It plays an important role on resource utilisation and power efficiency of cloud computing environment. In this paper, we propose an imperialist competitive-based algorithm for the virtual machine placement problem called ICA-VMPLC. The base optimisation algorithm is chosen to be ICA because of its ease in neighbourhood movement, good convergence rate and suitable terminology. The proposed algorithm investigates search space in a unique manner to efficiently obtain optimal placement solution that simultaneously minimises power consumption and total resource wastage. Its final solution performance is compared with several existing methods such as grouping genetic and ant colony-based algorithms as well as bin packing heuristic. The simulation results show that the proposed method is superior to other tested algorithms in terms of power consumption, resource wastage, CPU usage efficiency and memory usage efficiency.

  11. A novel assay for monoacylglycerol hydrolysis suitable for high-throughput screening.

    PubMed

    Brengdahl, Johan; Fowler, Christopher J

    2006-12-01

    A simple assay for monoacylglycerol hydrolysis suitable for high-throughput screening is described. The assay uses [(3)H]2-oleoylglycerol as substrate, with the tritium label in the glycerol part of the molecule and the use of phenyl sepharose gel to separate the hydrolyzed product ([(3)H]glycerol) from substrate. Using cytosolic fractions derived from rat cerebella as a source of hydrolytic activity, the assay gives the appropriate pH profile and sensitivity to inhibition with compounds known to inhibit hydrolysis of this substrate. The assay could also be adapted to a 96-well plate format, using C6 cells as the source of hydrolytic activity. Thus the assay is simple and appropriate for high-throughput screening of inhibitors of monoacylglycerol hydrolysis.

  12. Evaluation of the smoke density chamber as an apparatus for fire toxicity screening tests

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Labossiere, L. A.

    1976-01-01

    The smoke density chamber is perhaps the most widely used apparatus for smoke measurements. Because of its availability, it has been proposed as an apparatus for evaluating fire toxicity. The standard apparatus and procedure were not found suitable for toxicity screening tests using laboratory animals, because not enough materials of interest produced animal mortality or even incapacitation under standard test conditions. With modifications, the chamber offers greater promise as a screening tool, but other tests specifically designed to measure relative toxicity may be more cost-effective. Where one-dimensional heat flux is a requirement, the chamber is the most suitable apparatus available. It should be improved in regard to visibility of animals and ease of cleaning.

  13. Quantifying Pollutant Emissions from Office Equipment Phase IReport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.

    2006-12-01

    Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants ofmore » interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment with respect to human exposures. The more detailed studies of the next phase of research (Phase II) are meant to characterize changes in emissions with time and may identify factors that can be modified to reduce emissions. These measurements may identify 'win-win' situations in which low energy consumption machines have lower pollutant emissions. This information will be used to compare machines to determine if some are substantially better than their peers with respect to their emissions of pollutants.« less

  14. Study on Parallel 2-DOF Rotation Machanism in Radar

    NASA Astrophysics Data System (ADS)

    Jiang, Ming; Hu, Xuelong; Liu, Lei; Yu, Yunfei

    The spherical parallel machine has become the world's academic and industrial focus of the field in recent years due to its simple and economical manufacture as well as its structural compactness especially suitable for areas where space gesture changes. This paper dwells upon its present research and development home and abroad. The newer machine (RGRR-II) can rotate around the axis z within 360° and the axis y1 from -90° to +90°. It has the advantages such as less moving parts (only 3 parts), larger ratio of work space to machine size, zero mechanic coupling, no singularity. Constructing rotation machine with spherical parallel 2-DOF rotation join (RGRR-II) may realize semispherical movement with zero dead point and extent the range. Control card (PA8000NT Series CNC) is installed in the computer. The card can run the corresponding software which realizes radar movement control. The machine meets the need of radars in plane and satellite which require larger detection range, lighter weight and compacter structure.

  15. Phoenito experiments: combining the strengths of commercial crystallization automation.

    PubMed

    Newman, Janet; Pham, Tam M; Peat, Thomas S

    2008-11-01

    The use of crystallization robots for initial screening in macromolecular crystallization is well established. This paper describes how four general optimization techniques, growth-rate modulation, fine screening, seeding and additive screening, have been adapted for automation in a medium-throughput crystallization service facility. The use of automation for more challenging optimization experiments is discussed, as is a novel way of using both the Mosquito and the Phoenix nano-dispensing robots during the setup of a single crystallization plate. This dual-dispenser technique plays to the strengths of both machines.

  16. Phoenito experiments: combining the strengths of commercial crystallization automation

    PubMed Central

    Newman, Janet; Pham, Tam M.; Peat, Thomas S.

    2008-01-01

    The use of crystallization robots for initial screening in macromolecular crystallization is well established. This paper describes how four general optimization techniques, growth-rate modulation, fine screening, seeding and additive screening, have been adapted for automation in a medium-throughput crystallization service facility. The use of automation for more challenging optimization experiments is discussed, as is a novel way of using both the Mosquito and the Phoenix nano-dispensing robots during the setup of a single crystallization plate. This dual-dispenser technique plays to the strengths of both machines. PMID:18997323

  17. An automated ranking platform for machine learning regression models for meat spoilage prediction using multi-spectral imaging and metabolic profiling.

    PubMed

    Estelles-Lopez, Lucia; Ropodi, Athina; Pavlidis, Dimitris; Fotopoulou, Jenny; Gkousari, Christina; Peyrodie, Audrey; Panagou, Efstathios; Nychas, George-John; Mohareb, Fady

    2017-09-01

    Over the past decade, analytical approaches based on vibrational spectroscopy, hyperspectral/multispectral imagining and biomimetic sensors started gaining popularity as rapid and efficient methods for assessing food quality, safety and authentication; as a sensible alternative to the expensive and time-consuming conventional microbiological techniques. Due to the multi-dimensional nature of the data generated from such analyses, the output needs to be coupled with a suitable statistical approach or machine-learning algorithms before the results can be interpreted. Choosing the optimum pattern recognition or machine learning approach for a given analytical platform is often challenging and involves a comparative analysis between various algorithms in order to achieve the best possible prediction accuracy. In this work, "MeatReg", a web-based application is presented, able to automate the procedure of identifying the best machine learning method for comparing data from several analytical techniques, to predict the counts of microorganisms responsible of meat spoilage regardless of the packaging system applied. In particularly up to 7 regression methods were applied and these are ordinary least squares regression, stepwise linear regression, partial least square regression, principal component regression, support vector regression, random forest and k-nearest neighbours. MeatReg" was tested with minced beef samples stored under aerobic and modified atmosphere packaging and analysed with electronic nose, HPLC, FT-IR, GC-MS and Multispectral imaging instrument. Population of total viable count, lactic acid bacteria, pseudomonads, Enterobacteriaceae and B. thermosphacta, were predicted. As a result, recommendations of which analytical platforms are suitable to predict each type of bacteria and which machine learning methods to use in each case were obtained. The developed system is accessible via the link: www.sorfml.com. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Noninvasive prostate cancer screening based on serum surface-enhanced Raman spectroscopy and support vector machine

    NASA Astrophysics Data System (ADS)

    Li, Shaoxin; Zhang, Yanjiao; Xu, Junfa; Li, Linfang; Zeng, Qiuyao; Lin, Lin; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Liu, Songhao

    2014-09-01

    This study aims to present a noninvasive prostate cancer screening methods using serum surface-enhanced Raman scattering (SERS) and support vector machine (SVM) techniques through peripheral blood sample. SERS measurements are performed using serum samples from 93 prostate cancer patients and 68 healthy volunteers by silver nanoparticles. Three types of kernel functions including linear, polynomial, and Gaussian radial basis function (RBF) are employed to build SVM diagnostic models for classifying measured SERS spectra. For comparably evaluating the performance of SVM classification models, the standard multivariate statistic analysis method of principal component analysis (PCA) is also applied to classify the same datasets. The study results show that for the RBF kernel SVM diagnostic model, the diagnostic accuracy of 98.1% is acquired, which is superior to the results of 91.3% obtained from PCA methods. The receiver operating characteristic curve of diagnostic models further confirm above research results. This study demonstrates that label-free serum SERS analysis technique combined with SVM diagnostic algorithm has great potential for noninvasive prostate cancer screening.

  19. The Harvard Clean Energy Project: High-throughput screening of organic photovoltaic materials using cheminformatics, machine learning, and pattern recognition

    NASA Astrophysics Data System (ADS)

    Olivares-Amaya, Roberto; Hachmann, Johannes; Amador-Bedolla, Carlos; Daly, Aidan; Jinich, Adrian; Atahan-Evrenk, Sule; Boixo, Sergio; Aspuru-Guzik, Alán

    2012-02-01

    Organic photovoltaic devices have emerged as competitors to silicon-based solar cells, currently reaching efficiencies of over 9% and offering desirable properties for manufacturing and installation. We study conjugated donor polymers for high-efficiency bulk-heterojunction photovoltaic devices with a molecular library motivated by experimental feasibility. We use quantum mechanics and a distributed computing approach to explore this vast molecular space. We will detail the screening approach starting from the generation of the molecular library, which can be easily extended to other kinds of molecular systems. We will describe the screening method for these materials which ranges from descriptor models, ubiquitous in the drug discovery community, to eventually reaching first principles quantum chemistry methods. We will present results on the statistical analysis, based principally on machine learning, specifically partial least squares and Gaussian processes. Alongside, clustering methods and the use of the hypergeometric distribution reveal moieties important for the donor materials and allow us to quantify structure-property relationships. These efforts enable us to accelerate materials discovery in organic photovoltaics through our collaboration with experimental groups.

  20. Mining hidden data to predict patient prognosis: texture feature extraction and machine learning in mammography

    NASA Astrophysics Data System (ADS)

    Leighs, J. A.; Halling-Brown, M. D.; Patel, M. N.

    2018-03-01

    The UK currently has a national breast cancer-screening program and images are routinely collected from a number of screening sites, representing a wealth of invaluable data that is currently under-used. Radiologists evaluate screening images manually and recall suspicious cases for further analysis such as biopsy. Histological testing of biopsy samples confirms the malignancy of the tumour, along with other diagnostic and prognostic characteristics such as disease grade. Machine learning is becoming increasingly popular for clinical image classification problems, as it is capable of discovering patterns in data otherwise invisible. This is particularly true when applied to medical imaging features; however clinical datasets are often relatively small. A texture feature extraction toolkit has been developed to mine a wide range of features from medical images such as mammograms. This study analysed a dataset of 1,366 radiologist-marked, biopsy-proven malignant lesions obtained from the OPTIMAM Medical Image Database (OMI-DB). Exploratory data analysis methods were employed to better understand extracted features. Machine learning techniques including Classification and Regression Trees (CART), ensemble methods (e.g. random forests), and logistic regression were applied to the data to predict the disease grade of the analysed lesions. Prediction scores of up to 83% were achieved; sensitivity and specificity of the models trained have been discussed to put the results into a clinical context. The results show promise in the ability to predict prognostic indicators from the texture features extracted and thus enable prioritisation of care for patients at greatest risk.

  1. Use of machine learning to improve autism screening and diagnostic instruments: effectiveness, efficiency, and multi-instrument fusion

    PubMed Central

    Bone, Daniel; Bishop, Somer; Black, Matthew P.; Goodwin, Matthew S.; Lord, Catherine; Narayanan, Shrikanth S.

    2016-01-01

    Background Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely-used ASD screening and diagnostic tools. Methods The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders (DD), split at age 10. Algorithms were created via a robust ML classifier, support vector machine (SVM), while targeting best-estimate clinical diagnosis of ASD vs. non-ASD. Parameter settings were tuned in multiple levels of cross-validation. Results The created algorithms were more effective (higher performing) than current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. Conclusions ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. PMID:27090613

  2. Use of machine learning to improve autism screening and diagnostic instruments: effectiveness, efficiency, and multi-instrument fusion.

    PubMed

    Bone, Daniel; Bishop, Somer L; Black, Matthew P; Goodwin, Matthew S; Lord, Catherine; Narayanan, Shrikanth S

    2016-08-01

    Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely used ASD screening and diagnostic tools. The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders, split at age 10. Algorithms were created via a robust ML classifier, support vector machine, while targeting best-estimate clinical diagnosis of ASD versus non-ASD. Parameter settings were tuned in multiple levels of cross-validation. The created algorithms were more effective (higher performing) than the current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight the limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. © 2016 Association for Child and Adolescent Mental Health.

  3. Performance of machine-learning scoring functions in structure-based virtual screening

    PubMed Central

    Wójcikowski, Maciej; Ballester, Pedro J.; Siedlecki, Pawel

    2017-01-01

    Classical scoring functions have reached a plateau in their performance in virtual screening and binding affinity prediction. Recently, machine-learning scoring functions trained on protein-ligand complexes have shown great promise in small tailored studies. They have also raised controversy, specifically concerning model overfitting and applicability to novel targets. Here we provide a new ready-to-use scoring function (RF-Score-VS) trained on 15 426 active and 893 897 inactive molecules docked to a set of 102 targets. We use the full DUD-E data sets along with three docking tools, five classical and three machine-learning scoring functions for model building and performance assessment. Our results show RF-Score-VS can substantially improve virtual screening performance: RF-Score-VS top 1% provides 55.6% hit rate, whereas that of Vina only 16.2% (for smaller percent the difference is even more encouraging: RF-Score-VS top 0.1% achieves 88.6% hit rate for 27.5% using Vina). In addition, RF-Score-VS provides much better prediction of measured binding affinity than Vina (Pearson correlation of 0.56 and −0.18, respectively). Lastly, we test RF-Score-VS on an independent test set from the DEKOIS benchmark and observed comparable results. We provide full data sets to facilitate further research in this area (http://github.com/oddt/rfscorevs) as well as ready-to-use RF-Score-VS (http://github.com/oddt/rfscorevs_binary). PMID:28440302

  4. A general-purpose machine learning framework for predicting properties of inorganic materials

    DOE PAGES

    Ward, Logan; Agrawal, Ankit; Choudhary, Alok; ...

    2016-08-26

    A very active area of materials research is to devise methods that use machine learning to automatically extract predictive models from existing materials data. While prior examples have demonstrated successful models for some applications, many more applications exist where machine learning can make a strong impact. To enable faster development of machine-learning-based models for such applications, we have created a framework capable of being applied to a broad range of materials data. Our method works by using a chemically diverse list of attributes, which we demonstrate are suitable for describing a wide variety of properties, and a novel method formore » partitioning the data set into groups of similar materials to boost the predictive accuracy. In this manuscript, we demonstrate how this new method can be used to predict diverse properties of crystalline and amorphous materials, such as band gap energy and glass-forming ability.« less

  5. A general-purpose machine learning framework for predicting properties of inorganic materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Logan; Agrawal, Ankit; Choudhary, Alok

    A very active area of materials research is to devise methods that use machine learning to automatically extract predictive models from existing materials data. While prior examples have demonstrated successful models for some applications, many more applications exist where machine learning can make a strong impact. To enable faster development of machine-learning-based models for such applications, we have created a framework capable of being applied to a broad range of materials data. Our method works by using a chemically diverse list of attributes, which we demonstrate are suitable for describing a wide variety of properties, and a novel method formore » partitioning the data set into groups of similar materials to boost the predictive accuracy. In this manuscript, we demonstrate how this new method can be used to predict diverse properties of crystalline and amorphous materials, such as band gap energy and glass-forming ability.« less

  6. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras.

    PubMed

    Quinn, Mark Kenneth; Spinosa, Emanuele; Roberts, David A

    2017-07-25

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access.

  7. Learning molecular energies using localized graph kernels.

    PubMed

    Ferré, Grégoire; Haut, Terry; Barros, Kipton

    2017-03-21

    Recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturally incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. We benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.

  8. Learning molecular energies using localized graph kernels

    NASA Astrophysics Data System (ADS)

    Ferré, Grégoire; Haut, Terry; Barros, Kipton

    2017-03-01

    Recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturally incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. We benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.

  9. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras

    PubMed Central

    Spinosa, Emanuele; Roberts, David A.

    2017-01-01

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access. PMID:28757553

  10. Running accuracy analysis of a 3-RRR parallel kinematic machine considering the deformations of the links

    NASA Astrophysics Data System (ADS)

    Wang, Liping; Jiang, Yao; Li, Tiemin

    2014-09-01

    Parallel kinematic machines have drawn considerable attention and have been widely used in some special fields. However, high precision is still one of the challenges when they are used for advanced machine tools. One of the main reasons is that the kinematic chains of parallel kinematic machines are composed of elongated links that can easily suffer deformations, especially at high speeds and under heavy loads. A 3-RRR parallel kinematic machine is taken as a study object for investigating its accuracy with the consideration of the deformations of its links during the motion process. Based on the dynamic model constructed by the Newton-Euler method, all the inertia loads and constraint forces of the links are computed and their deformations are derived. Then the kinematic errors of the machine are derived with the consideration of the deformations of the links. Through further derivation, the accuracy of the machine is given in a simple explicit expression, which will be helpful to increase the calculating speed. The accuracy of this machine when following a selected circle path is simulated. The influences of magnitude of the maximum acceleration and external loads on the running accuracy of the machine are investigated. The results show that the external loads will deteriorate the accuracy of the machine tremendously when their direction coincides with the direction of the worst stiffness of the machine. The proposed method provides a solution for predicting the running accuracy of the parallel kinematic machines and can also be used in their design optimization as well as selection of suitable running parameters.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivares, Stefano

    We investigate the performance of a selective cloning machine based on linear optical elements and Gaussian measurements, which allows one to clone at will one of the two incoming input states. This machine is a complete generalization of a 1{yields}2 cloning scheme demonstrated by Andersen et al. [Phys. Rev. Lett. 94, 240503 (2005)]. The input-output fidelity is studied for a generic Gaussian input state, and the effect of nonunit quantum efficiency is also taken into account. We show that, if the states to be cloned are squeezed states with known squeezing parameter, then the fidelity can be enhanced using amore » third suitable squeezed state during the final stage of the cloning process. A binary communication protocol based on the selective cloning machine is also discussed.« less

  12. Heuristic lipophilicity potential for computer-aided rational drug design: optimizations of screening functions and parameters.

    PubMed

    Du, Q; Mezey, P G

    1998-09-01

    In this research we test and compare three possible atom-based screening functions used in the heuristic molecular lipophilicity potential (HMLP). Screening function 1 is a power distance-dependent function, bi/[formula: see text] Ri-r [formula: see text] gamma, screening function 2 is an exponential distance-dependent function, bi exp(-[formula: see text] Ri-r [formula: see text]/d0), and screening function 3 is a weighted distance-dependent function, sign(bi) exp[-xi [formula: see text] Ri-r [formula: see text]/magnitude of bi)]. For every screening function, the parameters (gamma, d0, and xi) are optimized using 41 common organic molecules of 4 types of compounds: aliphatic alcohols, aliphatic carboxylic acids, aliphatic amines, and aliphatic alkanes. The results of calculations show that screening function 3 cannot give chemically reasonable results, however, both the power screening function and the exponential screening function give chemically satisfactory results. There are two notable differences between screening functions 1 and 2. First, the exponential screening function has larger values in the short distance than the power screening function, therefore more influence from the nearest neighbors is involved using screening function 2 than screening function 1. Second, the power screening function has larger values in the long distance than the exponential screening function, therefore screening function 1 is effected by atoms at long distance more than screening function 2. For screening function 1, the suitable range of parameter gamma is 1.0 < gamma < 3.0, gamma = 2.3 is recommended, and gamma = 2.0 is the nearest integral value. For screening function 2, the suitable range of parameter d0 is 1.5 < d0 < 3.0, and d0 = 2.0 is recommended. HMLP developed in this research provides a potential tool for computer-aided three-dimensional drug design.

  13. Optimization of temperature field of tobacco heat shrink machine

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; Yang, Hai; Sun, Dong; Xu, Mingyang

    2018-06-01

    A company currently shrinking machine in the course of the film shrinkage is not compact, uneven temperature, resulting in poor quality of the shrinkage of the surface film. To solve this problem, the simulation and optimization of the temperature field are performed by using the k-epsilon turbulence model and the MRF model in fluent. The simulation results show that after the mesh screen structure is installed at the suction inlet of the centrifugal fan, the suction resistance of the fan can be increased and the eddy current intensity caused by the high-speed rotation of the fan can be improved, so that the internal temperature continuity of the heat shrinkable machine is Stronger.

  14. Automated macromolecular crystallization screening

    DOEpatents

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  15. Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg

    2009-06-01

    This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2more » storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.« less

  16. Resource Sharing in a Network of Personal Computers.

    DTIC Science & Technology

    1982-12-01

    magnetic card, or a more secure identifier such as a machine-read fingerprint or voiceprint. Security and Protection 57 (3) (R, key) (5) (RB’ B, key) (B...operations are invoked via messages, a program and its terminal can easily be located on separate machines. In Spice, an interface process called Canvas ...request of a process. In Canvas , a process can only subdivide windows that it already has. On the other hand, the window manager treats the screen as a

  17. A STUDY TO DETERMINE THE EXTENT TO WHICH INSTRUCTION TO UNIVERSITY FRESHMEN IN THE USE OF THE UNIVERSITY LIBRARY CAN BE TURNED OVER TO TEACHING MACHINES. FINAL REPORT.

    ERIC Educational Resources Information Center

    WENDT, PAUL R.; AND OTHERS

    A BRANCHING TEACHING-MACHINE PROGRAM WAS DEVELOPED TO TEACH FRESHMEN TO LOCATE MATERIALS WITHOUT THE HELP OF A LIBRARIAN. THE STUDENT WAS SEATED IN FRONT OF A CONSOLE IN A DARKENED, QUIET, AIR-CONDITIONED ROOM. USING A KEYBOARD, THE STUDENT WAS ABLE TO CALL UP ON A SCREEN ANY ONE OF 150 SLIDES. PICTORIAL AND PERFORMANCE FRAMES WERE DEVELOPED TO…

  18. A neurite quality index and machine vision software for improved quantification of neurodegeneration.

    PubMed

    Romero, Peggy; Miller, Ted; Garakani, Arman

    2009-12-01

    Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.

  19. Classifying BCI signals from novice users with extreme learning machine

    NASA Astrophysics Data System (ADS)

    Rodríguez-Bermúdez, Germán; Bueno-Crespo, Andrés; José Martinez-Albaladejo, F.

    2017-07-01

    Brain computer interface (BCI) allows to control external devices only with the electrical activity of the brain. In order to improve the system, several approaches have been proposed. However it is usual to test algorithms with standard BCI signals from experts users or from repositories available on Internet. In this work, extreme learning machine (ELM) has been tested with signals from 5 novel users to compare with standard classification algorithms. Experimental results show that ELM is a suitable method to classify electroencephalogram signals from novice users.

  20. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  1. Study on the Optimization and Process Modeling of the Rotary Ultrasonic Machining of Zerodur Glass-Ceramic

    NASA Astrophysics Data System (ADS)

    Pitts, James Daniel

    Rotary ultrasonic machining (RUM), a hybrid process combining ultrasonic machining and diamond grinding, was created to increase material removal rates for the fabrication of hard and brittle workpieces. The objective of this research was to experimentally derive empirical equations for the prediction of multiple machined surface roughness parameters for helically pocketed rotary ultrasonic machined Zerodur glass-ceramic workpieces by means of a systematic statistical experimental approach. A Taguchi parametric screening design of experiments was employed to systematically determine the RUM process parameters with the largest effect on mean surface roughness. Next empirically determined equations for the seven common surface quality metrics were developed via Box-Behnken surface response experimental trials. Validation trials were conducted resulting in predicted and experimental surface roughness in varying levels of agreement. The reductions in cutting force and tool wear associated with RUM, reported by previous researchers, was experimentally verified to also extended to helical pocketing of Zerodur glass-ceramic.

  2. The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology

    NASA Astrophysics Data System (ADS)

    Messerotti, Mauro

    2010-05-01

    The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.

  3. Protein crystal screening and characterization for serial femtosecond nanocrystallography

    PubMed Central

    Darmanin, Connie; Strachan, Jamie; Adda, Christopher G.; Ve, Thomas; Kobe, Bostjan; Abbey, Brian

    2016-01-01

    The recent development of X-ray free electron lasers (XFELs) has spurred the development of serial femtosecond nanocrystallography (SFX) which, for the first time, is enabling structure retrieval from sub-micron protein crystals. Although there are already a growing number of structures published using SFX, the technology is still very new and presents a number of unique challenges as well as opportunities for structural biologists. One of the biggest barriers to the success of SFX experiments is the preparation and selection of suitable protein crystal samples. Here we outline a protocol for preparing and screening for suitable XFEL targets. PMID:27139248

  4. Fabrication and characteristics of experimental radiographic amplifier screens. [image transducers with improved image contrast and resolution

    NASA Technical Reports Server (NTRS)

    Szepesi, Z.

    1978-01-01

    The fabrication process and transfer characteristics for solid state radiographic image transducers (radiographic amplifier screens) are described. These screens are for use in realtime nondestructive evaluation procedures that require large format radiographic images with contrast and resolution capabilities unavailable with conventional fluoroscopic screens. The screens are suitable for in-motion, on-line radiographic inspection by means of closed circuit television. Experimental effort was made to improve image quality and response to low energy (5 kV and up) X-rays.

  5. Dynamism in a Semiconductor Industrial Machine Allocation Problem using a Hybrid of the Bio-inspired and Musical-Harmony Approach

    NASA Astrophysics Data System (ADS)

    Kalsom Yusof, Umi; Nor Akmal Khalid, Mohd

    2015-05-01

    Semiconductor industries need to constantly adjust to the rapid pace of change in the market. Most manufactured products usually have a very short life cycle. These scenarios imply the need to improve the efficiency of capacity planning, an important aspect of the machine allocation plan known for its complexity. Various studies have been performed to balance productivity and flexibility in the flexible manufacturing system (FMS). Many approaches have been developed by the researchers to determine the suitable balance between exploration (global improvement) and exploitation (local improvement). However, not much work has been focused on the domain of machine allocation problem that considers the effects of machine breakdowns. This paper develops a model to minimize the effect of machine breakdowns, thus increasing the productivity. The objectives are to minimize system unbalance and makespan as well as increase throughput while satisfying the technological constraints such as machine time availability. To examine the effectiveness of the proposed model, results for throughput, system unbalance and makespan on real industrial datasets were performed with applications of intelligence techniques, that is, a hybrid of genetic algorithm and harmony search. The result aims to obtain a feasible solution to the domain problem.

  6. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm.

    PubMed

    Heidari, Morteza; Khuzani, Abolfazl Zargari; Hollingsworth, Alan B; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-01-30

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  7. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-02-01

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  8. Active machine learning-driven experimentation to determine compound effects on protein patterns.

    PubMed

    Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F

    2016-02-03

    High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance.

  9. Nanomeasuring and nanopositioning engineering

    NASA Astrophysics Data System (ADS)

    Jäger, G.; Hausotte, T.; Manske, E.; Büchner, H.-J.; Mastylo, R.; Dorozhovets, N.; Hofmann, N.

    2006-11-01

    The paper describes traceable nanometrology based on a nanopositioning machine with integrated nanoprobes. The operation of a high-precision long range three-dimensional nanopositioning and nanomeasuring machine (NPM-Machine) having a resolution of 0,1 nm over the positioning and measuring range of 25 mm x 25 mm x 5 mm is explained. An Abbe offset-free design of three miniature plan mirror interferometers and applying a new concept for compensating systematic errors resulting from mechanical guide systems provide very small uncertainties of measurement. The NPM-Machine has been developed by the Institute of Process Measurement and Sensor Technology of the Technische Universitaet Ilmenau and manufactured by the SIOS Messtechnik GmbH Ilmenau. The machines are operating successfully in several German and foreign research institutes including the Physikalisch-Technische Bundesanstalt (PTB), Germany. The integration of several, optical and tactile probe systems and nanotools makes the NPM-Machine suitable for various tasks, such as large-area scanning probe microscopy, mask and wafer inspection, nanostructuring, biotechnology and genetic engineering as well as measuring mechanical precision workpieces, precision treatment and for engineering new material. Various developed probe systems have been integrated into the NPM-Machine. The measurement results of a focus sensor, metrological AFM, white light sensor, tactile stylus probe and of a 3D-micro-touch-probe are presented. Single beam-, double beam- and triple beam interferometers built in the NPM-Machine for six degrees of freedom measurements are described.

  10. Development of TUA-WELLNESS screening tool for screening risk of mild cognitive impairment among community-dwelling older adults

    PubMed Central

    Vanoh, Divya; Shahar, Suzana; Rosdinom, Razali; Din, Normah Che; Yahya, Hanis Mastura; Omar, Azahadi

    2016-01-01

    Background and aim Focus on screening for cognitive impairment has to be given particular importance because of the rising older adult population. Thus, this study aimed to develop and assess a brief screening tool consisting of ten items that can be self-administered by community dwelling older adults (TUA-WELLNESS). Methodology A total of 1,993 noninstitutionalized respondents aged 60 years and above were selected for this study. The dependent variable was mild cognitive impairment (MCI) assessed using neuropsychological test batteries. The items for the screening tool comprised a wide range of factors that were chosen mainly from the analysis of ordinal logistic regression (OLR) and based on past literature. A suitable cut-off point was developed using receiver operating characteristic analysis. Results A total of ten items were included in the screening tool. From the ten items, eight were found to be significant by ordinal logistic regression and the remaining two items were part of the tool because they showed strong association with cognitive impairment in previous studies. The area under curve (AUC), sensitivity, and specificity for cut-off 11 were 0.84%, 83.3%, and 73.4%, respectively. Conclusion TUA-WELLNESS screening tool has been used to screen for major risk factors of MCI among Malaysian older adults. This tool is only suitable for basic MCI risk screening purpose and should not be used for diagnostic purpose. PMID:27274208

  11. Comparison of Deep Learning With Multiple Machine Learning Methods and Metrics Using Diverse Drug Discovery Data Sets.

    PubMed

    Korotcov, Alexandru; Tkachenko, Valery; Russo, Daniel P; Ekins, Sean

    2017-12-04

    Machine learning methods have been applied to many data sets in pharmaceutical research for several decades. The relative ease and availability of fingerprint type molecular descriptors paired with Bayesian methods resulted in the widespread use of this approach for a diverse array of end points relevant to drug discovery. Deep learning is the latest machine learning algorithm attracting attention for many of pharmaceutical applications from docking to virtual screening. Deep learning is based on an artificial neural network with multiple hidden layers and has found considerable traction for many artificial intelligence applications. We have previously suggested the need for a comparison of different machine learning methods with deep learning across an array of varying data sets that is applicable to pharmaceutical research. End points relevant to pharmaceutical research include absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) properties, as well as activity against pathogens and drug discovery data sets. In this study, we have used data sets for solubility, probe-likeness, hERG, KCNQ1, bubonic plague, Chagas, tuberculosis, and malaria to compare different machine learning methods using FCFP6 fingerprints. These data sets represent whole cell screens, individual proteins, physicochemical properties as well as a data set with a complex end point. Our aim was to assess whether deep learning offered any improvement in testing when assessed using an array of metrics including AUC, F1 score, Cohen's kappa, Matthews correlation coefficient and others. Based on ranked normalized scores for the metrics or data sets Deep Neural Networks (DNN) ranked higher than SVM, which in turn was ranked higher than all the other machine learning methods. Visualizing these properties for training and test sets using radar type plots indicates when models are inferior or perhaps over trained. These results also suggest the need for assessing deep learning further using multiple metrics with much larger scale comparisons, prospective testing as well as assessment of different fingerprints and DNN architectures beyond those used.

  12. 14 CFR 1214.504 - Screening requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... meaning of this subsection. (iii) All information obtained by medical or Employee Assistance Program... Critical Space System Personnel Reliability Program § 1214.504 Screening requirements. (a) Only those... program. 4 See footnote 1 to § 1214.502(e). (b) Determination of suitability for assignment to mission...

  13. A Boltzmann machine for the organization of intelligent machines

    NASA Technical Reports Server (NTRS)

    Moed, Michael C.; Saridis, George N.

    1989-01-01

    In the present technological society, there is a major need to build machines that would execute intelligent tasks operating in uncertain environments with minimum interaction with a human operator. Although some designers have built smart robots, utilizing heuristic ideas, there is no systematic approach to design such machines in an engineering manner. Recently, cross-disciplinary research from the fields of computers, systems AI and information theory has served to set the foundations of the emerging area of the design of intelligent machines. Since 1977 Saridis has been developing an approach, defined as Hierarchical Intelligent Control, designed to organize, coordinate and execute anthropomorphic tasks by a machine with minimum interaction with a human operator. This approach utilizes analytical (probabilistic) models to describe and control the various functions of the intelligent machine structured by the intuitively defined principle of Increasing Precision with Decreasing Intelligence (IPDI) (Saridis 1979). This principle, even though resembles the managerial structure of organizational systems (Levis 1988), has been derived on an analytic basis by Saridis (1988). The purpose is to derive analytically a Boltzmann machine suitable for optimal connection of nodes in a neural net (Fahlman, Hinton, Sejnowski, 1985). Then this machine will serve to search for the optimal design of the organization level of an intelligent machine. In order to accomplish this, some mathematical theory of the intelligent machines will be first outlined. Then some definitions of the variables associated with the principle, like machine intelligence, machine knowledge, and precision will be made (Saridis, Valavanis 1988). Then a procedure to establish the Boltzmann machine on an analytic basis will be presented and illustrated by an example in designing the organization level of an Intelligent Machine. A new search technique, the Modified Genetic Algorithm, is presented and proved to converge to the minimum of a cost function. Finally, simulations will show the effectiveness of a variety of search techniques for the intelligent machine.

  14. Situation analysis for cervical cancer diagnosis and treatment in east, central and southern African countries.

    PubMed Central

    Chirenje, Z. M.; Rusakaniko, S.; Kirumbi, L.; Ngwalle, E. W.; Makuta-Tlebere, P.; Kaggwa, S.; Mpanju-Shumbusho, W.; Makoae, L.

    2001-01-01

    OBJECTIVE: To determine the factors influencing cervical cancer diagnosis and treatment in countries of East, Central and Southern Africa (ECSA). METHODS: Data were collected from randomly selected primary health care centres, district and provincial hospitals, and tertiary hospitals in each participating country. Health care workers were interviewed, using a questionnaire; the facilities for screening, diagnosing, and treating cervical cancer in each institution were recorded, using a previously designed checklist. FINDINGS: Although 95% of institutions at all health care levels in ECSA countries had the basic infrastructure to carry out cervical cytology screening, only a small percentage of women were actually screened. Lack of policy guidelines, infrequent supply of basic materials, and a lack of suitable qualified staff were the most common reasons reported. CONCLUSIONS: This study demonstrates that there is an urgent need for more investment in the diagnosis and treatment of cervical cancer in ECSA countries. In these, and other countries with low resources, suitable screening programmes should be established. PMID:11242819

  15. A cross docking pipeline for improving pose prediction and virtual screening performance

    NASA Astrophysics Data System (ADS)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2018-01-01

    Pose prediction and virtual screening performance of a molecular docking method depend on the choice of protein structures used for docking. Multiple structures for a target protein are often used to take into account the receptor flexibility and problems associated with a single receptor structure. However, the use of multiple receptor structures is computationally expensive when docking a large library of small molecules. Here, we propose a new cross-docking pipeline suitable to dock a large library of molecules while taking advantage of multiple target protein structures. Our method involves the selection of a suitable receptor for each ligand in a screening library utilizing ligand 3D shape similarity with crystallographic ligands. We have prospectively evaluated our method in D3R Grand Challenge 2 and demonstrated that our cross-docking pipeline can achieve similar or better performance than using either single or multiple-receptor structures. Moreover, our method displayed not only decent pose prediction performance but also better virtual screening performance over several other methods.

  16. Selection and propagation of highly graft-compatible Douglas-fir rootstocks—a case history.

    Treesearch

    Donald L. Copes

    1981-01-01

    Two populations of Douglas-fir trees were screened for graft compatibility. Two-stage testing procedures were used with either high- or low-intensity screening in the first step. Of 303 trees, 16 (5 percent) were found to be 90- to 100-percent graft compatible and suitable for seed orchard use as rootstocks. High-intensity screening in the first stage was more...

  17. Machine-Learning-Assisted Approach for Discovering Novel Inhibitors Targeting Bromodomain-Containing Protein 4.

    PubMed

    Xing, Jing; Lu, Wenchao; Liu, Rongfeng; Wang, Yulan; Xie, Yiqian; Zhang, Hao; Shi, Zhe; Jiang, Hao; Liu, Yu-Chih; Chen, Kaixian; Jiang, Hualiang; Luo, Cheng; Zheng, Mingyue

    2017-07-24

    Bromodomain-containing protein 4 (BRD4) is implicated in the pathogenesis of a number of different cancers, inflammatory diseases and heart failure. Much effort has been dedicated toward discovering novel scaffold BRD4 inhibitors (BRD4is) with different selectivity profiles and potential antiresistance properties. Structure-based drug design (SBDD) and virtual screening (VS) are the most frequently used approaches. Here, we demonstrate a novel, structure-based VS approach that uses machine-learning algorithms trained on the priori structure and activity knowledge to predict the likelihood that a compound is a BRD4i based on its binding pattern with BRD4. In addition to positive experimental data, such as X-ray structures of BRD4-ligand complexes and BRD4 inhibitory potencies, negative data such as false positives (FPs) identified from our earlier ligand screening results were incorporated into our knowledge base. We used the resulting data to train a machine-learning model named BRD4LGR to predict the BRD4i-likeness of a compound. BRD4LGR achieved a 20-30% higher AUC-ROC than that of Glide using the same test set. When conducting in vitro experiments against a library of previously untested, commercially available organic compounds, the second round of VS using BRD4LGR generated 15 new BRD4is. Moreover, inverting the machine-learning model provided easy access to structure-activity relationship (SAR) interpretation for hit-to-lead optimization.

  18. Roll-to-roll suitable short-pulsed laser scribing of organic photovoltaics and close-to-process characterization

    NASA Astrophysics Data System (ADS)

    Kuntze, Thomas; Wollmann, Philipp; Klotzbach, Udo; Fledderus, Henri

    2017-03-01

    The proper long term operation of organic electronic devices like organic photovoltaics OPV depends on their resistance to environmental influences such as permeation of water vapor. Major efforts are spent to encapsulate OPV. State of the art is sandwich-like encapsulation between two ultra-barrier foils. Sandwich encapsulation faces two major disadvantages: high costs ( 1/3 of total costs) and parasitic intrinsic water (sponge effects of the substrate foil). To fight these drawbacks, a promising approach is to use the OPV substrate itself as barrier by integration of an ultra-barrier coating, followed by alternating deposition and structuring of OPV functional layers. In effect, more functionality will be integrated into less material, and production steps are reduced in number. All processing steps must not influence the underneath barrier functionality, while all electrical functionalities must be maintained. As most reasonable structuring tool, short and ultrashort pulsed lasers USP are used. Laser machining applies to three layers: bottom electrode made of transparent conductive materials (P1), organic photovoltaic operative stack (P2) and top electrode (P3). In this paper, the machining of functional 110…250 nm layers of flexible OPV by USP laser systems is presented. Main focus is on structuring without damaging the underneath ultra-barrier layer. The close-to-process machining quality characterization is performed with the analysis tool "hyperspectral imaging" (HSI), which is checked crosswise with the "gold standard" Ca-test. It is shown, that both laser machining and quality controlling, are well suitable for R2R production of OPV.

  19. A finite state machine read-out chip for integrated surface acoustic wave sensors

    NASA Astrophysics Data System (ADS)

    Rakshit, Sambarta; Iliadis, Agis A.

    2015-01-01

    A finite state machine based integrated sensor circuit suitable for the read-out module of a monolithically integrated SAW sensor on Si is reported. The primary sensor closed loop consists of a voltage controlled oscillator (VCO), a peak detecting comparator, a finite state machine (FSM), and a monolithically integrated SAW sensor device. The output of the system oscillates within a narrow voltage range that correlates with the SAW pass-band response. The period of oscillation is of the order of the SAW phase delay. We use timing information from the FSM to convert SAW phase delay to an on-chip 10 bit digital output operating on the principle of time to digital conversion (TDC). The control inputs of this digital conversion block are generated by a second finite state machine operating under a divided system clock. The average output varies with changes in SAW center frequency, thus tracking mass sensing events in real time. Based on measured VCO gain of 16 MHz/V our system will convert a 10 kHz SAW frequency shift to a corresponding mean voltage shift of 0.7 mV. A corresponding shift in phase delay is converted to a one or two bit shift in the TDC output code. The system can handle alternate SAW center frequencies and group delays simply by adjusting the VCO control and TDC delay control inputs. Because of frequency to voltage and phase to digital conversion, this topology does not require external frequency counter setups and is uniquely suitable for full monolithic integration of autonomous sensor systems and tags.

  20. Lean energy analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Liana, N. A.; Amsyar, N.; Hilmy, I.; Yusof, MD

    2018-01-01

    The industrial sector in Malaysia is one of the main sectors that have high percentage of energy demand compared to other sector and this problem may lead to the future power shortage and increasing the production cost of a company. Suitable initiatives should be implemented by the industrial sectors to solve the issues such as by improving the machining system. In the past, the majority of the energy consumption in industry focus on lighting, HVAC and office section usage. Future trend, manufacturing process is also considered to be included in the energy analysis. A study on Lean Energy Analysis in a machining process is presented. Improving the energy efficiency in a lathe machine by enhancing the cutting parameters of turning process is discussed. Energy consumption of a lathe machine was analyzed in order to identify the effect of cutting parameters towards energy consumption. It was found that the combination of parameters for third run (spindle speed: 1065 rpm, depth of cut: 1.5 mm, feed rate: 0.3 mm/rev) was the most preferred and ideal to be used during the turning machining process as it consumed less energy usage.

  1. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  2. The development of mixer machine for organic animal feed production: Proposed study

    NASA Astrophysics Data System (ADS)

    Leman, A. M.; Wahab, R. Abdul; Zakaria, Supaat; Feriyanto, Dafit; Nor, M. I. F. Che Mohd; Muzarpar, Syafiq

    2017-09-01

    Mixer machine plays a major role in producing homogenous composition of animal feed. Long time production, inhomogeneous and minor agglomeration has been observed by existing mixer. Therefore, this paper proposed continuous mixer to enhance mixing efficiency with shorter time of mixing process in order to abbreviate the whole process in animal feed production. Through calculation of torque, torsion, bending, power and energy consumption will perform in mixer machine process. Proposed mixer machine is designed by two layer buckets with purpose for continuity of mixing process. Mixing process was performed by 4 blades which consists of various arm length such as 50, 100,150 and 225 mm in 60 rpm velocity clockwise rotation. Therefore by using this machine will produce the homogenous composition of animal feed through nutrition analysis and short operation time of mixing process approximately of 5 minutes. Therefore, the production of animal feed will suitable for various animals including poultry and aquatic fish. This mixer will available for various organic material in animal feed production. Therefore, this paper will highlights some areas such as continues animal feed supply chain and bio-based animal feed.

  3. Machinability evaluation of titanium alloys.

    PubMed

    Kikuchi, Masafumi; Okuno, Osamu

    2004-03-01

    In the present study, the machinability of titanium, Ti-6Al-4V, Ti-6A1-7Nb, and free-cutting brass was evaluated using a milling machine. The metals were slotted with square end mills under four cutting conditions. The cutting force and the rotational speed of the spindle were measured. The cutting forces for Ti-6Al-4V and Ti-6Al-7Nb were higher and that for brass was lower than that for titanium. The rotational speed of the spindle was barely affected by cutting. The cross sections of the Ti-6Al-4V and Ti-6Al-7Nb chips were more clearly serrated than those of titanium, which is an indication of difficult-to-cut metals. There was no marked difference in the surface roughness of the cut surfaces among the metals. Cutting force and the appearance of the metal chips were found to be useful as indices of machinability and will aid in the development of new alloys for dental CAD/CAM and the selection of suitable machining conditions.

  4. Using information from historical high-throughput screens to predict active compounds.

    PubMed

    Riniker, Sereina; Wang, Yuan; Jenkins, Jeremy L; Landrum, Gregory A

    2014-07-28

    Modern high-throughput screening (HTS) is a well-established approach for hit finding in drug discovery that is routinely employed in the pharmaceutical industry to screen more than a million compounds within a few weeks. However, as the industry shifts to more disease-relevant but more complex phenotypic screens, the focus has moved to piloting smaller but smarter chemically/biologically diverse subsets followed by an expansion around hit compounds. One standard method for doing this is to train a machine-learning (ML) model with the chemical fingerprints of the tested subset of molecules and then select the next compounds based on the predictions of this model. An alternative approach would be to take advantage of the wealth of bioactivity information contained in older (full-deck) screens using so-called HTS fingerprints, where each element of the fingerprint corresponds to the outcome of a particular assay, as input to machine-learning algorithms. We constructed HTS fingerprints using two collections of data: 93 in-house assays and 95 publicly available assays from PubChem. For each source, an additional set of 51 and 46 assays, respectively, was collected for testing. Three different ML methods, random forest (RF), logistic regression (LR), and naïve Bayes (NB), were investigated for both the HTS fingerprint and a chemical fingerprint, Morgan2. RF was found to be best suited for learning from HTS fingerprints yielding area under the receiver operating characteristic curve (AUC) values >0.8 for 78% of the internal assays and enrichment factors at 5% (EF(5%)) >10 for 55% of the assays. The RF(HTS-fp) generally outperformed the LR trained with Morgan2, which was the best ML method for the chemical fingerprint, for the majority of assays. In addition, HTS fingerprints were found to retrieve more diverse chemotypes. Combining the two models through heterogeneous classifier fusion led to a similar or better performance than the best individual model for all assays. Further validation using a pair of in-house assays and data from a confirmatory screen--including a prospective set of around 2000 compounds selected based on our approach--confirmed the good performance. Thus, the combination of machine-learning with HTS fingerprints and chemical fingerprints utilizes information from both domains and presents a very promising approach for hit expansion, leading to more hits. The source code used with the public data is provided.

  5. Development of Type 2 Diabetes Mellitus Phenotyping Framework Using Expert Knowledge and Machine Learning Approach.

    PubMed

    Kagawa, Rina; Kawazoe, Yoshimasa; Ida, Yusuke; Shinohara, Emiko; Tanaka, Katsuya; Imai, Takeshi; Ohe, Kazuhiko

    2017-07-01

    Phenotyping is an automated technique that can be used to distinguish patients based on electronic health records. To improve the quality of medical care and advance type 2 diabetes mellitus (T2DM) research, the demand for T2DM phenotyping has been increasing. Some existing phenotyping algorithms are not sufficiently accurate for screening or identifying clinical research subjects. We propose a practical phenotyping framework using both expert knowledge and a machine learning approach to develop 2 phenotyping algorithms: one is for screening; the other is for identifying research subjects. We employ expert knowledge as rules to exclude obvious control patients and machine learning to increase accuracy for complicated patients. We developed phenotyping algorithms on the basis of our framework and performed binary classification to determine whether a patient has T2DM. To facilitate development of practical phenotyping algorithms, this study introduces new evaluation metrics: area under the precision-sensitivity curve (AUPS) with a high sensitivity and AUPS with a high positive predictive value. The proposed phenotyping algorithms based on our framework show higher performance than baseline algorithms. Our proposed framework can be used to develop 2 types of phenotyping algorithms depending on the tuning approach: one for screening, the other for identifying research subjects. We develop a novel phenotyping framework that can be easily implemented on the basis of proper evaluation metrics, which are in accordance with users' objectives. The phenotyping algorithms based on our framework are useful for extraction of T2DM patients in retrospective studies.

  6. Evaluation of the MacDonald scabbler for highway use.

    DOT National Transportation Integrated Search

    1975-01-01

    The MacDonald Scabbler is a small, hand held machine suitable for use in cleaning and roughening concrete surfaces, It weighs 308 pounds (140 kg), has 11 cutting heads, and, as a power source, requires a compressor capable of delivering 365 cubic foo...

  7. Finite Element Structural Analysis of a Low Energy Micro Sheet Forming Machine Concept Design

    NASA Astrophysics Data System (ADS)

    Razali, A. R.; Ann, C. T.; Ahmad, A. F.; Shariff, H. M.; Kasim, N. I.; Musa, M. A.

    2017-05-01

    It is forecasted that with the miniaturization of materials being processed, energy consumption will also be ‘miniaturized’ proportionally. The aim of this researchis to design a low energy micro-sheet-forming machine for the application of thin sheet metal. A fewconcept designsof machine structure were produced. With the help of FE software, the structure is then subjected to a forming force to observe deflection in the structure for the selection of the best and simplest design. Comparison studies between mild steel and aluminium alloys 6061 were made with a view to examine the most suitable material to be used. Based on the analysis, allowable maximum tolerance was set at 2.5µm and it was found that aluminium alloy 6061 suffice to be used.

  8. Drilling of optical glass with electroplated diamond tools

    NASA Astrophysics Data System (ADS)

    Wang, A. J.; Luan, C. G.; Yu, A. B.

    2010-10-01

    K9 optical glass drilling experiments were carried out. Bright nickel electroplated diamond tools with small slots and under heat treatment in different temperature were fabricated. Scan electro microscope was applied to analyze the wear of electroplated diamond tool. The material removal rate and grinding ratio were calculated. Machining quality was observed. Bond coating hardness was measured. The experimental results show that coolant is needed for the drilling processes of optical glasses. Heat treatment temperature of diamond tool has influence on wearability of diamond tool and grinding ratio. There were two wear types of electroplated diamond tool, diamond grit wear and bond wear. With the machining processes, wear of diamond grits included fracture, blunt and pull-out, and electroplated bond was gradually worn out. High material removal rates could be obtained by using diamond tool with suitable slot numbers. Bright nickel coating bond presents smallest grains and has better mechanical properties. Bright nickel electroplated diamond tool with slot structure and heat treatment under 200°C was suitable for optical glass drilling.

  9. Long-range nanopositioning and nanomeasuring machine for application to micro- and nanotechnology

    NASA Astrophysics Data System (ADS)

    Jäger, Gerd; Hausotte, Tino; Büchner, Hans-Joachim; Manske, Eberhard; Schmidt, Ingomar; Mastylo, Rostyslav

    2006-03-01

    The paper describes the operation of a high-precision long range three-dimensional nanopositioning and nanomeasuring machine (NPM-Machine). The NPM-Machine has been developed by the Institute of Process Measurement and Sensor Technology of the Technische Universität Ilmenau. The machine was successfully tested and continually improved in the last few years. The machines are operating successfully in several German and foreign research institutes including the Physikalisch-Technische Bundesanstalt (PTB). Three plane mirror miniature interferometers are installed into the NPM-machine having a resolution of less than 0,1 nm over the entire positioning and measuring range of 25 mm x 25 mm x 5 mm. An Abbe offset-free design of the three miniature plane mirror interferometers and applying a new concept for compensating systematic errors resulting from mechanical guide systems provide extraordinary accuracy with an expanded uncertainty of only 5 - 10 nm. The integration of several, optical and tactile probe systems and nanotools makes the NPM-Machine suitable for various tasks, such as large-area scanning probe microscopy, mask and wafer inspection, nanostructuring, biotechnology and genetic engineering as well as measuring mechanical precision workpieces, precision treatment and for engineering new material. Various developed probe systems have been integrated into the NPM-Machine. The measurement results of a focus sensor, metrological AFM, white light sensor, tactile stylus probe and of a 3D-micro-touch-probe are presented. Single beam-, double beam- and triple beam interferometers built in the NPM-Machine for six degrees of freedom measurements are described.

  10. 25 CFR 542.13 - What are the minimum internal control standards for gaming machines?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    .... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...

  11. 25 CFR 542.13 - What are the minimum internal control standards for gaming machines?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    .... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...

  12. 25 CFR 542.13 - What are the minimum internal control standards for gaming machines?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    .... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...

  13. 25 CFR 542.13 - What are the minimum internal control standards for gaming machines?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...

  14. 25 CFR 542.13 - What are the minimum internal control standards for gaming machines?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    .... (j) Player tracking system. (1) The following standards apply if a player tracking system is utilized... image on the computer screen; (B) Comparing the customer to image on customer's picture ID; or (C...

  15. 49 CFR 230.115 - Feed water tanks.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Feed water tanks. 230.115 Section 230.115... Tenders Steam Locomotive Tanks § 230.115 Feed water tanks. (a) General provisions. Tanks shall be maintained free from leaks, and in safe and suitable condition for service. Suitable screens must be provided...

  16. 2P2IHUNTER: a tool for filtering orthosteric protein–protein interaction modulators via a dedicated support vector machine

    PubMed Central

    Hamon, Véronique; Bourgeas, Raphael; Ducrot, Pierre; Theret, Isabelle; Xuereb, Laura; Basse, Marie Jeanne; Brunel, Jean Michel; Combes, Sebastien; Morelli, Xavier; Roche, Philippe

    2014-01-01

    Over the last 10 years, protein–protein interactions (PPIs) have shown increasing potential as new therapeutic targets. As a consequence, PPIs are today the most screened target class in high-throughput screening (HTS). The development of broad chemical libraries dedicated to these particular targets is essential; however, the chemical space associated with this ‘high-hanging fruit’ is still under debate. Here, we analyse the properties of 40 non-redundant small molecules present in the 2P2I database (http://2p2idb.cnrs-mrs.fr/) to define a general profile of orthosteric inhibitors and propose an original protocol to filter general screening libraries using a support vector machine (SVM) with 11 standard Dragon molecular descriptors. The filtering protocol has been validated using external datasets from PubChem BioAssay and results from in-house screening campaigns. This external blind validation demonstrated the ability of the SVM model to reduce the size of the filtered chemical library by eliminating up to 96% of the compounds as well as enhancing the proportion of active compounds by up to a factor of 8. We believe that the resulting chemical space identified in this paper will provide the scientific community with a concrete support to search for PPI inhibitors during HTS campaigns. PMID:24196694

  17. A Study on Human Oriented Autonomous Distributed Manufacturing System —Real-time Scheduling Method Based on Preference of Human Operators

    NASA Astrophysics Data System (ADS)

    Iwamura, Koji; Kuwahara, Shinya; Tanimizu, Yoshitaka; Sugimura, Nobuhiro

    Recently, new distributed architectures of manufacturing systems are proposed, aiming at realizing more flexible control structures of the manufacturing systems. Many researches have been carried out to deal with the distributed architectures for planning and control of the manufacturing systems. However, the human operators have not yet been discussed for the autonomous components of the distributed manufacturing systems. A real-time scheduling method is proposed, in this research, to select suitable combinations of the human operators, the resources and the jobs for the manufacturing processes. The proposed scheduling method consists of following three steps. In the first step, the human operators select their favorite manufacturing processes which they will carry out in the next time period, based on their preferences. In the second step, the machine tools and the jobs select suitable combinations for the next machining processes. In the third step, the automated guided vehicles and the jobs select suitable combinations for the next transportation processes. The second and third steps are carried out by using the utility value based method and the dispatching rule-based method proposed in the previous researches. Some case studies have been carried out to verify the effectiveness of the proposed method.

  18. Suitability of digital camcorders for virtual reality image data capture

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola; Maas, Hans-Gerd

    1998-12-01

    Today's consumer market digital camcorders offer features which make them appear quite interesting devices for virtual reality data capture. The paper compares a digital camcorder with an analogue camcorder and a machine vision type CCD camera and discusses the suitability of these three cameras for virtual reality applications. Besides the discussion of technical features of the cameras, this includes a detailed accuracy test in order to define the range of applications. In combination with the cameras, three different framegrabbers are tested. The geometric accuracy potential of all three cameras turned out to be surprisingly large, and no problems were noticed in the radiometric performance. On the other hand, some disadvantages have to be reported: from the photogrammetrists point of view, the major disadvantage of most camcorders is the missing possibility to synchronize multiple devices, limiting the suitability for 3-D motion data capture. Moreover, the standard video format contains interlacing, which is also undesirable for all applications dealing with moving objects or moving cameras. Further disadvantages are computer interfaces with functionality, which is still suboptimal. While custom-made solutions to these problems are probably rather expensive (and will make potential users turn back to machine vision like equipment), this functionality could probably be included by the manufacturers at almost zero cost.

  19. Benchmarking Ligand-Based Virtual High-Throughput Screening with the PubChem Database

    PubMed Central

    Butkiewicz, Mariusz; Lowe, Edward W.; Mueller, Ralf; Mendenhall, Jeffrey L.; Teixeira, Pedro L.; Weaver, C. David; Meiler, Jens

    2013-01-01

    With the rapidly increasing availability of High-Throughput Screening (HTS) data in the public domain, such as the PubChem database, methods for ligand-based computer-aided drug discovery (LB-CADD) have the potential to accelerate and reduce the cost of probe development and drug discovery efforts in academia. We assemble nine data sets from realistic HTS campaigns representing major families of drug target proteins for benchmarking LB-CADD methods. Each data set is public domain through PubChem and carefully collated through confirmation screens validating active compounds. These data sets provide the foundation for benchmarking a new cheminformatics framework BCL::ChemInfo, which is freely available for non-commercial use. Quantitative structure activity relationship (QSAR) models are built using Artificial Neural Networks (ANNs), Support Vector Machines (SVMs), Decision Trees (DTs), and Kohonen networks (KNs). Problem-specific descriptor optimization protocols are assessed including Sequential Feature Forward Selection (SFFS) and various information content measures. Measures of predictive power and confidence are evaluated through cross-validation, and a consensus prediction scheme is tested that combines orthogonal machine learning algorithms into a single predictor. Enrichments ranging from 15 to 101 for a TPR cutoff of 25% are observed. PMID:23299552

  20. Potential of cancer screening with serum surface-enhanced Raman spectroscopy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Li, S. X.; Zhang, Y. J.; Zeng, Q. Y.; Li, L. F.; Guo, Z. Y.; Liu, Z. M.; Xiong, H. L.; Liu, S. H.

    2014-06-01

    Cancer is the most common disease to threaten human health. The ability to screen individuals with malignant tumours with only a blood sample would be greatly advantageous to early diagnosis and intervention. This study explores the possibility of discriminating between cancer patients and normal subjects with serum surface-enhanced Raman spectroscopy (SERS) and a support vector machine (SVM) through a peripheral blood sample. A total of 130 blood samples were obtained from patients with liver cancer, colonic cancer, esophageal cancer, nasopharyngeal cancer, gastric cancer, as well as 113 blood samples from normal volunteers. Several diagnostic models were built with the serum SERS spectra using SVM and principal component analysis (PCA) techniques. The results show that a diagnostic accuracy of 85.5% is acquired with a PCA algorithm, while a diagnostic accuracy of 95.8% is obtained using radial basis function (RBF), PCA-SVM methods. The results prove that a RBF kernel PCA-SVM technique is superior to PCA and conventional SVM (C-SVM) algorithms in classification serum SERS spectra. The study demonstrates that serum SERS, in combination with SVM techniques, has great potential for screening cancerous patients with any solid malignant tumour through a peripheral blood sample.

  1. Mathematic model analysis of Gaussian beam propagation through an arbitrary thickness random phase screen.

    PubMed

    Tian, Yuzhen; Guo, Jin; Wang, Rui; Wang, Tingfeng

    2011-09-12

    In order to research the statistical properties of Gaussian beam propagation through an arbitrary thickness random phase screen for adaptive optics and laser communication application in the laboratory, we establish mathematic models of statistical quantities, which are based on the Rytov method and the thin phase screen model, involved in the propagation process. And the analytic results are developed for an arbitrary thickness phase screen based on the Kolmogorov power spectrum. The comparison between the arbitrary thickness phase screen and the thin phase screen shows that it is more suitable for our results to describe the generalized case, especially the scintillation index.

  2. Decision support framework for Parkinson's disease based on novel handwriting markers.

    PubMed

    Drotár, Peter; Mekyska, Jiří; Rektorová, Irena; Masarová, Lucia; Smékal, Zdeněk; Faundez-Zanuy, Marcos

    2015-05-01

    Parkinson's disease (PD) is a neurodegenerative disorder which impairs motor skills, speech, and other functions such as behavior, mood, and cognitive processes. One of the most typical clinical hallmarks of PD is handwriting deterioration, usually the first manifestation of PD. The aim of this study is twofold: (a) to find a subset of handwriting features suitable for identifying subjects with PD and (b) to build a predictive model to efficiently diagnose PD. We collected handwriting samples from 37 medicated PD patients and 38 age- and sex-matched controls. The handwriting samples were collected during seven tasks such as writing a syllable, word, or sentence. Every sample was used to extract the handwriting measures. In addition to conventional kinematic and spatio-temporal handwriting measures, we also computed novel handwriting measures based on entropy, signal energy, and empirical mode decomposition of the handwriting signals. The selected features were fed to the support vector machine classifier with radial Gaussian kernel for automated diagnosis. The accuracy of the classification of PD was as high as 88.13%, with the highest values of sensitivity and specificity equal to 89.47% and 91.89%, respectively. Handwriting may be a valuable marker as a diagnostic and screening tool.

  3. Testing of polyimide second-stage rod seals for single-state applications in advanced aircraft hydraulic systems

    NASA Technical Reports Server (NTRS)

    Waterman, A. W.

    1977-01-01

    Machined polyimide second-stage rod seals were evaluated to determine their suitability for single-stage applications where full system pressure acts on the upstream side of the seal. The 6.35-cm (2.5-in.) K-section seal was tested in impulse screening tests where peak pressure was increased in 3.448-MPa (500-psi) increments each 20,000 cycles. Seal failure occurred at 37.92 MPa (5,500 psi), indicating a potential for acceptability in a 27.58-MPa (4,000-psi) system. Static pressurization for 600 sec at pressures in excess of 10.34 MPa (1,500 psi) revealed structural inadequacy of the seal cross section to resist fracture and extrusion. Endurance testing showed the seals capable of at least 65,000 1.27-cm (0.5-in.) cycles at 450 K (350 F) without leakage. It was concluded that the second-stage seals were proven to be exceptional in the 1.379-MPa (200-psi) applications for which they were designed, but polyimide material properties are not adequate for use in this design at pressure loading equivalent to that present in single-stage applications.

  4. WaferOptics® mass volume production and reliability

    NASA Astrophysics Data System (ADS)

    Wolterink, E.; Demeyer, K.

    2010-05-01

    The Anteryon WaferOptics® Technology platform contains imaging optics designs, materials, metrologies and combined with wafer level based Semicon & MEMS production methods. WaferOptics® first required complete new system engineering. This system closes the loop between application requirement specifications, Anteryon product specification, Monte Carlo Analysis, process windows, process controls and supply reject criteria. Regarding the Anteryon product Integrated Lens Stack (ILS), new design rules, test methods and control systems were assessed, implemented, validated and customer released for mass production. This includes novel reflowable materials, mastering process, replication, bonding, dicing, assembly, metrology, reliability programs and quality assurance systems. Many of Design of Experiments were performed to assess correlations between optical performance parameters and machine settings of all process steps. Lens metrologies such as FFL, BFL, and MTF were adapted for wafer level production and wafer mapping was introduced for yield management. Test methods for screening and validating suitable optical materials were designed. Critical failure modes such as delamination and popcorning were assessed and modeled with FEM. Anteryon successfully managed to integrate the different technologies starting from single prototypes to high yield mass volume production These parallel efforts resulted in a steep yield increase from 30% to over 90% in a 8 months period.

  5. Standardized assessment of infrared thermographic fever screening system performance

    NASA Astrophysics Data System (ADS)

    Ghassemi, Pejhman; Pfefer, Joshua; Casamento, Jon; Wang, Quanzeng

    2017-03-01

    Thermal modalities represent the only currently viable mass fever screening approach for outbreaks of infectious disease pandemics such as Ebola and SARS. Non-contact infrared thermometers (NCITs) and infrared thermographs (IRTs) have been previously used for mass fever screening in transportation hubs such as airports to reduce the spread of disease. While NCITs remain a more popular choice for fever screening in the field and at fixed locations, there has been increasing evidence in the literature that IRTs can provide greater accuracy in estimating core body temperature if appropriate measurement practices are applied - including the use of technically suitable thermographs. Therefore, the purpose of this study was to develop a battery of evaluation test methods for standardized, objective and quantitative assessment of thermograph performance characteristics critical to assessing suitability for clinical use. These factors include stability, drift, uniformity, minimum resolvable temperature difference, and accuracy. Two commercial IRT models were characterized. An external temperature reference source with high temperature accuracy was utilized as part of the screening thermograph. Results showed that both IRTs are relatively accurate and stable (<1% error of reading with stability of +/-0.05°C). Overall, results of this study may facilitate development of standardized consensus test methods to enable consistent and accurate use of IRTs for fever screening.

  6. Electrochemical Study and Determination of Electroactive Species with Screen-Printed Electrodes

    ERIC Educational Resources Information Center

    Martín-Yerga, Daniel; Costa Rama, Estefanía; Costa García, Agustín

    2016-01-01

    A lab appropriate to introduce voltammetric techniques and basic electrochemical parameters is described in this work. It is suitable to study theoretical concepts of electrochemistry in an applied way for analytical undergraduate courses. Two electroactive species, hexaammineruthenium and dopamine, are used as simple redox systems. Screen-printed…

  7. Sensitivity of neuroprogenitor cells to chemical-induced apoptosis using a multiplexed assay suitable for high-throughput screening*

    EPA Science Inventory

    AbstractHigh-throughput methods are useful for rapidly screening large numbers of chemicals for biological activity, including the perturbation of pathways that may lead to adverse cellular effects. In vitro assays for the key events of neurodevelopment, including apoptosis, may ...

  8. Machine & electrical double control air dryer for vehicle air braking system

    NASA Astrophysics Data System (ADS)

    Zhang, Xuan; Yang, Liu; Wang, Xian Yan; Tan, Xiao Yan; Wang, Wei

    2017-09-01

    As is known to all, a vehicle air brake system, in which usually contains moisture. To solve the problem, it is common to use air dryer to dry compressed air effectively and completely remove the moisture and oil of braking system. However, the existing air dryer is not suitable for all commercial vehicles. According to the operational status of the new energy vehicles in the initial operating period, the structure design principle of the machine & electric control air dryer is expounded from the aspects of the structure and operating principle, research & development process.

  9. A Comparative Study Using Numerical Methods for Surface X Ray Doses with Conventional and Digital Radiology Equipment in Pediatric Radiology

    NASA Astrophysics Data System (ADS)

    Dan, Posa Ioan; Florin, Georgescu Remus; Virgil, Ciobanu; Antonescu, Elisabeta

    2011-09-01

    The place of the study is a pediatrics clinic which realizes a great variety of emergency, ambulatory ad hospital examinations. The radiology compartment respects work procedures and a system to ensure the quality of X ray examinations. The results show a constant for the programmator of the digital detector machine for the tension applied to the tube. For the screen-film detector machine the applied tension increases proportionally with the physical development of the child considering the trunk thickness.

  10. Learning molecular energies using localized graph kernels

    DOE PAGES

    Ferré, Grégoire; Haut, Terry Scot; Barros, Kipton Marcos

    2017-03-21

    We report that recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturallymore » incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. Finally, we benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.« less

  11. Design Enhancement and Performance Examination of External Rotor Switched Flux Permanent Magnet Machine for Downhole Application

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Sulaiman, E.; Soomro, H. A.; Jusoh, L. I.; Bahrim, F. S.; Omar, M. F.

    2017-08-01

    The recent change in innovation and employments of high-temperature magnets, permanent magnet flux switching machine (PMFSM) has turned out to be one of the suitable contenders for seaward boring, however, less intended for downhole because of high atmospheric temperature. Subsequently, this extensive review manages the design enhancement and performance examination of external rotor PMFSM for the downhole application. Preparatory, the essential design parameters required for machine configuration are computed numerically. At that point, the design enhancement strategy is actualized through deterministic technique. At last, preliminary and refined execution of the machine is contrasted and as a consequence, the yield torque is raised from 16.39Nm to 33.57Nm while depreciating the cogging torque and PM weight up to 1.77Nm and 0.79kg, individually. In this manner, it is inferred that purposed enhanced design of 12slot-22pole with external rotor is convenient for the downhole application.

  12. Way to nanogrinding technology

    NASA Astrophysics Data System (ADS)

    Miyashita, Masakazu

    1990-11-01

    Precision finishing process of hard and brittle material components such as single crystal silicon wafer and magnetic head consists of lapping and polishing which depend too much on skilled labor. This process is based on the traditional optical production technology and entirely different from the automated mass production technique in automobile production. Instead of traditional lapping and polishing, the nanogrinding is proposed as a new stock removal machining to generate optical surface on brittle materials. By this new technology, the damage free surface which is the same one produced by lapping and polishing can be obtained on brittle materials, and the free carvature can also be generated on brittle materials. This technology is based on the motion copying principle which is the same as in case of metal parts machining. The new nanogrinding technology is anticipated to be adapted as the machining technique suitable for automated mass production, because the stable machining on the level of optical production technique is expected to be obtained by the traditional lapping and polishing.

  13. Evaluation of Iron Loss in Interior Permanent Magnet Synchronous Motor with Consideration of Rotational Field

    NASA Astrophysics Data System (ADS)

    Ma, Lei; Sanada, Masayuki; Morimoto, Shigeo; Takeda, Yoji; Kaido, Chikara; Wakisaka, Takeaki

    Loss evaluation is an important issue in the design of electrical machines. Due to the complicate structure and flux distribution, it is difficult to predict the iron loss in the machines exactly. This paper studies the iron loss in interior permanent magnet synchronous motors based on the finite element method. The iron loss test data of core material are used in the fitting of the hysteresis and eddy current loss constants. For motors in practical operation, additional iron losses due to the appearance of rotation of flux density vector and harmonic flux density distribution makes the calculation data deviates from the measured ones. Revision is made to account for these excess iron losses which exist in the practical operating condition. Calculation results show good consistence with the experimental ones. The proposed method provides a possible way to predict the iron loss of the electrical machine with good precision, and may be helpful in the selection of the core material which is best suitable for a certain machine.

  14. Learning molecular energies using localized graph kernels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferré, Grégoire; Haut, Terry Scot; Barros, Kipton Marcos

    We report that recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturallymore » incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. Finally, we benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.« less

  15. Study on electroplating technology of diamond tools for machining hard and brittle materials

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Chen, Jian Hua; Sun, Li Peng; Wang, Yue

    2016-10-01

    With the development of the high speed cutting, the ultra-precision machining and ultrasonic vibration technique in processing hard and brittle material , the requirement of cutting tools is becoming higher and higher. As electroplated diamond tools have distinct advantages, such as high adaptability, high durability, long service life and good dimensional stability, the cutting tools are effective and extensive used in grinding hard and brittle materials. In this paper, the coating structure of electroplating diamond tool is described. The electroplating process flow is presented, and the influence of pretreatment on the machining quality is analyzed. Through the experimental research and summary, the reasonable formula of the electrolyte, the electroplating technologic parameters and the suitable sanding method were determined. Meanwhile, the drilling experiment on glass-ceramic shows that the electroplating process can effectively improve the cutting performance of diamond tools. It has laid a good foundation for further improving the quality and efficiency of the machining of hard and brittle materials.

  16. A Fragment-Based Ligand Screen Against Part of a Large Protein Machine: The ND1 Domains of the AAA+ ATPase p97/VCP.

    PubMed

    Chimenti, Michael S; Bulfer, Stacie L; Neitz, R Jeffrey; Renslo, Adam R; Jacobson, Matthew P; James, Thomas L; Arkin, Michelle R; Kelly, Mark J S

    2015-07-01

    The ubiquitous AAA+ ATPase p97 functions as a dynamic molecular machine driving several cellular processes. It is essential in regulating protein homeostasis, and it represents a potential drug target for cancer, particularly when there is a greater reliance on the endoplasmic reticulum-associated protein degradation pathway and ubiquitin-proteasome pathway to degrade an overabundance of secreted proteins. Here, we report a case study for using fragment-based ligand design approaches against this large and dynamic hexamer, which has multiple potential binding sites for small molecules. A screen of a fragment library was conducted by surface plasmon resonance (SPR) and followed up by nuclear magnetic resonance (NMR), two complementary biophysical techniques. Virtual screening was also carried out to examine possible binding sites for the experimental hits and evaluate the potential utility of fragment docking for this target. Out of this effort, 13 fragments were discovered that showed reversible binding with affinities between 140 µM and 1 mM, binding stoichiometries of 1:1 or 2:1, and good ligand efficiencies. Structural data for fragment-protein interactions were obtained with residue-specific [U-(2)H] (13)CH3-methyl-labeling NMR strategies, and these data were compared to poses from docking. The combination of virtual screening, SPR, and NMR enabled us to find and validate a number of interesting fragment hits and allowed us to gain an understanding of the structural nature of fragment binding. © 2015 Society for Laboratory Automation and Screening.

  17. Automatic machine learning based prediction of cardiovascular events in lung cancer screening data

    NASA Astrophysics Data System (ADS)

    de Vos, Bob D.; de Jong, Pim A.; Wolterink, Jelmer M.; Vliegenthart, Rozemarijn; Wielingen, Geoffrey V. F.; Viergever, Max A.; Išgum, Ivana

    2015-03-01

    Calcium burden determined in CT images acquired in lung cancer screening is a strong predictor of cardiovascular events (CVEs). This study investigated whether subjects undergoing such screening who are at risk of a CVE can be identified using automatic image analysis and subject characteristics. Moreover, the study examined whether these individuals can be identified using solely image information, or if a combination of image and subject data is needed. A set of 3559 male subjects undergoing Dutch-Belgian lung cancer screening trial was included. Low-dose non-ECG synchronized chest CT images acquired at baseline were analyzed (1834 scanned in the University Medical Center Groningen, 1725 in the University Medical Center Utrecht). Aortic and coronary calcifications were identified using previously developed automatic algorithms. A set of features describing number, volume and size distribution of the detected calcifications was computed. Age of the participants was extracted from image headers. Features describing participants' smoking status, smoking history and past CVEs were obtained. CVEs that occurred within three years after the imaging were used as outcome. Support vector machine classification was performed employing different feature sets using sets of only image features, or a combination of image and subject related characteristics. Classification based solely on the image features resulted in the area under the ROC curve (Az) of 0.69. A combination of image and subject features resulted in an Az of 0.71. The results demonstrate that subjects undergoing lung cancer screening who are at risk of CVE can be identified using automatic image analysis. Adding subject information slightly improved the performance.

  18. Advanced human machine interaction for an image interpretation workstation

    NASA Astrophysics Data System (ADS)

    Maier, S.; Martin, M.; van de Camp, F.; Peinsipp-Byma, E.; Beyerer, J.

    2016-05-01

    In recent years, many new interaction technologies have been developed that enhance the usability of computer systems and allow for novel types of interaction. The areas of application for these technologies have mostly been in gaming and entertainment. However, in professional environments, there are especially demanding tasks that would greatly benefit from improved human machine interfaces as well as an overall improved user experience. We, therefore, envisioned and built an image-interpretation-workstation of the future, a multi-monitor workplace comprised of four screens. Each screen is dedicated to a complex software product such as a geo-information system to provide geographic context, an image annotation tool, software to generate standardized reports and a tool to aid in the identification of objects. Using self-developed systems for hand tracking, pointing gestures and head pose estimation in addition to touchscreens, face identification, and speech recognition systems we created a novel approach to this complex task. For example, head pose information is used to save the position of the mouse cursor on the currently focused screen and to restore it as soon as the same screen is focused again while hand gestures allow for intuitive manipulation of 3d objects in mid-air. While the primary focus is on the task of image interpretation, all of the technologies involved provide generic ways of efficiently interacting with a multi-screen setup and could be utilized in other fields as well. In preliminary experiments, we received promising feedback from users in the military and started to tailor the functionality to their needs

  19. New Technique of High-Performance Torque Control Developed for Induction Machines

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.

    2003-01-01

    Two forms of high-performance torque control for motor drives have been described in the literature: field orientation control and direct torque control. Field orientation control has been the method of choice for previous NASA electromechanical actuator research efforts with induction motors. Direct torque control has the potential to offer some advantages over field orientation, including ease of implementation and faster response. However, the most common form of direct torque control is not suitable for the highspeed, low-stator-flux linkage induction machines designed for electromechanical actuators with the presently available sample rates of digital control systems (higher sample rates are required). In addition, this form of direct torque control is not suitable for the addition of a high-frequency carrier signal necessary for the "self-sensing" (sensorless) position estimation technique. This technique enables low- and zero-speed position sensorless operation of the machine. Sensorless operation is desirable to reduce the number of necessary feedback signals and transducers, thus improving the reliability and reducing the mass and volume of the system. This research was directed at developing an alternative form of direct torque control known as a "deadbeat," or inverse model, solution. This form uses pulse-width modulation of the voltage applied to the machine, thus reducing the necessary sample and switching frequency for the high-speed NASA motor. In addition, the structure of the deadbeat form allows the addition of the high-frequency carrier signal so that low- and zero-speed sensorless operation is possible. The new deadbeat solution is based on using the stator and rotor flux as state variables. This choice of state variables leads to a simple graphical representation of the solution as the intersection of a constant torque line with a constant stator flux circle. Previous solutions have been expressed only in complex mathematical terms without a method to clearly visualize the solution. The graphical technique allows a more insightful understanding of the operation of the machine under various conditions.

  20. [Results of audiometry screening in adolescent workers].

    PubMed

    Hartmann, B

    1990-11-01

    Results of screening audiometry of male youths aged 16 to 25 (n = 3969) in occupations from metallurgy, machine-building industry and traffic are demonstrated. Part of persons they have hearing loss between 5 to 10 percent increases from 2.8% of pupils before starting vocational training to 4.5% or 7.1% of apprentices and 9.7% of skilled workers. The incidence of persons with respective without middle ear inflammation in anamnesis only differ in stages about 20 percent hearing loss. It shows sensitivity of screening audiometry nevertheless there are possibilities of mistakes. Adolescents already may show measurable hearing loss in connection with professional and nonprofessional expositions as well as individual dispositions.

  1. Research on screening of suitable forage grasses in coastal saline - alkaline soil

    NASA Astrophysics Data System (ADS)

    Yue, Xiaoyu; Han, Xin; Song, Qianhong; Yang, Xu; Zhou, Qingyun

    2017-11-01

    The screening of salt-tolerant plants can provide suitable tree species for the afforestation of coastal salinity and maintain biodiversity and ecological stability. The research was based on the study of seven grasses, such as high fescue, the bermuda grass, the thyme, the rye grass, the precocious grass, the third leaf, and the red three leaves. Each pasture was planted in three different kinds of soil, such as salt alkali soil, salt alkali soil + ecological bag and non-saline alkali soil. The effect of salt alkali soil on germinating time, germination rate and grass growth was analyzed. The effects of ecological bag on soil salt and the growth and germination of grass was also analyzed in order to provide the reference basis for the widespread and systematic selection of salt-tolerant plants, with the grass being selected for the suitable ecological bag.

  2. Mechanization for Optimal Landscape Reclamation

    NASA Astrophysics Data System (ADS)

    Vondráčková, Terezie; Voštová, Věra; Kraus, Michal

    2017-12-01

    Reclamation is a method of ultimate utilization of land adversely affected by mining or other industrial activity. The paper explains the types of reclamation and the term “optimal reclamation”. Technological options of the long-lasting process of mine dumps reclamation starting with the removal of overlying rocks, transport and backfilling up to the follow-up remodelling of the mine dumps terrain. Technological units and equipment for stripping flow division. Stripping flow solution with respect to optimal reclamation. We recommend that the application of logistic chains and mining simulation with follow-up reclamation to open-pit mines be used for the implementation of optimal reclamation. In addition to a database of local heterogeneities of the stripped soil and reclaimed land, the flow of earths should be resolved in a manner allowing the most suitable soil substrate to be created for the restoration of agricultural and forest land on mine dumps. The methodology under development for the solution of a number of problems, including the geological survey of overlying rocks, extraction of stripping, their transport and backfilling in specified locations with the follow-up deployment of goal-directed reclamation. It will make possible to reduce the financial resources needed for the complex process chain by utilizing GIS, GPS and DGPS technologies, logistic tools and synergistic effects. When selecting machines for transport, moving and spreading of earths, various points of view and aspects must be taken into account. Among such aspects are e.g. the kind of earth to be operated by the respective construction machine, the kind of work activities to be performed, the machine’s capacity, the option to control the machine’s implement and economic aspects and clients’ requirements. All these points of view must be considered in the decision-making process so that the selected machine is capable of executing the required activity and that the use of an unsuitable machine is eliminated as it would result in a delay and increase in the project costs. Therefore, reclamation always includes extensive earth-moving work activities restoring the required relief of the land being reclaimed. Using the earth-moving machine capacity, the kind of soil in mine dumps, the kind of the work activity performed and the machine design, a SW application has been developed that allows the most suitable machine for the respective work technology to be selected with a view to preparing the land intended for reclamation.

  3. Machine Learning

    NASA Astrophysics Data System (ADS)

    Hoffmann, Achim; Mahidadia, Ashesh

    The purpose of this chapter is to present fundamental ideas and techniques of machine learning suitable for the field of this book, i.e., for automated scientific discovery. The chapter focuses on those symbolic machine learning methods, which produce results that are suitable to be interpreted and understood by humans. This is particularly important in the context of automated scientific discovery as the scientific theories to be produced by machines are usually meant to be interpreted by humans. This chapter contains some of the most influential ideas and concepts in machine learning research to give the reader a basic insight into the field. After the introduction in Sect. 1, general ideas of how learning problems can be framed are given in Sect. 2. The section provides useful perspectives to better understand what learning algorithms actually do. Section 3 presents the Version space model which is an early learning algorithm as well as a conceptual framework, that provides important insight into the general mechanisms behind most learning algorithms. In section 4, a family of learning algorithms, the AQ family for learning classification rules is presented. The AQ family belongs to the early approaches in machine learning. The next, Sect. 5 presents the basic principles of decision tree learners. Decision tree learners belong to the most influential class of inductive learning algorithms today. Finally, a more recent group of learning systems are presented in Sect. 6, which learn relational concepts within the framework of logic programming. This is a particularly interesting group of learning systems since the framework allows also to incorporate background knowledge which may assist in generalisation. Section 7 discusses Association Rules - a technique that comes from the related field of Data mining. Section 8 presents the basic idea of the Naive Bayesian Classifier. While this is a very popular learning technique, the learning result is not well suited for human comprehension as it is essentially a large collection of probability values. In Sect. 9, we present a generic method for improving accuracy of a given learner by generatingmultiple classifiers using variations of the training data. While this works well in most cases, the resulting classifiers have significantly increased complexity and, hence, tend to destroy the human readability of the learning result that a single learner may produce. Section 10 contains a summary, mentions briefly other techniques not discussed in this chapter and presents outlook on the potential of machine learning in the future.

  4. Quantitative PCR high-resolution melting (qPCR-HRM) curve analysis, a new approach to simultaneously screen point mutations and large rearrangements: application to MLH1 germline mutations in Lynch syndrome.

    PubMed

    Rouleau, Etienne; Lefol, Cédrick; Bourdon, Violaine; Coulet, Florence; Noguchi, Tetsuro; Soubrier, Florent; Bièche, Ivan; Olschwang, Sylviane; Sobol, Hagay; Lidereau, Rosette

    2009-06-01

    Several techniques have been developed to screen mismatch repair (MMR) genes for deleterious mutations. Until now, two different techniques were required to screen for both point mutations and large rearrangements. For the first time, we propose a new approach, called "quantitative PCR (qPCR) high-resolution melting (HRM) curve analysis (qPCR-HRM)," which combines qPCR and HRM to obtain a rapid and cost-effective method suitable for testing a large series of samples. We designed PCR amplicons to scan the MLH1 gene using qPCR HRM. Seventy-six patients were fully scanned in replicate, including 14 wild-type patients and 62 patients with known mutations (57 point mutations and five rearrangements). To validate the detected mutations, we used sequencing and/or hybridization on a dedicated MLH1 array-comparative genomic hybridization (array-CGH). All point mutations and rearrangements detected by denaturing high-performance liquid chromatography (dHPLC)+multiplex ligation-dependent probe amplification (MLPA) were successfully detected by qPCR HRM. Three large rearrangements were characterized with the dedicated MLH1 array-CGH. One variant was detected with qPCR HRM in a wild-type patient and was located within the reverse primer. One variant was not detected with qPCR HRM or with dHPLC due to its proximity to a T-stretch. With qPCR HRM, prescreening for point mutations and large rearrangements are performed in one tube and in one step with a single machine, without the need for any automated sequencer in the prescreening process. In replicate, its reagent cost, sensitivity, and specificity are comparable to those of dHPLC+MLPA techniques. However, qPCR HRM outperformed the other techniques in terms of its rapidity and amount of data provided.

  5. Design Comparison of Inner and Outer Rotor of Permanent Magnet Flux Switching Machine for Electric Bicycle Application

    NASA Astrophysics Data System (ADS)

    Jusoh, L. I.; Sulaiman, E.; Bahrim, F. S.; Kumar, R.

    2017-08-01

    Recent advancements have led to the development of flux switching machines (FSMs) with flux sources within the stators. The advantage of being a single-piece machine with a robust rotor structure makes FSM an excellent choice for speed applications. There are three categories of FSM, namely, the permanent magnet (PM) FSM, the field excitation (FE) FSM, and the hybrid excitation (HE) FSM. The PMFSM and the FEFSM have their respective PM and field excitation coil (FEC) as their key flux sources. Meanwhile, as the name suggests, the HEFSM has a combination of PM and FECs as the flux sources. The PMFSM is a simple and cheap machine, and it has the ability to control variable flux, which would be suitable for an electric bicycle. Thus, this paper will present a design comparison between an inner rotor and an outer rotor for a single-phase permanent magnet flux switching machine with 8S-10P, designed specifically for an electric bicycle. The performance of this machine was validated using the 2D- FEA. As conclusion, the outer-rotor has much higher torque approximately at 54.2% of an innerrotor PMFSM. From the comprehensive analysis of both designs it can be conclude that output performance is lower than the SRM and IPMSM design machine. But, it shows that the possibility to increase the design performance by using “deterministic optimization method”.

  6. Real-Time Deflection Monitoring for Milling of a Thin-Walled Workpiece by Using PVDF Thin-Film Sensors with a Cantilevered Beam as a Case Study

    PubMed Central

    Luo, Ming; Liu, Dongsheng; Luo, Huan

    2016-01-01

    Thin-walled workpieces, such as aero-engine blisks and casings, are usually made of hard-to-cut materials. The wall thickness is very small and it is easy to deflect during milling process under dynamic cutting forces, leading to inaccurate workpiece dimensions and poor surface integrity. To understand the workpiece deflection behavior in a machining process, a new real-time nonintrusive method for deflection monitoring is presented, and a detailed analysis of workpiece deflection for different machining stages of the whole machining process is discussed. The thin-film polyvinylidene fluoride (PVDF) sensor is attached to the non-machining surface of the workpiece to copy the deflection excited by the dynamic cutting force. The relationship between the input deflection and the output voltage of the monitoring system is calibrated by testing. Monitored workpiece deflection results show that the workpiece experiences obvious vibration during the cutter entering the workpiece stage, and vibration during the machining process can be easily tracked by monitoring the deflection of the workpiece. During the cutter exiting the workpiece stage, the workpiece experiences forced vibration firstly, and free vibration exists until the amplitude reduces to zero after the cutter exits the workpiece. Machining results confirmed the suitability of the deflection monitoring system for machining thin-walled workpieces with the application of PVDF sensors. PMID:27626424

  7. The Effects of Different Electrode Types for Obtaining Surface Machining Shape on Shape Memory Alloy Using Electrochemical Machining

    NASA Astrophysics Data System (ADS)

    Choi, S. G.; Kim, S. H.; Choi, W. K.; Moon, G. C.; Lee, E. S.

    2017-06-01

    Shape memory alloy (SMA) is important material used for the medicine and aerospace industry due to its characteristics called the shape memory effect, which involves the recovery of deformed alloy to its original state through the application of temperature or stress. Consumers in modern society demand stability in parts. Electrochemical machining is one of the methods for obtained these stabilities in parts requirements. These parts of shape memory alloy require fine patterns in some applications. In order to machine a fine pattern, the electrochemical machining method is suitable. For precision electrochemical machining using different shape electrodes, the current density should be controlled precisely. And electrode shape is required for precise electrochemical machining. It is possible to obtain precise square holes on the SMA if the insulation layer controlled the unnecessary current between electrode and workpiece. If it is adjusting the unnecessary current to obtain the desired shape, it will be a great contribution to the medical industry and the aerospace industry. It is possible to process a desired shape to the shape memory alloy by micro controlling the unnecessary current. In case of the square electrode without insulation layer, it derives inexact square holes due to the unnecessary current. The results using the insulated electrode in only side show precise square holes. The removal rate improved in case of insulated electrode than others because insulation layer concentrate the applied current to the machining zone.

  8. Predicting human liver microsomal stability with machine learning techniques.

    PubMed

    Sakiyama, Yojiro; Yuki, Hitomi; Moriya, Takashi; Hattori, Kazunari; Suzuki, Misaki; Shimada, Kaoru; Honma, Teruki

    2008-02-01

    To ensure a continuing pipeline in pharmaceutical research, lead candidates must possess appropriate metabolic stability in the drug discovery process. In vitro ADMET (absorption, distribution, metabolism, elimination, and toxicity) screening provides us with useful information regarding the metabolic stability of compounds. However, before the synthesis stage, an efficient process is required in order to deal with the vast quantity of data from large compound libraries and high-throughput screening. Here we have derived a relationship between the chemical structure and its metabolic stability for a data set of in-house compounds by means of various in silico machine learning such as random forest, support vector machine (SVM), logistic regression, and recursive partitioning. For model building, 1952 proprietary compounds comprising two classes (stable/unstable) were used with 193 descriptors calculated by Molecular Operating Environment. The results using test compounds have demonstrated that all classifiers yielded satisfactory results (accuracy > 0.8, sensitivity > 0.9, specificity > 0.6, and precision > 0.8). Above all, classification by random forest as well as SVM yielded kappa values of approximately 0.7 in an independent validation set, slightly higher than other classification tools. These results suggest that nonlinear/ensemble-based classification methods might prove useful in the area of in silico ADME modeling.

  9. Detection of Hard Exudates in Colour Fundus Images Using Fuzzy Support Vector Machine-Based Expert System.

    PubMed

    Jaya, T; Dheeba, J; Singh, N Albert

    2015-12-01

    Diabetic retinopathy is a major cause of vision loss in diabetic patients. Currently, there is a need for making decisions using intelligent computer algorithms when screening a large volume of data. This paper presents an expert decision-making system designed using a fuzzy support vector machine (FSVM) classifier to detect hard exudates in fundus images. The optic discs in the colour fundus images are segmented to avoid false alarms using morphological operations and based on circular Hough transform. To discriminate between the exudates and the non-exudates pixels, colour and texture features are extracted from the images. These features are given as input to the FSVM classifier. The classifier analysed 200 retinal images collected from diabetic retinopathy screening programmes. The tests made on the retinal images show that the proposed detection system has better discriminating power than the conventional support vector machine. With the best combination of FSVM and features sets, the area under the receiver operating characteristic curve reached 0.9606, which corresponds to a sensitivity of 94.1% with a specificity of 90.0%. The results suggest that detecting hard exudates using FSVM contribute to computer-assisted detection of diabetic retinopathy and as a decision support system for ophthalmologists.

  10. Methods for Effective Virtual Screening and Scaffold-Hopping in Chemical Compounds

    DTIC Science & Technology

    2007-04-04

    contains color images. 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 12 19a...Opterons with 4 GB of memory . We used the descriptor- spaces GF, ECZ3, and ErG (described in Section 4) for the evaluating the methods introduced in...screening: Use of data fusion and machine learning to enchance the effectiveness of sim- ilarity searching. J. Chem. Info. Model., (46):462–470, 2006. [18] J

  11. EEG-guided meditation: A personalized approach.

    PubMed

    Fingelkurts, Andrew A; Fingelkurts, Alexander A; Kallio-Tamminen, Tarja

    2015-12-01

    The therapeutic potential of meditation for physical and mental well-being is well documented, however the possibility of adverse effects warrants further discussion of the suitability of any particular meditation practice for every given participant. This concern highlights the need for a personalized approach in the meditation practice adjusted for a concrete individual. This can be done by using an objective screening procedure that detects the weak and strong cognitive skills in brain function, thus helping design a tailored meditation training protocol. Quantitative electroencephalogram (qEEG) is a suitable tool that allows identification of individual neurophysiological types. Using qEEG screening can aid developing a meditation training program that maximizes results and minimizes risk of potential negative effects. This brief theoretical-conceptual review provides a discussion of the problem and presents some illustrative results on the usage of qEEG screening for the guidance of mediation personalization. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Machine learning of molecular properties: Locality and active learning

    NASA Astrophysics Data System (ADS)

    Gubaev, Konstantin; Podryabinkin, Evgeny V.; Shapeev, Alexander V.

    2018-06-01

    In recent years, the machine learning techniques have shown great potent1ial in various problems from a multitude of disciplines, including materials design and drug discovery. The high computational speed on the one hand and the accuracy comparable to that of density functional theory on another hand make machine learning algorithms efficient for high-throughput screening through chemical and configurational space. However, the machine learning algorithms available in the literature require large training datasets to reach the chemical accuracy and also show large errors for the so-called outliers—the out-of-sample molecules, not well-represented in the training set. In the present paper, we propose a new machine learning algorithm for predicting molecular properties that addresses these two issues: it is based on a local model of interatomic interactions providing high accuracy when trained on relatively small training sets and an active learning algorithm of optimally choosing the training set that significantly reduces the errors for the outliers. We compare our model to the other state-of-the-art algorithms from the literature on the widely used benchmark tests.

  13. Scale effects and a method for similarity evaluation in micro electrical discharge machining

    NASA Astrophysics Data System (ADS)

    Liu, Qingyu; Zhang, Qinhe; Wang, Kan; Zhu, Guang; Fu, Xiuzhuo; Zhang, Jianhua

    2016-08-01

    Electrical discharge machining(EDM) is a promising non-traditional micro machining technology that offers a vast array of applications in the manufacturing industry. However, scale effects occur when machining at the micro-scale, which can make it difficult to predict and optimize the machining performances of micro EDM. A new concept of "scale effects" in micro EDM is proposed, the scale effects can reveal the difference in machining performances between micro EDM and conventional macro EDM. Similarity theory is presented to evaluate the scale effects in micro EDM. Single factor experiments are conducted and the experimental results are analyzed by discussing the similarity difference and similarity precision. The results show that the output results of scale effects in micro EDM do not change linearly with discharge parameters. The values of similarity precision of machining time significantly increase when scaling-down the capacitance or open-circuit voltage. It is indicated that the lower the scale of the discharge parameter, the greater the deviation of non-geometrical similarity degree over geometrical similarity degree, which means that the micro EDM system with lower discharge energy experiences more scale effects. The largest similarity difference is 5.34 while the largest similarity precision can be as high as 114.03. It is suggested that the similarity precision is more effective in reflecting the scale effects and their fluctuation than similarity difference. Consequently, similarity theory is suitable for evaluating the scale effects in micro EDM. This proposed research offers engineering values for optimizing the machining parameters and improving the machining performances of micro EDM.

  14. A Primer for DoD Reliability, Maintainability, Safety, and Logistics Standards, 1992

    DTIC Science & Technology

    1991-10-01

    equipment, identified in the order of application (i.e., assembly, unit and system screens). Screening Strength ( SS ) - The probability that a screen...As’- ss equipment suitability for its intended operational environment. c. Verify contractual compliance. Each test method is divided into two...SIMULA-ITno Q.A. j FABRICATION, VALIDATION AD CONCERNS j PACKAGE AND SAnSFACTORY MBI Y QUALIMCATION RE UIREMENTS NO QRA TESTS YES QMLFOR TECHNOLOGY

  15. Development of system decision support tools for behavioral trends monitoring of machinery maintenance in a competitive environment

    NASA Astrophysics Data System (ADS)

    Adeyeri, Michael Kanisuru; Mpofu, Khumbulani

    2017-06-01

    The article is centred on software system development for manufacturing company that produces polyethylene bags using mostly conventional machines in a competitive world where each business enterprise desires to stand tall. This is meant to assist in gaining market shares, taking maintenance and production decisions by the dynamism and flexibilities embedded in the package as customers' demand varies under the duress of meeting the set goals. The production and machine condition monitoring software (PMCMS) is programmed in C# and designed in such a way to support hardware integration, real-time machine conditions monitoring, which is based on condition maintenance approach, maintenance decision suggestions and suitable production strategies as the demand for products keeps changing in a highly competitive environment. PMCMS works with an embedded device which feeds it with data from the various machines being monitored at the workstation, and the data are read at the base station through transmission via a wireless transceiver and stored in a database. A case study was used in the implementation of the developed system, and the results show that it can monitor the machine's health condition effectively by displaying machines' health status, gives repair suggestions to probable faults, decides strategy for both production methods and maintenance, and, thus, can enhance maintenance performance obviously.

  16. Application of Machine Learning to Proteomics Data: Classification and Biomarker Identification in Postgenomics Biology

    PubMed Central

    Swan, Anna Louise; Mobasheri, Ali; Allaway, David; Liddell, Susan

    2013-01-01

    Abstract Mass spectrometry is an analytical technique for the characterization of biological samples and is increasingly used in omics studies because of its targeted, nontargeted, and high throughput abilities. However, due to the large datasets generated, it requires informatics approaches such as machine learning techniques to analyze and interpret relevant data. Machine learning can be applied to MS-derived proteomics data in two ways. First, directly to mass spectral peaks and second, to proteins identified by sequence database searching, although relative protein quantification is required for the latter. Machine learning has been applied to mass spectrometry data from different biological disciplines, particularly for various cancers. The aims of such investigations have been to identify biomarkers and to aid in diagnosis, prognosis, and treatment of specific diseases. This review describes how machine learning has been applied to proteomics tandem mass spectrometry data. This includes how it can be used to identify proteins suitable for use as biomarkers of disease and for classification of samples into disease or treatment groups, which may be applicable for diagnostics. It also includes the challenges faced by such investigations, such as prediction of proteins present, protein quantification, planning for the use of machine learning, and small sample sizes. PMID:24116388

  17. Machine learning and next-generation asteroid surveys

    NASA Astrophysics Data System (ADS)

    Nugent, Carrie R.; Dailey, John; Cutri, Roc M.; Masci, Frank J.; Mainzer, Amy K.

    2017-10-01

    Next-generation surveys such as NEOCam (Mainzer et al., 2016) will sift through tens of millions of point source detections daily to detect and discover asteroids. This requires new, more efficient techniques to distinguish between solar system objects, background stars and galaxies, and artifacts such as cosmic rays, scattered light and diffraction spikes.Supervised machine learning is a set of algorithms that allows computers to classify data on a training set, and then apply that classification to make predictions on new datasets. It has been employed by a broad range of fields, including computer vision, medical diagnoses, economics, and natural language processing. It has also been applied to astronomical datasets, including transient identification in the Palomar Transient Factory pipeline (Masci et al., 2016), and in the Pan-STARRS1 difference imaging (D. E. Wright et al., 2015).As part of the NEOCam extended phase A work we apply machine learning techniques to the problem of asteroid detection. Asteroid detection is an ideal application of supervised learning, as there is a wealth of metrics associated with each extracted source, and suitable training sets are easily created. Using the vetted NEOWISE dataset (E. L. Wright et al., 2010, Mainzer et al., 2011) as a proof-of-concept of this technique, we applied the python package sklearn. We report on reliability, feature set selection, and the suitability of various algorithms.

  18. 30 CFR 56.13021 - High-pressure hose connections.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false High-pressure hose connections. 56.13021... and Boilers § 56.13021 High-pressure hose connections. Except where automatic shutoff valves are used, safety chains or other suitable locking devices shall be used at connections to machines of high-pressure...

  19. Poetic Machines: From Paper to Pixel

    ERIC Educational Resources Information Center

    Naji, Jeneen

    2013-01-01

    This chapter investigates digital methods of signification in order to examine the impact of the digital medium on poetic expression. Traditional poetry criticism is problematised with reference to its suitability for application to online works in order to develop a comprehensive ePoetry rhetoric that explores not only what is being said, but…

  20. Machinability of some dentin simulating materials.

    PubMed

    Möllersten, L

    1985-01-01

    Machinability in low speed drilling was investigated for pure aluminium, Frasaco teeth, ivory, plexiglass and human dentin. The investigation was performed in order to find a suitable test material for drilling experiments using paralleling instruments. A material simulating human dentin in terms of cuttability at low drilling speeds was sought. Tests were performed using a specially designed apparatus. Holes to a depth of 2 mm were drilled with a twist drill using a constant feeding force. The time required was registered. The machinability of the materials tested was determined by direct comparison of the drilling times. As regards cuttability, first aluminium and then ivory were found to resemble human dentin most closely. By comparing drilling time variances the homogeneity of the materials tested was estimated. Aluminium, Frasaco teeth and plexiglass demonstrated better homogeneity than ivory and human dentin.

  1. Classification of the Regional Ionospheric Disturbance Based on Machine Learning Techniques

    NASA Astrophysics Data System (ADS)

    Terzi, Merve Begum; Arikan, Orhan; Karatay, Secil; Arikan, Feza; Gulyaeva, Tamara

    2016-08-01

    In this study, Total Electron Content (TEC) estimated from GPS receivers is used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. For the automated classification of regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. Performance of developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing developed classification technique to Global Ionospheric Map (GIM) TEC data, which is provided by the NASA Jet Propulsion Laboratory (JPL), it is shown that SVM can be a suitable learning method to detect anomalies in TEC variations.

  2. AstroML: "better, faster, cheaper" towards state-of-the-art data mining and machine learning

    NASA Astrophysics Data System (ADS)

    Ivezic, Zeljko; Connolly, Andrew J.; Vanderplas, Jacob

    2015-01-01

    We present AstroML, a Python module for machine learning and data mining built on numpy, scipy, scikit-learn, matplotlib, and astropy, and distributed under an open license. AstroML contains a growing library of statistical and machine learning routines for analyzing astronomical data in Python, loaders for several open astronomical datasets (such as SDSS and other recent major surveys), and a large suite of examples of analyzing and visualizing astronomical datasets. AstroML is especially suitable for introducing undergraduate students to numerical research projects and for graduate students to rapidly undertake cutting-edge research. The long-term goal of astroML is to provide a community repository for fast Python implementations of common tools and routines used for statistical data analysis in astronomy and astrophysics (see http://www.astroml.org).

  3. A machine learning heuristic to identify biologically relevant and minimal biomarker panels from omics data

    PubMed Central

    2015-01-01

    Background Investigations into novel biomarkers using omics techniques generate large amounts of data. Due to their size and numbers of attributes, these data are suitable for analysis with machine learning methods. A key component of typical machine learning pipelines for omics data is feature selection, which is used to reduce the raw high-dimensional data into a tractable number of features. Feature selection needs to balance the objective of using as few features as possible, while maintaining high predictive power. This balance is crucial when the goal of data analysis is the identification of highly accurate but small panels of biomarkers with potential clinical utility. In this paper we propose a heuristic for the selection of very small feature subsets, via an iterative feature elimination process that is guided by rule-based machine learning, called RGIFE (Rule-guided Iterative Feature Elimination). We use this heuristic to identify putative biomarkers of osteoarthritis (OA), articular cartilage degradation and synovial inflammation, using both proteomic and transcriptomic datasets. Results and discussion Our RGIFE heuristic increased the classification accuracies achieved for all datasets when no feature selection is used, and performed well in a comparison with other feature selection methods. Using this method the datasets were reduced to a smaller number of genes or proteins, including those known to be relevant to OA, cartilage degradation and joint inflammation. The results have shown the RGIFE feature reduction method to be suitable for analysing both proteomic and transcriptomics data. Methods that generate large ‘omics’ datasets are increasingly being used in the area of rheumatology. Conclusions Feature reduction methods are advantageous for the analysis of omics data in the field of rheumatology, as the applications of such techniques are likely to result in improvements in diagnosis, treatment and drug discovery. PMID:25923811

  4. Analysis of Pharmacists' Attitudes toward a Distance Learning Initiative on Health Screening.

    ERIC Educational Resources Information Center

    Whiteman, Jane; And Others

    1994-01-01

    A survey of 436 community pharmacists completing a distance learning (DL) course of continuing education (CE) in health screening, and 117 nonparticipants, found participants more positively disposed toward DL. Most found DL enjoyable and more suitable than other CE methods. More females and fewer males than expected requested and completed the…

  5. Comparison of diverse nanomaterial bioactivity profiles based on high-throughput screening (HTS) in ToxCast™ (FutureToxII)

    EPA Science Inventory

    Most nanomaterials (NMs) in commerce lack hazard data. Efficient NM testing requires suitable toxicity tests for prioritization of NMs to be tested. The EPA’s ToxCast program is screening NM bioactivities and ranking NMs by their bioactivities to inform targeted testing planning....

  6. Improved Success of Sparse Matrix Protein Crystallization Screening with Heterogeneous Nucleating Agents

    PubMed Central

    Thakur, Anil S.; Robin, Gautier; Guncar, Gregor; Saunders, Neil F. W.; Newman, Janet; Martin, Jennifer L.; Kobe, Bostjan

    2007-01-01

    Background Crystallization is a major bottleneck in the process of macromolecular structure determination by X-ray crystallography. Successful crystallization requires the formation of nuclei and their subsequent growth to crystals of suitable size. Crystal growth generally occurs spontaneously in a supersaturated solution as a result of homogenous nucleation. However, in a typical sparse matrix screening experiment, precipitant and protein concentration are not sampled extensively, and supersaturation conditions suitable for nucleation are often missed. Methodology/Principal Findings We tested the effect of nine potential heterogenous nucleating agents on crystallization of ten test proteins in a sparse matrix screen. Several nucleating agents induced crystal formation under conditions where no crystallization occurred in the absence of the nucleating agent. Four nucleating agents: dried seaweed; horse hair; cellulose and hydroxyapatite, had a considerable overall positive effect on crystallization success. This effect was further enhanced when these nucleating agents were used in combination with each other. Conclusions/Significance Our results suggest that the addition of heterogeneous nucleating agents increases the chances of crystal formation when using sparse matrix screens. PMID:17971854

  7. Machine Learning-based Virtual Screening and Its Applications to Alzheimer's Drug Discovery: A Review.

    PubMed

    Carpenter, Kristy A; Huang, Xudong

    2018-06-07

    Virtual Screening (VS) has emerged as an important tool in the drug development process, as it conducts efficient in silico searches over millions of compounds, ultimately increasing yields of potential drug leads. As a subset of Artificial Intelligence (AI), Machine Learning (ML) is a powerful way of conducting VS for drug leads. ML for VS generally involves assembling a filtered training set of compounds, comprised of known actives and inactives. After training the model, it is validated and, if sufficiently accurate, used on previously unseen databases to screen for novel compounds with desired drug target binding activity. The study aims to review ML-based methods used for VS and applications to Alzheimer's disease (AD) drug discovery. To update the current knowledge on ML for VS, we review thorough backgrounds, explanations, and VS applications of the following ML techniques: Naïve Bayes (NB), k-Nearest Neighbors (kNN), Support Vector Machines (SVM), Random Forests (RF), and Artificial Neural Networks (ANN). All techniques have found success in VS, but the future of VS is likely to lean more heavily toward the use of neural networks - and more specifically, Convolutional Neural Networks (CNN), which are a subset of ANN that utilize convolution. We additionally conceptualize a work flow for conducting ML-based VS for potential therapeutics of for AD, a complex neurodegenerative disease with no known cure and prevention. This both serves as an example of how to apply the concepts introduced earlier in the review and as a potential workflow for future implementation. Different ML techniques are powerful tools for VS, and they have advantages and disadvantages albeit. ML-based VS can be applied to AD drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  8. Partitioned learning of deep Boltzmann machines for SNP data.

    PubMed

    Hess, Moritz; Lenz, Stefan; Blätte, Tamara J; Bullinger, Lars; Binder, Harald

    2017-10-15

    Learning the joint distributions of measurements, and in particular identification of an appropriate low-dimensional manifold, has been found to be a powerful ingredient of deep leaning approaches. Yet, such approaches have hardly been applied to single nucleotide polymorphism (SNP) data, probably due to the high number of features typically exceeding the number of studied individuals. After a brief overview of how deep Boltzmann machines (DBMs), a deep learning approach, can be adapted to SNP data in principle, we specifically present a way to alleviate the dimensionality problem by partitioned learning. We propose a sparse regression approach to coarsely screen the joint distribution of SNPs, followed by training several DBMs on SNP partitions that were identified by the screening. Aggregate features representing SNP patterns and the corresponding SNPs are extracted from the DBMs by a combination of statistical tests and sparse regression. In simulated case-control data, we show how this can uncover complex SNP patterns and augment results from univariate approaches, while maintaining type 1 error control. Time-to-event endpoints are considered in an application with acute myeloid leukemia patients, where SNP patterns are modeled after a pre-screening based on gene expression data. The proposed approach identified three SNPs that seem to jointly influence survival in a validation dataset. This indicates the added value of jointly investigating SNPs compared to standard univariate analyses and makes partitioned learning of DBMs an interesting complementary approach when analyzing SNP data. A Julia package is provided at 'http://github.com/binderh/BoltzmannMachines.jl'. binderh@imbi.uni-freiburg.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  9. Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design.

    PubMed

    Aromaa, Susanna; Väänänen, Kaisa

    2016-09-01

    In recent years, the use of virtual prototyping has increased in product development processes, especially in the assessment of complex systems targeted at end-users. The purpose of this study was to evaluate the suitability of virtual prototyping to support human factors/ergonomics evaluation (HFE) during the design phase. Two different virtual prototypes were used: augmented reality (AR) and virtual environment (VE) prototypes of a maintenance platform of a rock crushing machine. Nineteen designers and other stakeholders were asked to assess the suitability of the prototype for HFE evaluation. Results indicate that the system model characteristics and user interface affect the experienced suitability. The VE system was valued as being more suitable to support the assessment of visibility, reach, and the use of tools than the AR system. The findings of this study can be used as a guidance for the implementing virtual prototypes in the product development process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Development of a sterilizing in-place application for a production machine using Vaporized Hydrogen Peroxide.

    PubMed

    Mau, T; Hartmann, V; Burmeister, J; Langguth, P; Häusler, H

    2004-01-01

    The use of steam in sterilization processes is limited by the implementation of heat-sensitive components inside the machines to be sterilized. Alternative low-temperature sterilization methods need to be found and their suitability evaluated. Vaporized Hydrogen Peroxide (VHP) technology was adapted for a production machine consisting of highly sensitive pressure sensors and thermo-labile air tube systems. This new kind of "cold" surface sterilization, known from the Barrier Isolator Technology, is based on the controlled release of hydrogen peroxide vapour into sealed enclosures. A mobile VHP generator was used to generate the hydrogen peroxide vapour. The unit was combined with the air conduction system of the production machine. Terminal vacuum pumps were installed to distribute the gas within the production machine and for its elimination. In order to control the sterilization process, different physical process monitors were incorporated. The validation of the process was based on biological indicators (Geobacillus stearothermophilus). The Limited Spearman Karber Method (LSKM) was used to statistically evaluate the sterilization process. The results show that it is possible to sterilize surfaces in a complex tube system with the use of gaseous hydrogen peroxide. A total microbial reduction of 6 log units was reached.

  11. Relative optical navigation around small bodies via Extreme Learning Machine

    NASA Astrophysics Data System (ADS)

    Law, Andrew M.

    To perform close proximity operations under a low-gravity environment, relative and absolute positions are vital information to the maneuver. Hence navigation is inseparably integrated in space travel. Extreme Learning Machine (ELM) is presented as an optical navigation method around small celestial bodies. Optical Navigation uses visual observation instruments such as a camera to acquire useful data and determine spacecraft position. The required input data for operation is merely a single image strip and a nadir image. ELM is a machine learning Single Layer feed-Forward Network (SLFN), a type of neural network (NN). The algorithm is developed on the predicate that input weights and biases can be randomly assigned and does not require back-propagation. The learned model is the output layer weights which are used to calculate a prediction. Together, Extreme Learning Machine Optical Navigation (ELM OpNav) utilizes optical images and ELM algorithm to train the machine to navigate around a target body. In this thesis the asteroid, Vesta, is the designated celestial body. The trained ELMs estimate the position of the spacecraft during operation with a single data set. The results show the approach is promising and potentially suitable for on-board navigation.

  12. Two High Throughput Screen Assays for Measurement of TNF-α in THP-1 Cells

    PubMed Central

    Leister, Kristin P; Huang, Ruili; Goodwin, Bonnie L; Chen, Andrew; Austin, Christopher P; Xia, Menghang

    2011-01-01

    Tumor Necrosis Factor-α (TNF-α), a secreted cytokine, plays an important role in inflammatory diseases and immune disorders, and is a potential target for drug development. The traditional assays for detecting TNF-α, enzyme linked immunosorbent assay (ELISA) and radioimmunoassay, are not suitable for the large size compound screens. Both assays suffer from a complicated protocol, multiple plate wash steps and/or excessive radioactive waste. A simple and quick measurement of TNF-α production in a cell based assay is needed for high throughput screening to identify the lead compounds from the compound library. We have developed and optimized two homogeneous TNF-α assays using the HTRF (homogeneous time resolved fluorescence) and AlphaLISA assay formats. We have validated the HTRF based TNF-α assay in a 1536-well plate format by screening a library of 1280 pharmacologically active compounds. The active compounds identified from the screen were confirmed in the AlphaLISA TNF-α assay using a bead-based technology. These compounds were also confirmed in a traditional ELISA assay. From this study, several beta adrenergic agonists have been identified as TNF-α inhibitors. We also identified several novel inhibitors of TNF-α, such as BTO-1, CCG-2046, ellipticine, and PD 169316. The results demonstrated that both homogeneous TNF-α assays are robust and suitable for high throughput screening. PMID:21643507

  13. Generation of an arrayed CRISPR-Cas9 library targeting epigenetic regulators: from high-content screens to in vivo assays

    PubMed Central

    2017-01-01

    ABSTRACT The CRISPR-Cas9 system has revolutionized genome engineering, allowing precise modification of DNA in various organisms. The most popular method for conducting CRISPR-based functional screens involves the use of pooled lentiviral libraries in selection screens coupled with next-generation sequencing. Screens employing genome-scale pooled small guide RNA (sgRNA) libraries are demanding, particularly when complex assays are used. Furthermore, pooled libraries are not suitable for microscopy-based high-content screens or for systematic interrogation of protein function. To overcome these limitations and exploit CRISPR-based technologies to comprehensively investigate epigenetic mechanisms, we have generated a focused sgRNA library targeting 450 epigenetic regulators with multiple sgRNAs in human cells. The lentiviral library is available both in an arrayed and pooled format and allows temporally-controlled induction of gene knock-out. Characterization of the library showed high editing activity of most sgRNAs and efficient knock-out at the protein level in polyclonal populations. The sgRNA library can be used for both selection and high-content screens, as well as for targeted investigation of selected proteins without requiring isolation of knock-out clones. Using a variety of functional assays we show that the library is suitable for both in vitro and in vivo applications, representing a unique resource to study epigenetic mechanisms in physiological and pathological conditions. PMID:29327641

  14. Diagnosis by Volatile Organic Compounds in Exhaled Breath from Lung Cancer Patients Using Support Vector Machine Algorithm

    PubMed Central

    Sakumura, Yuichi; Koyama, Yutaro; Tokutake, Hiroaki; Hida, Toyoaki; Sato, Kazuo; Itoh, Toshio; Akamatsu, Takafumi; Shin, Woosuck

    2017-01-01

    Monitoring exhaled breath is a very attractive, noninvasive screening technique for early diagnosis of diseases, especially lung cancer. However, the technique provides insufficient accuracy because the exhaled air has many crucial volatile organic compounds (VOCs) at very low concentrations (ppb level). We analyzed the breath exhaled by lung cancer patients and healthy subjects (controls) using gas chromatography/mass spectrometry (GC/MS), and performed a subsequent statistical analysis to diagnose lung cancer based on the combination of multiple lung cancer-related VOCs. We detected 68 VOCs as marker species using GC/MS analysis. We reduced the number of VOCs and used support vector machine (SVM) algorithm to classify the samples. We observed that a combination of five VOCs (CHN, methanol, CH3CN, isoprene, 1-propanol) is sufficient for 89.0% screening accuracy, and hence, it can be used for the design and development of a desktop GC-sensor analysis system for lung cancer. PMID:28165388

  15. Diagnosis by Volatile Organic Compounds in Exhaled Breath from Lung Cancer Patients Using Support Vector Machine Algorithm.

    PubMed

    Sakumura, Yuichi; Koyama, Yutaro; Tokutake, Hiroaki; Hida, Toyoaki; Sato, Kazuo; Itoh, Toshio; Akamatsu, Takafumi; Shin, Woosuck

    2017-02-04

    Monitoring exhaled breath is a very attractive, noninvasive screening technique for early diagnosis of diseases, especially lung cancer. However, the technique provides insufficient accuracy because the exhaled air has many crucial volatile organic compounds (VOCs) at very low concentrations (ppb level). We analyzed the breath exhaled by lung cancer patients and healthy subjects (controls) using gas chromatography/mass spectrometry (GC/MS), and performed a subsequent statistical analysis to diagnose lung cancer based on the combination of multiple lung cancer-related VOCs. We detected 68 VOCs as marker species using GC/MS analysis. We reduced the number of VOCs and used support vector machine (SVM) algorithm to classify the samples. We observed that a combination of five VOCs (CHN, methanol, CH₃CN, isoprene, 1-propanol) is sufficient for 89.0% screening accuracy, and hence, it can be used for the design and development of a desktop GC-sensor analysis system for lung cancer.

  16. Materials Screening for the Discovery of New Half-Heuslers: Machine Learning versus ab Initio Methods.

    PubMed

    Legrain, Fleur; Carrete, Jesús; van Roekeghem, Ambroise; Madsen, Georg K H; Mingo, Natalio

    2018-01-18

    Machine learning (ML) is increasingly becoming a helpful tool in the search for novel functional compounds. Here we use classification via random forests to predict the stability of half-Heusler (HH) compounds, using only experimentally reported compounds as a training set. Cross-validation yields an excellent agreement between the fraction of compounds classified as stable and the actual fraction of truly stable compounds in the ICSD. The ML model is then employed to screen 71 178 different 1:1:1 compositions, yielding 481 likely stable candidates. The predicted stability of HH compounds from three previous high-throughput ab initio studies is critically analyzed from the perspective of the alternative ML approach. The incomplete consistency among the three separate ab initio studies and between them and the ML predictions suggests that additional factors beyond those considered by ab initio phase stability calculations might be determinant to the stability of the compounds. Such factors can include configurational entropies and quasiharmonic contributions.

  17. Choice and maintenance of equipment for electron crystallography.

    PubMed

    Mills, Deryck J; Vonck, Janet

    2013-01-01

    The choice of equipment for an electron crystallography laboratory will ultimately be determined by the available budget; nevertheless, the ideal lab will have two electron microscopes: a dedicated 300 kV cryo-EM with a field emission gun and a smaller LaB(6) machine for screening. The high-end machine should be equipped with photographic film or a very large CCD or CMOS camera for 2D crystal data collection; the screening microscope needs a mid-size CCD for rapid evaluation of crystal samples. The microscope room installations should provide adequate space and a special environment that puts no restrictions on the collection of high-resolution data. Equipment for specimen preparation includes a carbon coater, glow discharge unit, light microscope, plunge freezer, and liquid nitrogen containers and storage dewars. When photographic film is to be used, additional requirements are a film desiccator, dark room, optical diffractometer, and a film scanner. Having the electron microscopes and ancillary equipment well maintained and always in optimum condition facilitates the production of high-quality data.

  18. Active machine learning-driven experimentation to determine compound effects on protein patterns

    PubMed Central

    Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F

    2016-01-01

    High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance. DOI: http://dx.doi.org/10.7554/eLife.10047.001 PMID:26840049

  19. An HTRF® Assay for the Protein Kinase ATM.

    PubMed

    Adams, Phillip; Clark, Jonathan; Hawdon, Simon; Hill, Jennifer; Plater, Andrew

    2017-01-01

    Ataxia telangiectasia mutated (ATM) is a serine/threonine kinase that plays a key role in the regulation of DNA damage pathways and checkpoint arrest. In recent years, there has been growing interest in ATM as a therapeutic target due to its association with cancer cell survival following genotoxic stress such as radio- and chemotherapy. Large-scale targeted drug screening campaigns have been hampered, however, by technical issues associated with the production of sufficient quantities of purified ATM and the availability of a suitable high-throughput assay. Using a purified, functionally active recombinant ATM and one of its physiological substrates, p53, we have developed an in vitro FRET-based activity assay that is suitable for high-throughput drug screening.

  20. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    PubMed

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Not all trust is created equal: dispositional and history-based trust in human-automation interactions.

    PubMed

    Merritt, Stephanie M; Ilgen, Daniel R

    2008-04-01

    We provide an empirical demonstration of the importance of attending to human user individual differences in examinations of trust and automation use. Past research has generally supported the notions that machine reliability predicts trust in automation, and trust in turn predicts automation use. However, links between user personality and perceptions of the machine with trust in automation have not been empirically established. On our X-ray screening task, 255 students rated trust and made automation use decisions while visually searching for weapons in X-ray images of luggage. We demonstrate that individual differences affect perceptions of machine characteristics when actual machine characteristics are constant, that perceptions account for 52% of trust variance above the effects of actual characteristics, and that perceptions mediate the effects of actual characteristics on trust. Importantly, we also demonstrate that when administered at different times, the same six trust items reflect two types of trust (dispositional trust and history-based trust) and that these two trust constructs are differentially related to other variables. Interactions were found among user characteristics, machine characteristics, and automation use. Our results suggest that increased specificity in the conceptualization and measurement of trust is required, future researchers should assess user perceptions of machine characteristics in addition to actual machine characteristics, and incorporation of user extraversion and propensity to trust machines can increase prediction of automation use decisions. Potential applications include the design of flexible automation training programs tailored to individuals who differ in systematic ways.

  2. Underground coal mine instrumentation and test

    NASA Technical Reports Server (NTRS)

    Burchill, R. F.; Waldron, W. D.

    1976-01-01

    The need to evaluate mechanical performance of mine tools and to obtain test performance data from candidate systems dictate that an engineering data recording system be built. Because of the wide range of test parameters which would be evaluated, a general purpose data gathering system was designed and assembled to permit maximum versatility. A primary objective of this program was to provide a specific operating evaluation of a longwall mining machine vibration response under normal operating conditions. A number of mines were visited and a candidate for test evaluation was selected, based upon management cooperation, machine suitability, and mine conditions. Actual mine testing took place in a West Virginia mine.

  3. Technics study on high accuracy crush dressing and sharpening of diamond grinding wheel

    NASA Astrophysics Data System (ADS)

    Jia, Yunhai; Lu, Xuejun; Li, Jiangang; Zhu, Lixin; Song, Yingjie

    2011-05-01

    Mechanical grinding of artificial diamond grinding wheel was traditional wheel dressing process. The rotate speed and infeed depth of tool wheel were main technics parameters. The suitable technics parameters of metals-bonded diamond grinding wheel and resin-bonded diamond grinding wheel high accuracy crush dressing were obtained by a mount of experiment in super-hard material wheel dressing grind machine and by analysis of grinding force. In the same time, the effect of machine sharpening and sprinkle granule sharpening was contrasted. These analyses and lots of experiments had extent instruction significance to artificial diamond grinding wheel accuracy crush dressing.

  4. Realization of Intelligent Measurement and Control System for Limb Rehabilitation Based on PLC and Touch Screen

    NASA Astrophysics Data System (ADS)

    Liu, Xiangquan

    According to the treatment needs of patients with limb movement disorder, on the basis of the limb rehabilitative training prototype, function of measure and control system are analyzed, design of system hardware and software is completed. The touch screen which is adopt as host computer and man-machine interaction window is responsible for sending commands and training information display; The PLC which is adopt as slave computer is responsible for receiving control command from touch screen, collecting the sensor data, regulating torque and speed of motor by analog output according to the different training mode, realizing ultimately active and passive training for limb rehabilitation therapy.

  5. Drugs and Drug-Like Compounds: Discriminating Approved Pharmaceuticals from Screening-Library Compounds

    NASA Astrophysics Data System (ADS)

    Schierz, Amanda C.; King, Ross D.

    Compounds in drug screening-libraries should resemble pharmaceuticals. To operationally test this, we analysed the compounds in terms of known drug-like filters and developed a novel machine learning method to discriminate approved pharmaceuticals from “drug-like” compounds. This method uses both structural features and molecular properties for discrimination. The method has an estimated accuracy of 91% in discriminating between the Maybridge HitFinder library and approved pharmaceuticals, and 99% between the NATDiverse collection (from Analyticon Discovery) and approved pharmaceuticals. These results show that Lipinski’s Rule of 5 for oral absorption is not sufficient to describe “drug-likeness” and be the main basis of screening-library design.

  6. Flow Cytometry Enables Multiplexed Measurements of Genetically Encoded Intramolecular FRET Sensors Suitable for Screening.

    PubMed

    Doucette, Jaimee; Zhao, Ziyan; Geyer, Rory J; Barra, Melanie M; Balunas, Marcy J; Zweifach, Adam

    2016-07-01

    Genetically encoded sensors based on intramolecular FRET between CFP and YFP are used extensively in cell biology research. Flow cytometry has been shown to offer a means to measure CFP-YFP FRET; we suspected it would provide a unique way to conduct multiplexed measurements from cells expressing different FRET sensors, which is difficult to do with microscopy, and that this could be used for screening. We confirmed that flow cytometry accurately measures FRET signals using cells transiently transfected with an ERK activity reporter, comparing responses measured with imaging and cytometry. We created polyclonal long-term transfectant lines, each expressing a different intramolecular FRET sensor, and devised a way to bar-code four distinct populations of cells. We demonstrated the feasibility of multiplexed measurements and determined that robust multiplexed measurements can be conducted in plate format. To validate the suitability of the method for screening, we measured responses from a plate of bacterial extracts that in unrelated experiments we had determined contained the protein kinase C (PKC)-activating compound teleocidin A-1. The multiplexed assay correctly identifying the teleocidin A-1-containing well. We propose that multiplexed cytometric FRET measurements will be useful for analyzing cellular function and for screening compound collections. © 2016 Society for Laboratory Automation and Screening.

  7. Applying NISHIJIN historical textile technique for e-Textile.

    PubMed

    Kuroda, Tomohiro; Hirano, Kikuo; Sugimura, Kazushige; Adachi, Satoshi; Igarashi, Hidetsugu; Ueshima, Kazuo; Nakamura, Hideo; Nambu, Masayuki; Doi, Takahiro

    2013-01-01

    The e-Textile is the key technology for continuous ambient health monitoring to increase quality of life of patients with chronic diseases. The authors introduce techniques of Japanese historical textile, NISHIJIN, which illustrate almost any pattern from one continuous yarn within the machine weaving process, which is suitable for mixed flow production. Thus, NISHIJIN is suitable for e-Textile production, which requires rapid prototyping and mass production of very complicated patterns. The authors prototyped and evaluated a few vests to take twelve-lead electrocardiogram. The result tells that the prototypes obtains electrocardiogram, which is good enough for diagnosis.

  8. Technology-assisted title and abstract screening for systematic reviews: a retrospective evaluation of the Abstrackr machine learning tool.

    PubMed

    Gates, Allison; Johnson, Cydney; Hartling, Lisa

    2018-03-12

    Machine learning tools can expedite systematic review (SR) processes by semi-automating citation screening. Abstrackr semi-automates citation screening by predicting relevant records. We evaluated its performance for four screening projects. We used a convenience sample of screening projects completed at the Alberta Research Centre for Health Evidence, Edmonton, Canada: three SRs and one descriptive analysis for which we had used SR screening methods. The projects were heterogeneous with respect to search yield (median 9328; range 5243 to 47,385 records; interquartile range (IQR) 15,688 records), topic (Antipsychotics, Bronchiolitis, Diabetes, Child Health SRs), and screening complexity. We uploaded the records to Abstrackr and screened until it made predictions about the relevance of the remaining records. Across three trials for each project, we compared the predictions to human reviewer decisions and calculated the sensitivity, specificity, precision, false negative rate, proportion missed, and workload savings. Abstrackr's sensitivity was > 0.75 for all projects and the mean specificity ranged from 0.69 to 0.90 with the exception of Child Health SRs, for which it was 0.19. The precision (proportion of records correctly predicted as relevant) varied by screening task (median 26.6%; range 14.8 to 64.7%; IQR 29.7%). The median false negative rate (proportion of records incorrectly predicted as irrelevant) was 12.6% (range 3.5 to 21.2%; IQR 12.3%). The workload savings were often large (median 67.2%, range 9.5 to 88.4%; IQR 23.9%). The proportion missed (proportion of records predicted as irrelevant that were included in the final report, out of the total number predicted as irrelevant) was 0.1% for all SRs and 6.4% for the descriptive analysis. This equated to 4.2% (range 0 to 12.2%; IQR 7.8%) of the records in the final reports. Abstrackr's reliability and the workload savings varied by screening task. Workload savings came at the expense of potentially missing relevant records. How this might affect the results and conclusions of SRs needs to be evaluated. Studies evaluating Abstrackr as the second reviewer in a pair would be of interest to determine if concerns for reliability would diminish. Further evaluations of Abstrackr's performance and usability will inform its refinement and practical utility.

  9. Video-Out Projection and Lecture Hall Set-Up. Microcomputing Working Paper Series.

    ERIC Educational Resources Information Center

    Gibson, Chris

    This paper details the considerations involved in determining suitable video projection systems for displaying the Apple Macintosh's screen to large groups of people, both in classrooms with approximately 25 people, and in lecture halls with approximately 250. To project the Mac screen to groups in lecture halls, the Electrohome EDP-57 video…

  10. Impact of polymer surface characteristics on the microrheological measurement quality of protein solutions - A tracer particle screening.

    PubMed

    Bauer, Katharina Christin; Schermeyer, Marie-Therese; Seidel, Jonathan; Hubbuch, Jürgen

    2016-05-30

    Microrheological measurements prove to be suitable to identify rheological parameters of biopharmaceutical solutions. These give information about the flow characteristics but also about the interactions and network structures in protein solutions. For the microrheological measurement tracer particles are required. Due to their specific surface characteristic not all are suitable for reliable measurement results in biopharmaceutical systems. In the present work a screening of melamine, PMMA, polystyrene and surface modified polystyrene as tracer particles were investigated at various protein solution conditions. The surface characteristics of the screened tracer particles were evaluated by zeta potential measurements. Furthermore each tracer particle was used to determine the dynamic viscosity of lysozyme solutions by microrheology and compared to a standard. The results indicate that the selection of the tracer particle had a strong impact on the quality of the microrheological measurement dependent on pH and additive type. Surface modified polystyrene was the only tracer particle that yielded good microrheological results for all tested conditions. The study indicated that the electrostatic surface charge of the tracer particle had a minor impact than its hydrophobicity. This characteristic was the crucial surface property that needs to be considered for the selection of a suitable tracer particle to achieve high measurement accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Automatic Selection of Suitable Sentences for Language Learning Exercises

    ERIC Educational Resources Information Center

    Pilán, Ildikó; Volodina, Elena; Johansson, Richard

    2013-01-01

    In our study we investigated second and foreign language (L2) sentence readability, an area little explored so far in the case of several languages, including Swedish. The outcome of our research consists of two methods for sentence selection from native language corpora based on Natural Language Processing (NLP) and machine learning (ML)…

  12. An Open-Access Educational Tool for Teaching Motion Dynamics in Multi-Axis Servomotor Control

    ERIC Educational Resources Information Center

    Rivera-Guillen, J. R.; de Jesus Rangel-Magdaleno, J.; de Jesus Romero-Troncoso, R.; Osornio-Rios, R. A.; Guevara-Gonzalez, R. G.

    2012-01-01

    Servomotors are widely used in computerized numerically controlled (CNC) machines, hence motion control is a major topic covered in undergraduate/graduate engineering courses. Despite the fact that several syllabi include the motion dynamics topic in their courses, there are neither suitable tools available for designing and simulating multi-axis…

  13. A System to Enable the Blind to Work Independently on the Center Lathe.

    ERIC Educational Resources Information Center

    Guha, Sujoy K.; Anand, Sneh

    1980-01-01

    A study has shown that with suitable accessories to machines and appropriate work planning, totally blind machinists can perform varied tasks on a lathe independently. Based on the results of the study, simple accessories have been designed and tested for use with the center lathe. (Author/PHR)

  14. Geometric improvement of electrochemical discharge micro-drilling using an ultrasonic-vibrated electrolyte

    NASA Astrophysics Data System (ADS)

    Han, Min-Seop; Min, Byung-Kwon; Lee, Sang Jo

    2009-06-01

    Electrochemical discharge machining (ECDM) is a spark-based micromachining method especially suitable for the fabrication of various microstructures on nonconductive materials, such as glass and some engineering ceramics. However, since the spark discharge frequency is drastically reduced as the machining depth increases ECDM microhole drilling has confronted difficulty in achieving uniform geometry for machined holes. One of the primary reasons for this is the difficulty of sustaining an adequate electrolyte flow in the narrow gap between the tool and the workpiece, which results in a widened taper at the hole entrance, as well as a significant reduction of the machining depth. In this paper, ultrasonic electrolyte vibration was used to enhance the machining depth of the ECDM drilling process by assuring an adequate electrolyte flow, thus helping to maintain consistent spark generation. Moreover, the stability of the gas film formation, as well as the surface quality of the hole entrance, was improved with the aid of a side-insulated electrode and a pulse-power generator. The side-insulated electrode prevented stray electrolysis and concentrated the spark discharge at the tool tip, while the pulse voltage reduced thermal damage to the workpiece surface by introducing a periodic pulse-off time. Microholes were fabricated in order to investigate the effects of ultrasonic assistance on the overcut and machining depth of the holes. The experimental results demonstrated that the possibility of consistent spark generation and the machinability of microholes were simultaneously enhanced.

  15. 49 CFR Appendix A to Part 1511 - Aviation Security Infrastructure Fee

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... final acceptance testing. This includes such equipment as Metal Detection Devices, Hand Wands, X-ray... such equipment as Metal Detection Devices, Hand Wands, X-ray screening machines, Explosives Trace... as test objects and X-ray radiation surveys, electricity costs and maintenance contract costs...

  16. Virtual Environment Training: Auxiliary Machinery Room (AMR) Watchstation Trainer.

    ERIC Educational Resources Information Center

    Hriber, Dennis C.; And Others

    1993-01-01

    Describes a project implemented at Newport News Shipbuilding that used Virtual Environment Training to improve the performance of submarine crewmen. Highlights include development of the Auxiliary Machine Room (AMR) Watchstation Trainer; Digital Video Interactive (DVI); screen layout; test design and evaluation; user reactions; authoring language;…

  17. Laughing Bear.

    ERIC Educational Resources Information Center

    Seeds, Michael A.; Seeds, Kathryn Anne

    1983-01-01

    Provided is a complete listing (Applesoft Basic) for a children's spelling program. The listing includes a machine language music utility that plays short tunes and uses the Apple's two hi-res screens for animation. Also included is a program that allows pictures to be drawn and saved to animate other programs. (JN)

  18. Analysis of labor employment assessment on production machine to minimize time production

    NASA Astrophysics Data System (ADS)

    Hernawati, Tri; Suliawati; Sari Gumay, Vita

    2018-03-01

    Every company both in the field of service and manufacturing always trying to pass efficiency of it’s resource use. One resource that has an important role is labor. Labor has different efficiency levels for different jobs anyway. Problems related to the optimal allocation of labor that has different levels of efficiency for different jobs are called assignment problems, which is a special case of linear programming. In this research, Analysis of Labor Employment Assesment on Production Machine to Minimize Time Production, in PT PDM is done by using Hungarian algorithm. The aim of the research is to get the assignment of optimal labor on production machine to minimize time production. The results showed that the assignment of existing labor is not suitable because the time of completion of the assignment is longer than the assignment by using the Hungarian algorithm. By applying the Hungarian algorithm obtained time savings of 16%.

  19. Design and Implementation of a Hypothermic Machine Perfusion Device for Clinical Preservation of Isolated Organs

    PubMed Central

    Shen, Fei; Yan, Ruqiang

    2017-01-01

    The imbalance between limited organ supply and huge potential need has hindered the development of organ-graft techniques. In this paper a low-cost hypothermic machine perfusion (HMP) device is designed and implemented to maintain suitable preservation surroundings and extend the survival life of isolated organs. Four necessary elements (the machine perfusion, the physiological parameter monitoring, the thermostatic control and the oxygenation apparatus) involved in this HMP device are introduced. Especially during the thermostatic control process, a modified Bayes estimation, which introduces the concept of improvement factor, is realized to recognize and reduce the possible measurement errors resulting from sensor faults and noise interference. Also, a fuzzy-PID controller contributes to improve the accuracy and reduces the computational load using the DSP. Our experiments indicate that the reliability of the instrument meets the design requirements, thus being appealing for potential clinical preservation applications. PMID:28587173

  20. Research and development of Camellia oleifera fruit sheller and sorting machine

    NASA Astrophysics Data System (ADS)

    Kang, Di; Wang, Yong; Fan, Youhua; Chen, Zejun

    2018-01-01

    Camellia oleifera fruit sheller in this paper was designed by the principle of kneading and extruding. This machine adopted the rolling classification sieve to screen camellia oleifera fruit with different sizes into the husking device, and camellia oleifera fruit was shelled in the mutually co-operative action of transport belt and flexible rubbing washboard. After research, in the condition that the moisture content of camellia oleifera fruit was below 55%, the vibration of the motor frequency was 50 Hz and the horizontal angle of sorting belt was 50 degrees∼55 degrees, the processing capacity was more than 900 kg/h, the threshing ratio was more than 97%, the seed broken ratio was less than 5%, the loss ratio was less than 1%. The machine is of great value in actual production, and should be widely spread and applied.

  1. Research of a smart cutting tool based on MEMS strain gauge

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Zhao, Y. L.; Shao, YW; Hu, T. J.; Zhang, Q.; Ge, X. H.

    2018-03-01

    Cutting force is an important factor that affects machining accuracy, cutting vibration and tool wear. Machining condition monitoring by cutting force measurement is a key technology for intelligent manufacture. Current cutting force sensors exist problems of large volume, complex structure and poor compatibility in practical application, for these problems, a smart cutting tool is proposed in this paper for cutting force measurement. Commercial MEMS (Micro-Electro-Mechanical System) strain gauges with high sensitivity and small size are adopted as transducing element of the smart tool, and a structure optimized cutting tool is fabricated for MEMS strain gauge bonding. Static calibration results show that the developed smart cutting tool is able to measure cutting forces in both X and Y directions, and the cross-interference error is within 3%. Its general accuracy is 3.35% and 3.27% in X and Y directions, and sensitivity is 0.1 mV/N, which is very suitable for measuring small cutting forces in high speed and precision machining. The smart cutting tool is portable and reliable for practical application in CNC machine tool.

  2. Medical Image Data and Datasets in the Era of Machine Learning-Whitepaper from the 2016 C-MIMI Meeting Dataset Session.

    PubMed

    Kohli, Marc D; Summers, Ronald M; Geis, J Raymond

    2017-08-01

    At the first annual Conference on Machine Intelligence in Medical Imaging (C-MIMI), held in September 2016, a conference session on medical image data and datasets for machine learning identified multiple issues. The common theme from attendees was that everyone participating in medical image evaluation with machine learning is data starved. There is an urgent need to find better ways to collect, annotate, and reuse medical imaging data. Unique domain issues with medical image datasets require further study, development, and dissemination of best practices and standards, and a coordinated effort among medical imaging domain experts, medical imaging informaticists, government and industry data scientists, and interested commercial, academic, and government entities. High-level attributes of reusable medical image datasets suitable to train, test, validate, verify, and regulate ML products should be better described. NIH and other government agencies should promote and, where applicable, enforce, access to medical image datasets. We should improve communication among medical imaging domain experts, medical imaging informaticists, academic clinical and basic science researchers, government and industry data scientists, and interested commercial entities.

  3. Modelling daily water temperature from air temperature for the Missouri River.

    PubMed

    Zhu, Senlin; Nyarko, Emmanuel Karlo; Hadzima-Nyarko, Marijana

    2018-01-01

    The bio-chemical and physical characteristics of a river are directly affected by water temperature, which thereby affects the overall health of aquatic ecosystems. It is a complex problem to accurately estimate water temperature. Modelling of river water temperature is usually based on a suitable mathematical model and field measurements of various atmospheric factors. In this article, the air-water temperature relationship of the Missouri River is investigated by developing three different machine learning models (Artificial Neural Network (ANN), Gaussian Process Regression (GPR), and Bootstrap Aggregated Decision Trees (BA-DT)). Standard models (linear regression, non-linear regression, and stochastic models) are also developed and compared to machine learning models. Analyzing the three standard models, the stochastic model clearly outperforms the standard linear model and nonlinear model. All the three machine learning models have comparable results and outperform the stochastic model, with GPR having slightly better results for stations No. 2 and 3, while BA-DT has slightly better results for station No. 1. The machine learning models are very effective tools which can be used for the prediction of daily river temperature.

  4. Implementing Journaling in a Linux Shared Disk File System

    NASA Technical Reports Server (NTRS)

    Preslan, Kenneth W.; Barry, Andrew; Brassow, Jonathan; Cattelan, Russell; Manthei, Adam; Nygaard, Erling; VanOort, Seth; Teigland, David; Tilstra, Mike; O'Keefe, Matthew; hide

    2000-01-01

    In computer systems today, speed and responsiveness is often determined by network and storage subsystem performance. Faster, more scalable networking interfaces like Fibre Channel and Gigabit Ethernet provide the scaffolding from which higher performance computer systems implementations may be constructed, but new thinking is required about how machines interact with network-enabled storage devices. In this paper we describe how we implemented journaling in the Global File System (GFS), a shared-disk, cluster file system for Linux. Our previous three papers on GFS at the Mass Storage Symposium discussed our first three GFS implementations, their performance, and the lessons learned. Our fourth paper describes, appropriately enough, the evolution of GFS version 3 to version 4, which supports journaling and recovery from client failures. In addition, GFS scalability tests extending to 8 machines accessing 8 4-disk enclosures were conducted: these tests showed good scaling. We describe the GFS cluster infrastructure, which is necessary for proper recovery from machine and disk failures in a collection of machines sharing disks using GFS. Finally, we discuss the suitability of Linux for handling the big data requirements of supercomputing centers.

  5. Quantum-assisted Helmholtz machines: A quantum–classical deep learning framework for industrial datasets in near-term devices

    NASA Astrophysics Data System (ADS)

    Benedetti, Marcello; Realpe-Gómez, John; Perdomo-Ortiz, Alejandro

    2018-07-01

    Machine learning has been presented as one of the key applications for near-term quantum technologies, given its high commercial value and wide range of applicability. In this work, we introduce the quantum-assisted Helmholtz machine:a hybrid quantum–classical framework with the potential of tackling high-dimensional real-world machine learning datasets on continuous variables. Instead of using quantum computers only to assist deep learning, as previous approaches have suggested, we use deep learning to extract a low-dimensional binary representation of data, suitable for processing on relatively small quantum computers. Then, the quantum hardware and deep learning architecture work together to train an unsupervised generative model. We demonstrate this concept using 1644 quantum bits of a D-Wave 2000Q quantum device to model a sub-sampled version of the MNIST handwritten digit dataset with 16 × 16 continuous valued pixels. Although we illustrate this concept on a quantum annealer, adaptations to other quantum platforms, such as ion-trap technologies or superconducting gate-model architectures, could be explored within this flexible framework.

  6. [Algorithms, machine intelligence, big data : general considerations].

    PubMed

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges.

  7. Early experiences in developing and managing the neuroscience gateway.

    PubMed

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T

    2015-02-01

    The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.

  8. A hybrid analytical model for open-circuit field calculation of multilayer interior permanent magnet machines

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Xia, Changliang; Yan, Yan; Geng, Qiang; Shi, Tingna

    2017-08-01

    Due to the complicated rotor structure and nonlinear saturation of rotor bridges, it is difficult to build a fast and accurate analytical field calculation model for multilayer interior permanent magnet (IPM) machines. In this paper, a hybrid analytical model suitable for the open-circuit field calculation of multilayer IPM machines is proposed by coupling the magnetic equivalent circuit (MEC) method and the subdomain technique. In the proposed analytical model, the rotor magnetic field is calculated by the MEC method based on the Kirchhoff's law, while the field in the stator slot, slot opening and air-gap is calculated by subdomain technique based on the Maxwell's equation. To solve the whole field distribution of the multilayer IPM machines, the coupled boundary conditions on the rotor surface are deduced for the coupling of the rotor MEC and the analytical field distribution of the stator slot, slot opening and air-gap. The hybrid analytical model can be used to calculate the open-circuit air-gap field distribution, back electromotive force (EMF) and cogging torque of multilayer IPM machines. Compared with finite element analysis (FEA), it has the advantages of faster modeling, less computation source occupying and shorter time consuming, and meanwhile achieves the approximate accuracy. The analytical model is helpful and applicable for the open-circuit field calculation of multilayer IPM machines with any size and pole/slot number combination.

  9. Biofilm-forming bacteria with varying tolerance to peracetic acid from a paper machine.

    PubMed

    Rasimus, Stiina; Kolari, Marko; Rita, Hannu; Hoornstra, Douwe; Salkinoja-Salonen, Mirja

    2011-09-01

    Biofilms cause runnability problems in paper machines and are therefore controlled with biocides. Peracetic acid is usually effective in preventing bulky biofilms. This study investigated the microbiological status of a paper machine where low concentrations (≤ 15 ppm active ingredient) of peracetic acid had been used for several years. The paper machine contained a low amount of biofilms. Biofilm-forming bacteria from this environment were isolated and characterized by 16S rRNA gene sequencing, whole-cell fatty acid analysis, biochemical tests, and DNA fingerprinting. Seventy-five percent of the isolates were identified as members of the subclades Sphingomonas trueperi and S. aquatilis, and the others as species of the genera Burkholderia (B. cepacia complex), Methylobacterium, and Rhizobium. Although the isolation media were suitable for the common paper machine biofoulers Deinococcus, Meiothermus, and Pseudoxanthomonas, none of these were found, indicating that peracetic acid had prevented their growth. Spontaneous, irreversible loss of the ability to form biofilm was observed during subculturing of certain isolates of the subclade S. trueperi. The Sphingomonas isolates formed monoculture biofilms that tolerated peracetic acid at concentrations (10 ppm active ingredient) used for antifouling in paper machines. High pH and low conductivity of the process waters favored the peracetic acid tolerance of Sphingomonas sp. biofilms. This appears to be the first report on sphingomonads as biofilm formers in warm water using industries.

  10. A comparative study of surface EMG classification by fuzzy relevance vector machine and fuzzy support vector machine.

    PubMed

    Xie, Hong-Bo; Huang, Hu; Wu, Jianhua; Liu, Lei

    2015-02-01

    We present a multiclass fuzzy relevance vector machine (FRVM) learning mechanism and evaluate its performance to classify multiple hand motions using surface electromyographic (sEMG) signals. The relevance vector machine (RVM) is a sparse Bayesian kernel method which avoids some limitations of the support vector machine (SVM). However, RVM still suffers the difficulty of possible unclassifiable regions in multiclass problems. We propose two fuzzy membership function-based FRVM algorithms to solve such problems, based on experiments conducted on seven healthy subjects and two amputees with six hand motions. Two feature sets, namely, AR model coefficients and room mean square value (AR-RMS), and wavelet transform (WT) features, are extracted from the recorded sEMG signals. Fuzzy support vector machine (FSVM) analysis was also conducted for wide comparison in terms of accuracy, sparsity, training and testing time, as well as the effect of training sample sizes. FRVM yielded comparable classification accuracy with dramatically fewer support vectors in comparison with FSVM. Furthermore, the processing delay of FRVM was much less than that of FSVM, whilst training time of FSVM much faster than FRVM. The results indicate that FRVM classifier trained using sufficient samples can achieve comparable generalization capability as FSVM with significant sparsity in multi-channel sEMG classification, which is more suitable for sEMG-based real-time control applications.

  11. Early experiences in developing and managing the neuroscience gateway

    PubMed Central

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.

    2015-01-01

    SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124

  12. Application of target costing in machining

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Bhaskaran; Kokatnur, Ameet; Gupta, Deepak P.

    2004-11-01

    In today's intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functionality requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. Target costing is a structured approach to determine the cost at which a proposed product, meeting the quality and functionality requirements, must be produced in order to generate the desired profits. It subtracts the desired profit margin from the company's selling price to establish the manufacturing cost of the product. Extensive literature review revealed that companies in automotive, electronic and process industries have reaped the benefits of target costing. However target costing approach has not been applied in the machining industry, but other techniques based on Geometric Programming, Goal Programming, and Lagrange Multiplier have been proposed for application in this industry. These models follow a forward approach, by first selecting a set of machining parameters, and then determining the machining cost. Hence in this study we have developed an algorithm to apply the concepts of target costing, which is a backward approach that selects the machining parameters based on the required machining costs, and is therefore more suitable for practical applications in process improvement and cost reduction. A target costing model was developed for turning operation and was successfully validated using practical data.

  13. A Machine Learning Application Based in Random Forest for Integrating Mass Spectrometry-Based Metabolomic Data: A Simple Screening Method for Patients With Zika Virus

    PubMed Central

    Melo, Carlos Fernando Odir Rodrigues; Navarro, Luiz Claudio; de Oliveira, Diogo Noin; Guerreiro, Tatiane Melina; Lima, Estela de Oliveira; Delafiori, Jeany; Dabaja, Mohamed Ziad; Ribeiro, Marta da Silva; de Menezes, Maico; Rodrigues, Rafael Gustavo Martins; Morishita, Karen Noda; Esteves, Cibele Zanardi; de Amorim, Aline Lopes Lucas; Aoyagui, Caroline Tiemi; Parise, Pierina Lorencini; Milanez, Guilherme Paier; do Nascimento, Gabriela Mansano; Ribas Freitas, André Ricardo; Angerami, Rodrigo; Costa, Fábio Trindade Maranhão; Arns, Clarice Weis; Resende, Mariangela Ribeiro; Amaral, Eliana; Junior, Renato Passini; Ribeiro-do-Valle, Carolina C.; Milanez, Helaine; Moretti, Maria Luiza; Proenca-Modena, Jose Luiz; Avila, Sandra; Rocha, Anderson; Catharino, Rodrigo Ramos

    2018-01-01

    Recent Zika outbreaks in South America, accompanied by unexpectedly severe clinical complications have brought much interest in fast and reliable screening methods for ZIKV (Zika virus) identification. Reverse-transcriptase polymerase chain reaction (RT-PCR) is currently the method of choice to detect ZIKV in biological samples. This approach, nonetheless, demands a considerable amount of time and resources such as kits and reagents that, in endemic areas, may result in a substantial financial burden over affected individuals and health services veering away from RT-PCR analysis. This study presents a powerful combination of high-resolution mass spectrometry and a machine-learning prediction model for data analysis to assess the existence of ZIKV infection across a series of patients that bear similar symptomatic conditions, but not necessarily are infected with the disease. By using mass spectrometric data that are inputted with the developed decision-making algorithm, we were able to provide a set of features that work as a “fingerprint” for this specific pathophysiological condition, even after the acute phase of infection. Since both mass spectrometry and machine learning approaches are well-established and have largely utilized tools within their respective fields, this combination of methods emerges as a distinct alternative for clinical applications, providing a diagnostic screening—faster and more accurate—with improved cost-effectiveness when compared to existing technologies. PMID:29696139

  14. Irradiate-anneal screening of total dose effects in semiconductor devices

    NASA Technical Reports Server (NTRS)

    Stanley, A. G.; Price, W. E.

    1976-01-01

    Judicious choice of radiation dose and parameter change acceptance criteria, absence of anomalous anneal phenomena, and absence of anomalous reirradiation effects are recognized as essential for a successful irradiation-anneal (IRAN) screening procedure to ensure that no device will fall, upon reirradiation, above parametric limits assigned for the worst case application. Reirradiation and irradiation-anneal behavior of various semiconductor devices are compared and those that do not lend themselves to IRAN screening are singled out. Information needed to judge the suitability of an IRAN type screening program is detailed. Reasons for success of the limited IRAN screening of flight parts for the Mariner Jupiter/Saturn (MJS '77) spacecraft are indicated.

  15. Machine Learning for Social Services: A Study of Prenatal Case Management in Illinois.

    PubMed

    Pan, Ian; Nolan, Laura B; Brown, Rashida R; Khan, Romana; van der Boor, Paul; Harris, Daniel G; Ghani, Rayid

    2017-06-01

    To evaluate the positive predictive value of machine learning algorithms for early assessment of adverse birth risk among pregnant women as a means of improving the allocation of social services. We used administrative data for 6457 women collected by the Illinois Department of Human Services from July 2014 to May 2015 to develop a machine learning model for adverse birth prediction and improve upon the existing paper-based risk assessment. We compared different models and determined the strongest predictors of adverse birth outcomes using positive predictive value as the metric for selection. Machine learning algorithms performed similarly, outperforming the current paper-based risk assessment by up to 36%; a refined paper-based assessment outperformed the current assessment by up to 22%. We estimate that these improvements will allow 100 to 170 additional high-risk pregnant women screened for program eligibility each year to receive services that would have otherwise been unobtainable. Our analysis exhibits the potential for machine learning to move government agencies toward a more data-informed approach to evaluating risk and providing social services. Overall, such efforts will improve the efficiency of allocating resource-intensive interventions.

  16. Autism detection in early childhood (ADEC): reliability and validity data for a Level 2 screening tool for autistic disorder.

    PubMed

    Nah, Yong-Hwee; Young, Robyn L; Brewer, Neil; Berlingeri, Genna

    2014-03-01

    The Autism Detection in Early Childhood (ADEC; Young, 2007) was developed as a Level 2 clinician-administered autistic disorder (AD) screening tool that was time-efficient, suitable for children under 3 years, easy to administer, and suitable for persons with minimal training and experience with AD. A best estimate clinical Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) diagnosis of AD was made for 70 children using all available information and assessment results, except for the ADEC data. A screening study compared these children on the ADEC with 57 children with other developmental disorders and 64 typically developing children. Results indicated high internal consistency (α = .91). Interrater reliability and test-retest reliability of the ADEC were also adequate. ADEC scores reliably discriminated different diagnostic groups after controlling for nonverbal IQ and Vineland Adaptive Behavior Composite scores. Construct validity (using exploratory factor analysis) and concurrent validity using performance on the Autism Diagnostic Observation Schedule (Lord et al., 2000), the Autism Diagnostic Interview-Revised (Le Couteur, Lord, & Rutter, 2003), and DSM-IV-TR criteria were also demonstrated. Signal detection analysis identified the optimal ADEC cutoff score, with the ADEC identifying all children who had an AD (N = 70, sensitivity = 1.0) but overincluding children with other disabilities (N = 13, specificity ranging from .74 to .90). Together, the reliability and validity data indicate that the ADEC has potential to be established as a suitable and efficient screening tool for infants with AD. 2014 APA

  17. The classification of normal screening mammograms

    NASA Astrophysics Data System (ADS)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  18. Grid heterogeneity in in-silico experiments: an exploration of drug screening using DOCK on cloud environments.

    PubMed

    Yim, Wen-Wai; Chien, Shu; Kusumoto, Yasuyuki; Date, Susumu; Haga, Jason

    2010-01-01

    Large-scale in-silico screening is a necessary part of drug discovery and Grid computing is one answer to this demand. A disadvantage of using Grid computing is the heterogeneous computational environments characteristic of a Grid. In our study, we have found that for the molecular docking simulation program DOCK, different clusters within a Grid organization can yield inconsistent results. Because DOCK in-silico virtual screening (VS) is currently used to help select chemical compounds to test with in-vitro experiments, such differences have little effect on the validity of using virtual screening before subsequent steps in the drug discovery process. However, it is difficult to predict whether the accumulation of these discrepancies over sequentially repeated VS experiments will significantly alter the results if VS is used as the primary means for identifying potential drugs. Moreover, such discrepancies may be unacceptable for other applications requiring more stringent thresholds. This highlights the need for establishing a more complete solution to provide the best scientific accuracy when executing an application across Grids. One possible solution to platform heterogeneity in DOCK performance explored in our study involved the use of virtual machines as a layer of abstraction. This study investigated the feasibility and practicality of using virtual machine and recent cloud computing technologies in a biological research application. We examined the differences and variations of DOCK VS variables, across a Grid environment composed of different clusters, with and without virtualization. The uniform computer environment provided by virtual machines eliminated inconsistent DOCK VS results caused by heterogeneous clusters, however, the execution time for the DOCK VS increased. In our particular experiments, overhead costs were found to be an average of 41% and 2% in execution time for two different clusters, while the actual magnitudes of the execution time costs were minimal. Despite the increase in overhead, virtual clusters are an ideal solution for Grid heterogeneity. With greater development of virtual cluster technology in Grid environments, the problem of platform heterogeneity may be eliminated through virtualization, allowing greater usage of VS, and will benefit all Grid applications in general.

  19. Applying a machine learning model using a locally preserving projection based feature regeneration algorithm to predict breast cancer risk

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qian, Wei; Zheng, Bin

    2018-03-01

    Both conventional and deep machine learning has been used to develop decision-support tools applied in medical imaging informatics. In order to take advantages of both conventional and deep learning approach, this study aims to investigate feasibility of applying a locally preserving projection (LPP) based feature regeneration algorithm to build a new machine learning classifier model to predict short-term breast cancer risk. First, a computer-aided image processing scheme was used to segment and quantify breast fibro-glandular tissue volume. Next, initially computed 44 image features related to the bilateral mammographic tissue density asymmetry were extracted. Then, an LLP-based feature combination method was applied to regenerate a new operational feature vector using a maximal variance approach. Last, a k-nearest neighborhood (KNN) algorithm based machine learning classifier using the LPP-generated new feature vectors was developed to predict breast cancer risk. A testing dataset involving negative mammograms acquired from 500 women was used. Among them, 250 were positive and 250 remained negative in the next subsequent mammography screening. Applying to this dataset, LLP-generated feature vector reduced the number of features from 44 to 4. Using a leave-onecase-out validation method, area under ROC curve produced by the KNN classifier significantly increased from 0.62 to 0.68 (p < 0.05) and odds ratio was 4.60 with a 95% confidence interval of [3.16, 6.70]. Study demonstrated that this new LPP-based feature regeneration approach enabled to produce an optimal feature vector and yield improved performance in assisting to predict risk of women having breast cancer detected in the next subsequent mammography screening.

  20. Influence of grid bar shape on field cleaner performance - Screening tests

    USDA-ARS?s Scientific Manuscript database

    Extractor type cleaners are used on cotton strippers and in the seed cotton cleaning machinery in the ginning process to remove large foreign material such as burrs and sticks. Previous research on the development of extractor type cleaners focused on machine design and operating parameters that max...

  1. Drill Press Work Sample.

    ERIC Educational Resources Information Center

    Shawsheen Valley Regional Vocational-Technical High School, Billerica, MA.

    This manual contains a work sample intended to assess a handicapped student's interest in and to screen interested students into a training program in basic machine shop I. (The course is based on the entry level of the drill press operator.) Section 1 describes the assessment, correlates the work performed and worker traits required for…

  2. A Comparison of Machine Learning Algorithms for Chemical Toxicity Classification Using a Simulated Multi-Scale Data Model

    EPA Science Inventory

    Bioactivity profiling using high-throughput in vitro assays can reduce the cost and time required for toxicological screening of environmental chemicals and can also reduce the need for animal testing. Several public efforts are aimed at discovering patterns or classifiers in hig...

  3. Facile high-throughput forward chemical genetic screening by in situ monitoring of glucuronidase-based reporter gene expression in Arabidopsis thaliana

    PubMed Central

    Halder, Vivek; Kombrink, Erich

    2015-01-01

    The use of biologically active small molecules to perturb biological functions holds enormous potential for investigating complex signaling networks. However, in contrast to animal systems, the search for and application of chemical tools for basic discovery in the plant sciences, generally referred to as “chemical genetics,” has only recently gained momentum. In addition to cultured cells, the well-characterized, small-sized model plant Arabidopsis thaliana is suitable for cultivation in microplates, which allows employing diverse cell- or phenotype-based chemical screens. In such screens, a chemical's bioactivity is typically assessed either through scoring its impact on morphological traits or quantifying molecular attributes such as enzyme or reporter activities. Here, we describe a facile forward chemical screening methodology for intact Arabidopsis seedlings harboring the β-glucuronidase (GUS) reporter by directly quantifying GUS activity in situ with 4-methylumbelliferyl-β-D-glucuronide (4-MUG) as substrate. The quantitative nature of this screening assay has an obvious advantage over the also convenient histochemical GUS staining method, as it allows application of statistical procedures and unbiased hit selection based on threshold values as well as distinction between compounds with strong or weak bioactivity. At the same time, the in situ bioassay is very convenient requiring less effort and time for sample handling in comparison to the conventional quantitative in vitro GUS assay using 4-MUG, as validated with several Arabidopsis lines harboring different GUS reporter constructs. To demonstrate that the developed assays is particularly suitable for large-scale screening projects, we performed a pilot screen for chemical activators or inhibitors of salicylic acid-mediated defense signaling using the Arabidopsis PR1p::GUS line. Importantly, the screening methodology provided here can be adopted for any inducible GUS reporter line. PMID:25688251

  4. Experience of domestic violence routine screening in Family Planning NSW clinics.

    PubMed

    Hunter, Tara; Botfield, Jessica R; Estoesta, Jane; Markham, Pippa; Robertson, Sarah; McGeechan, Kevin

    2017-04-01

    This study reviewed implementation of the Domestic Violence Routine Screening (DVRS) program at Family Planning NSW and outcomes of screening to determine the feasibility of routine screening in a family planning setting and the suitability of this program in the context of women's reproductive and sexual health. A retrospective review of medical records was undertaken of eligible women attending Family Planning NSW clinics between 1 January and 31 December 2015. Modified Poisson regression was used to estimate prevalence ratios and assess association between binary outcomes and client characteristics. Of 13440 eligible women, 5491 were screened (41%). Number of visits, clinic attended, age, employment status and disability were associated with completion of screening. In all, 220 women (4.0%) disclosed domestic violence. Factors associated with disclosure were clinic attended, age group, region of birth, employment status, education and disability. Women who disclosed domestic violence were more likely to have discussed issues related to sexually transmissible infections in their consultation. All women who disclosed were assessed for any safety concerns and offered a range of suitable referral options. Although routine screening may not be appropriate in all health settings, given associations between domestic violence and sexual and reproductive health, a DVRS program is considered appropriate in sexual and reproductive health clinics and appears to be feasible in a service such as Family Planning NSW. Consistent implementation of the program should continue at Family Planning NSW and be expanded to other family planning services in Australia to support identification and early intervention for women affected by domestic violence.

  5. Exploring the optimal integration levels between SAR and optical data for better urban land cover mapping in the Pearl River Delta

    NASA Astrophysics Data System (ADS)

    Zhang, Hongsheng; Xu, Ru

    2018-02-01

    Integrating synthetic aperture radar (SAR) and optical data to improve urban land cover classification has been identified as a promising approach. However, which integration level is the most suitable remains unclear but important to many researchers and engineers. This study aimed to compare different integration levels for providing a scientific reference for a wide range of studies using optical and SAR data. SAR data from TerraSAR-X and ENVISAT ASAR in both WSM and IMP modes were used to be combined with optical data at pixel level, feature level and decision levels using four typical machine learning methods. The experimental results indicated that: 1) feature level that used both the original images and extracted features achieved a significant improvement of up to 10% compared to that using optical data alone; 2) different levels of fusion required different suitable methods depending on the data distribution and data resolution. For instance, support vector machine was the most stable at both the feature and decision levels, while random forest was suitable at the pixel level but not suitable at the decision level. 3) By examining the distribution of SAR features, some features (e.g., homogeneity) exhibited a close-to-normal distribution, explaining the improvement from the maximum likelihood method at the feature and decision levels. This indicated the benefits of using texture features from SAR data when being combined with optical data for land cover classification. Additionally, the research also shown that combining optical and SAR data does not guarantee improvement compared with using single data source for urban land cover classification, depending on the selection of appropriate fusion levels and fusion methods.

  6. An eye movement study for identification of suitable font characters for presentation on a computer screen.

    PubMed

    Banerjee, Jayeeta; Majumdar, Dhurjati; Majumdar, Deepti; Pal, Madhu Sudan

    2010-06-01

    We are experiencing a shifting of media: from the printed paper to the computer screen. This transition is modifying the process of how we read and understand a text. It is very difficult to conclude on suitability of font characters based upon subjective evaluation method only. Present study evaluates the effect of font type on human cognitive workload during perception of individual alphabets on a computer screen. Twenty six young subjects volunteered for this study. Here, subjects have been shown individual characters of different font types and their eye movements have been recorded. A binocular eye movement recorder was used for eye movement recording. The results showed that different eye movement parameters such as pupil diameter, number of fixations, fixation duration were less for font type Verdana. The present study recommends the use of font type Verdana for presentation of individual alphabets on various electronic displays in order to reduce cognitive workload.

  7. Evaluation of Standardized Instruments for Use in Universal Screening of Very Early School-Age Children: Suitability, Technical Adequacy, and Usability

    ERIC Educational Resources Information Center

    Miles, Sandra; Fulbrook, Paul; Mainwaring-Mägi, Debra

    2018-01-01

    Universal screening of very early school-age children (age 4-7 years) is important for early identification of learning problems that may require enhanced learning opportunity. In this context, use of standardized instruments is critical to obtain valid, reliable, and comparable assessment outcomes. A wide variety of standardized instruments is…

  8. Screening for Dyslexia in French-Speaking University Students: An Evaluation of the Detection Accuracy of the "Alouette" Test

    ERIC Educational Resources Information Center

    Cavalli, Eddy; Colé, Pascale; Leloup, Gilles; Poracchia-George, Florence; Sprenger-Charolles, Liliane; El Ahmadi, Abdessadek

    2018-01-01

    Developmental dyslexia is a lifelong impairment affecting 5% to 10% of the population. In French-speaking countries, although a number of standardized tests for dyslexia in children are available, tools suitable to screen for dyslexia in adults are lacking. In this study, we administered the "Alouette" reading test to a normative sample…

  9. Screening for Autism in Iranian Preschoolers: Contrasting M-CHAT and a Scale Developed in Iran

    ERIC Educational Resources Information Center

    Samadi, Sayyed Ali; McConkey, Roy

    2015-01-01

    Suitable screening instruments for the early diagnosis of autism are not readily available for use with preschoolers in non-Western countries. This study evaluated two tools: M-CHAT which is widely used internationally and one developed in Iran called Hiva. A population sample was recruited of nearly 3000 preschoolers in one Iranian city. Parents…

  10. Copper homeostasis gene discovery in Drosophila melanogaster.

    PubMed

    Norgate, Melanie; Southon, Adam; Zou, Sige; Zhan, Ming; Sun, Yu; Batterham, Phil; Camakaris, James

    2007-06-01

    Recent studies have shown a high level of conservation between Drosophila melanogaster and mammalian copper homeostasis mechanisms. These studies have also demonstrated the efficiency with which this species can be used to characterize novel genes, at both the cellular and whole organism level. As a versatile and inexpensive model organism, Drosophila is also particularly useful for gene discovery applications and thus has the potential to be extremely useful in identifying novel copper homeostasis genes and putative disease genes. In order to assess the suitability of Drosophila for this purpose, three screening approaches have been investigated. These include an analysis of the global transcriptional response to copper in both adult flies and an embryonic cell line using DNA microarray analysis. Two mutagenesis-based screens were also utilized. Several candidate copper homeostasis genes have been identified through this work. In addition, the results of each screen were carefully analyzed to identify any factors influencing efficiency and sensitivity. These are discussed here with the aim of maximizing the efficiency of future screens and the most suitable approaches are outlined. Building on this information, there is great potential for the further use of Drosophila for copper homeostasis gene discovery.

  11. Behavioral phenotyping of mice in pharmacological and toxicological research.

    PubMed

    Karl, Tim; Pabst, Reinhard; von Hörsten, Stephan

    2003-07-01

    The evaluation of behavioral effects is an important component for the in vivo screening of drugs or potentially toxic compounds in mice. Ideally, such screening should be composed of monitoring general health, sensory functions, and motor abilities, right before specific behavioral domains are tested. A rational strategy in the design and procedure of testing as well as an effective composition of different well-established and reproducible behavioral tests can minimize the risk of false positive and false negative results in drug screening. In the present review we describe such basic considerations in planning experiments, selecting strains of mice, and propose groups of behavioral tasks suitable for a reliable detection of differences in specific behavioral domains in mice. Screening of general health and neurophysiologic functions (reflexes, sensory abilities) and motor function (pole test, wire hang test, beam walking, rotarod, accelerod, and footprint) as well as specific hypothesis-guided testing in the behavioral domains of learning and memory (water maze, radial maze, conditioned fear, and avoidance tasks), emotionality (open field, hole board, elevated plus maze, and object exploration), nociception (tail flick, hot plate), psychiatric-like conditions (porsolt swim test, acoustic startle response, and prepulse inhibition), and aggression (isolation-induced aggression, spontaneous aggression, and territorial aggression) are described in further detail. This review is designed to describe a general approach, which increases reliability of behavioral screening. Furthermore, it provides an overview on a selection of specific procedures suitable for but not limited to behavioral screening in pharmacology and toxicology.

  12. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias

    With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less

  13. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    PubMed

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  14. Service Oriented Architecture Security Risks and their Mitigation

    DTIC Science & Technology

    2012-10-01

    this section can be mitigated by making use of suitable authentication , confidentiality, integrity, and authorisation standards such as Security...for authorisation . Machines/non-human users should be clearly identified and authenticated by the identity provision and authentication services... authentication , any security related attributes for the subject, and the authorisation decisions given based on the security and privilege attributes

  15. Engine Lathe Operator. Instructor's Guide. Part of Single-Tool Skills Program Series. Machine Industries Occupations.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Secondary Curriculum Development.

    Expected to help meet the need for trained operators in metalworking and suitable for use in the adult education programs of school districts, in manpower development and training programs, and in secondary schools, this guide consists of four sections: Introduction, General Job Content, Shop Projects, and Drawings for the Projects. General Job…

  16. A Study of Readability of Texts in Bangla through Machine Learning Approaches

    ERIC Educational Resources Information Center

    Sinha, Manjira; Basu, Anupam

    2016-01-01

    In this work, we have investigated text readability in Bangla language. Text readability is an indicator of the suitability of a given document with respect to a target reader group. Therefore, text readability has huge impact on educational content preparation. The advances in the field of natural language processing have enabled the automatic…

  17. Molecular Dynamics based on a Generalized Born solvation model: application to protein folding

    NASA Astrophysics Data System (ADS)

    Onufriev, Alexey

    2004-03-01

    An accurate description of the aqueous environment is essential for realistic biomolecular simulations, but may become very expensive computationally. We have developed a version of the Generalized Born model suitable for describing large conformational changes in macromolecules. The model represents the solvent implicitly as continuum with the dielectric properties of water, and include charge screening effects of salt. The computational cost associated with the use of this model in Molecular Dynamics simulations is generally considerably smaller than the cost of representing water explicitly. Also, compared to traditional Molecular Dynamics simulations based on explicit water representation, conformational changes occur much faster in implicit solvation environment due to the absence of viscosity. The combined speed-up allow one to probe conformational changes that occur on much longer effective time-scales. We apply the model to folding of a 46-residue three helix bundle protein (residues 10-55 of protein A, PDB ID 1BDD). Starting from an unfolded structure at 450 K, the protein folds to the lowest energy state in 6 ns of simulation time, which takes about a day on a 16 processor SGI machine. The predicted structure differs from the native one by 2.4 A (backbone RMSD). Analysis of the structures seen on the folding pathway reveals details of the folding process unavailable form experiment.

  18. Adaptive user displays for intelligent tutoring software.

    PubMed

    Beal, Carole R

    2004-12-01

    Intelligent tutoring software (ITS) holds great promise for K-12 instruction. Yet it is difficult to obtain rich information about users that can be used in realistic educational delivery settings--public school classrooms--in which eye tracking and other user sensing technologies are not suitable. We are pursuing three "cheap and cheerful" strategies to meet this challenge in the context of an ITS for high school math instruction. First, we use detailed representations of student cognitive skills, including tasks to assess individual users' proficiency with abstract reasoning, proficiency with simple math facts and computational skill, and spatial ability. Second, we are using data mining and machine learning algorithms to identify instructional sequences that have been effective with previous students, and to use these patterns to make decisions about current students. Third, we are integrating a simple focus-of-attention tracking system into the software, using inexpensive, web cameras. This coarse-grained information can be used to time the display of multimedia hints, explanations, and examples when the user is actually looking at the screen, and to diagnose causes of problem-solving errors. The ultimate goal is to create non-intrusive software that can adapt the display of instructional information in real time to the user's cognitive strengths, motivation, and attention.

  19. 'Controversy'. Propaganda versus evidence based health promotion: the case of breast screening.

    PubMed

    Hann, A

    1999-01-01

    Breast cancer is a serious problem in the developed world, and the common perception of the risks of developing the disease are communicated to the public via a variety of means. This includes leaflets in doctors' surgeries, health promotion campaigns and invitations from well woman clinics to attend for various forms of screening. The national breast cancer screening programme in the UK has a very high compliance rate (which is vital) and a well oiled media machine. This article examines the way in which the risks of developing breast cancer are communicated to women of all ages in the UK, and speculates as to the reason behind the misleading manner in which health promoters offer this information.

  20. Suitability of the "'Little DCDQ" for the Identification of DCD in a Selected Group of 3-5-Year-Old South African Children

    ERIC Educational Resources Information Center

    Venter, Amné; Pienaar, Anita E.; Coetzee, Dané

    2015-01-01

    Background: In order to identify Developmental Coordination Disorder (DCD) as soon as possible, we need validated screening instruments that can be used for the early identification of motor coordination delays. The aim of this study was to establish the suitability of the Little Developmental Coordination Disorder Questionnaire (Little DCDQ) for…

  1. SPIKE: AI scheduling techniques for Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Johnston, Mark D.

    1991-09-01

    AI (Artificial Intelligence) scheduling techniques for HST are presented in the form of the viewgraphs. The following subject areas are covered: domain; HST constraint timescales; HTS scheduling; SPIKE overview; SPIKE architecture; constraint representation and reasoning; use of suitability functions by scheduling agent; SPIKE screen example; advantages of suitability function framework; limiting search and constraint propagation; scheduling search; stochastic search; repair methods; implementation; and status.

  2. Automatic detection of tweets reporting cases of influenza like illnesses in Australia

    PubMed Central

    2015-01-01

    Early detection of disease outbreaks is critical for disease spread control and management. In this work we investigate the suitability of statistical machine learning approaches to automatically detect Twitter messages (tweets) that are likely to report cases of possible influenza like illnesses (ILI). Empirical results obtained on a large set of tweets originating from the state of Victoria, Australia, in a 3.5 month period show evidence that machine learning classifiers are effective in identifying tweets that mention possible cases of ILI (up to 0.736 F-measure, i.e. the harmonic mean of precision and recall), regardless of the specific technique implemented by the classifier investigated in the study. PMID:25870759

  3. Final Report: Enabling Exascale Hardware and Software Design through Scalable System Virtualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Patrick G.

    2015-02-01

    In this grant, we enhanced the Palacios virtual machine monitor to increase its scalability and suitability for addressing exascale system software design issues. This included a wide range of research on core Palacios features, large-scale system emulation, fault injection, perfomrance monitoring, and VMM extensibility. This research resulted in large number of high-impact publications in well-known venues, the support of a number of students, and the graduation of two Ph.D. students and one M.S. student. In addition, our enhanced version of the Palacios virtual machine monitor has been adopted as a core element of the Hobbes operating system under active DOE-fundedmore » research and development.« less

  4. Plastic Foam Withstands Greater Temperatures And Pressures

    NASA Technical Reports Server (NTRS)

    Cranston, John A.; Macarthur, Doug

    1993-01-01

    Improved plastic foam suitable for use in foam-core laminated composite parts and in tooling for making fiber/matrix-composite parts. Stronger at high temperatures, more thermally and dimensionally stable, machinable, resistant to chemical degradation, and less expensive. Compatible with variety of matrix resins. Made of polyisocyanurate blown with carbon dioxide and has density of 12 to 15 pounds per cubic feet. Does not contibute to depletion of ozone from atmosphere. Improved foam used in cores of composite panels in such diverse products as aircraft, automobiles, railroad cars, boats, and sporting equipment like surfboards, skis, and skateboards. Also used in thermally stable flotation devices in submersible vehicles. Machined into mandrels upon which filaments wound to make shells.

  5. Primary prevention of sudden cardiac death of the young athlete: the controversy about the screening electrocardiogram and its innovative artificial intelligence solution.

    PubMed

    Chang, Anthony C

    2012-03-01

    The preparticipation screening for athlete participation in sports typically entails a comprehensive medical and family history and a complete physical examination. A 12-lead electrocardiogram (ECG) can increase the likelihood of detecting cardiac diagnoses such as hypertrophic cardiomyopathy, but this diagnostic test as part of the screening process has engendered considerable controversy. The pro position is supported by argument that international screening protocols support its use, positive diagnosis has multiple benefits, history and physical examination are inadequate, primary prevention is essential, and the cost effectiveness is justified. Although the aforementioned myriad of justifications for routine ECG screening of young athletes can be persuasive, several valid contentions oppose supporting such a policy, namely, that the sudden death incidence is very (too) low, the ECG screening will be too costly, the false-positive rate is too high, resources will be allocated away from other diseases, and manpower is insufficient for its execution. Clinicians, including pediatric cardiologists, have an understandable proclivity for avoiding this prodigious national endeavor. The controversy, however, should not be focused on whether an inexpensive, noninvasive test such as an ECG should be mandated but should instead be directed at just how these tests for young athletes can be performed in the clinical imbroglio of these disease states (with variable genetic penetrance and phenotypic expression) with concomitant fiscal accountability and logistical expediency in this era of economic restraint. This monumental endeavor in any city or region requires two crucial elements well known to business scholars: implementation and execution. The eventual solution for the screening ECG dilemma requires a truly innovative and systematic approach that will liberate us from inadequate conventional solutions. Artificial intelligence, specifically the process termed "machine learning" and "neural networking," involves complex algorithms that allow computers to improve the decision-making process based on repeated input of empirical data (e.g., databases and ECGs). These elements all can be improved with a national database, evidence-based medicine, and in the near future, innovation that entails a Kurzweilian artificial intelligence infrastructure with machine learning and neural networking that will construct the ultimate clinical decision-making algorithm.

  6. Comparison of Test Procedures and Energy Efficiency Criteria in Selected International Standards & Labeling Programs for Copy Machines, External Power Supplies, LED Displays, Residential Gas Cooktops and Televisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Nina; Zhou, Nan; Fridley, David

    2012-03-01

    This report presents a technical review of international minimum energy performance standards (MEPS), voluntary and mandatory energy efficiency labels and test procedures for five products being considered for new or revised MEPS in China: copy machines, external power supply, LED displays, residential gas cooktops and flat-screen televisions. For each product, an overview of the scope of existing international standards and labeling programs, energy values and energy performance metrics and description and detailed summary table of criteria and procedures in major test standards are presented.

  7. Defined, serum/feeder-free conditions for expansion and drug screening of primary B-acute lymphoblastic leukemia.

    PubMed

    Jiang, Zhiwu; Wu, Di; Ye, Wei; Weng, Jianyu; Lai, Peilong; Shi, Pengcheng; Guo, Xutao; Huang, Guohua; Deng, Qiuhua; Tang, Yanlai; Zhao, Hongyu; Cui, Shuzhong; Lin, Simiao; Wang, Suna; Li, Baiheng; Wu, Qiting; Li, Yangqiu; Liu, Pentao; Pei, Duanqing; Du, Xin; Yao, Yao; Li, Peng

    2017-12-05

    Functional screening for compounds represents a major hurdle in the development of rational therapeutics for B-acute lymphoblastic leukemia (B-ALL). In addition, using cell lines as valid models for evaluating responses to novel drug therapies raises serious concerns, as cell lines are prone to genotypic/phenotypic drift and loss of heterogeneity in vitro . Here, we reported that OP9 cells, not OP9-derived adipocytes (OP9TA), support the growth of primary B-ALL cells in vitro . To identify the factors from OP9 cells that support the growth of primary B-ALL cells, we performed RNA-Seq to analyze the gene expression profiles of OP9 and OP9TA cells. We thus developed a defined, serum/feeder-free condition (FI76V) that can support the expansion of a range of clinically distinct primary B-ALL cells that still maintain their leukemia-initiating ability. We demonstrated the suitability of high-throughput drug screening based on our B-ALL cultured conditions. Upon screening 378 kinase inhibitors, we identified a cluster of 17 kinase inhibitors that can efficiently kill B-ALL cells in vitro . Importantly, we demonstrated the synergistic cytotoxicity of dinaciclib/BTG226 to B-ALL cells. Taken together, we developed a defined condition for the ex vivo expansion of primary B-ALL cells that is suitable for high-throughput screening of novel compounds.

  8. Semiconductor cooling apparatus

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A. (Inventor); Gaier, James R. (Inventor)

    1993-01-01

    Gas derived graphite fibers generated by the decomposition of an organic gas are joined with a suitable binder. This produces a high thermal conductivity composite material which passively conducts heat from a source, such as a semiconductor, to a heat sink. The fibers may be intercalated. The intercalate can be halogen or halide salt, alkaline metal, or any other species which contributes to the electrical conductivity improvement of the graphite fiber. The fibers are bundled and joined with a suitable binder to form a high thermal conductivity composite material device. The heat transfer device may also be made of intercalated highly oriented pyrolytic graphite and machined, rather than made of fibers.

  9. Quantifying nonhomogeneous colors in agricultural materials. Part II: comparison of machine vision and sensory panel evaluations.

    PubMed

    Balaban, M O; Aparicio, J; Zotarelli, M; Sims, C

    2008-11-01

    The average colors of mangos and apples were measured using machine vision. A method to quantify the perception of nonhomogeneous colors by sensory panelists was developed. Three colors out of several reference colors and their perceived percentage of the total sample area were selected by untrained panelists. Differences between the average colors perceived by panelists and those from the machine vision were reported as DeltaE values (color difference error). Effects of nonhomogeneity of color, and using real samples or their images in the sensory panels on DeltaE were evaluated. In general, samples with more nonuniform colors had higher DeltaE values, suggesting that panelists had more difficulty in evaluating more nonhomogeneous colors. There was no significant difference in DeltaE values between the real fruits and their screen image, therefore images can be used to evaluate color instead of the real samples.

  10. Implementing finite state machines in a computer-based teaching system

    NASA Astrophysics Data System (ADS)

    Hacker, Charles H.; Sitte, Renate

    1999-09-01

    Finite State Machines (FSM) are models for functions commonly implemented in digital circuits such as timers, remote controls, and vending machines. Teaching FSM is core in the curriculum of many university digital electronic or discrete mathematics subjects. Students often have difficulties grasping the theoretical concepts in the design and analysis of FSM. This has prompted the author to develop an MS-WindowsTM compatible software, WinState, that provides a tutorial style teaching aid for understanding the mechanisms of FSM. The animated computer screen is ideal for visually conveying the required design and analysis procedures. WinState complements other software for combinatorial logic previously developed by the author, and enhances the existing teaching package by adding sequential logic circuits. WinState enables the construction of a students own FSM, which can be simulated, to test the design for functionality and possible errors.

  11. Climatic influence on anthrax suitability in warming northern latitudes.

    PubMed

    Walsh, Michael G; de Smalen, Allard W; Mor, Siobhan M

    2018-06-18

    Climate change is impacting ecosystem structure and function, with potentially drastic downstream effects on human and animal health. Emerging zoonotic diseases are expected to be particularly vulnerable to climate and biodiversity disturbance. Anthrax is an archetypal zoonosis that manifests its most significant burden on vulnerable pastoralist communities. The current study sought to investigate the influence of temperature increases on geographic anthrax suitability in the temperate, boreal, and arctic North, where observed climate impact has been rapid. This study also explored the influence of climate relative to more traditional factors, such as livestock distribution, ungulate biodiversity, and soil-water balance, in demarcating risk. Machine learning was used to model anthrax suitability in northern latitudes. The model identified climate, livestock density and wild ungulate species richness as the most influential features in predicting suitability. These findings highlight the significance of warming temperatures for anthrax ecology in northern latitudes, and suggest potential mitigating effects of interventions targeting megafauna biodiversity conservation in grassland ecosystems, and animal health promotion among small to midsize livestock herds.

  12. Efficient screening of environmental isolates for Saccharomyces cerevisiae strains that are suitable for brewing.

    PubMed

    Fujihara, Hidehiko; Hino, Mika; Takashita, Hideharu; Kajiwara, Yasuhiro; Okamoto, Keiko; Furukawa, Kensuke

    2014-01-01

    We developed an efficient screening method for Saccharomyces cerevisiae strains from environmental isolates. MultiPlex PCR was performed targeting four brewing S. cerevisiae genes (SSU1, AWA1, BIO6, and FLO1). At least three genes among the four were amplified from all S. cerevisiae strains. The use of this method allowed us to successfully obtain S. cerevisiae strains.

  13. A Study on Software-based Sensing Technology for Multiple Object Control in AR Video

    PubMed Central

    Jung, Sungmo; Song, Jae-gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo

    2010-01-01

    Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker’should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms. PMID:22163444

  14. A systematic approach to prioritize drug targets using machine learning, a molecular descriptor-based classification model, and high-throughput screening of plant derived molecules: a case study in oral cancer.

    PubMed

    Randhawa, Vinay; Kumar Singh, Anil; Acharya, Vishal

    2015-12-01

    Systems-biology inspired identification of drug targets and machine learning-based screening of small molecules which modulate their activity have the potential to revolutionize modern drug discovery by complementing conventional methods. To utilize the effectiveness of such pipelines, we first analyzed the dysregulated gene pairs between control and tumor samples and then implemented an ensemble-based feature selection approach to prioritize targets in oral squamous cell carcinoma (OSCC) for therapeutic exploration. Based on the structural information of known inhibitors of CXCR4-one of the best targets identified in this study-a feature selection was implemented for the identification of optimal structural features (molecular descriptor) based on which a classification model was generated. Furthermore, the CXCR4-centered descriptor-based classification model was finally utilized to screen a repository of plant derived small-molecules to obtain potential inhibitors. The application of our methodology may assist effective selection of the best targets which may have previously been overlooked, that in turn will lead to the development of new oral cancer medications. The small molecules identified in this study can be ideal candidates for trials as potential novel anti-oral cancer agents. Importantly, distinct steps of this whole study may provide reference for the analysis of other complex human diseases.

  15. A study on software-based sensing technology for multiple object control in AR video.

    PubMed

    Jung, Sungmo; Song, Jae-Gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo

    2010-01-01

    Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker'should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms.

  16. In Silico Prediction of Physicochemical Properties of Environmental Chemicals Using Molecular Fingerprints and Machine Learning

    EPA Science Inventory

    There are little available toxicity data on the vast majority of chemicals in commerce. High-throughput screening (HTS) studies, such as those being carried out by the U.S. Environmental Protection Agency (EPA) ToxCast program in partnership with the federal Tox21 research progra...

  17. The IBM PC as an Online Search Machine. Part 5: Searching through Crosstalk.

    ERIC Educational Resources Information Center

    Kolner, Stuart J.

    1985-01-01

    This last of a five-part series on using the IBM personal computer for online searching highlights a brief review, search process, making the connection, switching between screens and modes, online transaction, capture buffer controls, coping with options, function keys, script files, processing downloaded information, note to TELEX users, and…

  18. Machine Learning of Human Pluripotent Stem Cell-Derived Engineered Cardiac Tissue Contractility for Automated Drug Classification.

    PubMed

    Lee, Eugene K; Tran, David D; Keung, Wendy; Chan, Patrick; Wong, Gabriel; Chan, Camie W; Costa, Kevin D; Li, Ronald A; Khine, Michelle

    2017-11-14

    Accurately predicting cardioactive effects of new molecular entities for therapeutics remains a daunting challenge. Immense research effort has been focused toward creating new screening platforms that utilize human pluripotent stem cell (hPSC)-derived cardiomyocytes and three-dimensional engineered cardiac tissue constructs to better recapitulate human heart function and drug responses. As these new platforms become increasingly sophisticated and high throughput, the drug screens result in larger multidimensional datasets. Improved automated analysis methods must therefore be developed in parallel to fully comprehend the cellular response across a multidimensional parameter space. Here, we describe the use of machine learning to comprehensively analyze 17 functional parameters derived from force readouts of hPSC-derived ventricular cardiac tissue strips (hvCTS) electrically paced at a range of frequencies and exposed to a library of compounds. A generated metric is effective for then determining the cardioactivity of a given drug. Furthermore, we demonstrate a classification model that can automatically predict the mechanistic action of an unknown cardioactive drug. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Micro Dot Patterning on the Light Guide Panel Using Powder Blasting.

    PubMed

    Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam

    2008-02-08

    This study is to develop a micromachining technology for a light guidepanel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a singleinjection process instead of existing screen printing processes. The micro powder blastingtechnique is applied to form micro dot patterns on the LGP mold surface. The optimalconditions for masking, laminating, exposure, and developing processes to form the microdot patterns are first experimentally investigated. A LGP mold with masked micro patternsis then machined using the micro powder blasting method and the machinability of themicro dot patterns is verified. A prototype LGP is test- injected using the developed LGPmold and a shape analysis of the patterns and performance testing of the injected LGP arecarried out. As an additional approach, matte finishing, a special surface treatment method,is applied to the mold surface to improve the light diffusion characteristics, uniformity andbrightness of the LGP. The results of this study show that the applied powder blastingmethod can be successfully used to manufacture LGPs with micro patterns by just singleinjection using the developed mold and thereby replace existing screen printing methods.

  20. A method to screen and evaluate tissue adhesives for joint repair applications

    PubMed Central

    2012-01-01

    Background Tissue adhesives are useful means for various medical procedures. Since varying requirements cause that a single adhesive cannot meet all needs, bond strength testing remains one of the key applications used to screen for new products and study the influence of experimental variables. This study was conducted to develop an easy to use method to screen and evaluate tissue adhesives for tissue engineering applications. Method Tissue grips were designed to facilitate the reproducible production of substrate tissue and adhesive strength measurements in universal testing machines. Porcine femoral condyles were used to generate osteochondral test tissue cylinders (substrates) of different shapes. Viability of substrates was tested using PI/FDA staining. Self-bonding properties were determined to examine reusability of substrates (n = 3). Serial measurements (n = 5) in different operation modes (OM) were performed to analyze the bonding strength of tissue adhesives in bone (OM-1) and cartilage tissue either in isolation (OM-2) or under specific requirements in joint repair such as filling cartilage defects with clinical applied fibrin/PLGA-cell-transplants (OM-3) or tissues (OM-4). The efficiency of the method was determined on the basis of adhesive properties of fibrin glue for different assembly times (30 s, 60 s). Seven randomly generated collagen formulations were analyzed to examine the potential of method to identify new tissue adhesives. Results Viability analysis of test tissue cylinders revealed vital cells (>80%) in cartilage components even 48 h post preparation. Reuse (n = 10) of test substrate did not significantly change adhesive characteristics. Adhesive strength of fibrin varied in different test settings (OM-1: 7.1 kPa, OM-2: 2.6 kPa, OM-3: 32.7 kPa, OM-4: 30.1 kPa) and was increasing with assembly time on average (2.4-fold). The screening of the different collagen formulations revealed a substance with significant higher adhesive strength on cartilage (14.8 kPa) and bone tissue (11.8 kPa) compared to fibrin and also considerable adhesive properties when filling defects with cartilage tissue (23.2 kPa). Conclusion The method confirmed adhesive properties of fibrin and demonstrated the dependence of adhesive properties and applied settings. Furthermore the method was suitable to screen for potential adhesives and to identify a promising candidate for cartilage and bone applications. The method can offer simple, replicable and efficient evaluation of adhesive properties in ex vivo specimens and may be a useful supplement to existing methods in clinical relevant settings. PMID:22984926

  1. [Selection of a melanine concentrating hormone receptor-1 (MCHR1) antagonists' focused library and its biological screening with AequoScreen].

    PubMed

    Flachner, Beáta; Hajdú, István; Dobi, Krisztina; Lorincz, Zsolt; Cseh, Sándor; Dormán, György

    2013-01-01

    Target focused libraries can be rapidly selected by 2D virtual screening methods from multimillion compounds' repositories if structures of active compounds are available. In the present study a multi-step virtual and in vitro screening cascade is reported to select Melanin Concentrating Hormone Receptor-1 (MCHR1) antagonists. The 2D similarity search combined with physicochemical parameter filtering is suitable for selecting candidates from multimillion compounds' repository. The seeds of the first round virtual screening were collected from the literature and commercial databases, while the seeds of the second round were the hits of the first round. In vitro screening underlined the efficiency of our approach, as in the second screening round the hit rate (8.6 %) significantly improved compared to the first round (1.9%), reaching the antagonist activity even below 10 nM.

  2. Freeform diamond machining of complex monolithic metal optics for integral field systems

    NASA Astrophysics Data System (ADS)

    Dubbeldam, Cornelis M.; Robertson, David J.; Preuss, Werner

    2004-09-01

    Implementation of the optical designs of image slicing Integral Field Systems requires accurate alignment of a large number of small (and therefore difficult to manipulate) optical components. In order to facilitate the integration of these complex systems, the Astronomical Instrumentation Group (AIG) of the University of Durham, in collaboration with the Labor für Mikrozerspanung (Laboratory for Precision Machining - LFM) of the University of Bremen, have developed a technique for fabricating monolithic multi-faceted mirror arrays using freeform diamond machining. Using this technique, the inherent accuracy of the diamond machining equipment is exploited to achieve the required relative alignment accuracy of the facets, as well as an excellent optical surface quality for each individual facet. Monolithic arrays manufactured using this freeform diamond machining technique were successfully applied in the Integral Field Unit for the GEMINI Near-InfraRed Spectrograph (GNIRS IFU), which was recently installed at GEMINI South. Details of their fabrication process and optical performance are presented in this paper. In addition, the direction of current development work, conducted under the auspices of the Durham Instrumentation R&D Program supported by the UK Particle Physics and Astronomy Research Council (PPARC), will be discussed. The main emphasis of this research is to improve further the optical performance of diamond machined components, as well as to streamline the production and quality control processes with a view to making this technique suitable for multi-IFU instruments such as KMOS etc., which require series production of large quantities of optical components.

  3. Vibration Damping Analysis of Lightweight Structures in Machine Tools

    PubMed Central

    Aggogeri, Francesco; Borboni, Alberto; Merlo, Angelo; Pellegrini, Nicola; Ricatto, Raffaele

    2017-01-01

    The dynamic behaviour of a machine tool (MT) directly influences the machining performance. The adoption of lightweight structures may reduce the effects of undesired vibrations and increase the workpiece quality. This paper aims to present and compare a set of hybrid materials that may be excellent candidates to fabricate the MT moving parts. The selected materials have high dynamic characteristics and capacity to dampen mechanical vibrations. In this way, starting from the kinematic model of a milling machine, this study evaluates a number of prototypes made of Al foam sandwiches (AFS), Al corrugated sandwiches (ACS) and composite materials reinforced by carbon fibres (CFRP). These prototypes represented the Z-axis ram of a commercial milling machine. The static and dynamical properties have been analysed by using both finite element (FE) simulations and experimental tests. The obtained results show that the proposed structures may be a valid alternative to the conventional materials of MT moving parts, increasing machining performance. In particular, the AFS prototype highlighted a damping ratio that is 20 times greater than a conventional ram (e.g., steel). Its application is particularly suitable to minimize unwanted oscillations during high-speed finishing operations. The results also show that the CFRP structure guarantees high stiffness with a weight reduced by 48.5%, suggesting effective applications in roughing operations, saving MT energy consumption. The ACS structure has a good trade-off between stiffness and damping and may represent a further alternative, if correctly evaluated. PMID:28772653

  4. 24 CFR 982.54 - Administrative plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... or a specified category of families; (4) Occupancy policies, including: (i) Definition of what group... conducting required HQS inspections; and (23) PHA screening of applicants for family behavior or suitability...

  5. Chiasmus

    NASA Astrophysics Data System (ADS)

    Cady, Stephen

    2009-02-01

    Chiasmus is a responsive and dynamically reflective, two-sided volumetric surface that embodies phenomenological issues such as the formation of images, observer and machine perception and the dynamics of the screen as a space of image reception. It consists of a square grid of 64 individually motorized cube elements engineered to move linearly. Each cube is controlled by custom software that analyzes video imagery for luminance values and sends these values to the motor control mechanisms to coordinate the individual movements. The resolution of the sculptural screen from the individual movements allows its volume to dynamically alter, providing novel and unique perspectives of its mobile form to an observer.

  6. Review on sugar beet salt stress studies in Iran

    NASA Astrophysics Data System (ADS)

    Khayamim, S.; Noshad, H.; Jahadakbar, M. R.; Fotuhi, K.

    2017-07-01

    Increase of saline lands in most regions of the world and Iran, limit of production increase based on land enhancement and also threat of saline water and soils for crop production make related researches and production of salt tolerant variety to be more serious. There have been many researches about salt stress in Sugar Beet Seed Institute of Iran (SBSI) during several years. Accordingly, the new screening methods for stress tolerance to be continued based on these researches. Previous researches in SBSI were reviewed and results concluded to this study which is presented in this article in three categories including: Agronomy, Breeding and Biotechnology. In agronomy researches, suitable planting medium, EC, growth stage and traits for salinity tolerance screening were determined and agronomic technique such as planting date, planting method and suitable nutrition for sugar beet under salt stress were introduced. Sand was salinizied by saline treatments two times more than Perlit so large sized Perlit is suitable medium for saline studies. Sugar beet genotypes screening for salt tolerance and should be conducted at EC=20 in laboratory and EC= 16 dS/M in greenhouse. Although sugar beet seed germination has been known as more susceptible stage to salinity, it seems establishment is more susceptible than germination in which salinity will cause 70-80% decrease in plant establishment. Measurements of leaves Na, K and total carbohydrate at establishment stage would be useful for faster screening of genotypes, based on high and significant correlation of these traits at establishment with yield at harvest time. In breeding section, SBSI genotypes with drought tolerance background would be useful for salinity stress studies and finally there is a need for more research in the field of biotechnology in Iran.

  7. [Suitability of screening for diabetes mellitus in women with a history of gestational diabetes].

    PubMed

    Álvarez-Silvares, E; Domínguez-Vigo, P; Domínguez-Sánchez, J; González-González, A

    To assess long-term suitability of screening for type 2 diabetes mellitus in women with a previous diagnosis of gestational diabetes in Primary Care. The secondary objectives were to determine if there were clinical factors that modified the usefulness of the screening. An observational cohort type study was performed, which included all patients with the diagnosis of gestational diabetes during the years 2000 to 2009 (n=470) in the University Hospital Complex of Ourense. The electronic medical records were reviewed to assess the existence of gestational diabetes and the year of the last fasting blood glucose. The mean follow-up time was 12.9 years. The screening for evidence of a fasting blood glucose in the last 3 years was considered adequate. The following variables were analysed: adequacy of screening for type 2 diabetes mellitus, age, body mass index, gestational diabetes in more than one gestation, and rural/urban environment. A descriptive analysis of the data was performed, using Chi2 and Student's t-test to determine differences between subgroups. Statistical significance was considered as P<.05 RESULTS: The long-term monitoring of these patients was very irregular. Only 67.08% of the study group underwent diabetes mellitus type 2 screening. The level of follow-up was not associated with age, BMI, the place of residence, or the year of diagnosis. In patients with more than one episode of gestational diabetes, subsequent blood glucose control was achieved in 94.1%. The adequacy of the screening in our area is very irregular and highly improvable. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. Factors Influencing the Clinical Stratification of Suitability to Drive after Stroke: A Qualitative Study.

    PubMed

    Stapleton, Tadhg; Connolly, Deirdre; O'Neill, Desmond

    2015-01-01

    While a clinical pre-selection screening process for a stroke patient's suitability for driving has been acknowledged, little is known about the factors or processes influencing this screening typically conducted by clinicians practicing at a generalist level. This study explored this clinical stratification process through the use of semi-structured interviews with senior occupational therapists (n = 17) and stroke physicians (n = 7) using qualitative description methodology. The findings revealed a trichotomy stratification of stroke patients for driving in the clinical setting; those who are fit to drive, unfit to drive, and a "maybe" group who need more detailed assessment and observation. Factors that had a major influence on this clinical-based stratification of driving suitability were client's levels of awareness, insight, and impulsivity. A period of prolonged contact with the client was preferential to guide the stratification decision in order for clinicians to build a comprehensive picture of the person. A mix of assessment approaches including standardized assessment but with increased emphasis on naturalistic observation of functional performance underpinned the clinical stratification process. This study uncovers some of the factors and processes influencing the early clinical-based stratification of driving suitability after stroke, and highlights the contribution of the generalist practitioner in the assessment of fitness to drive continuum.

  9. Predictions of BuChE inhibitors using support vector machine and naive Bayesian classification techniques in drug discovery.

    PubMed

    Fang, Jiansong; Yang, Ranyao; Gao, Li; Zhou, Dan; Yang, Shengqian; Liu, Ai-Lin; Du, Guan-hua

    2013-11-25

    Butyrylcholinesterase (BuChE, EC 3.1.1.8) is an important pharmacological target for Alzheimer's disease (AD) treatment. However, the currently available BuChE inhibitor screening assays are expensive, labor-intensive, and compound-dependent. It is necessary to develop robust in silico methods to predict the activities of BuChE inhibitors for the lead identification. In this investigation, support vector machine (SVM) models and naive Bayesian models were built to discriminate BuChE inhibitors (BuChEIs) from the noninhibitors. Each molecule was initially represented in 1870 structural descriptors (1235 from ADRIANA.Code, 334 from MOE, and 301 from Discovery studio). Correlation analysis and stepwise variable selection method were applied to figure out activity-related descriptors for prediction models. Additionally, structural fingerprint descriptors were added to improve the predictive ability of models, which were measured by cross-validation, a test set validation with 1001 compounds and an external test set validation with 317 diverse chemicals. The best two models gave Matthews correlation coefficient of 0.9551 and 0.9550 for the test set and 0.9132 and 0.9221 for the external test set. To demonstrate the practical applicability of the models in virtual screening, we screened an in-house data set with 3601 compounds, and 30 compounds were selected for further bioactivity assay. The assay results showed that 10 out of 30 compounds exerted significant BuChE inhibitory activities with IC50 values ranging from 0.32 to 22.22 μM, at which three new scaffolds as BuChE inhibitors were identified for the first time. To our best knowledge, this is the first report on BuChE inhibitors using machine learning approaches. The models generated from SVM and naive Bayesian approaches successfully predicted BuChE inhibitors. The study proved the feasibility of a new method for predicting bioactivities of ligands and discovering novel lead compounds.

  10. A Machine Learning Ensemble Classifier for Early Prediction of Diabetic Retinopathy.

    PubMed

    S K, Somasundaram; P, Alli

    2017-11-09

    The main complication of diabetes is Diabetic retinopathy (DR), retinal vascular disease and it leads to the blindness. Regular screening for early DR disease detection is considered as an intensive labor and resource oriented task. Therefore, automatic detection of DR diseases is performed only by using the computational technique is the great solution. An automatic method is more reliable to determine the presence of an abnormality in Fundus images (FI) but, the classification process is poorly performed. Recently, few research works have been designed for analyzing texture discrimination capacity in FI to distinguish the healthy images. However, the feature extraction (FE) process was not performed well, due to the high dimensionality. Therefore, to identify retinal features for DR disease diagnosis and early detection using Machine Learning and Ensemble Classification method, called, Machine Learning Bagging Ensemble Classifier (ML-BEC) is designed. The ML-BEC method comprises of two stages. The first stage in ML-BEC method comprises extraction of the candidate objects from Retinal Images (RI). The candidate objects or the features for DR disease diagnosis include blood vessels, optic nerve, neural tissue, neuroretinal rim, optic disc size, thickness and variance. These features are initially extracted by applying Machine Learning technique called, t-distributed Stochastic Neighbor Embedding (t-SNE). Besides, t-SNE generates a probability distribution across high-dimensional images where the images are separated into similar and dissimilar pairs. Then, t-SNE describes a similar probability distribution across the points in the low-dimensional map. This lessens the Kullback-Leibler divergence among two distributions regarding the locations of the points on the map. The second stage comprises of application of ensemble classifiers to the extracted features for providing accurate analysis of digital FI using machine learning. In this stage, an automatic detection of DR screening system using Bagging Ensemble Classifier (BEC) is investigated. With the help of voting the process in ML-BEC, bagging minimizes the error due to variance of the base classifier. With the publicly available retinal image databases, our classifier is trained with 25% of RI. Results show that the ensemble classifier can achieve better classification accuracy (CA) than single classification models. Empirical experiments suggest that the machine learning-based ensemble classifier is efficient for further reducing DR classification time (CT).

  11. The Concept of C2 Communication and Information Support

    DTIC Science & Technology

    2004-06-01

    communication and information literacy , • Sensors: technology and systematic development as a branch, • Military prognosis research (combat models...intelligence, • Visualization of actions, suitable forms of information presentation, • Techniques of learning CIS users communication and information ... literacy , • Sensors: technology and systematic development as a branch, • Military prognosis research (combat models), • Man - machine interface. CISu

  12. Ductile Binder Phase For Use With Almgb14 And Other Hard Ceramic Materials

    DOEpatents

    Cook, Bruce A.; Russell, Alan; Harringa, Joel

    2005-07-26

    This invention relates to a ductile binder phase for use with AlMgB14 and other hard materials. The ductile binder phase, a cobalt-manganese alloy, is used in appropriate quantities to tailor good hardness and reasonable fracture toughness for hard materials so they can be used suitably in industrial machining and grinding applications.

  13. Decision support system for diabetic retinopathy using discrete wavelet transform.

    PubMed

    Noronha, K; Acharya, U R; Nayak, K P; Kamath, S; Bhandary, S V

    2013-03-01

    Prolonged duration of the diabetes may affect the tiny blood vessels of the retina causing diabetic retinopathy. Routine eye screening of patients with diabetes helps to detect diabetic retinopathy at the early stage. It is very laborious and time-consuming for the doctors to go through many fundus images continuously. Therefore, decision support system for diabetic retinopathy detection can reduce the burden of the ophthalmologists. In this work, we have used discrete wavelet transform and support vector machine classifier for automated detection of normal and diabetic retinopathy classes. The wavelet-based decomposition was performed up to the second level, and eight energy features were extracted. Two energy features from the approximation coefficients of two levels and six energy values from the details in three orientations (horizontal, vertical and diagonal) were evaluated. These features were fed to the support vector machine classifier with various kernel functions (linear, radial basis function, polynomial of orders 2 and 3) to evaluate the highest classification accuracy. We obtained the highest average classification accuracy, sensitivity and specificity of more than 99% with support vector machine classifier (polynomial kernel of order 3) using three discrete wavelet transform features. We have also proposed an integrated index called Diabetic Retinopathy Risk Index using clinically significant wavelet energy features to identify normal and diabetic retinopathy classes using just one number. We believe that this (Diabetic Retinopathy Risk Index) can be used as an adjunct tool by the doctors during the eye screening to cross-check their diagnosis.

  14. An fMRI and effective connectivity study investigating miss errors during advice utilization from human and machine agents.

    PubMed

    Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; de Visser, Ewart; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank

    2017-10-01

    As society becomes more reliant on machines and automation, understanding how people utilize advice is a necessary endeavor. Our objective was to reveal the underlying neural associations during advice utilization from expert human and machine agents with fMRI and multivariate Granger causality analysis. During an X-ray luggage-screening task, participants accepted or rejected good or bad advice from either the human or machine agent framed as experts with manipulated reliability (high miss rate). We showed that the machine-agent group decreased their advice utilization compared to the human-agent group and these differences in behaviors during advice utilization could be accounted for by high expectations of reliable advice and changes in attention allocation due to miss errors. Brain areas involved with the salience and mentalizing networks, as well as sensory processing involved with attention, were recruited during the task and the advice utilization network consisted of attentional modulation of sensory information with the lingual gyrus as the driver during the decision phase and the fusiform gyrus as the driver during the feedback phase. Our findings expand on the existing literature by showing that misses degrade advice utilization, which is represented in a neural network involving salience detection and self-processing with perceptual integration.

  15. Automating the application of smart materials for protein crystallization.

    PubMed

    Khurshid, Sahir; Govada, Lata; El-Sharif, Hazim F; Reddy, Subrayal M; Chayen, Naomi E

    2015-03-01

    The fabrication and validation of the first semi-liquid nonprotein nucleating agent to be administered automatically to crystallization trials is reported. This research builds upon prior demonstration of the suitability of molecularly imprinted polymers (MIPs; known as `smart materials') for inducing protein crystal growth. Modified MIPs of altered texture suitable for high-throughput trials are demonstrated to improve crystal quality and to increase the probability of success when screening for suitable crystallization conditions. The application of these materials is simple, time-efficient and will provide a potent tool for structural biologists embarking on crystallization trials.

  16. A Virtual Astronomical Research Machine in No Time (VARMiNT)

    NASA Astrophysics Data System (ADS)

    Beaver, John

    2012-05-01

    We present early results of using virtual machine software to help make astronomical research computing accessible to a wider range of individuals. Our Virtual Astronomical Research Machine in No Time (VARMiNT) is an Ubuntu Linux virtual machine with free, open-source software already installed and configured (and in many cases documented). The purpose of VARMiNT is to provide a ready-to-go astronomical research computing environment that can be freely shared between researchers, or between amateur and professional, teacher and student, etc., and to circumvent the often-difficult task of configuring a suitable computing environment from scratch. Thus we hope that VARMiNT will make it easier for individuals to engage in research computing even if they have no ready access to the facilities of a research institution. We describe our current version of VARMiNT and some of the ways it is being used at the University of Wisconsin - Fox Valley, a two-year teaching campus of the University of Wisconsin System, as a means to enhance student independent study research projects and to facilitate collaborations with researchers at other locations. We also outline some future plans and prospects.

  17. LED light design method for high contrast and uniform illumination imaging in machine vision.

    PubMed

    Wu, Xiaojun; Gao, Guangming

    2018-03-01

    In machine vision, illumination is very critical to determine the complexity of the inspection algorithms. Proper lights can obtain clear and sharp images with the highest contrast and low noise between the interested object and the background, which is conducive to the target being located, measured, or inspected. Contrary to the empirically based trial-and-error convention to select the off-the-shelf LED light in machine vision, an optimization algorithm for LED light design is proposed in this paper. It is composed of the contrast optimization modeling and the uniform illumination technology for non-normal incidence (UINI). The contrast optimization model is built based on the surface reflection characteristics, e.g., the roughness, the reflective index, and light direction, etc., to maximize the contrast between the features of interest and the background. The UINI can keep the uniformity of the optimized lighting by the contrast optimization model. The simulation and experimental results demonstrate that the optimization algorithm is effective and suitable to produce images with the highest contrast and uniformity, which is very inspirational to the design of LED illumination systems in machine vision.

  18. A machine learning system to improve heart failure patient assistance.

    PubMed

    Guidi, Gabriele; Pettenati, Maria Chiara; Melillo, Paolo; Iadanza, Ernesto

    2014-11-01

    In this paper, we present a clinical decision support system (CDSS) for the analysis of heart failure (HF) patients, providing various outputs such as an HF severity evaluation, HF-type prediction, as well as a management interface that compares the different patients' follow-ups. The whole system is composed of a part of intelligent core and of an HF special-purpose management tool also providing the function to act as interface for the artificial intelligence training and use. To implement the smart intelligent functions, we adopted a machine learning approach. In this paper, we compare the performance of a neural network (NN), a support vector machine, a system with fuzzy rules genetically produced, and a classification and regression tree and its direct evolution, which is the random forest, in analyzing our database. Best performances in both HF severity evaluation and HF-type prediction functions are obtained by using the random forest algorithm. The management tool allows the cardiologist to populate a "supervised database" suitable for machine learning during his or her regular outpatient consultations. The idea comes from the fact that in literature there are a few databases of this type, and they are not scalable to our case.

  19. High productivity mould robotic milling in Al-5083

    NASA Astrophysics Data System (ADS)

    Urresti, Iker; Arrazola, Pedro Jose; Ørskov, Klaus Bonde; Pelegay, Jose Angel

    2018-05-01

    Industrial serial robots were usually limited to welding, handling or spray painting operations until very recent years. However, some industries have already realized about their important capabilities in terms of flexibility, working space, adaptability and cost. Hence, currently they are seriously being considered to carry out certain metal machining tasks. Therefore, robot based machining is presented as a cost-saving and flexible manufacturing alternative compared to conventional CNC machines especially for roughing or even pre-roughing of large parts. Nevertheless, there are still some drawbacks usually referred as low rigidity, accuracy and repeatability. Thus, the process productivity is usually sacrificed getting low Material Removal Rates (MRR), and consequently not being competitive. Nevertheless, in this paper different techniques to obtain increased productivity are presented, though an appropriate selection of cutting strategies and parameters that are essential for it. During this research some rough milling tests in Al-5083 are presented where High Feed Milling (HFM) is implemented as productive cutting strategy and the experimental modal analysis named Tap-testing is used for the suitable choice of cutting conditions. Competitive productivity rates are experienced while process stability is checked through the cutting forces measurements in order to prove the effectiveness of the experimental modal analysis for robotic machining.

  20. Comparison of four machine learning algorithms for their applicability in satellite-based optical rainfall retrievals

    NASA Astrophysics Data System (ADS)

    Meyer, Hanna; Kühnlein, Meike; Appelhans, Tim; Nauss, Thomas

    2016-03-01

    Machine learning (ML) algorithms have successfully been demonstrated to be valuable tools in satellite-based rainfall retrievals which show the practicability of using ML algorithms when faced with high dimensional and complex data. Moreover, recent developments in parallel computing with ML present new possibilities for training and prediction speed and therefore make their usage in real-time systems feasible. This study compares four ML algorithms - random forests (RF), neural networks (NNET), averaged neural networks (AVNNET) and support vector machines (SVM) - for rainfall area detection and rainfall rate assignment using MSG SEVIRI data over Germany. Satellite-based proxies for cloud top height, cloud top temperature, cloud phase and cloud water path serve as predictor variables. The results indicate an overestimation of rainfall area delineation regardless of the ML algorithm (averaged bias = 1.8) but a high probability of detection ranging from 81% (SVM) to 85% (NNET). On a 24-hour basis, the performance of the rainfall rate assignment yielded R2 values between 0.39 (SVM) and 0.44 (AVNNET). Though the differences in the algorithms' performance were rather small, NNET and AVNNET were identified as the most suitable algorithms. On average, they demonstrated the best performance in rainfall area delineation as well as in rainfall rate assignment. NNET's computational speed is an additional advantage in work with large datasets such as in remote sensing based rainfall retrievals. However, since no single algorithm performed considerably better than the others we conclude that further research in providing suitable predictors for rainfall is of greater necessity than an optimization through the choice of the ML algorithm.

  1. Deposition and micro electrical discharge machining of CVD-diamond layers incorporated with silicon

    NASA Astrophysics Data System (ADS)

    Kühn, R.; Berger, T.; Prieske, M.; Börner, R.; Hackert-Oschätzchen, M.; Zeidler, H.; Schubert, A.

    2017-10-01

    In metal forming, lubricants have to be used to prevent corrosion or to reduce friction and tool wear. From an economical and ecological point of view, the aim is to avoid the usage of lubricants. For dry deep drawing of aluminum sheets it is intended to apply locally micro-structured wear-resistant carbon based coatings onto steel tools. One type of these coatings are diamond layers prepared by chemical vapor deposition (CVD). Due to the high strength of diamond, milling processes are unsuitable for micro-structuring of these layers. In contrast to this, micro electrical discharge machining (micro EDM) is a suitable process for micro-structuring CVD-diamond layers. Due to its non-contact nature and its process principle of ablating material by melting and evaporating, it is independent of the hardness, brittleness or toughness of the workpiece material. In this study the deposition and micro electrical discharge machining of silicon incorporated CVD-diamond (Si-CVD-diamond) layers were presented. For this, 10 µm thick layers were deposited on molybdenum plates by a laser-induced plasma CVD process (LaPlas-CVD). For the characterization of the coatings RAMAN- and EDX-analyses were conducted. Experiments in EDM were carried out with a tungsten carbide tool electrode with a diameter of 90 µm to investigate the micro-structuring of Si-CVD-diamond. The impact of voltage, discharge energy and tool polarity on process speed and resulting erosion geometry were analyzed. The results show that micro EDM is a suitable technology for micro-structuring of silicon incorporated CVD-diamond layers.

  2. Measurement-induced operation of two-ion quantum heat machines

    NASA Astrophysics Data System (ADS)

    Chand, Suman; Biswas, Asoka

    2017-03-01

    We show how one can implement a quantum heat machine by using two interacting trapped ions, in presence of a thermal bath. The electronic states of the ions act like a working substance, while the vibrational mode is modelled as the cold bath. The heat exchange with the cold bath is mimicked by the projective measurement of the electronic states. We show how such measurement in a suitable basis can lead to either a quantum heat engine or a refrigerator, which undergoes a quantum Otto cycle. The local magnetic field is adiabatically changed during the heat cycle. The performance of the heat machine depends upon the interaction strength between the ions, the magnetic fields, and the measurement cost. In our model, the coupling to the hot and the cold baths is never switched off in an alternative fashion during the heat cycle, unlike other existing proposals of quantum heat engines. This makes our proposal experimentally realizable using current tapped-ion technology.

  3. Measurement-induced operation of two-ion quantum heat machines.

    PubMed

    Chand, Suman; Biswas, Asoka

    2017-03-01

    We show how one can implement a quantum heat machine by using two interacting trapped ions, in presence of a thermal bath. The electronic states of the ions act like a working substance, while the vibrational mode is modelled as the cold bath. The heat exchange with the cold bath is mimicked by the projective measurement of the electronic states. We show how such measurement in a suitable basis can lead to either a quantum heat engine or a refrigerator, which undergoes a quantum Otto cycle. The local magnetic field is adiabatically changed during the heat cycle. The performance of the heat machine depends upon the interaction strength between the ions, the magnetic fields, and the measurement cost. In our model, the coupling to the hot and the cold baths is never switched off in an alternative fashion during the heat cycle, unlike other existing proposals of quantum heat engines. This makes our proposal experimentally realizable using current tapped-ion technology.

  4. Effect of casting geometry on mechanical properties of two nickel-base superalloys

    NASA Technical Reports Server (NTRS)

    Johnston, J. R.; Dreshfield, R. L.; Collins, H. E.

    1976-01-01

    An investigation was performed to determine mechanical properties of two rhenium-free modifications of alloy TRW, and to evaluate the suitability of the alloy for use in a small integrally cast turbine rotor. The two alloys were initially developed using stress rupture properties of specimens machined from solid gas turbine blades. Properties in this investigation were determined from cast to size bars and bars cut from 3.8 by 7.6 by 17.8 cm blocks. Specimens machined from blocks had inferior tensile strength and always had markedly poorer rupture lives than cast to size bars. At 1,000 C the cast to size bars had shorter rupture lives than those machined from blades. Alloy R generally had better properties than alloy S in the conditions evaluated. The results show the importance of casting geometry on mechanical properties of nickel base superalloys and suggest that the geometry of a component can be simulated when developing alloys for that component.

  5. Design consideration in constructing high performance embedded Knowledge-Based Systems (KBS)

    NASA Technical Reports Server (NTRS)

    Dalton, Shelly D.; Daley, Philip C.

    1988-01-01

    As the hardware trends for artificial intelligence (AI) involve more and more complexity, the process of optimizing the computer system design for a particular problem will also increase in complexity. Space applications of knowledge based systems (KBS) will often require an ability to perform both numerically intensive vector computations and real time symbolic computations. Although parallel machines can theoretically achieve the speeds necessary for most of these problems, if the application itself is not highly parallel, the machine's power cannot be utilized. A scheme is presented which will provide the computer systems engineer with a tool for analyzing machines with various configurations of array, symbolic, scaler, and multiprocessors. High speed networks and interconnections make customized, distributed, intelligent systems feasible for the application of AI in space. The method presented can be used to optimize such AI system configurations and to make comparisons between existing computer systems. It is an open question whether or not, for a given mission requirement, a suitable computer system design can be constructed for any amount of money.

  6. A video, text, and speech-driven realistic 3-d virtual head for human-machine interface.

    PubMed

    Yu, Jun; Wang, Zeng-Fu

    2015-05-01

    A multiple inputs-driven realistic facial animation system based on 3-D virtual head for human-machine interface is proposed. The system can be driven independently by video, text, and speech, thus can interact with humans through diverse interfaces. The combination of parameterized model and muscular model is used to obtain a tradeoff between computational efficiency and high realism of 3-D facial animation. The online appearance model is used to track 3-D facial motion from video in the framework of particle filtering, and multiple measurements, i.e., pixel color value of input image and Gabor wavelet coefficient of illumination ratio image, are infused to reduce the influence of lighting and person dependence for the construction of online appearance model. The tri-phone model is used to reduce the computational consumption of visual co-articulation in speech synchronized viseme synthesis without sacrificing any performance. The objective and subjective experiments show that the system is suitable for human-machine interaction.

  7. [Extension of cardiac monitoring function by used of ordinary ECG machine].

    PubMed

    Chen, Zhencheng; Jiang, Yong; Ni, Lili; Wang, Hongyan

    2002-06-01

    This paper deals with a portable monitor system on liquid crystal display (LCD) based on this available ordinary ECG machine, which is low power and suitable for China's specific condition. Apart from developing the overall scheme of the system, this paper also has completed the design of the hardware and the software. The 80c196 single chip microcomputer is taken as the central microprocessor and real time electrocardiac single is data treated and analyzed in the system. With the performance of ordinary monitor, this machine also possesses the following functions: five types of arrhythmia analysis, alarm, freeze, and record of automatic pappering, convenient in carrying, with alternate-current (AC) or direct-current (DC) powered. The hardware circuit is simplified and the software structure is optimized in this paper. Multiple low power designs and LCD unit design are adopted and completed in it. Popular in usage, low in cost price, the portable monitor system will have a valuable influence on China's monitor system field.

  8. WARP: Weight Associative Rule Processor. A dedicated VLSI fuzzy logic megacell

    NASA Technical Reports Server (NTRS)

    Pagni, A.; Poluzzi, R.; Rizzotto, G. G.

    1992-01-01

    During the last five years Fuzzy Logic has gained enormous popularity in the academic and industrial worlds. The success of this new methodology has led the microelectronics industry to create a new class of machines, called Fuzzy Machines, to overcome the limitations of traditional computing systems when utilized as Fuzzy Systems. This paper gives an overview of the methods by which Fuzzy Logic data structures are represented in the machines (each with its own advantages and inefficiencies). Next, the paper introduces WARP (Weight Associative Rule Processor) which is a dedicated VLSI megacell allowing the realization of a fuzzy controller suitable for a wide range of applications. WARP represents an innovative approach to VLSI Fuzzy controllers by utilizing different types of data structures for characterizing the membership functions during the various stages of the Fuzzy processing. WARP dedicated architecture has been designed in order to achieve high performance by exploiting the computational advantages offered by the different data representations.

  9. Development of thermal model to analyze thermal flux distribution in thermally enhanced machining of high chrome white cast iron

    NASA Astrophysics Data System (ADS)

    Ravi, A. M.; Murigendrappa, S. M.

    2018-04-01

    In recent times, thermally enhanced machining (TEM) slowly gearing up to cut hard metals like high chrome white cast iron (HCWCI) which were impossible in conventional procedures. Also setting up of suitable cutting parameters and positioning of the heat source against the work appears to be critical in order to enhance the machinability characteristics of the work material. In this research work, the Oxy - LPG flame was used as the heat source and HCWCI as the workpiece. ANSYS-CFD-Flow software was used to develop the transient thermal model to analyze the thermal flux distribution on the work surface during TEM of HCWCI using Cubic boron nitride (CBN) tools. Non-contact type Infrared thermo sensor was used to measure the surface temperature continuously at different positions, and is validated with the thermal model results. The result confirms thermal model is a better predictive tool for thermal flux distribution analysis in TEM process.

  10. Evaluation of Two New Chromogenic Media, CHROMagar MRSA and S. aureus ID, for Identifying Staphylococcus aureus and Screening Methicillin-Resistant S. aureus

    PubMed Central

    Hedin, Göran; Fang, Hong

    2005-01-01

    Thirty-nine methicillin-resistant Staphylococcus aureus (MRSA) isolates with diverse genetic backgrounds and two reference strains were correctly identified as S. aureus on CHROMagar MRSA and S. aureus ID media. Growth inhibition on CHROMagar MRSA was noted. A combination of cefoxitin disk and S. aureus ID was found suitable for rapid MRSA screening. PMID:16081989

  11. Site Characterization Technologies for DNAPL Investigations

    EPA Pesticide Factsheets

    This document is intended to help managers at sites with potential or confirmed DNAPL contamination identify suitable characterization technologies, screen the technologies for potential application, learn about applications at similar sites, and...

  12. Towards Automated Screening of Two-dimensional Crystals

    PubMed Central

    Cheng, Anchi; Leung, Albert; Fellmann, Denis; Quispe, Joel; Suloway, Christian; Pulokas, James; Carragher, Bridget; Potter, Clinton S.

    2007-01-01

    Screening trials to determine the presence of two-dimensional (2D) protein crystals suitable for three-dimensional structure determination using electron crystallography is a very labor-intensive process. Methods compatible with fully automated screening have been developed for the process of crystal production by dialysis and for producing negatively stained grids of the resulting trials. Further automation via robotic handling of the EM grids, and semi-automated transmission electron microscopic imaging and evaluation of the trial grids is also possible. We, and others, have developed working prototypes for several of these tools and tested and evaluated them in a simple screen of 24 crystallization conditions. While further development of these tools is certainly required for a turn-key system, the goal of fully automated screening appears to be within reach. PMID:17977016

  13. Designing focused chemical libraries enriched in protein-protein interaction inhibitors using machine-learning methods.

    PubMed

    Reynès, Christelle; Host, Hélène; Camproux, Anne-Claude; Laconde, Guillaume; Leroux, Florence; Mazars, Anne; Deprez, Benoit; Fahraeus, Robin; Villoutreix, Bruno O; Sperandio, Olivier

    2010-03-05

    Protein-protein interactions (PPIs) may represent one of the next major classes of therapeutic targets. So far, only a minute fraction of the estimated 650,000 PPIs that comprise the human interactome are known with a tiny number of complexes being drugged. Such intricate biological systems cannot be cost-efficiently tackled using conventional high-throughput screening methods. Rather, time has come for designing new strategies that will maximize the chance for hit identification through a rationalization of the PPI inhibitor chemical space and the design of PPI-focused compound libraries (global or target-specific). Here, we train machine-learning-based models, mainly decision trees, using a dataset of known PPI inhibitors and of regular drugs in order to determine a global physico-chemical profile for putative PPI inhibitors. This statistical analysis unravels two important molecular descriptors for PPI inhibitors characterizing specific molecular shapes and the presence of a privileged number of aromatic bonds. The best model has been transposed into a computer program, PPI-HitProfiler, that can output from any drug-like compound collection a focused chemical library enriched in putative PPI inhibitors. Our PPI inhibitor profiler is challenged on the experimental screening results of 11 different PPIs among which the p53/MDM2 interaction screened within our own CDithem platform, that in addition to the validation of our concept led to the identification of 4 novel p53/MDM2 inhibitors. Collectively, our tool shows a robust behavior on the 11 experimental datasets by correctly profiling 70% of the experimentally identified hits while removing 52% of the inactive compounds from the initial compound collections. We strongly believe that this new tool can be used as a global PPI inhibitor profiler prior to screening assays to reduce the size of the compound collections to be experimentally screened while keeping most of the true PPI inhibitors. PPI-HitProfiler is freely available on request from our CDithem platform website, www.CDithem.com.

  14. Designing Focused Chemical Libraries Enriched in Protein-Protein Interaction Inhibitors using Machine-Learning Methods

    PubMed Central

    Reynès, Christelle; Host, Hélène; Camproux, Anne-Claude; Laconde, Guillaume; Leroux, Florence; Mazars, Anne; Deprez, Benoit; Fahraeus, Robin; Villoutreix, Bruno O.; Sperandio, Olivier

    2010-01-01

    Protein-protein interactions (PPIs) may represent one of the next major classes of therapeutic targets. So far, only a minute fraction of the estimated 650,000 PPIs that comprise the human interactome are known with a tiny number of complexes being drugged. Such intricate biological systems cannot be cost-efficiently tackled using conventional high-throughput screening methods. Rather, time has come for designing new strategies that will maximize the chance for hit identification through a rationalization of the PPI inhibitor chemical space and the design of PPI-focused compound libraries (global or target-specific). Here, we train machine-learning-based models, mainly decision trees, using a dataset of known PPI inhibitors and of regular drugs in order to determine a global physico-chemical profile for putative PPI inhibitors. This statistical analysis unravels two important molecular descriptors for PPI inhibitors characterizing specific molecular shapes and the presence of a privileged number of aromatic bonds. The best model has been transposed into a computer program, PPI-HitProfiler, that can output from any drug-like compound collection a focused chemical library enriched in putative PPI inhibitors. Our PPI inhibitor profiler is challenged on the experimental screening results of 11 different PPIs among which the p53/MDM2 interaction screened within our own CDithem platform, that in addition to the validation of our concept led to the identification of 4 novel p53/MDM2 inhibitors. Collectively, our tool shows a robust behavior on the 11 experimental datasets by correctly profiling 70% of the experimentally identified hits while removing 52% of the inactive compounds from the initial compound collections. We strongly believe that this new tool can be used as a global PPI inhibitor profiler prior to screening assays to reduce the size of the compound collections to be experimentally screened while keeping most of the true PPI inhibitors. PPI-HitProfiler is freely available on request from our CDithem platform website, www.CDithem.com. PMID:20221258

  15. Combinatorial support vector machines approach for virtual screening of selective multi-target serotonin reuptake inhibitors from large compound libraries.

    PubMed

    Shi, Z; Ma, X H; Qin, C; Jia, J; Jiang, Y Y; Tan, C Y; Chen, Y Z

    2012-02-01

    Selective multi-target serotonin reuptake inhibitors enhance antidepressant efficacy. Their discovery can be facilitated by multiple methods, including in silico ones. In this study, we developed and tested an in silico method, combinatorial support vector machines (COMBI-SVMs), for virtual screening (VS) multi-target serotonin reuptake inhibitors of seven target pairs (serotonin transporter paired with noradrenaline transporter, H(3) receptor, 5-HT(1A) receptor, 5-HT(1B) receptor, 5-HT(2C) receptor, melanocortin 4 receptor and neurokinin 1 receptor respectively) from large compound libraries. COMBI-SVMs trained with 917-1951 individual target inhibitors correctly identified 22-83.3% (majority >31.1%) of the 6-216 dual inhibitors collected from literature as independent testing sets. COMBI-SVMs showed moderate to good target selectivity in misclassifying as dual inhibitors 2.2-29.8% (majority <15.4%) of the individual target inhibitors of the same target pair and 0.58-7.1% of the other 6 targets outside the target pair. COMBI-SVMs showed low dual inhibitor false hit rates (0.006-0.056%, 0.042-0.21%, 0.2-4%) in screening 17 million PubChem compounds, 168,000 MDDR compounds, and 7-8181 MDDR compounds similar to the dual inhibitors. Compared with similarity searching, k-NN and PNN methods, COMBI-SVM produced comparable dual inhibitor yields, similar target selectivity, and lower false hit rate in screening 168,000 MDDR compounds. The annotated classes of many COMBI-SVMs identified MDDR virtual hits correlate with the reported effects of their predicted targets. COMBI-SVM is potentially useful for searching selective multi-target agents without explicit knowledge of these agents. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation

    PubMed Central

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541

  17. Review of Cuttability Indices and A New Rockmass Classification Approach for Selection of Surface Miners

    NASA Astrophysics Data System (ADS)

    Dey, Kaushik; Ghose, A. K.

    2011-09-01

    Rock excavation is carried out either by drilling and blasting or using rock-cutting machines like rippers, bucket wheel excavators, surface miners, road headers etc. Economics of mechanised rock excavation by rock-cutting machines largely depends on the achieved production rates. Thus, assessment of the performance (productivity) is important prior to deploying a rock-cutting machine. In doing so, several researchers have classified rockmass in different ways and have developed cuttability indices to correlate machine performance directly. However, most of these indices were developed to assess the performance of road headers/tunnel-boring machines apart from a few that were developed in the earlier days when the ripper was a popular excavating equipment. Presently, around 400 surface miners are in operation around the world amongst which, 105 are in India. Until now, no rockmass classification system is available to assess the performance of surface miners. Surface miners are being deployed largely on trial and error basis or based on the performance charts provided by the manufacturer. In this context, it is logical to establish a suitable cuttability index to predict the performance of surface miners. In this present paper, the existing cuttability indices are reviewed and a new cuttability indexes proposed. A new relationship is also developed to predict the output from surface miners using the proposed cuttability index.

  18. Development of a Crush and Mix Machine for Composite Brick Fabrication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sothea, Kruy; Fazli, Nik; Hamdi, M.

    2011-01-17

    Currently, people are more and more concerned about the environmental protection. Municipal solid wastes (MSW) have bad effect on the environment and also human health. In addition, the amounts of municipal solid wastes are increasing due to the economic development, density of population, especially in the developing countries and they are recycled in a little percentage. To address this problem, the composite brick forming machine was designed and developed to make brick using combination of MSW and mortar. The machine consists of two independent parts, crusher and mixer part, and molding part. This paper explores the design of crusher andmore » mixer part. The crusher has ability to cut MSW such as wood, paper and plastic into small size. There are two mixers; one is used for making mortar and other use for making slurry. FEA analyses were carried out to address the suitable strength of the critical parts of the crusher which ensures that crusher can run properly with high efficiency. The experimentation of the crusher shows that it has high performance for cutting MSW. The mixers also work very well in high efficiency. The results of composite brick testing have been shown that ability of the machine can performance well. This is the innovation of crush and mix machine which is portable and economic by using MSW in replacement of sand.« less

  19. Transferability of glass lens molding

    NASA Astrophysics Data System (ADS)

    Katsuki, Masahide

    2006-02-01

    Sphere lenses have been used for long time. But it is well known that sphere lenses theoretically have spherical aberration, coma and so on. And, aspheric lenses attract attention recently. Plastic lenses are molded easily with injection machines, and are relatively low cost. They are suitable for mass production. On the other hand, glass lenses have several excellent features such as high refractive index, heat resistance and so on. Many aspheric glass lenses came to be used for the latest digital camera and mobile phone camera module. It is very difficult to produce aspheric glass lenses by conventional process of curve generating and polishing. For the solution of this problem, Glass Molding Machine was developed and is spreading through the market. High precision mold is necessary to mold glass lenses with Glass Molding Machine. The mold core is ground or turned by high precision NC aspheric generator. To obtain higher transferability of the mold core, the function of the molding machine and the conditions of molding are very important. But because of high molding temperature, there are factors of thermal expansion and contraction of the mold and glass material. And it is hard to avoid the factors. In this session, I introduce following items. [1] Technology of glass molding and the machine is introduced. [2] The transferability of glass molding is analyzed with some data of glass lenses molded. [3] Compensation of molding shape error is discussed with examples.

  20. Development of a Crush and Mix Machine for Composite Brick Fabrication

    NASA Astrophysics Data System (ADS)

    Sothea, Kruy; Fazli, Nik; Hamdi, M.; Aoyama, Hideki

    2011-01-01

    Currently, people are more and more concerned about the environmental protection. Municipal solid wastes (MSW) have bad effect on the environment and also human health. In addition, the amounts of municipal solid wastes are increasing due to the economic development, density of population, especially in the developing countries and they are recycled in a little percentage. To address this problem, the composite brick forming machine was designed and developed to make brick using combination of MSW and mortar. The machine consists of two independent parts, crusher and mixer part, and molding part. This paper explores the design of crusher and mixer part. The crusher has ability to cut MSW such as wood, paper and plastic into small size. There are two mixers; one is used for making mortar and other use for making slurry. FEA analyses were carried out to address the suitable strength of the critical parts of the crusher which ensures that crusher can run properly with high efficiency. The experimentation of the crusher shows that it has high performance for cutting MSW. The mixers also work very well in high efficiency. The results of composite brick testing have been shown that ability of the machine can performance well. This is the innovation of crush and mix machine which is portable and economic by using MSW in replacement of sand.

  1. Stereoscopic optical viewing system

    DOEpatents

    Tallman, C.S.

    1986-05-02

    An improved optical system which provides the operator with a stereoscopic viewing field and depth of vision, particularly suitable for use in various machines such as electron or laser beam welding and drilling machines. The system features two separate but independently controlled optical viewing assemblies from the eyepiece to a spot directly above the working surface. Each optical assembly comprises a combination of eye pieces, turning prisms, telephoto lenses for providing magnification, achromatic imaging relay lenses and final stage pentagonal turning prisms. Adjustment for variations in distance from the turning prisms to the workpiece, necessitated by varying part sizes and configurations and by the operator's visual accuity, is provided separately for each optical assembly by means of separate manual controls at the operator console or within easy reach of the operator.

  2. Spectral and spatial characterisation of laser-driven positron beams

    DOE PAGES

    Sarri, G.; Warwick, J.; Schumaker, W.; ...

    2016-10-18

    The generation of high-quality relativistic positron beams is a central area of research in experimental physics, due to their potential relevance in a wide range of scientific and engineering areas, ranging from fundamental science to practical applications. There is now growing interest in developing hybrid machines that will combine plasma-based acceleration techniques with more conventional radio-frequency accelerators, in order to minimise the size and cost of these machines. Here we report on recent experiments on laser-driven generation of high-quality positron beams using a relatively low energy and potentially table-top laser system. Lastly, the results obtained indicate that current technology allowsmore » to create, in a compact setup, positron beams suitable for injection in radio-frequency accelerators.« less

  3. Development Of Knowledge Systems For Trouble Shooting Complex Production Machinery

    NASA Astrophysics Data System (ADS)

    Sanford, Richard L.; Novak, Thomas; Meigs, James R.

    1987-05-01

    This paper discusses the use of knowledge base system software for microcomputers to aid repairmen in diagnosing electrical failures in complex mining machinery. The knowledge base is constructed to allow the user to input initial symptoms of the failed machine, and the most probable cause of failure is traced through the knowledge base, with the software requesting additional information such as voltage or resistance measurements as needed. Although the case study presented is for an underground mining machine, results have application to any industry using complex machinery. Two commercial expert-system development tools (M1 TM and Insight 2+TM) and an Al language (Turbo PrologTM) are discussed with emphasis on ease of application and suitability for this study.

  4. Espresso coffee foam delays cooling of the liquid phase.

    PubMed

    Arii, Yasuhiro; Nishizawa, Kaho

    2017-04-01

    Espresso coffee foam, called crema, is known to be a marker of the quality of espresso coffee extraction. However, the role of foam in coffee temperature has not been quantitatively clarified. In this study, we used an automatic machine for espresso coffee extraction. We evaluated whether the foam prepared using the machine was suitable for foam analysis. After extraction, the percentage and consistency of the foam were measured using various techniques, and changes in the foam volume were tracked over time. Our extraction method, therefore, allowed consistent preparation of high-quality foam. We also quantitatively determined that the foam phase slowed cooling of the liquid phase after extraction. High-quality foam plays an important role in delaying the cooling of espresso coffee.

  5. Stereoscopic optical viewing system

    DOEpatents

    Tallman, Clifford S.

    1987-01-01

    An improved optical system which provides the operator a stereoscopic viewing field and depth of vision, particularly suitable for use in various machines such as electron or laser beam welding and drilling machines. The system features two separate but independently controlled optical viewing assemblies from the eyepiece to a spot directly above the working surface. Each optical assembly comprises a combination of eye pieces, turning prisms, telephoto lenses for providing magnification, achromatic imaging relay lenses and final stage pentagonal turning prisms. Adjustment for variations in distance from the turning prisms to the workpiece, necessitated by varying part sizes and configurations and by the operator's visual accuity, is provided separately for each optical assembly by means of separate manual controls at the operator console or within easy reach of the operator.

  6. The performance of disk arrays in shared-memory database machines

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.; Hong, Wei

    1993-01-01

    In this paper, we examine how disk arrays and shared memory multiprocessors lead to an effective method for constructing database machines for general-purpose complex query processing. We show that disk arrays can lead to cost-effective storage systems if they are configured from suitably small formfactor disk drives. We introduce the storage system metric data temperature as a way to evaluate how well a disk configuration can sustain its workload, and we show that disk arrays can sustain the same data temperature as a more expensive mirrored-disk configuration. We use the metric to evaluate the performance of disk arrays in XPRS, an operational shared-memory multiprocessor database system being developed at the University of California, Berkeley.

  7. Cloud Detection of Optical Satellite Images Using Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Lee, Kuan-Yi; Lin, Chao-Hung

    2016-06-01

    Cloud covers are generally present in optical remote-sensing images, which limit the usage of acquired images and increase the difficulty of data analysis, such as image compositing, correction of atmosphere effects, calculations of vegetation induces, land cover classification, and land cover change detection. In previous studies, thresholding is a common and useful method in cloud detection. However, a selected threshold is usually suitable for certain cases or local study areas, and it may be failed in other cases. In other words, thresholding-based methods are data-sensitive. Besides, there are many exceptions to control, and the environment is changed dynamically. Using the same threshold value on various data is not effective. In this study, a threshold-free method based on Support Vector Machine (SVM) is proposed, which can avoid the abovementioned problems. A statistical model is adopted to detect clouds instead of a subjective thresholding-based method, which is the main idea of this study. The features used in a classifier is the key to a successful classification. As a result, Automatic Cloud Cover Assessment (ACCA) algorithm, which is based on physical characteristics of clouds, is used to distinguish the clouds and other objects. In the same way, the algorithm called Fmask (Zhu et al., 2012) uses a lot of thresholds and criteria to screen clouds, cloud shadows, and snow. Therefore, the algorithm of feature extraction is based on the ACCA algorithm and Fmask. Spatial and temporal information are also important for satellite images. Consequently, co-occurrence matrix and temporal variance with uniformity of the major principal axis are used in proposed method. We aim to classify images into three groups: cloud, non-cloud and the others. In experiments, images acquired by the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) and images containing the landscapes of agriculture, snow area, and island are tested. Experiment results demonstrate the detection accuracy of the proposed method is better than related methods.

  8. The Lick-Gaertner automatic measuring system

    NASA Technical Reports Server (NTRS)

    Vasilevskis, S.; Popov, W. A.

    1971-01-01

    The Lick-Gaertner automatic equipment has been designed mainly for the measurement of stellar proper motions with reference to galaxies, and consists of two main components: the survey machine and the automatic measuring engine. The survey machine is used for initial inspection and selection of objects for subsequent measurement. Two plates, up to 17 x 17 inches each, are surveyed simultaneously by means of projection on a screen. The approximate positions of objects selected are measured by two optical screws: helical lines cut through an aluminum coating on glass cylinders. These approximate coordinates to a precision of the order of 0.03mm are transmitted to a card punch by encoders connected with the cylinders.

  9. Molecular graph convolutions: moving beyond fingerprints

    NASA Astrophysics Data System (ADS)

    Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick

    2016-08-01

    Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.

  10. Digital imaging biomarkers feed machine learning for melanoma screening.

    PubMed

    Gareau, Daniel S; Correa da Rosa, Joel; Yagerman, Sarah; Carucci, John A; Gulati, Nicholas; Hueto, Ferran; DeFazio, Jennifer L; Suárez-Fariñas, Mayte; Marghoob, Ashfaq; Krueger, James G

    2017-07-01

    We developed an automated approach for generating quantitative image analysis metrics (imaging biomarkers) that are then analysed with a set of 13 machine learning algorithms to generate an overall risk score that is called a Q-score. These methods were applied to a set of 120 "difficult" dermoscopy images of dysplastic nevi and melanomas that were subsequently excised/classified. This approach yielded 98% sensitivity and 36% specificity for melanoma detection, approaching sensitivity/specificity of expert lesion evaluation. Importantly, we found strong spectral dependence of many imaging biomarkers in blue or red colour channels, suggesting the need to optimize spectral evaluation of pigmented lesions. © 2016 The Authors. Experimental Dermatology Published by John Wiley & Sons Ltd.

  11. Amazing structure of respirasome: unveiling the secrets of cell respiration.

    PubMed

    Guo, Runyu; Gu, Jinke; Wu, Meng; Yang, Maojun

    2016-12-01

    Respirasome, a huge molecular machine that carries out cellular respiration, has gained growing attention since its discovery, because respiration is the most indispensable biological process in almost all living creatures. The concept of respirasome has renewed our understanding of the respiratory chain organization, and most recently, the structure of respirasome solved by Yang's group from Tsinghua University (Gu et al. Nature 237(7622):639-643, 2016) firstly presented the detailed interactions within this huge molecular machine, and provided important information for drug design and screening. However, the study of cellular respiration went through a long history. Here, we briefly showed the detoured history of respiratory chain investigation, and then described the amazing structure of respirasome.

  12. Molecular graph convolutions: moving beyond fingerprints.

    PubMed

    Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick

    2016-08-01

    Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph-atoms, bonds, distances, etc.-which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.

  13. RE-Powering’s Electronic Decision Tree

    EPA Pesticide Factsheets

    Developed by US EPA's RE-Powering America's Land Initiative, the RE-Powering Decision Trees tool guides interested parties through a process to screen sites for their suitability for solar photovoltaics or wind installations

  14. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  15. Co-Located Collaborative Learning Video Game with Single Display Groupware

    ERIC Educational Resources Information Center

    Infante, Cristian; Weitz, Juan; Reyes, Tomas; Nussbaum, Miguel; Gomez, Florencia; Radovic, Darinka

    2010-01-01

    Role Game is a co-located CSCL video game played by three students sitting at one machine sharing a single screen, each with their own input device. Inspired by video console games, Role Game enables students to learn by doing, acquiring social abilities and mastering subject matter in a context of co-located collaboration. After describing the…

  16. Level 2 Screening with the PDD Behavior Inventory: Subgroup Profiles and Implications for Differential Diagnosis

    ERIC Educational Resources Information Center

    Cohen, Ira L.; Liu, Xudong; Hudson, Melissa; Gillis, Jennifer; Cavalari, Rachel N. S.; Romanczyk, Raymond G.; Karmel, Bernard Z.; Gardner, Judith M.

    2017-01-01

    The PDD Behavior Inventory (PDDBI) has recently been shown, in a large multisite study, to discriminate well between autism spectrum disorder (ASD) and other groups when its scores were examined using a machine learning tool, Classification and Regression Trees (CART). Discrimination was good for toddlers, preschoolers, and school-age children;…

  17. Cutawl Techniques and Silk Screen; Commercial and Advertising Art--Intermediate: 9185.03.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    The course is comprised of two comprehensive courses totaling 135 hours of classwork. Orientation to commercial and advertising art is a necessary prerequisite to entry into the course. The first half of the course introduces the student to the function and operation of the cutawl machine. Through supervised classroom practice, the student…

  18. Diamond Smoothing Tools

    NASA Technical Reports Server (NTRS)

    Voronov, Oleg

    2007-01-01

    Diamond smoothing tools have been proposed for use in conjunction with diamond cutting tools that are used in many finish-machining operations. Diamond machining (including finishing) is often used, for example, in fabrication of precise metal mirrors. A diamond smoothing tool according to the proposal would have a smooth spherical surface. For a given finish machining operation, the smoothing tool would be mounted next to the cutting tool. The smoothing tool would slide on the machined surface left behind by the cutting tool, plastically deforming the surface material and thereby reducing the roughness of the surface, closing microcracks and otherwise generally reducing or eliminating microscopic surface and subsurface defects, and increasing the microhardness of the surface layer. It has been estimated that if smoothing tools of this type were used in conjunction with cutting tools on sufficiently precise lathes, it would be possible to reduce the roughness of machined surfaces to as little as 3 nm. A tool according to the proposal would consist of a smoothing insert in a metal holder. The smoothing insert would be made from a diamond/metal functionally graded composite rod preform, which, in turn, would be made by sintering together a bulk single-crystal or polycrystalline diamond, a diamond powder, and a metallic alloy at high pressure. To form the spherical smoothing tip, the diamond end of the preform would be subjected to flat grinding, conical grinding, spherical grinding using diamond wheels, and finally spherical polishing and/or buffing using diamond powders. If the diamond were a single crystal, then it would be crystallographically oriented, relative to the machining motion, to minimize its wear and maximize its hardness. Spherically polished diamonds could also be useful for purposes other than smoothing in finish machining: They would likely also be suitable for use as heat-resistant, wear-resistant, unlubricated sliding-fit bearing inserts.

  19. Three-dimensional tool radius compensation for multi-axis peripheral milling

    NASA Astrophysics Data System (ADS)

    Chen, Youdong; Wang, Tianmiao

    2013-05-01

    Few function about 3D tool radius compensation is applied to generating executable motion control commands in the existing computer numerical control (CNC) systems. Once the tool radius is changed, especially in the case of tool size changing with tool wear in machining, a new NC program has to be recreated. A generic 3D tool radius compensation method for multi-axis peripheral milling in CNC systems is presented. The offset path is calculated by offsetting the tool path along the direction of the offset vector with a given distance. The offset vector is perpendicular to both the tangent vector of the tool path and the orientation vector of the tool axis relative to the workpiece. The orientation vector equations of the tool axis relative to the workpiece are obtained through homogeneous coordinate transformation matrix and forward kinematics of generalized kinematics model of multi-axis machine tools. To avoid cutting into the corner formed by the two adjacent tool paths, the coordinates of offset path at the intersection point have been calculated according to the transition type that is determined by the angle between the two tool path tangent vectors at the corner. Through the verification by the solid cutting simulation software VERICUT® with different tool radiuses on a table-tilting type five-axis machine tool, and by the real machining experiment of machining a soup spoon on a five-axis machine tool with the developed CNC system, the effectiveness of the proposed 3D tool radius compensation method is confirmed. The proposed compensation method can be suitable for all kinds of three- to five-axis machine tools as a general form.

  20. Intelligent, Energy Saving Power Supply and Control System of Hoisting Mine Machine with Compact and Hybrid Drive System / Inteligentne, Energooszczędne Układy Zasilania I Sterowania Górniczych Maszyn Wyciągowych Z Napędem Zintegrowanym Lub Hybrydowym

    NASA Astrophysics Data System (ADS)

    Szymański, Zygmunt

    2015-03-01

    In the paper present's an analysis of suitableness an application of compact and hybrid drive system in hoisting machine. In the paper presented the review of constructional solutions of hoisting machines drive system, driving with AC and DC motor. In the paper presented conception of modern, energy sparing hoisting machine supply system, composed with compact motor, an supplied with transistor or thyristor converter supply system, and intelligent control system composed with multilevel microprocessor controller. In the paper present's also analysis of suitableness application an selected method of artificial intelligent in hoisting machine control system, automation system, and modern diagnostic system. In the paper one limited to analysis of: fuzzy logic method, genetic algorithms method, and modern neural net II and III generation. That method enables realization of complex control algorithms of hosting machine with insurance of energy sparing exploitation conditions, monitoring of exploitation parameters, and prediction diagnostic of hoisting machine technical state, minimization a number of failure states. In the paper present's a conception of control and diagnostic system of the hoisting machine based on fuzzy logic neural set control. In the chapter presented also a selected control algorithms and results of computer simulations realized for particular mathematical models of hoisting machine. Results of theoretical investigation were partly verified in laboratory and industrial experiments. Przedstawiono analizę celowości wprowadzania, napędów zintegrowanych oraz napędów hybrydowych, do układów napędowych maszyn wyciągowych. Zamieszczono przegląd rozwiązań konstrukcyjnych wybranych hybrydowych oraz zintegrowanych napędów maszyn wyciągowych z silnikami DC i AC. Opisano koncepcję nowoczesnego, energooszczędnego układu zasilania górniczych maszyny wyciągowej, złożonego z silnika zintegrowanego, (tranzystorowego lub tyrystorowego) zasilacza przekształtnikowego, oraz inteligentnego obwodu sterowania zbudowanego na wielopoziomowych sterownikach mikroprocesorowych. Przedstawiono analizę możliwości zastosowania wybranych metod sztucznej inteligencji w układach sterowania, automatyki oraz diagnostyki maszyn wyciągowych. W referacie ograniczono się do analizy metod sterowania rozmytego, metod algorytmów genetycznych oraz nowoczesnych sieci neuronowych II oraz III generacji. Metody te zapewniają realizację złożonych algorytmów sterowania maszyną wyciągową z zapewnieniem energooszczędnych warunków eksploatacyjnych, monitoringu parametrów eksploatacyjnych oraz predykcyjną diagnostykę stanu technicznego maszyny wyciągowej, minimalizującą liczbę stanów awaryjnych. Przedstawiono koncepcję układu sterowania i diagnostyki maszyny bazującej na metodzie: fuzzy-logic neuro set control system (sterowanie rozmyte w sieciach neuronowych). Przedstawiono wybrane algorytmy sterowania oraz wyniki analiz komputerowych wybranych modeli matematycznych maszyny wyciągowej. Wyniki rozważań teoretycznych zostały częściowo sprawdzone w warunkach laboratoryjnych oraz przemysłowych.

  1. Screening for non-alcoholic fatty liver disease in children: do guidelines provide enough guidance?

    PubMed

    Koot, B G P; Nobili, V

    2017-09-01

    Non-alcoholic fatty liver disease (NAFLD) is the most common chronic liver disease in the industrialized world in children. Its high prevalence and important health risks make NAFLD highly suitable for screening. In practice, screening is widely, albeit not consistently, performed. To review the recommendations on screening for NAFLD in children. Recommendations on screening were reviewed from major paediatric obesity guidelines and NAFLD guidelines. A literature overview is provided on open questions and controversies. Screening for NAFLD is advocated in all obesity and most NAFLD guidelines. Guidelines are not uniform in whom to screen, and most guidelines do not specify how screening should be performed in practice. Screening for NAFLD remains controversial, due to lack of a highly accurate screening tool, limited knowledge to predict the natural course of NAFLD and limited data on its cost effectiveness. Guidelines provide little guidance on how screening should be performed. Screening for NAFLD remains controversial because not all conditions for screening are fully met. Consensus is needed on the optimal use of currently available screening tools. Research should focus on new accurate screening tool, the natural history of NAFLD and the cost effectiveness of different screening strategies in children. © 2017 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity Federation.

  2. The MORPHEUS II protein crystallization screen

    PubMed Central

    Gorrec, Fabrice

    2015-01-01

    High-quality macromolecular crystals are a prerequisite for the process of protein structure determination by X-ray diffraction. Unfortunately, the relative yield of diffraction-quality crystals from crystallization experiments is often very low. In this context, innovative crystallization screen formulations are continuously being developed. In the past, MORPHEUS, a screen in which each condition integrates a mix of additives selected from the Protein Data Bank, a cryoprotectant and a buffer system, was developed. Here, MORPHEUS II, a follow-up to the original 96-condition initial screen, is described. Reagents were selected to yield crystals when none might be observed in traditional initial screens. Besides, the screen includes heavy atoms for experimental phasing and small polyols to ensure the cryoprotection of crystals. The suitability of the resulting novel conditions is shown by the crystallization of a broad variety of protein samples and their efficiency is compared with commercially available conditions. PMID:26144227

  3. The MORPHEUS II protein crystallization screen.

    PubMed

    Gorrec, Fabrice

    2015-07-01

    High-quality macromolecular crystals are a prerequisite for the process of protein structure determination by X-ray diffraction. Unfortunately, the relative yield of diffraction-quality crystals from crystallization experiments is often very low. In this context, innovative crystallization screen formulations are continuously being developed. In the past, MORPHEUS, a screen in which each condition integrates a mix of additives selected from the Protein Data Bank, a cryoprotectant and a buffer system, was developed. Here, MORPHEUS II, a follow-up to the original 96-condition initial screen, is described. Reagents were selected to yield crystals when none might be observed in traditional initial screens. Besides, the screen includes heavy atoms for experimental phasing and small polyols to ensure the cryoprotection of crystals. The suitability of the resulting novel conditions is shown by the crystallization of a broad variety of protein samples and their efficiency is compared with commercially available conditions.

  4. Conceptual design of thermal energy storage systems for near term electric utility applications. Volume 1: Screening of concepts

    NASA Technical Reports Server (NTRS)

    Hausz, W.; Berkowitz, B. J.; Hare, R. C.

    1978-01-01

    Over forty thermal energy storage (TES) concepts gathered from the literature and personal contacts were studied for their suitability for the electric utility application of storing energy off-peak discharge during peak hours. Twelve selections were derived from the concepts for screening; they used as storage media high temperature water (HTW), hot oil, molten salts, and packed beds of solids such as rock. HTW required pressure containment by prestressed cast-iron or concrete vessels, or lined underground cavities. Both steam generation from storage and feedwater heating from storage were studied. Four choices were made for further study during the project. Economic comparison by electric utility standard cost practices, and near-term availability (low technical risk) were principal criteria but suitability for utility use, conservation potential, and environmental hazards were considered.

  5. Proton-irradiation technology for high-frequency high-current silicon welding diode manufacturing

    NASA Astrophysics Data System (ADS)

    Lagov, P. B.; Drenin, A. S.; Zinoviev, M. A.

    2017-05-01

    Different proton irradiation regimes were tested to provide more than 20 kHz-frequency, soft reverse recovery “snap-less” behavior, low forward voltage drop and leakage current for 50 mm diameter 7 kA/400 V welding diode Al/Si/Mo structure. Silicon diode with such parameters is very suitable for high frequency resistance welding machines of new generation for robotic welding.

  6. Titles for Technology: An Annotated Bibliography. Compiled at the 1967 Summer Institute of Technology for Children (Marlton, N.J.)

    ERIC Educational Resources Information Center

    New Jersey State Dept. of Education, Trenton. Div. of Vocational Education.

    This annotated bibliography includes about 400 books which are suitable for use in elementary industrial arts. These books, available in the state library system of New Jersey, are organized under 50 topics such as: (1) Automation, (2) Graphic Arts, (3) Machines, (4) Space Travel, and (5) Tools and Measuring. Most of the citations are children's…

  7. Heart Rate Monitor

    NASA Technical Reports Server (NTRS)

    1984-01-01

    In the mid 70's, NASA saw a need for a long term electrocardiographic electrode suitable for use on astronauts. Heart Rate Inc.'s insulated capacitive electrode is constructed of thin dielectric film applied to stainless steel surface, originally developed under a grant by Texas Technical University. HRI, Inc. was awarded NASA license and continued development of heart rate monitor for use on exercise machines for physical fitness and medical markets.

  8. Statistical models for the distribution of modulus of elasticity and modulus of rupture in lumber with implications for reliability calculations

    Treesearch

    Steve P. Verrill; Frank C. Owens; David E. Kretschmann; Rubin Shmulsky

    2017-01-01

    It is common practice to assume that a two-parameter Weibull probability distribution is suitable for modeling lumber properties. Verrill and co-workers demonstrated theoretically and empirically that the modulus of rupture (MOR) distribution of visually graded or machine stress rated (MSR) lumber is not distributed as a Weibull. Instead, the tails of the MOR...

  9. The Invasive Species Forecasting System (ISFS): An iRODS-Based, Cloud-Enabled Decision Support System for Invasive Species Habitat Suitability Modeling

    NASA Technical Reports Server (NTRS)

    Gill, Roger; Schnase, John L.

    2012-01-01

    The Invasive Species Forecasting System (ISFS) is an online decision support system that allows users to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of interest, such as a national park, monument, forest, or refuge. Target customers for ISFS are natural resource managers and decision makers who have a need for scientifically valid, model- based predictions of the habitat suitability of plant species of management concern. In a joint project involving NASA and the Maryland Department of Natural Resources, ISFS has been used to model the potential distribution of Wavyleaf Basketgrass in Maryland's Chesapeake Bay Watershed. Maximum entropy techniques are used to generate predictive maps using predictor datasets derived from remotely sensed data and climate simulation outputs. The workflow to run a model is implemented in an iRODS microservice using a custom ISFS file driver that clips and re-projects data to geographic regions of interest, then shells out to perform MaxEnt processing on the input data. When the model completes, all output files and maps from the model run are registered in iRODS and made accessible to the user. The ISFS user interface is a web browser that uses the iRODS PHP client to interact with the ISFS/iRODS- server. ISFS is designed to reside in a VMware virtual machine running SLES 11 and iRODS 3.0. The ISFS virtual machine is hosted in a VMware vSphere private cloud infrastructure to deliver the online service.

  10. Deep Learning Accurately Predicts Estrogen Receptor Status in Breast Cancer Metabolomics Data.

    PubMed

    Alakwaa, Fadhl M; Chaudhary, Kumardeep; Garmire, Lana X

    2018-01-05

    Metabolomics holds the promise as a new technology to diagnose highly heterogeneous diseases. Conventionally, metabolomics data analysis for diagnosis is done using various statistical and machine learning based classification methods. However, it remains unknown if deep neural network, a class of increasingly popular machine learning methods, is suitable to classify metabolomics data. Here we use a cohort of 271 breast cancer tissues, 204 positive estrogen receptor (ER+), and 67 negative estrogen receptor (ER-) to test the accuracies of feed-forward networks, a deep learning (DL) framework, as well as six widely used machine learning models, namely random forest (RF), support vector machines (SVM), recursive partitioning and regression trees (RPART), linear discriminant analysis (LDA), prediction analysis for microarrays (PAM), and generalized boosted models (GBM). DL framework has the highest area under the curve (AUC) of 0.93 in classifying ER+/ER- patients, compared to the other six machine learning algorithms. Furthermore, the biological interpretation of the first hidden layer reveals eight commonly enriched significant metabolomics pathways (adjusted P-value <0.05) that cannot be discovered by other machine learning methods. Among them, protein digestion and absorption and ATP-binding cassette (ABC) transporters pathways are also confirmed in integrated analysis between metabolomics and gene expression data in these samples. In summary, deep learning method shows advantages for metabolomics based breast cancer ER status classification, with both the highest prediction accuracy (AUC = 0.93) and better revelation of disease biology. We encourage the adoption of feed-forward networks based deep learning method in the metabolomics research community for classification.

  11. Quantitative structure-activity relationship analysis and virtual screening studies for identifying HDAC2 inhibitors from known HDAC bioactive chemical libraries.

    PubMed

    Pham-The, H; Casañola-Martin, G; Diéguez-Santana, K; Nguyen-Hai, N; Ngoc, N T; Vu-Duc, L; Le-Thi-Thu, H

    2017-03-01

    Histone deacetylases (HDAC) are emerging as promising targets in cancer, neuronal diseases and immune disorders. Computational modelling approaches have been widely applied for the virtual screening and rational design of novel HDAC inhibitors. In this study, different machine learning (ML) techniques were applied for the development of models that accurately discriminate HDAC2 inhibitors form non-inhibitors. The obtained models showed encouraging results, with the global accuracy in the external set ranging from 0.83 to 0.90. Various aspects related to the comparison of modelling techniques, applicability domain and descriptor interpretations were discussed. Finally, consensus predictions of these models were used for screening HDAC2 inhibitors from four chemical libraries whose bioactivities against HDAC1, HDAC3, HDAC6 and HDAC8 have been known. According to the results of virtual screening assays, structures of some hits with pair-isoform-selective activity (between HDAC2 and other HDACs) were revealed. This study illustrates the power of ML-based QSAR approaches for the screening and discovery of potent, isoform-selective HDACIs.

  12. Wear-screening and joint simulation studies vs. materials selection and prosthesis design.

    PubMed

    Clarke, I C

    1982-01-01

    Satisfactory friction and wear performance of orthomedic biomaterials is an essential criterion for both hemiarthroplasty and total joint replacements. This report will chart the clinical historical experience of candidate biomaterials with their wear resistance and compare/contrast these data to experimental test predictions. The latter review will encompass publications dealing with both joint simulators and the more basic friction and wear screening devices. Special consideration will be given to the adequacy of the test protocol, the design of the experimental machines, and the accuracy of the measurement techniques. The discussion will then center on clinical reality vs. experimental adequacy and summarize current developments.

  13. Machine learning approaches to investigate the impact of PCBs on the transcriptome of the common bottlenose dolphin (Tursiops truncatus).

    PubMed

    Mancia, Annalaura; Ryan, James C; Van Dolah, Frances M; Kucklick, John R; Rowles, Teresa K; Wells, Randall S; Rosel, Patricia E; Hohn, Aleta A; Schwacke, Lori H

    2014-09-01

    As top-level predators, common bottlenose dolphins (Tursiops truncatus) are particularly sensitive to chemical and biological contaminants that accumulate and biomagnify in the marine food chain. This work investigates the potential use of microarray technology and gene expression profile analysis to screen common bottlenose dolphins for exposure to environmental contaminants through the immunological and/or endocrine perturbations associated with these agents. A dolphin microarray representing 24,418 unigene sequences was used to analyze blood samples collected from 47 dolphins during capture-release health assessments from five different US coastal locations (Beaufort, NC, Sarasota Bay, FL, Saint Joseph Bay, FL, Sapelo Island, GA and Brunswick, GA). Organohalogen contaminants including pesticides, polychlorinated biphenyl congeners (PCBs) and polybrominated diphenyl ether congeners were determined in blubber biopsy samples from the same animals. A subset of samples (n = 10, males; n = 8, females) with the highest and the lowest measured values of PCBs in their blubber was used as strata to determine the differential gene expression of the exposure extremes through machine learning classification algorithms. A set of genes associated primarily with nuclear and DNA stability, cell division and apoptosis regulation, intra- and extra-cellular traffic, and immune response activation was selected by the algorithm for identifying the two exposure extremes. In order to test the hypothesis that these gene expression patterns reflect PCB exposure, we next investigated the blood transcriptomes of the remaining dolphin samples using machine-learning approaches, including K-nn and Support Vector Machines classifiers. Using the derived gene sets, the algorithms worked very well (100% success rate) at classifying dolphins according to the contaminant load accumulated in their blubber. These results suggest that gene expression profile analysis may provide a valuable means to screen for indicators of chemical exposure. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Identifying Green Infrastructure from Social Media and Crowdsourcing- An Image Based Machine-Learning Approach.

    NASA Astrophysics Data System (ADS)

    Rai, A.; Minsker, B. S.

    2016-12-01

    In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.

  15. Screening on oil-decomposing microorganisms and application in organic waste treatment machine.

    PubMed

    Lu, Yi-Tong; Chen, Xiao-Bin; Zhou, Pei; Li, Zhen-Hong

    2005-01-01

    As an oil-decomposable mixture of two bacteria strains (Bacillus sp. and Pseudomonas sp.), Y3 was isolated after 50 d domestication under the condition that oil was used as the limited carbon source. The decomposing rate by Y3 was higher than that by each separate individual strain, indicating a synergistic effect of the two bacteria. Under the conditions that T = 25-40 degrees C, pH = 6-8, HRT (Hydraulic retention time) = 36 h and the oil concentration at 0.1%, Y3 yielded the highest decomposing rate of 95.7%. Y3 was also applied in an organic waste treatment machine and a certain rate of activated bacteria was put into the stuffing. A series of tests including humidity, pH, temperature, C/N rate and oil percentage of the stuffing were carried out to check the efficacy of oil-decomposition. Results showed that the oil content of the stuffing with inoculums was only half of that of the control. Furthermore, the bacteria were also beneficial to maintain the stability of the machine operating. Therefore, the bacteria mixture as well as the machines in this study could be very useful for waste treatment.

  16. A Machine Learning Framework for Plan Payment Risk Adjustment.

    PubMed

    Rose, Sherri

    2016-12-01

    To introduce cross-validation and a nonparametric machine learning framework for plan payment risk adjustment and then assess whether they have the potential to improve risk adjustment. 2011-2012 Truven MarketScan database. We compare the performance of multiple statistical approaches within a broad machine learning framework for estimation of risk adjustment formulas. Total annual expenditure was predicted using age, sex, geography, inpatient diagnoses, and hierarchical condition category variables. The methods included regression, penalized regression, decision trees, neural networks, and an ensemble super learner, all in concert with screening algorithms that reduce the set of variables considered. The performance of these methods was compared based on cross-validated R 2 . Our results indicate that a simplified risk adjustment formula selected via this nonparametric framework maintains much of the efficiency of a traditional larger formula. The ensemble approach also outperformed classical regression and all other algorithms studied. The implementation of cross-validated machine learning techniques provides novel insight into risk adjustment estimation, possibly allowing for a simplified formula, thereby reducing incentives for increased coding intensity as well as the ability of insurers to "game" the system with aggressive diagnostic upcoding. © Health Research and Educational Trust.

  17. Learning algorithms for human-machine interfaces.

    PubMed

    Danziger, Zachary; Fishbach, Alon; Mussa-Ivaldi, Ferdinando A

    2009-05-01

    The goal of this study is to create and examine machine learning algorithms that adapt in a controlled and cadenced way to foster a harmonious learning environment between the user and the controlled device. To evaluate these algorithms, we have developed a simple experimental framework. Subjects wear an instrumented data glove that records finger motions. The high-dimensional glove signals remotely control the joint angles of a simulated planar two-link arm on a computer screen, which is used to acquire targets. A machine learning algorithm was applied to adaptively change the transformation between finger motion and the simulated robot arm. This algorithm was either LMS gradient descent or the Moore-Penrose (MP) pseudoinverse transformation. Both algorithms modified the glove-to-joint angle map so as to reduce the endpoint errors measured in past performance. The MP group performed worse than the control group (subjects not exposed to any machine learning), while the LMS group outperformed the control subjects. However, the LMS subjects failed to achieve better generalization than the control subjects, and after extensive training converged to the same level of performance as the control subjects. These results highlight the limitations of coadaptive learning using only endpoint error reduction.

  18. Learning Algorithms for Human–Machine Interfaces

    PubMed Central

    Fishbach, Alon; Mussa-Ivaldi, Ferdinando A.

    2012-01-01

    The goal of this study is to create and examine machine learning algorithms that adapt in a controlled and cadenced way to foster a harmonious learning environment between the user and the controlled device. To evaluate these algorithms, we have developed a simple experimental framework. Subjects wear an instrumented data glove that records finger motions. The high-dimensional glove signals remotely control the joint angles of a simulated planar two-link arm on a computer screen, which is used to acquire targets. A machine learning algorithm was applied to adaptively change the transformation between finger motion and the simulated robot arm. This algorithm was either LMS gradient descent or the Moore–Penrose (MP) pseudoinverse transformation. Both algorithms modified the glove-to-joint angle map so as to reduce the endpoint errors measured in past performance. The MP group performed worse than the control group (subjects not exposed to any machine learning), while the LMS group outperformed the control subjects. However, the LMS subjects failed to achieve better generalization than the control subjects, and after extensive training converged to the same level of performance as the control subjects. These results highlight the limitations of coadaptive learning using only endpoint error reduction. PMID:19203886

  19. Solubility prediction, solvate and cocrystal screening as tools for rational crystal engineering.

    PubMed

    Loschen, Christoph; Klamt, Andreas

    2015-06-01

    The fact that novel drug candidates are becoming increasingly insoluble is a major problem of current drug development. Computational tools may address this issue by screening for suitable solvents or by identifying potential novel cocrystal formers that increase bioavailability. In contrast to other more specialized methods, the fluid phase thermodynamics approach COSMO-RS (conductor-like screening model for real solvents) allows for a comprehensive treatment of drug solubility, solvate and cocrystal formation and many other thermodynamics properties in liquids. This article gives an overview of recent COSMO-RS developments that are of interest for drug development and contains several new application examples for solubility prediction and solvate/cocrystal screening. For all property predictions COSMO-RS has been used. The basic concept of COSMO-RS consists of using the screening charge density as computed from first principles calculations in combination with fast statistical thermodynamics to compute the chemical potential of a compound in solution. The fast and accurate assessment of drug solubility and the identification of suitable solvents, solvate or cocrystal formers is nowadays possible and may be used to complement modern drug development. Efficiency is increased by avoiding costly quantum-chemical computations using a database of previously computed molecular fragments. COSMO-RS theory can be applied to a range of physico-chemical properties, which are of interest in rational crystal engineering. Most notably, in combination with experimental reference data, accurate quantitative solubility predictions in any solvent or solvent mixture are possible. Additionally, COSMO-RS can be extended to the prediction of cocrystal formation, which results in considerable predictive accuracy concerning coformer screening. In a recent variant costly quantum chemical calculations are avoided resulting in a significant speed-up and ease-of-use. © 2015 Royal Pharmaceutical Society.

  20. Identification of an Antimicrobial Agent Effective against Methicillin-Resistant Staphylococcus aureus Persisters Using a Fluorescence-Based Screening Strategy

    PubMed Central

    Kim, Wooseong; Conery, Annie L.; Rajamuthiah, Rajmohan; Fuchs, Beth Burgwyn; Ausubel, Frederick M.; Mylonakis, Eleftherios

    2015-01-01

    Persisters are a subpopulation of normal bacterial cells that show tolerance to conventional antibiotics. Persister cells are responsible for recalcitrant chronic infections and new antibiotics effective against persisters would be a major development in the treatment of these infections. Using the reporter dye SYTOX Green that only stains cells with permeabilized membranes, we developed a fluorescence-based screening assay in a 384-well format for identifying compounds that can kill methicillin-resistant Staphylococcus aureus (MRSA) persisters. The assay proved robust and suitable for high throughput screening (Z`-factor: >0.7). In screening a library of hits from a previous screen, which identified compounds that had the ability to block killing of the nematode Caenorhabditis by MRSA, we discovered that the low molecular weight compound NH125, a bacterial histidine kinase inhibitor, kills MRSA persisters by causing cell membrane permeabilization, and that 5 μg/mL of the compound can kill all cells to the limit of detection in a 108 CFU/mL culture of MRSA persisters within 3h. Furthermore, NH125 disrupts 50% of established MRSA biofilms at 20 μg/mL and completely eradicates biofilms at 160 μg/mL. Our results suggest that the SYTOX Green screening assay is suitable for large-scale projects to identify small molecules effective against MRSA persisters and should be easily adaptable to a broad range of pathogens that form persisters. Since NH125 has strong bactericidal properties against MRSA persisters and high selectivity to bacteria, we believe NH125 is a good anti-MRSA candidate drug that should be further evaluated. PMID:26039584

  1. Unintended transplantation of three organs from an HIV-positive donor: report of the analysis of an adverse event in a regional health care service in Italy.

    PubMed

    Bellandi, T; Albolino, S; Tartaglia, R; Filipponi, F

    2010-01-01

    In February 2007, three organs from an human immunodeficiency virus (HIV)-positive donor were transplanted at two hospitals in the Tuscany Regional Health Care Service, owing to a chain of errors during the donation process. The heart-beating donor was a 41-year-old woman who died as a result of head trauma. The patient's history did not highlight any risky behavior. The available data on previous hospital admissions reported a negative result on HIV testing. During the donation process, the result of the lab test performed for evaluation of organ suitability was mistakenly transcribed from positive to negative. This wrong negative result was then included in the donation record without any cross-check. Therefore, the Regional Transplant Center allocated the liver and both kidneys. The patient also donated tissues, and a second laboratory conducted an evaluation of suitability for the tissue banks. During this process, only 5 days after the successful transplantation procedures, the positive HIV result was fed back to the Regional Transplant Center and the previous error discovered. Transplanted patients were immediately assessed and then treated with antiretroviral medications. A national commission soon performed a systems analysis of the adverse event. Besides the active error committed during the manual transcription for the HIV lab test result, the commission also identified technological factors, such as the lack of integration between the lab machine, the laboratory information system (LIS), and the donor record, as well as organizational factors, such as the distribution to two different labs of the suitability evaluation for organs and tissues. Recommendations included: automatic transmission of lab test results from the lab machine to the LIS and to the donor record, centralization of lab tests for suitability evaluation of organs and tissues, a training program to develop a proactive quality and safety culture in the regional network of donation and transplantations. Copyright 2010. Published by Elsevier Inc.

  2. [Prenatal risk calculation: comparison between Fast Screen pre I plus software and ViewPoint software. Evaluation of the risk calculation algorithms].

    PubMed

    Morin, Jean-François; Botton, Eléonore; Jacquemard, François; Richard-Gireme, Anouk

    2013-01-01

    The Fetal medicine foundation (FMF) has developed a new algorithm called Prenatal Risk Calculation (PRC) to evaluate Down syndrome screening based on free hCGβ, PAPP-A and nuchal translucency. The peculiarity of this algorithm is to use the degree of extremeness (DoE) instead of the multiple of the median (MoM). The biologists measuring maternal seric markers on Kryptor™ machines (Thermo Fisher Scientific) use Fast Screen pre I plus software for the prenatal risk calculation. This software integrates the PRC algorithm. Our study evaluates the data of 2.092 patient files of which 19 show a fœtal abnormality. These files have been first evaluated with the ViewPoint software based on MoM. The link between DoE and MoM has been analyzed and the different calculated risks compared. The study shows that Fast Screen pre I plus software gives the same risk results as ViewPoint software, but yields significantly fewer false positive results.

  3. High-throughput screening of high Monascus pigment-producing strain based on digital image processing.

    PubMed

    Xia, Meng-lei; Wang, Lan; Yang, Zhi-xia; Chen, Hong-zhang

    2016-04-01

    This work proposed a new method which applied image processing and support vector machine (SVM) for screening of mold strains. Taking Monascus as example, morphological characteristics of Monascus colony were quantified by image processing. And the association between the characteristics and pigment production capability was determined by SVM. On this basis, a highly automated screening strategy was achieved. The accuracy of the proposed strategy is 80.6 %, which is compatible with the existing methods (81.1 % for microplate and 85.4 % for flask). Meanwhile, the screening of 500 colonies only takes 20-30 min, which is the highest rate among all published results. By applying this automated method, 13 strains with high-predicted production were obtained and the best one produced as 2.8-fold (226 U/mL) of pigment and 1.9-fold (51 mg/L) of lovastatin compared with the parent strain. The current study provides us with an effective and promising method for strain improvement.

  4. Machine learning of molecular electronic properties in chemical compound space

    NASA Astrophysics Data System (ADS)

    Montavon, Grégoire; Rupp, Matthias; Gobre, Vivekanand; Vazquez-Mayagoitia, Alvaro; Hansen, Katja; Tkatchenko, Alexandre; Müller, Klaus-Robert; Anatole von Lilienfeld, O.

    2013-09-01

    The combination of modern scientific computing with electronic structure theory can lead to an unprecedented amount of data amenable to intelligent data analysis for the identification of meaningful, novel and predictive structure-property relationships. Such relationships enable high-throughput screening for relevant properties in an exponentially growing pool of virtual compounds that are synthetically accessible. Here, we present a machine learning model, trained on a database of ab initio calculation results for thousands of organic molecules, that simultaneously predicts multiple electronic ground- and excited-state properties. The properties include atomization energy, polarizability, frontier orbital eigenvalues, ionization potential, electron affinity and excitation energies. The machine learning model is based on a deep multi-task artificial neural network, exploiting the underlying correlations between various molecular properties. The input is identical to ab initio methods, i.e. nuclear charges and Cartesian coordinates of all atoms. For small organic molecules, the accuracy of such a ‘quantum machine’ is similar, and sometimes superior, to modern quantum-chemical methods—at negligible computational cost.

  5. The desktop interface in intelligent tutoring systems

    NASA Technical Reports Server (NTRS)

    Baudendistel, Stephen; Hua, Grace

    1987-01-01

    The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.

  6. A state-based approach to trend recognition and failure prediction for the Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Nelson, Kyle S.; Hadden, George D.

    1992-01-01

    A state-based reasoning approach to trend recognition and failure prediction for the Altitude Determination, and Control System (ADCS) of the Space Station Freedom (SSF) is described. The problem domain is characterized by features (e.g., trends and impending failures) that develop over a variety of time spans, anywhere from several minutes to several years. Our state-based reasoning approach, coupled with intelligent data screening, allows features to be tracked as they develop in a time-dependent manner. That is, each state machine has the ability to encode a time frame for the feature it detects. As features are detected, they are recorded and can be used as input to other state machines, creating a hierarchical feature recognition scheme. Furthermore, each machine can operate independently of the others, allowing simultaneous tracking of features. State-based reasoning was implemented in the trend recognition and the prognostic modules of a prototype Space Station Freedom Maintenance and Diagnostic System (SSFMDS) developed at Honeywell's Systems and Research Center.

  7. Development of machine learning models to predict inhibition of 3-dehydroquinate dehydratase.

    PubMed

    de Ávila, Maurício Boff; de Azevedo, Walter Filgueira

    2018-04-20

    In this study, we describe the development of new machine learning models to predict inhibition of the enzyme 3-dehydroquinate dehydratase (DHQD). This enzyme is the third step of the shikimate pathway and is responsible for the synthesis of chorismate, which is a natural precursor of aromatic amino acids. The enzymes of shikimate pathway are absent in humans, which make them protein targets for the design of antimicrobial drugs. We focus our study on the crystallographic structures of DHQD in complex with competitive inhibitors, for which experimental inhibition constant data is available. Application of supervised machine learning techniques was able to elaborate a robust DHQD-targeted model to predict binding affinity. Combination of high-resolution crystallographic structures and binding information indicates that the prevalence of intermolecular electrostatic interactions between DHQD and competitive inhibitors is of pivotal importance for the binding affinity against this enzyme. The present findings can be used to speed up virtual screening studies focused on the DHQD structure. © 2018 John Wiley & Sons A/S.

  8. Ice Storm Supercomputer

    ScienceCinema

    None

    2018-05-01

    A new Idaho National Laboratory supercomputer is helping scientists create more realistic simulations of nuclear fuel. Dubbed "Ice Storm" this 2048-processor machine allows researchers to model and predict the complex physics behind nuclear reactor behavior. And with a new visualization lab, the team can see the results of its simulations on the big screen. For more information about INL research, visit http://www.facebook.com/idahonationallaboratory.

  9. Graphing Calculators in the Secondary Mathematics Classroom. Monograph #21.

    ERIC Educational Resources Information Center

    Eckert, Paul; And Others

    The objective of this presentation is to focus on the use of a hand-held graphics calculator. The specific machine referred to in this monograph is the Casio fx-7000G, chosen because of its low cost, its large viewing screen, its versatility, and its simple operation. Sections include: (1) "Basic Operations with the Casio fx-7000G"; (2) "Graphical…

  10. Utilizing residues from in-woods flail processing

    Treesearch

    Ronald K. Baughman; Bryce J. Stokes; William F. Watson

    1990-01-01

    A Barkbuster 1100 tub grinder has been employed to process debris discharged by a Manitowoc VFDD-1642. The machine successfully passed the material through a 7.62 cm screen and discharged the reduced debris into a chip van for transport. Fuel production is directly dependent upon the production of clean chips by the flail/chipper portion of the system and the available...

  11. Apple (LCSI) LOGO vs. MIT (Terrapin/Krell) LOGO: A Comparison for Grades 2 thru 4.

    ERIC Educational Resources Information Center

    Wappler, Reinhold D.

    Two LOGO dialects are compared for appropriateness for use with second, third, and fourth grade students on the basis of 18 months of experience with teaching LOGO programing language at this level in a four-machine laboratory setting. Benefits and drawbacks of the dialects are evaluated in the areas of editing. screen modes, debugging,…

  12. Automated EEG-based screening of depression using deep convolutional neural network.

    PubMed

    Acharya, U Rajendra; Oh, Shu Lih; Hagiwara, Yuki; Tan, Jen Hong; Adeli, Hojjat; Subha, D P

    2018-07-01

    In recent years, advanced neurocomputing and machine learning techniques have been used for Electroencephalogram (EEG)-based diagnosis of various neurological disorders. In this paper, a novel computer model is presented for EEG-based screening of depression using a deep neural network machine learning approach, known as Convolutional Neural Network (CNN). The proposed technique does not require a semi-manually-selected set of features to be fed into a classifier for classification. It learns automatically and adaptively from the input EEG signals to differentiate EEGs obtained from depressive and normal subjects. The model was tested using EEGs obtained from 15 normal and 15 depressed patients. The algorithm attained accuracies of 93.5% and 96.0% using EEG signals from the left and right hemisphere, respectively. It was discovered in this research that the EEG signals from the right hemisphere are more distinctive in depression than those from the left hemisphere. This discovery is consistent with recent research and revelation that the depression is associated with a hyperactive right hemisphere. An exciting extension of this research would be diagnosis of different stages and severity of depression and development of a Depression Severity Index (DSI). Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Micro Dot Patterning on the Light Guide Panel Using Powder Blasting

    PubMed Central

    Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam

    2008-01-01

    This study is to develop a micromachining technology for a light guide panel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a single injection process instead of existing screen printing processes. The micro powder blasting technique is applied to form micro dot patterns on the LGP mold surface. The optimal conditions for masking, laminating, exposure, and developing processes to form the micro dot patterns are first experimentally investigated. A LGP mold with masked micro patterns is then machined using the micro powder blasting method and the machinability of the micro dot patterns is verified. A prototype LGP is test- injected using the developed LGP mold and a shape analysis of the patterns and performance testing of the injected LGP are carried out. As an additional approach, matte finishing, a special surface treatment method, is applied to the mold surface to improve the light diffusion characteristics, uniformity and brightness of the LGP. The results of this study show that the applied powder blasting method can be successfully used to manufacture LGPs with micro patterns by just single injection using the developed mold and thereby replace existing screen printing methods. PMID:27879740

  14. Advice Taking from Humans and Machines: An fMRI and Effective Connectivity Study.

    PubMed

    Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank

    2016-01-01

    With new technological advances, advice can come from different sources such as machines or humans, but how individuals respond to such advice and the neural correlates involved need to be better understood. We combined functional MRI and multivariate Granger causality analysis with an X-ray luggage-screening task to investigate the neural basis and corresponding effective connectivity involved with advice utilization from agents framed as experts. Participants were asked to accept or reject good or bad advice from a human or machine agent with low reliability (high false alarm rate). We showed that unreliable advice decreased performance overall and participants interacting with the human agent had a greater depreciation of advice utilization during bad advice compared to the machine agent. These differences in advice utilization can be perceivably due to reevaluation of expectations arising from association of dispositional credibility for each agent. We demonstrated that differences in advice utilization engaged brain regions that may be associated with evaluation of personal characteristics and traits (precuneus, posterior cingulate cortex, temporoparietal junction) and interoception (posterior insula). We found that the right posterior insula and left precuneus were the drivers of the advice utilization network that were reciprocally connected to each other and also projected to all other regions. Our behavioral and neuroimaging results have significant implications for society because of progressions in technology and increased interactions with machines.

  15. Advice Taking from Humans and Machines: An fMRI and Effective Connectivity Study

    PubMed Central

    Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank

    2016-01-01

    With new technological advances, advice can come from different sources such as machines or humans, but how individuals respond to such advice and the neural correlates involved need to be better understood. We combined functional MRI and multivariate Granger causality analysis with an X-ray luggage-screening task to investigate the neural basis and corresponding effective connectivity involved with advice utilization from agents framed as experts. Participants were asked to accept or reject good or bad advice from a human or machine agent with low reliability (high false alarm rate). We showed that unreliable advice decreased performance overall and participants interacting with the human agent had a greater depreciation of advice utilization during bad advice compared to the machine agent. These differences in advice utilization can be perceivably due to reevaluation of expectations arising from association of dispositional credibility for each agent. We demonstrated that differences in advice utilization engaged brain regions that may be associated with evaluation of personal characteristics and traits (precuneus, posterior cingulate cortex, temporoparietal junction) and interoception (posterior insula). We found that the right posterior insula and left precuneus were the drivers of the advice utilization network that were reciprocally connected to each other and also projected to all other regions. Our behavioral and neuroimaging results have significant implications for society because of progressions in technology and increased interactions with machines. PMID:27867351

  16. Cheminformatic models based on machine learning for pyruvate kinase inhibitors of Leishmania mexicana.

    PubMed

    Jamal, Salma; Scaria, Vinod

    2013-11-19

    Leishmaniasis is a neglected tropical disease which affects approx. 12 million individuals worldwide and caused by parasite Leishmania. The current drugs used in the treatment of Leishmaniasis are highly toxic and has seen widespread emergence of drug resistant strains which necessitates the need for the development of new therapeutic options. The high throughput screen data available has made it possible to generate computational predictive models which have the ability to assess the active scaffolds in a chemical library followed by its ADME/toxicity properties in the biological trials. In the present study, we have used publicly available, high-throughput screen datasets of chemical moieties which have been adjudged to target the pyruvate kinase enzyme of L. mexicana (LmPK). The machine learning approach was used to create computational models capable of predicting the biological activity of novel antileishmanial compounds. Further, we evaluated the molecules using the substructure based approach to identify the common substructures contributing to their activity. We generated computational models based on machine learning methods and evaluated the performance of these models based on various statistical figures of merit. Random forest based approach was determined to be the most sensitive, better accuracy as well as ROC. We further added a substructure based approach to analyze the molecules to identify potentially enriched substructures in the active dataset. We believe that the models developed in the present study would lead to reduction in cost and length of clinical studies and hence newer drugs would appear faster in the market providing better healthcare options to the patients.

  17. High-Throughput Gene Expression Profiles to Define Drug Similarity and Predict Compound Activity.

    PubMed

    De Wolf, Hans; Cougnaud, Laure; Van Hoorde, Kirsten; De Bondt, An; Wegner, Joerg K; Ceulemans, Hugo; Göhlmann, Hinrich

    2018-04-01

    By adding biological information, beyond the chemical properties and desired effect of a compound, uncharted compound areas and connections can be explored. In this study, we add transcriptional information for 31K compounds of Janssen's primary screening deck, using the HT L1000 platform and assess (a) the transcriptional connection score for generating compound similarities, (b) machine learning algorithms for generating target activity predictions, and (c) the scaffold hopping potential of the resulting hits. We demonstrate that the transcriptional connection score is best computed from the significant genes only and should be interpreted within its confidence interval for which we provide the stats. These guidelines help to reduce noise, increase reproducibility, and enable the separation of specific and promiscuous compounds. The added value of machine learning is demonstrated for the NR3C1 and HSP90 targets. Support Vector Machine models yielded balanced accuracy values ≥80% when the expression values from DDIT4 & SERPINE1 and TMEM97 & SPR were used to predict the NR3C1 and HSP90 activity, respectively. Combining both models resulted in 22 new and confirmed HSP90-independent NR3C1 inhibitors, providing two scaffolds (i.e., pyrimidine and pyrazolo-pyrimidine), which could potentially be of interest in the treatment of depression (i.e., inhibiting the glucocorticoid receptor (i.e., NR3C1), while leaving its chaperone, HSP90, unaffected). As such, the initial hit rate increased by a factor 300, as less, but more specific chemistry could be screened, based on the upfront computed activity predictions.

  18. Cervical cancer screening uptake and challenges in Malawi from 2011 to 2015: retrospective cohort study.

    PubMed

    Msyamboza, Kelias Phiri; Phiri, Twambilire; Sichali, Wesley; Kwenda, Willy; Kachale, Fanny

    2016-08-17

    Malawi has the highest cervical cancer incidence and mortality in the world with age-standardized rate (ASR) of 75.9 and 49.8 per 100,000 population respectively. In response, Ministry of Health established a cervical cancer screening programme using visual inspection with acetic acid (VIA) and treatment of precancerous lesions with cryotherapy. This paper highlights the roll out, integration with family planning services and HIV ART Programme, uptake and challenges of VIA and Cryotherapy programme. We analyzed program data, supportive supervision, quarterly and annual reports from the National Cervical Cancer Control Program. We evaluated the uptake and challenges of screening services by age, HIV serostatus and trends over a five year period (2011-2015). Between 2011 and 2015, number of cervical cancer screening sites, number of women screened and coverage per annum increased from 75 to 130, 15,331 to 49,301 and 9.3 % to 26.5 % respectively. In this five year period, a total of 145,015 women were screened. Of these, 7,349 (5.1 %) and 6,289 (4.3 %) were VIA positive and suspect cancer respectively. Overall 13,638 (9.4 %) were detected to be VIA positive or had suspect cancer. Of the 48,588 women with known age screened in 2015; 13,642 (28.1 %), 27,275 (56.1 %) and 7,671 (15.8 %) were aged 29 or less, 30-45, 46 years or more. Among 39,101 women with data on HIV serostatus; 21,546 (55.1 %) were HIV negative, 6,209 (15.9 %) were HIV positive and 11, 346 (29.0 %) status was unknown. VIA positivity rate and prevalence of suspect cancer were significantly higher in HIV positive than HIV negative women (8.8 % vs 5.0 %, 6.4 % vs 3.0 %); in women aged 30-45 years than women aged 29 years or less (5.6 % vs 2.3 %, 2.6 % vs 1.2 %) respectively, all p <0.05). The main challenge of the programme was failure to treat VIA positive women eligible for cryotherapy. Over the five year period, the programme only treated 1,001 (43.3 %) out of 2,311 eligible women and only 266 (31.8 %) of the 836 women with large lesion or suspect cancer who were referred, received the health care at the referral centre. The reasons for failure to provide cryotherapy treatment were stock out of gas, faulty/broken cryotherapy machine (usually connectors or probes) or no cryotherapy machine at all in the whole district. For women with large lesion or suspect cancer; lack of loop electrosurgical excision procedure (LEEP) machine or inadequate gynaecologists at the referral centre, were the major reasons. Cancer radiotherapy services were not available in Malawi. This study provided data on VIA positivity rate, prevalence of suspect cancer, failure rate of cryotherapy and challenges in the provision of cryotherapy and LEEP treatment in Malawi. These data could be used as baseline for monitoring and evaluation of Human Papillomavirus (HPV) vaccination programme which the country introduced in 2013, the linkage of cervical cancer screening and women on HIV ART and the long term effect of ART, voluntary male medical circumcision on the prevalence and incidence of cervical cancer.

  19. Population screening for genetic disorders in the 21st century: evidence, economics, and ethics.

    PubMed

    Grosse, S D; Rogowski, W H; Ross, L F; Cornel, M C; Dondorp, W J; Khoury, M J

    2010-01-01

    Proposals for population screening for genetic diseases require careful scrutiny by decision makers because of the potential for harms and the need to demonstrate benefits commensurate with the opportunity cost of resources expended. We review current evidence-based processes used in the United States, the United Kingdom, and the Netherlands to assess genetic screening programs, including newborn screening programs, carrier screening, and organized cascade testing of relatives of patients with genetic syndromes. In particular, we address critical evidentiary, economic, and ethical issues that arise in the appraisal of screening tests offered to the population. Specific case studies include newborn screening for congenital adrenal hyperplasia and cystic fibrosis and adult screening for hereditary hemochromatosis. Organizations and countries often reach different conclusions about the suitability of screening tests for implementation on a population basis. Deciding when and how to introduce pilot screening programs is challenging. In certain cases, e.g., hereditary hemochromatosis, a consensus does not support general screening although cascade screening may be cost-effective. Genetic screening policies have often been determined by technological capability, advocacy, and medical opinion rather than through a rigorous evidence-based review process. Decision making should take into account principles of ethics and opportunity costs. Copyright 2009 S. Karger AG, Basel.

  20. Tablet coating by injection molding technology - Optimization of coating formulation attributes and coating process parameters.

    PubMed

    Desai, Parind M; Puri, Vibha; Brancazio, David; Halkude, Bhakti S; Hartman, Jeremy E; Wahane, Aniket V; Martinez, Alexander R; Jensen, Keith D; Harinath, Eranda; Braatz, Richard D; Chun, Jung-Hoon; Trout, Bernhardt L

    2018-01-01

    We developed and evaluated a solvent-free injection molding (IM) coating technology that could be suitable for continuous manufacturing via incorporation with IM tableting. Coating formulations (coating polymers and plasticizers) were prepared using hot-melt extrusion and screened via stress-strain analysis employing a universal testing machine. Selected coating formulations were studied for their melt flow characteristics. Tablets were coated using a vertical injection molding unit. Process parameters like softening temperature, injection pressure, and cooling temperature played a very important role in IM coating processing. IM coating employing polyethylene oxide (PEO) based formulations required sufficient room humidity (>30% RH) to avoid immediate cracks, whereas other formulations were insensitive to the room humidity. Tested formulations based on Eudrajit E PO and Kollicoat IR had unsuitable mechanical properties. Three coating formulations based on hydroxypropyl pea starch, PEO 1,000,000 and Opadry had favorable mechanical (<700MPa Young's modulus, >35% elongation, >95×10 4 J/m 3 toughness) and melt flow (>0.4g/min) characteristics, that rendered acceptable IM coats. These three formulations increased the dissolution time by 10, 15 and 35min, respectively (75% drug release), compared to the uncoated tablets (15min). Coated tablets stored in several environmental conditions remained stable to cracking for the evaluated 8-week time period. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Erectile function in cardiovascular patients: its significance and a quick assessment using a visual-scale questionnaire.

    PubMed

    Glavaš, Sandra; Valenčić, Lara; Trbojević, Natasa; Tomašić, Ana-Marija; Turčić, Nikolina; Tibauth, Sara; Ružić, Alen

    2015-12-01

    The aim of this study was to investigate the connection between erectile dysfunction (ED) and cardiovascular diseases and to test a novel visual-scale questionnaire (VEF) we propose for the assessment of erectile function. Erectile function was assessed in 170 male cardiovascular patients under the age of 70 by the use of several self-administered questionnaires: the International Index of Erectile Function-5 (IIEF-5); the Massachusetts Male Aging Study questionnaires (MMAS Sexual Activity Questionnaire and MMAS Single Question), and finally, VEF. Patients’ mean age was 55.65 ± 9.97 y. The most common indications for hospitalization were coronary artery disease (CAD) (n = 82, 48%), and decompensated chronic heart failure (n = 30, 18%). The prevalence of ED as determined by IIEF-5 was 58% (n = 99). Patients with ED were on average 5.7 years older (P = 0.0001), had a higher frequency of diabetes (by 19%, P < 0.01), and a somewhat higher level of uric acid (by 72 μmol/l, P < 0.01). Results of the VEF correlated significantly with those of other questionnaires. Three different machine learning algorithms demonstrated a greater accuracy of VEF than IIEF-5 and MMAS Sexual Activity Questionnaire in predicting ED severity. ED is highly prevalent among cardiovascular patients. The Visual Scale Erectile Function questionnaire (VEF) is a simple and valid tool, suitable for quick screening of this condition.

  2. Evaluation of hot-melt extrusion and injection molding for continuous manufacturing of immediate-release tablets.

    PubMed

    Melocchi, Alice; Loreti, Giulia; Del Curto, Maria Dorly; Maroni, Alessandra; Gazzaniga, Andrea; Zema, Lucia

    2015-06-01

    The exploitation of hot-melt extrusion and injection molding for the manufacturing of immediate-release (IR) tablets was preliminarily investigated in view of their special suitability for continuous manufacturing, which represents a current goal of pharmaceutical production because of its possible advantages in terms of improved sustainability. Tablet-forming agents were initially screened based on processability by single-screw extruder and micromolding machine as well as disintegration/dissolution behavior of extruded/molded prototypes. Various polymers, such as low-viscosity hydroxypropylcellulose, polyvinyl alcohol, polyvinyl alcohol-polyethylene glycol graft copolymer, various sodium starch glycolate grades (e.g., Explotab(®) CLV) that could be processed with no need for technological aids, except for a plasticizer, were identified. Furthermore, the feasibility of both extruded and molded IR tablets from low-viscosity hydroxypropylcellulose or Explotab(®) CLV was assessed. Explotab(®) CLV, in particular, showed thermoplastic properties and a very good aptitude as a tablet-forming agent, starting from which disintegrating tablets were successfully obtained by either techniques. Prototypes containing a poorly soluble model drug (furosemide), based on both a simple formulation (Explotab(®) CLV and water/glycerol as plasticizers) and formulations including dissolution/disintegration adjuvants (soluble and effervescent excipients) were shown to fulfill the USP 37 dissolution requirements for furosemide tablets. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  3. Study of electronic structure of liquid Pb

    NASA Astrophysics Data System (ADS)

    Vora, A. M.; Gajjar, P. N.

    2018-04-01

    The Fiolhais et al.'s universal model potential in conjunction with the hard sphere technique of Percus and Yevick has been used for the study of electronic structure, Fermi energy and density of states of liquid Pb. The screening influence of the different forms of the local field correction functions proposed by Hartree (H) and Taylor (T) on the afore said properties is studied, which replicates the changing effects of screening and found suitable for the present study.

  4. Improved Online Support Vector Machines Spam Filtering Using String Kernels

    NASA Astrophysics Data System (ADS)

    Amayri, Ola; Bouguila, Nizar

    A major bottleneck in electronic communications is the enormous dissemination of spam emails. Developing of suitable filters that can adequately capture those emails and achieve high performance rate become a main concern. Support vector machines (SVMs) have made a large contribution to the development of spam email filtering. Based on SVMs, the crucial problems in email classification are feature mapping of input emails and the choice of the kernels. In this paper, we present thorough investigation of several distance-based kernels and propose the use of string kernels and prove its efficiency in blocking spam emails. We detail a feature mapping variants in text classification (TC) that yield improved performance for the standard SVMs in filtering task. Furthermore, to cope for realtime scenarios we propose an online active framework for spam filtering.

  5. Structural design of the Sandia 34-M Vertical Axis Wind Turbine

    NASA Astrophysics Data System (ADS)

    Berg, D. E.

    Sandia National Laboratories, as the lead DOE laboratory for Vertical Axis Wind Turbine (VAWT) development, is currently designing a 34-meter diameter Darrieus-type VAWT. This turbine will be a research test bed which provides a focus for advancing technology and validating design and fabrication techniques in a size range suitable for utility use. Structural data from this machine will allow structural modeling to be refined and verified for a turbine on which the gravity effects and stochastic wind loading are significant. Performance data from it will allow aerodynamic modeling to be refined and verified. The design effort incorporates Sandia's state-of-the-art analysis tools in the design of a complete machine. The analytic tools used in this design are discussed and the conceptual design procedure is described.

  6. Design of a cardiac monitor in terms of parameters of QRS complex.

    PubMed

    Chen, Zhen-cheng; Ni, Li-li; Su, Ke-ping; Wang, Hong-yan; Jiang, Da-zong

    2002-08-01

    Objective. To design a portable cardiac monitor system based on the available ordinary ECG machine and works on the basis of QRS parameters. Method. The 80196 single chip microcomputer was used as the central microprocessor and real time electrocardiac signal was collected and analyzed [correction of analysized] in the system. Result. Apart from the performance of an ordinary monitor, this machine possesses also the following functions: arrhythmia analysis, HRV analysis, alarm, freeze, and record of automatic papering. Convenient in carrying, the system is powered by AC or DC sources. Stability, low power and low cost are emphasized in the hardware design; and modularization method is applied in software design. Conclusion. Popular in usage and low cost made the portable monitor system suitable for use under simple conditions.

  7. Online Artifact Removal for Brain-Computer Interfaces Using Support Vector Machines and Blind Source Separation

    PubMed Central

    Halder, Sebastian; Bensch, Michael; Mellinger, Jürgen; Bogdan, Martin; Kübler, Andrea; Birbaumer, Niels; Rosenstiel, Wolfgang

    2007-01-01

    We propose a combination of blind source separation (BSS) and independent component analysis (ICA) (signal decomposition into artifacts and nonartifacts) with support vector machines (SVMs) (automatic classification) that are designed for online usage. In order to select a suitable BSS/ICA method, three ICA algorithms (JADE, Infomax, and FastICA) and one BSS algorithm (AMUSE) are evaluated to determine their ability to isolate electromyographic (EMG) and electrooculographic (EOG) artifacts into individual components. An implementation of the selected BSS/ICA method with SVMs trained to classify EMG and EOG artifacts, which enables the usage of the method as a filter in measurements with online feedback, is described. This filter is evaluated on three BCI datasets as a proof-of-concept of the method. PMID:18288259

  8. Online artifact removal for brain-computer interfaces using support vector machines and blind source separation.

    PubMed

    Halder, Sebastian; Bensch, Michael; Mellinger, Jürgen; Bogdan, Martin; Kübler, Andrea; Birbaumer, Niels; Rosenstiel, Wolfgang

    2007-01-01

    We propose a combination of blind source separation (BSS) and independent component analysis (ICA) (signal decomposition into artifacts and nonartifacts) with support vector machines (SVMs) (automatic classification) that are designed for online usage. In order to select a suitable BSS/ICA method, three ICA algorithms (JADE, Infomax, and FastICA) and one BSS algorithm (AMUSE) are evaluated to determine their ability to isolate electromyographic (EMG) and electrooculographic (EOG) artifacts into individual components. An implementation of the selected BSS/ICA method with SVMs trained to classify EMG and EOG artifacts, which enables the usage of the method as a filter in measurements with online feedback, is described. This filter is evaluated on three BCI datasets as a proof-of-concept of the method.

  9. A flexible ultrasound transducer array with micro-machined bulk PZT.

    PubMed

    Wang, Zhe; Xue, Qing-Tang; Chen, Yuan-Quan; Shu, Yi; Tian, He; Yang, Yi; Xie, Dan; Luo, Jian-Wen; Ren, Tian-Ling

    2015-01-23

    This paper proposes a novel flexible piezoelectric micro-machined ultrasound transducer, which is based on PZT and a polyimide substrate. The transducer is made on the polyimide substrate and packaged with medical polydimethylsiloxane. Instead of etching the PZT ceramic, this paper proposes a method of putting diced PZT blocks into holes on the polyimide which are pre-etched. The device works in d31 mode and the electromechanical coupling factor is 22.25%. Its flexibility, good conformal contacting with skin surfaces and proper resonant frequency make the device suitable for heart imaging. The flexible packaging ultrasound transducer also has a good waterproof performance after hundreds of ultrasonic electric tests in water. It is a promising ultrasound transducer and will be an effective supplementary ultrasound imaging method in the practical applications.

  10. [Coloration of mica glass ceramic for use in dental CAD/CAM system].

    PubMed

    Sun, Ying; Wang, Zhong-yi; Tian, Jie-mo; Cao, Xiao-gang

    2003-03-01

    An intrinsically colored machinable glass-ceramic containing tetrasilicic fluormica as the predominant crystal phase was studied, which was used in molar crown in dental CAD/CAM system. Orthogonal design analysis was used to select appropriate base formula, coloration and heat treatment process. Factors influencing the color appearance of mica glass ceramic were nucleation agent and the ratio of Mg(2+) to K(+) in base formula; Cerium oxide (CeO(2)) was used as the main coloration; The preferred heat treatment was 650 degrees C for 1 h and 1,000 degrees C or 1,050 degrees C for 3 h - 4 h. This mica glass-ceramic could provide 4 to 5 color appearance for dental use, it showed excellent machinability which was eminently suitable for use in dental CAD/CAM system.

  11. Wearable Technology in Medicine: Machine-to-Machine (M2M) Communication in Distributed Systems.

    PubMed

    Schmucker, Michael; Yildirim, Kemal; Igel, Christoph; Haag, Martin

    2016-01-01

    Smart wearables are capable of supporting physicians during various processes in medical emergencies. Nevertheless, it is almost impossible to operate several computers without neglecting a patient's treatment. Thus, it is necessary to set up a distributed network consisting of two or more computers to exchange data or initiate remote procedure calls (RPC). If it is not possible to create flawless connections between those devices, it is not possible to transfer medically relevant data to the most suitable device, as well as to control a device with another one. This paper shows how wearables can be paired and what problems occur when trying to pair several wearables. Furthermore, it is described as to what interesting scenarios are possible in the context of emergency medicine/paramedicine.

  12. Sugeno-Fuzzy Expert System Modeling for Quality Prediction of Non-Contact Machining Process

    NASA Astrophysics Data System (ADS)

    Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.

    2018-03-01

    Modeling can be categorised into four main domains: prediction, optimisation, estimation and calibration. In this paper, the Takagi-Sugeno-Kang (TSK) fuzzy logic method is examined as a prediction modelling method to investigate the taper quality of laser lathing, which seeks to replace traditional lathe machines with 3D laser lathing in order to achieve the desired cylindrical shape of stock materials. Three design parameters were selected: feed rate, cutting speed and depth of cut. A total of twenty-four experiments were conducted with eight sequential runs and replicated three times. The results were found to be 99% of accuracy rate of the TSK fuzzy predictive model, which suggests that the model is a suitable and practical method for non-linear laser lathing process.

  13. Simultaneous multi-component seismic denoising and reconstruction via K-SVD

    NASA Astrophysics Data System (ADS)

    Hou, Sian; Zhang, Feng; Li, Xiangyang; Zhao, Qiang; Dai, Hengchang

    2018-06-01

    Data denoising and reconstruction play an increasingly significant role in seismic prospecting for their value in enhancing effective signals, dealing with surface obstacles and reducing acquisition costs. In this paper, we propose a novel method to denoise and reconstruct multicomponent seismic data simultaneously. This method lies within the framework of machine learning and the key points are defining a suitable weight function and a modified inner product operator. The purpose of these two processes are to perform missing data machine learning when the random noise deviation is unknown, and building a mathematical relationship for each component to incorporate all the information of multi-component data. Two examples, using synthetic and real multicomponent data, demonstrate that the new method is a feasible alternative for multi-component seismic data processing.

  14. Grid generation methodology and CFD simulations in sliding vane compressors and expanders

    NASA Astrophysics Data System (ADS)

    Bianchi, Giuseppe; Rane, Sham; Kovacevic, Ahmed; Cipollone, Roberto; Murgia, Stefano; Contaldi, Giulio

    2017-08-01

    The limiting factor for the employment of advanced 3D CFD tools in the analysis and design of rotary vane machines is the unavailability of methods for generation of computational grids suitable for fast and reliable numerical analysis. The paper addresses this challenge presenting the development of an analytical grid generation for vane machines that is based on the user defined nodal displacement. In particular, mesh boundaries are defined as parametric curves generated using trigonometrical modelling of the axial cross section of the machine while the distribution of computational nodes is performed using algebraic algorithms with transfinite interpolation, post orthogonalisation and smoothing. Algebraic control functions are introduced for distribution of nodes on the rotor and casing boundaries in order to achieve good grid quality in terms of cell size and expansion. In this way, the moving and deforming fluid domain of the sliding vane machine is discretized and the conservation of intrinsic quantities in ensured by maintaining the cell connectivity and structure. For validation of generated grids, a mid-size air compressor and a small-scale expander for Organic Rankine Cycle applications have been investigated in this paper. Remarks on implementation of the mesh motion algorithm, stability and robustness experienced with the ANSYS CFX solver as well as the obtained flow results are presented.

  15. A High Performance Torque Sensor for Milling Based on a Piezoresistive MEMS Strain Gauge

    PubMed Central

    Qin, Yafei; Zhao, Yulong; Li, Yingxue; Zhao, You; Wang, Peng

    2016-01-01

    In high speed and high precision machining applications, it is important to monitor the machining process in order to ensure high product quality. For this purpose, it is essential to develop a dynamometer with high sensitivity and high natural frequency which is suited to these conditions. This paper describes the design, calibration and performance of a milling torque sensor based on piezoresistive MEMS strain. A detailed design study is carried out to optimize the two mutually-contradictory indicators sensitivity and natural frequency. The developed torque sensor principally consists of a thin-walled cylinder, and a piezoresistive MEMS strain gauge bonded on the surface of the sensing element where the shear strain is maximum. The strain gauge includes eight piezoresistances and four are connected in a full Wheatstone circuit bridge, which is used to measure the applied torque force during machining procedures. Experimental static calibration results show that the sensitivity of torque sensor has been improved to 0.13 mv/Nm. A modal impact test indicates that the natural frequency of torque sensor reaches 1216 Hz, which is suitable for high speed machining processes. The dynamic test results indicate that the developed torque sensor is stable and practical for monitoring the milling process. PMID:27070620

  16. Novel diesel exhaust filters for underground mining vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bickel, K.L.; Taubert, T.R.

    1995-12-31

    The U.S. Bureau of Mines (USBM) pioneered the development of disposable filters for reducing diesel particulate emissions from permissible mining machines. The USBM is now evaluating filter media that can withstand the high exhaust temperatures on nonpermissible machines. The goal of the evaluation is to find an inexpensive medium that can be cleaned or disposed of after use, and will reduce particulate emissions by 50 % or more. This report summarizes the results from screening tests of a lava rock and woven fiberglass filter media. The lava rock media exhibited low collection efficiencies, but with very low increases in exhaustmore » back pressure. Preliminary results indicate a collection efficiency exceeding 80 % for the woven fiber media. Testing of both media is continuing.« less

  17. Molecular graph convolutions: moving beyond fingerprints

    PubMed Central

    Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick

    2016-01-01

    Molecular “fingerprints” encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement. PMID:27558503

  18. Cloud Forensics Issues

    DTIC Science & Technology

    2014-07-01

    voluminous threat environment. Today we regularly construct seamless encrypted communications between machines through SSL or other TLS . These do not...return to the web application and the user, As a prerequisite to end-to-end communication an SSL , or other suitable TLS is set up between each of the...an TLS connection is established between the requestor and the service provider, within which a WS-Security package will be sent to the service

  19. The Event Based Language and Its Multiple Processor Implementations.

    DTIC Science & Technology

    1980-01-01

    10 6.1 "Recursive" Linear Fibonacci ................................................ 105 6.2 The Readers Writers Problem...kinds. Examples of such systems are: C.mmp [Wu-72], Pluribus [He-73], Data Flow [ De -75], the boolean n-cube parallel machine [Su-77], and the MuNet [Wa...concurrency within programs; therefore, we hate concentrated on two types of systems which seem suitable: a processor network, and a data flow processor [ De -77

  20. JPRS Report, Soviet Union, Economic Affairs

    DTIC Science & Technology

    1988-10-18

    34Commodities—The Mirror of Cost Accounting"] [Text] A number of large-scale decisions directed toward increasing the production of high-quality...suitable in the sphere of scientific research and experimental design work. It is known, for example, that the number of blueprints , specifications, or...the situation, Yu. Kozyrev , deputy chief of the Department for Problems of the Machine Building Complex of the USSR State Committee for Science and

Top