Sample records for machine consisted primarily

  1. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  2. Hand Planting Versus Machine Planting of Bottomland Red Oaks on Former Agricultural Fields in Louisiana's Mississippi Alluvial Plain: Sixth-Year Results

    Treesearch

    Alexander J. Michalek; Brian Roy Lockhart; Thomas J. Dean; Bobby D. Keeland; John W. McCoy

    2002-01-01

    Interest in restoring bottomland hardwoods on abandoned agricultural fields has gained considerably over the past 15 years, due primarily to federal cost-share programs such as the Conservation Reserve Program and the Wetlands Reserve Program. While a variety of artificial regeneration techniques are available to afforest these lands, none have met with consistently...

  3. Preliminary design of a 100 kW turbine generator

    NASA Technical Reports Server (NTRS)

    Puthoff, R. L.; Sirocky, P. J.

    1974-01-01

    The National Science Foundation and the Lewis Research Center have engaged jointly in a Wind Energy Program which includes the design and erection of a 100 kW wind turbine generator. The machine consists primarily of a rotor turbine, transmission, shaft, alternator, and tower. The rotor, measuring 125 feet in diameter and consisting of two variable pitch blades operates at 40 rpm and generates 100 kW of electrical power at 18 mph wind velocity. The entire assembly is placed on top of a tower 100 feet above ground level.

  4. Machine Learning. Part 1. A Historical and Methodological Analysis.

    DTIC Science & Technology

    1983-05-31

    Machine learning has always been an integral part of artificial intelligence, and its methodology has evolved in concert with the major concerns of the field. In response to the difficulties of encoding ever-increasing volumes of knowledge in modern Al systems, many researchers have recently turned their attention to machine learning as a means to overcome the knowledge acquisition bottleneck. Part 1 of this paper presents a taxonomic analysis of machine learning organized primarily by learning strategies and secondarily by

  5. 3D unstructured-mesh radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morel, J.

    1997-12-31

    Three unstructured-mesh radiation transport codes are currently being developed at Los Alamos National Laboratory. The first code is ATTILA, which uses an unstructured tetrahedral mesh in conjunction with standard Sn (discrete-ordinates) angular discretization, standard multigroup energy discretization, and linear-discontinuous spatial differencing. ATTILA solves the standard first-order form of the transport equation using source iteration in conjunction with diffusion-synthetic acceleration of the within-group source iterations. DANTE is designed to run primarily on workstations. The second code is DANTE, which uses a hybrid finite-element mesh consisting of arbitrary combinations of hexahedra, wedges, pyramids, and tetrahedra. DANTE solves several second-order self-adjoint forms of the transport equation including the even-parity equation, the odd-parity equation, and a new equation called the self-adjoint angular flux equation. DANTE also offers three angular discretization options:more » $$S{_}n$$ (discrete-ordinates), $$P{_}n$$ (spherical harmonics), and $$SP{_}n$$ (simplified spherical harmonics). DANTE is designed to run primarily on massively parallel message-passing machines, such as the ASCI-Blue machines at LANL and LLNL. The third code is PERICLES, which uses the same hybrid finite-element mesh as DANTE, but solves the standard first-order form of the transport equation rather than a second-order self-adjoint form. DANTE uses a standard $$S{_}n$$ discretization in angle in conjunction with trilinear-discontinuous spatial differencing, and diffusion-synthetic acceleration of the within-group source iterations. PERICLES was initially designed to run on workstations, but a version for massively parallel message-passing machines will be built. The three codes will be described in detail and computational results will be presented.« less

  6. Design, fabrication and performance of two grazing incidence telescopes for celestial extreme ultraviolet astronomy

    NASA Technical Reports Server (NTRS)

    Lampton, M.; Cash, W.; Malina, R. F.; Bowyer, S.

    1977-01-01

    The design and performance of grazing incidence telescopes for celestial extreme ultraviolet (EUV) astronomy are described. The telescopes basically consist of a star tracker, collimator, grazing incidence mirror, vacuum box lid, vacuum housing, filters, a ranicon detector, an electronics box, and an aspect camera. For the survey mirror a Wolter-Schwarzschild type II configuration was selected. Diamond-turning was used for mirror fabrication, a technique which machines surfaces to the order of 10 microns over the required dimensions. The design of the EUV spectrometer is discussed with particular reference to the optics for a primarily spectroscopic application and the fabrication of the f/10 optics.

  7. A linear helicon plasma device with controllable magnetic field gradient.

    PubMed

    Barada, Kshitish K; Chattopadhyay, P K; Ghosh, J; Kumar, Sunil; Saxena, Y C

    2012-06-01

    Current free double layers (CFDLs) are localized potential structures having spatial dimensions - Debye lengths and potential drops of more than local electron temperature across them. CFDLs do not need a current for them to be sustained and hence they differ from the current driven double layers. Helicon antenna produced plasmas in an expanded chamber along with an expanding magnetic field have shown the existence of CFDL near the expansion region. A helicon plasma device has been designed, fabricated, and installed in the Institute for Plasma Research, India to study the role of maximum magnetic field gradient as well as its location with respect to the geometrical expansion region of the chamber in CFDL formation. The special feature of this machine consisting of two chambers of different radii is its capability of producing different magnetic field gradients near the physical boundary between the two chambers either by changing current in one particular coil in the direction opposite to that in other coils and/or by varying the position of this particular coil. Although, the machine is primarily designed for CFDL experiments, it is also capable of carrying out many basic plasma physics experiments such as wave propagation, wave coupling, and plasma instabilities in a varying magnetic field topology. In this paper, we will present the details of the machine construction, its specialties, and some preliminary results about the production and characterization of helicon plasma in this machine.

  8. EEG-based emotion recognition in music listening.

    PubMed

    Lin, Yuan-Pin; Wang, Chi-Hong; Jung, Tzyy-Ping; Wu, Tien-Lin; Jeng, Shyh-Kang; Duann, Jeng-Ren; Chen, Jyh-Horng

    2010-07-01

    Ongoing brain activity can be recorded as electroencephalograph (EEG) to discover the links between emotional states and brain activity. This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening. A framework was proposed to optimize EEG-based emotion recognition by systematically 1) seeking emotion-specific EEG features and 2) exploring the efficacy of the classifiers. Support vector machine was employed to classify four emotional states (joy, anger, sadness, and pleasure) and obtained an averaged classification accuracy of 82.29% +/- 3.06% across 26 subjects. Further, this study identified 30 subject-independent features that were most relevant to emotional processing across subjects and explored the feasibility of using fewer electrodes to characterize the EEG dynamics during music listening. The identified features were primarily derived from electrodes placed near the frontal and the parietal lobes, consistent with many of the findings in the literature. This study might lead to a practical system for noninvasive assessment of the emotional states in practical or clinical applications.

  9. Importance of polarity change in the electrical discharge machining

    NASA Astrophysics Data System (ADS)

    Schulze, H.-P.

    2017-10-01

    The polarity change in the electrical discharge machining is still a problem and is often performed completely unmotivated or randomly. The polarity must be designated primarily, i.e. the anodic part must be clearly assigned to the tool or the workpiece. Normally, the polarity of the workpiece electrode is named. In paper, will be shown which determine fundamental causes the structural behavior of the cathode and anode, and when it makes sense to change the polarity. The polarity change is primarily dependent on the materials that are used as cathode and anode. This distinction must be made if there are pure metals or complex materials. Secondary of the polarity change is also affected by the process energy source (PES) and the supply line. The polarity change is mostly influenced by the fact that the removal is to be maximized on the workpiece while the tool is minimal removal (wear) occur. A second factor that makes a polarity change needed is the use of electrical discharge in combination with other machining methods, such as electrochemical machining (ECM).

  10. UPEML: a machine-portable CDC Update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Young, M.F.

    1984-12-01

    UPEML is a machine-portable CDC Update emulation program. UPEML is written in ANSI standard Fortran-77 and is relatively simple and compact. It is capable of emulating a significant subset of the standard CDC Update functions including program library creation and subsequent modification. Machine-portability is an essential attribute of UPEML. It was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX 11/780 and the IBM 3081.

  11. Unsupervised classification of major depression using functional connectivity MRI.

    PubMed

    Zeng, Ling-Li; Shen, Hui; Liu, Li; Hu, Dewen

    2014-04-01

    The current diagnosis of psychiatric disorders including major depressive disorder based largely on self-reported symptoms and clinical signs may be prone to patients' behaviors and psychiatrists' bias. This study aims at developing an unsupervised machine learning approach for the accurate identification of major depression based on single resting-state functional magnetic resonance imaging scans in the absence of clinical information. Twenty-four medication-naive patients with major depression and 29 demographically similar healthy individuals underwent resting-state functional magnetic resonance imaging. We first clustered the voxels within the perigenual cingulate cortex into two subregions, a subgenual region and a pregenual region, according to their distinct resting-state functional connectivity patterns and showed that a maximum margin clustering-based unsupervised machine learning approach extracted sufficient information from the subgenual cingulate functional connectivity map to differentiate depressed patients from healthy controls with a group-level clustering consistency of 92.5% and an individual-level classification consistency of 92.5%. It was also revealed that the subgenual cingulate functional connectivity network with the highest discriminative power primarily included the ventrolateral and ventromedial prefrontal cortex, superior temporal gyri and limbic areas, indicating that these connections may play critical roles in the pathophysiology of major depression. The current study suggests that subgenual cingulate functional connectivity network signatures may provide promising objective biomarkers for the diagnosis of major depression and that maximum margin clustering-based unsupervised machine learning approaches may have the potential to inform clinical practice and aid in research on psychiatric disorders. Copyright © 2013 Wiley Periodicals, Inc.

  12. A linear helicon plasma device with controllable magnetic field gradient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barada, Kshitish K.; Chattopadhyay, P. K.; Ghosh, J.

    2012-06-15

    Current free double layers (CFDLs) are localized potential structures having spatial dimensions - Debye lengths and potential drops of more than local electron temperature across them. CFDLs do not need a current for them to be sustained and hence they differ from the current driven double layers. Helicon antenna produced plasmas in an expanded chamber along with an expanding magnetic field have shown the existence of CFDL near the expansion region. A helicon plasma device has been designed, fabricated, and installed in the Institute for Plasma Research, India to study the role of maximum magnetic field gradient as well asmore » its location with respect to the geometrical expansion region of the chamber in CFDL formation. The special feature of this machine consisting of two chambers of different radii is its capability of producing different magnetic field gradients near the physical boundary between the two chambers either by changing current in one particular coil in the direction opposite to that in other coils and/or by varying the position of this particular coil. Although, the machine is primarily designed for CFDL experiments, it is also capable of carrying out many basic plasma physics experiments such as wave propagation, wave coupling, and plasma instabilities in a varying magnetic field topology. In this paper, we will present the details of the machine construction, its specialties, and some preliminary results about the production and characterization of helicon plasma in this machine.« less

  13. Automated Solar Module Assembly Line

    NASA Technical Reports Server (NTRS)

    Bycer, M.

    1979-01-01

    The gathering of information that led to the design approach of the machine, and a summary of the findings in the areas of study along with a description of each station of the machine are discussed. The machine is a cell stringing and string applique machine which is flexible in design, capable of handling a variety of cells and assembling strings of cells which can then be placed in a matrix up to 4 ft x 2 ft. in series or parallel arrangement. The target machine cycle is to be 5 seconds per cell. This machine is primarily adapted to 100 MM round cells with one or two tabs between cells. It places finished strings of up to twelve cells in a matrix of up to six such strings arranged in series or in parallel.

  14. Universal Tool Grinder Operator Instructor's Guide. Part of Single-Tool Skills Program Machine Industries Occupations.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Div. of Curriculum Development.

    The document is an instructor's guide for a course on universal tool grinder operation. The course is designed to train people in making complicated machine setups and precision in the grinding operations and, although intended primarily for adult learners, it can be adapted for high school use. The guide is divided into three parts: (1) the…

  15. Optimization of large matrix calculations for execution on the Cray X-MP vector supercomputer

    NASA Technical Reports Server (NTRS)

    Hornfeck, William A.

    1988-01-01

    A considerable volume of large computational computer codes were developed for NASA over the past twenty-five years. This code represents algorithms developed for machines of earlier generation. With the emergence of the vector supercomputer as a viable, commercially available machine, an opportunity exists to evaluate optimization strategies to improve the efficiency of existing software. This result is primarily due to architectural differences in the latest generation of large-scale machines and the earlier, mostly uniprocessor, machines. A sofware package being used by NASA to perform computations on large matrices is described, and a strategy for conversion to the Cray X-MP vector supercomputer is also described.

  16. The identification of cis-regulatory elements: A review from a machine learning perspective.

    PubMed

    Li, Yifeng; Chen, Chih-Yu; Kaye, Alice M; Wasserman, Wyeth W

    2015-12-01

    The majority of the human genome consists of non-coding regions that have been called junk DNA. However, recent studies have unveiled that these regions contain cis-regulatory elements, such as promoters, enhancers, silencers, insulators, etc. These regulatory elements can play crucial roles in controlling gene expressions in specific cell types, conditions, and developmental stages. Disruption to these regions could contribute to phenotype changes. Precisely identifying regulatory elements is key to deciphering the mechanisms underlying transcriptional regulation. Cis-regulatory events are complex processes that involve chromatin accessibility, transcription factor binding, DNA methylation, histone modifications, and the interactions between them. The development of next-generation sequencing techniques has allowed us to capture these genomic features in depth. Applied analysis of genome sequences for clinical genetics has increased the urgency for detecting these regions. However, the complexity of cis-regulatory events and the deluge of sequencing data require accurate and efficient computational approaches, in particular, machine learning techniques. In this review, we describe machine learning approaches for predicting transcription factor binding sites, enhancers, and promoters, primarily driven by next-generation sequencing data. Data sources are provided in order to facilitate testing of novel methods. The purpose of this review is to attract computational experts and data scientists to advance this field. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  17. A Life Study of Ausforged, Standard Forged and Standard Machined AISI M-50 Spur Gears

    NASA Technical Reports Server (NTRS)

    Townsend, D. P.; Bamberger, E. N.; Zaretsky, E. V.

    1975-01-01

    Tests were conducted at 350 K (170 F) with three groups of 8.9 cm (3.5 in.) pitch diameter spur gears made of vacuum induction melted (VIM) consumable-electrode vacuum-arc melted (VAR), AISI M-50 steel and one group of vacuum-arc remelted (VAR) AISI 9310 steel. The pitting fatigue life of the standard forged and ausforged gears was approximately five times that of the VAR AISI 9310 gears and ten times that of the bending fatigue life of the standard machined VIM-VAR AISI M-50 gears run under identical conditions. There was a slight decrease in the 10-percent life of the ausforged gears from that for the standard forged gears, but the difference is not statistically significant. The standard machined gears failed primarily by gear tooth fracture while the forged and ausforged VIM-VAR AISI M-50 and the VAR AISI 9310 gears failed primarily by surface pitting fatigue. The ausforged gears had a slightly greater tendency to fail by tooth fracture than the standard forged gears.

  18. 2007 SB14 Source Reduction Plan/Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, L

    2007-07-24

    Aqueous solutions (mixed waste) generated from various LLNL operations, such as debris washing, sample preparation and analysis, and equipment maintenance and cleanout, were combined for storage in the B695 tank farm. Prior to combination the individual waste streams had different codes depending on the particular generating process and waste characteristics. The largest streams were CWC 132, 791, 134, 792. Several smaller waste streams were also included. This combined waste stream was treated at LLNL's waste treatment facility using a vacuum filtration and cool vapor evaporation process in preparation for discharge to sanitary sewer. Prior to discharge, the treated waste streammore » was sampled and the results were reviewed by LLNL's water monitoring specialists. The treated solution was discharged following confirmation that it met the discharge criteria. A major source, accounting for 50% for this waste stream, is metal machining, cutting and grinding operations in the engineering machine shops in B321/B131. An additional 7% was from similar operations in B131 and B132S. This waste stream primarily contains metal cuttings from machined parts, machining coolant and water, with small amounts of tramp oil from the machining and grinding equipment. Several waste reduction measures for the B321 machine shop have been taken, including the use of a small point-of-use filtering/tramp-oil coalescing/UV-sterilization coolant recycling unit, and improved management techniques (testing and replenishing) for coolants. The recycling unit had some operational problems during 2006. The machine shop is planning to have it repaired in the near future. A major source, accounting for 50% for this waste stream, is metal machining, cutting and grinding operations in the engineering machine shops in B321/B131. An additional 7% was from similar operations in B131 and B132S. This waste stream primarily contains metal cuttings from machined parts, machining coolant and water, with small amounts of tramp oil from the machining and grinding equipment. Several waste reduction measures for the B321 machine shop have been taken, including the use of a small point-of-use filtering/tramp-oil coalescing/UV-sterilization coolant recycling unit, and improved management techniques (testing and replenishing) for coolants. The recycling unit had some operational problems during 2006. The machine shop is planning to have it repaired in the near future. Quarterly waste generation data prepared by the Environmental Protection Department's P2 Team are regularly provided to engineering shops as well as other facilities so that generators can track the effectiveness of their waste minimization efforts.« less

  19. A Tool for Assessing the Text Legibility of Digital Human Machine Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roger Lew; Ronald L. Boring; Thomas A. Ulrich

    2015-08-01

    A tool intended to aid qualified professionals in the assessment of the legibility of text presented on a digital display is described. The assessment of legibility is primarily for the purposes of designing and analyzing human machine interfaces in accordance with NUREG-0700 and MIL-STD 1472G. The tool addresses shortcomings of existing guidelines by providing more accurate metrics of text legibility with greater sensitivity to design alternatives.

  20. CHISSL: A Human-Machine Collaboration Space for Unsupervised Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Dustin L.; Komurlu, Caner; Blaha, Leslie M.

    We developed CHISSL, a human-machine interface that utilizes supervised machine learning in an unsupervised context to help the user group unlabeled instances by her own mental model. The user primarily interacts via correction (moving a misplaced instance into its correct group) or confirmation (accepting that an instance is placed in its correct group). Concurrent with the user's interactions, CHISSL trains a classification model guided by the user's grouping of the data. It then predicts the group of unlabeled instances and arranges some of these alongside the instances manually organized by the user. We hypothesize that this mode of human andmore » machine collaboration is more effective than Active Learning, wherein the machine decides for itself which instances should be labeled by the user. We found supporting evidence for this hypothesis in a pilot study where we applied CHISSL to organize a collection of handwritten digits.« less

  1. A predictive machine learning approach for microstructure optimization and materials design

    NASA Astrophysics Data System (ADS)

    Liu, Ruoqian; Kumar, Abhishek; Chen, Zhengzhang; Agrawal, Ankit; Sundararaghavan, Veera; Choudhary, Alok

    2015-06-01

    This paper addresses an important materials engineering question: How can one identify the complete space (or as much of it as possible) of microstructures that are theoretically predicted to yield the desired combination of properties demanded by a selected application? We present a problem involving design of magnetoelastic Fe-Ga alloy microstructure for enhanced elastic, plastic and magnetostrictive properties. While theoretical models for computing properties given the microstructure are known for this alloy, inversion of these relationships to obtain microstructures that lead to desired properties is challenging, primarily due to the high dimensionality of microstructure space, multi-objective design requirement and non-uniqueness of solutions. These challenges render traditional search-based optimization methods incompetent in terms of both searching efficiency and result optimality. In this paper, a route to address these challenges using a machine learning methodology is proposed. A systematic framework consisting of random data generation, feature selection and classification algorithms is developed. Experiments with five design problems that involve identification of microstructures that satisfy both linear and nonlinear property constraints show that our framework outperforms traditional optimization methods with the average running time reduced by as much as 80% and with optimality that would not be achieved otherwise.

  2. Welding Behavior of Free Machining Stainless Steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BROOKS,JOHN A.; ROBINO,CHARLES V.; HEADLEY,THOMAS J.

    2000-07-24

    The weld solidification and cracking behavior of sulfur bearing free machining austenitic stainless steel was investigated for both gas-tungsten arc (GTA) and pulsed laser beam weld processes. The GTA weld solidification was consistent with those predicted with existing solidification diagrams and the cracking response was controlled primarily by solidification mode. The solidification behavior of the pulsed laser welds was complex, and often contained regions of primary ferrite and primary austenite solidification, although in all cases the welds were found to be completely austenite at room temperature. Electron backscattered diffraction (EBSD) pattern analysis indicated that the nature of the base metalmore » at the time of solidification plays a primary role in initial solidification. The solid state transformation of austenite to ferrite at the fusion zone boundary, and ferrite to austenite on cooling may both be massive in nature. A range of alloy compositions that exhibited good resistance to solidification cracking and was compatible with both welding processes was identified. The compositional range is bounded by laser weldability at lower Cr{sub eq}/Ni{sub eq} ratios and by the GTA weldability at higher ratios. It was found with both processes that the limiting ratios were somewhat dependent upon sulfur content.« less

  3. Probability machines: consistent probability estimation using nonparametric learning machines.

    PubMed

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  4. Rock sampling. [method for controlling particle size distribution

    NASA Technical Reports Server (NTRS)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaballa, H; O’Brien, M; Riegel, A

    Purpose: To develop a daily quality assurance (QA) device that can test the 6DoF (degrees of freedom) couch repositioning accuracy, prior to SBRT treatment deliveries, with an accuracy of ±0.3 degrees and ±0.3 mm. Methods: A daily QA phantom is designed with a focus on the derived center of projections of its markers, rather than tracking its individual markers one at a time. This approach can be the most favorable to address the intended machining accuracy of the QA phantom and the CBCT spatial resolution limitations, primarily 1 mm min slice thickness, simultaneously. With the current design, ±0.1 mm congruencemore » of the resultant center of gravity of the markers with reference CT (0.6 mm minimum slice thickness) vs CBCT (1.0 mm minimum slice thickness) can be achieved. If successful, the QA device should be qualified to test 6DoF couch performance with a gauged accuracy of ±0.3 degrees/±0.3 mm. Testing is performed for the Varian True Beam 2.0 6DoF system. Results: Once the QA phantom is constructed and tested, agreement of the center of gravity of the reference CT scan and the CBCT scan of ±0.1 mm is achieved. This has translated into a consistent 3D-3D match on the treatment machine, CT vs CBCT, with a repetitive ±0.1 mm variation, thus exceeding our expectations. We have deployed the phantom for daily QA on one of our accelerators, and found that the QA time has increased by only 10 minutes. Conclusion: A 6DoF phantom has been designed (patent pending) and built with a realistic work flow in mind where the daily couch accuracy QA checks taking less than 10 minutes. Current developments include integration with the Varian’s Machine Performance Check consistency module.« less

  6. Software Deficiency Issues Confronting the Utilization of ’Non-von Neumann’ Architectures

    DTIC Science & Technology

    1989-01-01

    upon work done by Charles Babbage nearly 100 years before. Hence, the " Babbage " machine that was designed in the 1820’s and 1830’s is generally...functionality. For example, consider the world’s first computer designer, Charles Babbage , who primarily designed in the 1820’s what is considered to be the...primarily considered as direct descendants of ideas that were devised in the 1930’s, these ideas were basically rediscoveries of what Charles Babbage

  7. Tool feed influence on the machinability of CO(2) laser optics.

    PubMed

    Arnold, J B; Steger, P J; Saito, T T

    1975-08-01

    Influence of tool feed on reflectivity of diamond-machined surfaces was evaluated using materials (gold, silver, and copper) from which CO(2) laser optics are primarily produced. Fifteen specimens were machined by holding all machining parameters constant, except tool feed. Tool feed was allowed to vary by controlled amounts from one evaluation zone (or part) to another. Past experience has verified that the quality of a diamond-machined surface is not a function of the cutting velocity; therefore, this experiment was conducted on the basis that a variation in cutting velocity was not an influencing factor on the diamondturning process. Inspection results of the specimens indicated that tool feeds significantly higher than 5.1 micro/rev (200 microin./rev) produced detrimental effects on the machined surfaces. In some cases, at feeds as high as 13 microm/rev (500 microin./rev), visible scoring was evident. Those surfaces produced with tool feeds less than 5.1 microm/rev had little difference in reflectivity. Measurements indicat d that their reflectivity existed in a range from 96.7% to 99.3% at 10.6 microm.

  8. Automated solar module assembly line

    NASA Technical Reports Server (NTRS)

    Bycer, M.

    1980-01-01

    The solar module assembly machine which Kulicke and Soffa delivered under this contract is a cell tabbing and stringing machine, and capable of handling a variety of cells and assembling strings up to 4 feet long which then can be placed into a module array up to 2 feet by 4 feet in a series of parallel arrangement, and in a straight or interdigitated array format. The machine cycle is 5 seconds per solar cell. This machine is primarily adapted to 3 inch diameter round cells with two tabs between cells. Pulsed heat is used as the bond technique for solar cell interconnects. The solar module assembly machine unloads solar cells from a cassette, automatically orients them, applies flux and solders interconnect ribbons onto the cells. It then inverts the tabbed cells, connects them into cell strings, and delivers them into a module array format using a track mounted vacuum lance, from which they are taken to test and cleaning benches prior to final encapsulation into finished solar modules. Throughout the machine the solar cell is handled very carefully, and any contact with the collector side of the cell is avoided or minimized.

  9. AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT.

    PubMed

    Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred

    2016-01-01

    In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the "Islands of Automation" dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing.

  10. 40 CFR 1074.5 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... activity engaged in as a vocation. Construction equipment or vehicle means any internal combustion engine... vehicle means any internal combustion engine-powered machine primarily used in the commercial production... STATE STANDARDS AND PROCEDURES FOR WAIVER OF FEDERAL PREEMPTION FOR NONROAD ENGINES AND NONROAD VEHICLES...

  11. 40 CFR 1074.5 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... activity engaged in as a vocation. Construction equipment or vehicle means any internal combustion engine... vehicle means any internal combustion engine-powered machine primarily used in the commercial production... STATE STANDARDS AND PROCEDURES FOR WAIVER OF FEDERAL PREEMPTION FOR NONROAD ENGINES AND NONROAD VEHICLES...

  12. Machine learning for characterization of insect vector feeding

    USDA-ARS?s Scientific Manuscript database

    Insects that feed by ingesting plant and animal fluids cause devastating damage to humans, livestock, and agriculture worldwide, primarily by transmitting phytopathogenic and zoonotic pathogens. The feeding processes required for successful disease transmission by sucking insects can be recorded by ...

  13. Traction sheave elevator, hoisting unit and machine space

    DOEpatents

    Hakala, Harri; Mustalahti, Jorma; Aulanko, Esko

    2000-01-01

    Traction sheave elevator consisting of an elevator car moving along elevator guide rails, a counterweight moving along counterweight guide rails, a set of hoisting ropes (3) on which the elevator car and counterweight are suspended, and a drive machine unit (6) driving a traction sheave (7) acting on the hoisting ropes (3) and placed in the elevator shaft. The drive machine unit (6) is of a flat construction. A wall of the elevator shaft is provided with a machine space with its open side facing towards the shaft, the essential parts of the drive machine unit (6) being placed in the space. The hoisting unit (9) of the traction sheave elevator consists of a substantially discoidal drive machine unit (6) and an instrument panel (8) mounted on the frame (20) of the hoisting unit.

  14. 26 CFR 1.188-1 - Amortization of certain expenditures for qualified on-the-job training and child care facilities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... of the training, placement is to be based primarily upon the skills learned through the training... training facility for purposes of section 188 simply because new employees receive training on the machines...

  15. 26 CFR 1.188-1 - Amortization of certain expenditures for qualified on-the-job training and child care facilities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of the training, placement is to be based primarily upon the skills learned through the training... training facility for purposes of section 188 simply because new employees receive training on the machines...

  16. Uniform Slavic Transliteration Alphabet (USTA).

    ERIC Educational Resources Information Center

    Dekleva, Borut

    The Uniform Slavic Transliteration Alphabet (USTA) was designed primarily with the following objectives: to aid librarians (catalogers and bibliographers), information scientists, transliterators, and editors of bibliographic works of the many Slavic tongues; and to serve as original research for the further development of a machine-readable…

  17. 26 CFR 1.188-1 - Amortization of certain expenditures for qualified on-the-job training and child care facilities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... of the training, placement is to be based primarily upon the skills learned through the training... training facility for purposes of section 188 simply because new employees receive training on the machines...

  18. 26 CFR 1.188-1 - Amortization of certain expenditures for qualified on-the-job training and child care facilities.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... of the training, placement is to be based primarily upon the skills learned through the training... training facility for purposes of section 188 simply because new employees receive training on the machines...

  19. 26 CFR 1.188-1 - Amortization of certain expenditures for qualified on-the-job training and child care facilities.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... of the training, placement is to be based primarily upon the skills learned through the training... training facility for purposes of section 188 simply because new employees receive training on the machines...

  20. Machine learning for characterization of insect vector feeding

    USDA-ARS?s Scientific Manuscript database

    Insects that feed by ingesting plant and animal fluids cause devastating damage to humans, livestock, and agriculture worldwide, primarily by transmitting pathogens of plants and animals. The feeding processes required for successful pathogen transmission by sucking insects can be recorded by monito...

  1. A predictive machine learning approach for microstructure optimization and materials design

    DOE PAGES

    Liu, Ruoqian; Kumar, Abhishek; Chen, Zhengzhang; ...

    2015-06-23

    This paper addresses an important materials engineering question: How can one identify the complete space (or as much of it as possible) of microstructures that are theoretically predicted to yield the desired combination of properties demanded by a selected application? We present a problem involving design of magnetoelastic Fe-Ga alloy microstructure for enhanced elastic, plastic and magnetostrictive properties. While theoretical models for computing properties given the microstructure are known for this alloy, inversion of these relationships to obtain microstructures that lead to desired properties is challenging, primarily due to the high dimensionality of microstructure space, multi-objective design requirement and non-uniquenessmore » of solutions. These challenges render traditional search-based optimization methods incompetent in terms of both searching efficiency and result optimality. In this paper, a route to address these challenges using a machine learning methodology is proposed. A systematic framework consisting of random data generation, feature selection and classification algorithms is developed. In conclusion, experiments with five design problems that involve identification of microstructures that satisfy both linear and nonlinear property constraints show that our framework outperforms traditional optimization methods with the average running time reduced by as much as 80% and with optimality that would not be achieved otherwise.« less

  2. Foam-Mixing-And-Dispensing Machine

    NASA Technical Reports Server (NTRS)

    Chong, Keith Y.; Toombs, Gordon R.; Jackson, Richard J.

    1996-01-01

    Time-and-money-saving machine produces consistent, homogeneously mixed foam, enhancing production efficiency. Automatically mixes and dispenses polyurethane foam in quantities specified by weight. Consists of cart-mounted, air-driven proportioning unit; air-activated mechanical mixing gun; programmable timer/counter, and controller.

  3. Development of a kolanut peeling device.

    PubMed

    Kareem, I; Owolarafe, O K; Ajayi, O A

    2014-10-01

    A kolanut peeling machine was designed, constructed and evaluated for the postharvest processing of the seed. The peeling machine consists of a standing frame, peeling unit and hopper. The peeling unit consists of a special paddle, which mixes the kolanut, rubs them against one another and against the wall of the barrel and also conveys the kolanut to the outlet. The performance of the kolanut peeling machine was evaluated for its peeling efficiency at different moisture content (53.0, 57.6, 61.4 % w.b.) and speeds of operation of the machine. The result of the analysis of variance shows that the main factors and their interaction had significant effects (p < 0.05) on the peeling efficiency of the machine. The result also shows that the peeling efficiency of the machine increased as the moisture content increase and decreased with increase in machine speed. The highest efficiency of the machine was 60.3 % at a moisture content of 61.4 % w.b. and speed of 40 rpm.

  4. Guidelines for Developing and Reporting Machine Learning Predictive Models in Biomedical Research: A Multidisciplinary View

    PubMed Central

    2016-01-01

    Background As more and more researchers are turning to big data for new opportunities of biomedical discoveries, machine learning models, as the backbone of big data analysis, are mentioned more often in biomedical journals. However, owing to the inherent complexity of machine learning methods, they are prone to misuse. Because of the flexibility in specifying machine learning models, the results are often insufficiently reported in research articles, hindering reliable assessment of model validity and consistent interpretation of model outputs. Objective To attain a set of guidelines on the use of machine learning predictive models within clinical settings to make sure the models are correctly applied and sufficiently reported so that true discoveries can be distinguished from random coincidence. Methods A multidisciplinary panel of machine learning experts, clinicians, and traditional statisticians were interviewed, using an iterative process in accordance with the Delphi method. Results The process produced a set of guidelines that consists of (1) a list of reporting items to be included in a research article and (2) a set of practical sequential steps for developing predictive models. Conclusions A set of guidelines was generated to enable correct application of machine learning models and consistent reporting of model specifications and results in biomedical research. We believe that such guidelines will accelerate the adoption of big data analysis, particularly with machine learning methods, in the biomedical research community. PMID:27986644

  5. Posthumanism: beyond humanism?

    PubMed

    Valera, Luca

    2014-01-01

    The focal point of posthumanism consists not as such in an a-critical acceptance of the technological promises - like there is for transhumanism - but in a total contamination and hybridization of human beings with other living beings and machines (these are the two main forms of contamination). The change of perspective untaken by posthumanism would be, thus, a paradigmatic shift in anthropology. As with ecologism, posthumanism, in order to obtain total contamination and man's openness to otherness, proposes the elimination and the fluidification of boundaries, thus even denying man's identity, and, with it, the very possibility of openness. However, by denying the identity, one denies the condition of possibility of thought, just as it has been manifested in history until now: hence we understand how, primarily, posthumanism is not configured as an adequate philosophical reflection, but as a narrative that takes origin from certain requirements, which are eminently human, and that discloses its deeply anthropogenic roots.

  6. Konnen Computer das Sprachproblem losen (Can Computers Solve the Language Problem)?

    ERIC Educational Resources Information Center

    Zeilinger, Michael

    1972-01-01

    Various computer applications in linguistics, primarily speech synthesis and machine translation, are reviewed. Although the computer proves useful for statistics, dictionary building and programmed instruction, the promulgation of a world auxiliary language is considered a more human and practical solution to the international communication…

  7. AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT

    PubMed Central

    Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred

    2017-01-01

    In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the “Islands of Automation” dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing. PMID:28691121

  8. Automatic soldering machine

    NASA Technical Reports Server (NTRS)

    Stein, J. A.

    1974-01-01

    Fully-automatic tube-joint soldering machine can be used to make leakproof joints in aluminum tubes of 3/16 to 2 in. in diameter. Machine consists of temperature-control unit, heater transformer and heater head, vibrator, and associated circuitry controls, and indicators.

  9. Decoding the ecological function of accessory genome

    USDA-ARS?s Scientific Manuscript database

    Shiga toxin-producing Escherichia coli O157:H7 primarily resides in cattle asymptomatically, and can be transmitted to humans through food. A study by Lupolova et al applied a machine-learning approach to complex pan-genome information and predicted that only a small subset of bovine isolates have t...

  10. Low-cost boring mill

    NASA Technical Reports Server (NTRS)

    Hibdon, R. A.

    1979-01-01

    Portable unit and special fixture serve as boring mill. Machine, fabricated primarily from scrap metal, was designed and set up in about 12 working days. It has reduced setup and boring time by 66 percent as compared with existing boring miles, thereby making latter available for other jobs. Unit can be operated by one man.

  11. Comparing statistical and machine learning classifiers: alternatives for predictive modeling in human factors research.

    PubMed

    Carnahan, Brian; Meyer, Gérard; Kuntz, Lois-Ann

    2003-01-01

    Multivariate classification models play an increasingly important role in human factors research. In the past, these models have been based primarily on discriminant analysis and logistic regression. Models developed from machine learning research offer the human factors professional a viable alternative to these traditional statistical classification methods. To illustrate this point, two machine learning approaches--genetic programming and decision tree induction--were used to construct classification models designed to predict whether or not a student truck driver would pass his or her commercial driver license (CDL) examination. The models were developed and validated using the curriculum scores and CDL exam performances of 37 student truck drivers who had completed a 320-hr driver training course. Results indicated that the machine learning classification models were superior to discriminant analysis and logistic regression in terms of predictive accuracy. Actual or potential applications of this research include the creation of models that more accurately predict human performance outcomes.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartolac, S; Letourneau, D; University of Toronto, Toronto, Ontario

    Purpose: Application of process control theory in quality assurance programs promises to allow earlier identification of problems and potentially better quality in delivery than traditional paradigms based primarily on tolerances and action levels. The purpose of this project was to characterize underlying seasonal variations in linear accelerator output that can be used to improve performance or trigger preemptive maintenance. Methods: Review of runtime plots of daily (6 MV) output data acquired using in house ion chamber based devices over three years and for fifteen linear accelerators of varying make and model were evaluated. Shifts in output due to known interventionsmore » with the machines were subtracted from the data to model an uncorrected scenario for each linear accelerator. Observable linear trends were also removed from the data prior to evaluation of periodic variations. Results: Runtime plots of output revealed sinusoidal, seasonal variations that were consistent across all units, irrespective of manufacturer, model or age of machine. The average amplitude of the variation was on the order of 1%. Peak and minimum variations were found to correspond to early April and September, respectively. Approximately 48% of output adjustments made over the period examined were potentially avoidable if baseline levels had corresponded to the mean output, rather than to points near a peak or valley. Linear trends were observed for three of the fifteen units, with annual increases in output ranging from 2–3%. Conclusion: Characterization of cyclical seasonal trends allows for better separation of potentially innate accelerator behaviour from other behaviours (e.g. linear trends) that may be better described as true out of control states (i.e. non-stochastic deviations from otherwise expected behavior) and could indicate service requirements. Results also pointed to an optimal setpoint for accelerators such that output of machines is maintained within set tolerances and interventions are required less frequently.« less

  13. A Systematic Strategy for Screening and Application of Specific Biomarkers in Hepatotoxicity Using Metabolomics Combined With ROC Curves and SVMs.

    PubMed

    Li, Yubo; Wang, Lei; Ju, Liang; Deng, Haoyue; Zhang, Zhenzhu; Hou, Zhiguo; Xie, Jiabin; Wang, Yuming; Zhang, Yanjun

    2016-04-01

    Current studies that evaluate toxicity based on metabolomics have primarily focused on the screening of biomarkers while largely neglecting further verification and biomarker applications. For this reason, we used drug-induced hepatotoxicity as an example to establish a systematic strategy for screening specific biomarkers and applied these biomarkers to evaluate whether the drugs have potential hepatotoxicity toxicity. Carbon tetrachloride (5 ml/kg), acetaminophen (1500 mg/kg), and atorvastatin (5 mg/kg) are established as rat hepatotoxicity models. Fifteen common biomarkers were screened by multivariate statistical analysis and integration analysis-based metabolomics data. The receiver operating characteristic curve was used to evaluate the sensitivity and specificity of the biomarkers. We obtained 10 specific biomarker candidates with an area under the curve greater than 0.7. Then, a support vector machine model was established by extracting specific biomarker candidate data from the hepatotoxic drugs and nonhepatotoxic drugs; the accuracy of the model was 94.90% (92.86% sensitivity and 92.59% specificity) and the results demonstrated that those ten biomarkers are specific. 6 drugs were used to predict the hepatotoxicity by the support vector machines model; the prediction results were consistent with the biochemical and histopathological results, demonstrating that the model was reliable. Thus, this support vector machine model can be applied to discriminate the between the hepatic or nonhepatic toxicity of drugs. This approach not only presents a new strategy for screening-specific biomarkers with greater diagnostic significance but also provides a new evaluation pattern for hepatotoxicity, and it will be a highly useful tool in toxicity estimation and disease diagnoses. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. ALE: automated label extraction from GEO metadata.

    PubMed

    Giles, Cory B; Brown, Chase A; Ripperger, Michael; Dennis, Zane; Roopnarinesingh, Xiavan; Porter, Hunter; Perz, Aleksandra; Wren, Jonathan D

    2017-12-28

    NCBI's Gene Expression Omnibus (GEO) is a rich community resource containing millions of gene expression experiments from human, mouse, rat, and other model organisms. However, information about each experiment (metadata) is in the format of an open-ended, non-standardized textual description provided by the depositor. Thus, classification of experiments for meta-analysis by factors such as gender, age of the sample donor, and tissue of origin is not feasible without assigning labels to the experiments. Automated approaches are preferable for this, primarily because of the size and volume of the data to be processed, but also because it ensures standardization and consistency. While some of these labels can be extracted directly from the textual metadata, many of the data available do not contain explicit text informing the researcher about the age and gender of the subjects with the study. To bridge this gap, machine-learning methods can be trained to use the gene expression patterns associated with the text-derived labels to refine label-prediction confidence. Our analysis shows only 26% of metadata text contains information about gender and 21% about age. In order to ameliorate the lack of available labels for these data sets, we first extract labels from the textual metadata for each GEO RNA dataset and evaluate the performance against a gold standard of manually curated labels. We then use machine-learning methods to predict labels, based upon gene expression of the samples and compare this to the text-based method. Here we present an automated method to extract labels for age, gender, and tissue from textual metadata and GEO data using both a heuristic approach as well as machine learning. We show the two methods together improve accuracy of label assignment to GEO samples.

  15. Failure prediction using machine learning and time series in optical network.

    PubMed

    Wang, Zhilong; Zhang, Min; Wang, Danshi; Song, Chuang; Liu, Min; Li, Jin; Lou, Liqi; Liu, Zhuo

    2017-08-07

    In this paper, we propose a performance monitoring and failure prediction method in optical networks based on machine learning. The primary algorithms of this method are the support vector machine (SVM) and double exponential smoothing (DES). With a focus on risk-aware models in optical networks, the proposed protection plan primarily investigates how to predict the risk of an equipment failure. To the best of our knowledge, this important problem has not yet been fully considered. Experimental results showed that the average prediction accuracy of our method was 95% when predicting the optical equipment failure state. This finding means that our method can forecast an equipment failure risk with high accuracy. Therefore, our proposed DES-SVM method can effectively improve traditional risk-aware models to protect services from possible failures and enhance the optical network stability.

  16. Notes on a storage manager for the Clouds kernel

    NASA Technical Reports Server (NTRS)

    Pitts, David V.; Spafford, Eugene H.

    1986-01-01

    The Clouds project is research directed towards producing a reliable distributed computing system. The initial goal is to produce a kernel which provides a reliable environment with which a distributed operating system can be built. The Clouds kernal consists of a set of replicated subkernels, each of which runs on a machine in the Clouds system. Each subkernel is responsible for the management of resources on its machine; the subkernal components communicate to provide the cooperation necessary to meld the various machines into one kernel. The implementation of a kernel-level storage manager that supports reliability is documented. The storage manager is a part of each subkernel and maintains the secondary storage residing at each machine in the distributed system. In addition to providing the usual data transfer services, the storage manager ensures that data being stored survives machine and system crashes, and that the secondary storage of a failed machine is recovered (made consistent) automatically when the machine is restarted. Since the storage manager is part of the Clouds kernel, efficiency of operation is also a concern.

  17. Neutron spectral measurements in an intense photon field associated with a high-energy x-ray radiotherapy machine.

    PubMed

    Holeman, G R; Price, K W; Friedman, L F; Nath, R

    1977-01-01

    High-energy x-ray radiotherapy machines in the supermegavoltage region generate complex neutron energy spectra which make an exact evaluation of neutron shielding difficult. Fast neutrons resulting from photonuclear reactions in the x-ray target and collimators undergo successive collisions in the surrounding materials and are moderated by varying amounts. In order to examine the neutron radiation exposures quantitatively, the neutron energy spectra have been measured inside and outside the treatment room of a Sagittaire medical linear accelerator (25-MV x rays) located at Yale-New Haven Hospital. The measurements were made using a Bonner spectrometer consisting of 2-, 3-, 5-, 8-, 10- and 12-in.-diameter polyethylene spheres with 6Li and 7Li thermoluminescent dosimeter (TLD) chips at the centers, in addition to bare and cadmium-covered chips. The individual TLD chips were calibrated for neutron and photon response. The spectrometer was calibrated using a known PuBe spectrum Spectrometer measurements were made at Yale Electron Accelerator Laboratory and results compared with a neutron time-of-flight spectrometer and an activation technique. The agreement between the results from these independent methods is found to be good, except for the measurements in the direct photon beam. Quality factors have been inferred for the neutron fields inside and outside the treatment room. Values of the inferred quality factors fall primarily between 4 and 8, depending on location.

  18. Machine Learning for the Knowledge Plane

    DTIC Science & Technology

    2006-06-01

    this idea is to combine techniques from machine learning with new architectural concepts in networking to make the internet self-aware and self...work on the machine learning portion of the Knowledge Plane. This consisted of three components: (a) we wrote a document formulating the various

  19. Guidelines for Developing and Reporting Machine Learning Predictive Models in Biomedical Research: A Multidisciplinary View.

    PubMed

    Luo, Wei; Phung, Dinh; Tran, Truyen; Gupta, Sunil; Rana, Santu; Karmakar, Chandan; Shilton, Alistair; Yearwood, John; Dimitrova, Nevenka; Ho, Tu Bao; Venkatesh, Svetha; Berk, Michael

    2016-12-16

    As more and more researchers are turning to big data for new opportunities of biomedical discoveries, machine learning models, as the backbone of big data analysis, are mentioned more often in biomedical journals. However, owing to the inherent complexity of machine learning methods, they are prone to misuse. Because of the flexibility in specifying machine learning models, the results are often insufficiently reported in research articles, hindering reliable assessment of model validity and consistent interpretation of model outputs. To attain a set of guidelines on the use of machine learning predictive models within clinical settings to make sure the models are correctly applied and sufficiently reported so that true discoveries can be distinguished from random coincidence. A multidisciplinary panel of machine learning experts, clinicians, and traditional statisticians were interviewed, using an iterative process in accordance with the Delphi method. The process produced a set of guidelines that consists of (1) a list of reporting items to be included in a research article and (2) a set of practical sequential steps for developing predictive models. A set of guidelines was generated to enable correct application of machine learning models and consistent reporting of model specifications and results in biomedical research. We believe that such guidelines will accelerate the adoption of big data analysis, particularly with machine learning methods, in the biomedical research community. ©Wei Luo, Dinh Phung, Truyen Tran, Sunil Gupta, Santu Rana, Chandan Karmakar, Alistair Shilton, John Yearwood, Nevenka Dimitrova, Tu Bao Ho, Svetha Venkatesh, Michael Berk. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 16.12.2016.

  20. Unit: Making Life Easier, Inspection Pack, National Trial Print.

    ERIC Educational Resources Information Center

    Australian Science Education Project, Toorak, Victoria.

    As a part of the unit materials in the series produced by the Australian Science Education Project, this teacher edition is primarily composed of three sections: a core relating to a bicycle, tests, and options. The core is concerned with basic properties of a machine such as force multiplication, speed multiplication, energy dissipation, and…

  1. Criteria for the Segmentation of Vowels on Duplex Oscillograms.

    ERIC Educational Resources Information Center

    Naeser, Margaret A.

    This paper develops criteria for the segmentation of vowels on duplex oscillograms. Previous vowel duration studies have primarily used sound spectrograms. The use of duplex oscillograms, rather than sound spectrograms, permits faster production (real time) at less expense (adding machine paper may be used). The speech signal can be more spread…

  2. [A new machinability test machine and the machinability of composite resins for core built-up].

    PubMed

    Iwasaki, N

    2001-06-01

    A new machinability test machine especially for dental materials was contrived. The purpose of this study was to evaluate the effects of grinding conditions on machinability of core built-up resins using this machine, and to confirm the relationship between machinability and other properties of composite resins. The experimental machinability test machine consisted of a dental air-turbine handpiece, a control weight unit, a driving unit of the stage fixing the test specimen, and so on. The machinability was evaluated as the change in volume after grinding using a diamond point. Five kinds of core built-up resins and human teeth were used in this study. The machinabilities of these composite resins increased with an increasing load during grinding, and decreased with repeated grinding. There was no obvious correlation between the machinability and Vickers' hardness; however, a negative correlation was observed between machinability and scratch width.

  3. Modeling of Geometric Error in Linear Guide Way to Improved the vertical three-axis CNC Milling machine’s accuracy

    NASA Astrophysics Data System (ADS)

    Kwintarini, Widiyanti; Wibowo, Agung; Arthaya, Bagus M.; Yuwana Martawirya, Yatna

    2018-03-01

    The purpose of this study was to improve the accuracy of three-axis CNC Milling Vertical engines with a general approach by using mathematical modeling methods of machine tool geometric errors. The inaccuracy of CNC machines can be caused by geometric errors that are an important factor during the manufacturing process and during the assembly phase, and are factors for being able to build machines with high-accuracy. To improve the accuracy of the three-axis vertical milling machine, by knowing geometric errors and identifying the error position parameters in the machine tool by arranging the mathematical modeling. The geometric error in the machine tool consists of twenty-one error parameters consisting of nine linear error parameters, nine angle error parameters and three perpendicular error parameters. The mathematical modeling approach of geometric error with the calculated alignment error and angle error in the supporting components of the machine motion is linear guide way and linear motion. The purpose of using this mathematical modeling approach is the identification of geometric errors that can be helpful as reference during the design, assembly and maintenance stages to improve the accuracy of CNC machines. Mathematically modeling geometric errors in CNC machine tools can illustrate the relationship between alignment error, position and angle on a linear guide way of three-axis vertical milling machines.

  4. Multiple Cylinder Free-Piston Stirling Machinery

    NASA Astrophysics Data System (ADS)

    Berchowitz, David M.; Kwon, Yong-Rak

    In order to improve the specific power of piston-cylinder type machinery, there is a point in capacity or power where an advantage accrues with increasing number of piston-cylinder assemblies. In the case of Stirling machinery where primary energy is transferred across the casing wall of the machine, this consideration is even more important. This is due primarily to the difference in scaling of basic power and the required heat transfer. Heat transfer is found to be progressively limited as the size of the machine increases. Multiple cylinder machines tend to preserve the surface area to volume ratio at more favorable levels. In addition, the spring effect of the working gas in the so-called alpha configuration is often sufficient to provide a high frequency resonance point that improves the specific power. There are a number of possible multiple cylinder configurations. The simplest is an opposed pair of piston-displacer machines (beta configuration). A three-cylinder machine requires stepped pistons to obtain proper volume phase relationships. Four to six cylinder configurations are also possible. A small demonstrator inline four cylinder alpha machine has been built to demonstrate both cooling operation and power generation. Data from this machine verifies theoretical expectations and is used to extrapolate the performance of future machines. Vibration levels are discussed and it is argued that some multiple cylinder machines have no linear component to the casing vibration but may have a nutating couple. Example applications are discussed ranging from general purpose coolers, computer cooling, exhaust heat power extraction and some high power engines.

  5. Machining heavy plastic sections

    NASA Technical Reports Server (NTRS)

    Stalkup, O. M.

    1967-01-01

    Machining technique produces consistently satisfactory plane-parallel optical surfaces for pressure windows, made of plexiglass, required to support a photographic study of liquid rocket combustion processes. The surfaces are machined and polished to the required tolerances and show no degradation from stress relaxation over periods as long as 6 months.

  6. 32 CFR 286.29 - Collection of fees and fee rates.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... consists of two parts; individual time (hereafter referred to as human time), and machine time. (i) Human... support, operator, programmer, database administrator, or action officer). (ii) Machine time. Machine time... the time of providing the documents to the requester or recipient when the requester specifically...

  7. 32 CFR 286.29 - Collection of fees and fee rates.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... consists of two parts; individual time (hereafter referred to as human time), and machine time. (i) Human... support, operator, programmer, database administrator, or action officer). (ii) Machine time. Machine time... the time of providing the documents to the requester or recipient when the requester specifically...

  8. 32 CFR 286.29 - Collection of fees and fee rates.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... consists of two parts; individual time (hereafter referred to as human time), and machine time. (i) Human... support, operator, programmer, database administrator, or action officer). (ii) Machine time. Machine time... the time of providing the documents to the requester or recipient when the requester specifically...

  9. View southwest of machine shops, building 18 section visible in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View southwest of machine shops, building 18 section visible in foreground; building 16 section on left. This structure consists of two formerly separate buildings. Jet Lowe, Haer staff photographer, summer 1995. - Naval Base Philadelphia-Philadelphia Naval Shipyard, Machine Shops, League Island, Philadelphia, Philadelphia County, PA

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Overview of the Machine-Tool Task Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, G.P.

    1981-06-08

    The Machine Tool Task Force, (MTTF) surveyed the state of the art of machine tool technology for material removal for two and one-half years. This overview gives a brief summary of the approach, specific subjects covered, principal conclusions and some of the key recommendations aimed at improving the technology and advancing the productivity of machine tools. The Task Force consisted of 123 experts from the US and other countries. Their findings are documented in a five-volume report, Technology of Machine Tools.

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  13. Touchable Tornadoes

    ERIC Educational Resources Information Center

    Gilhousen, David

    2004-01-01

    In this article, the author discusses a tornado-producing machine that he used in teacher-led, student assisted demonstrations in order to reinforce concepts learned during a unit on weather. The machine, or simulator, was powered by a hair dryer, fan, and cool-mist humidifier. The machine consists of a demonstration table containing a plenum box,…

  14. Learning Machine, Vietnamese Based Human-Computer Interface.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    The sixth session of IT@EDU98 consisted of seven papers on the topic of the learning machine--Vietnamese based human-computer interface, and was chaired by Phan Viet Hoang (Informatics College, Singapore). "Knowledge Based Approach for English Vietnamese Machine Translation" (Hoang Kiem, Dinh Dien) presents the knowledge base approach,…

  15. Distributed state machine supervision for long-baseline gravitational-wave detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rollins, Jameson Graef, E-mail: jameson.rollins@ligo.org

    The Laser Interferometer Gravitational-wave Observatory (LIGO) consists of two identical yet independent, widely separated, long-baseline gravitational-wave detectors. Each Advanced LIGO detector consists of complex optical-mechanical systems isolated from the ground by multiple layers of active seismic isolation, all controlled by hundreds of fast, digital, feedback control systems. This article describes a novel state machine-based automation platform developed to handle the automation and supervisory control challenges of these detectors. The platform, called Guardian, consists of distributed, independent, state machine automaton nodes organized hierarchically for full detector control. User code is written in standard Python and the platform is designed to facilitatemore » the fast-paced development process associated with commissioning the complicated Advanced LIGO instruments. While developed specifically for the Advanced LIGO detectors, Guardian is a generic state machine automation platform that is useful for experimental control at all levels, from simple table-top setups to large-scale multi-million dollar facilities.« less

  16. Improved Prediction of Non-methylated Islands in Vertebrates Highlights Different Characteristic Sequence Patterns

    PubMed Central

    Vingron, Martin

    2016-01-01

    Non-methylated islands (NMIs) of DNA are genomic regions that are important for gene regulation and development. A recent study of genome-wide non-methylation data in vertebrates by Long et al. (eLife 2013;2:e00348) has shown that many experimentally identified non-methylated regions do not overlap with classically defined CpG islands which are computationally predicted using simple DNA sequence features. This is especially true in cold-blooded vertebrates such as Danio rerio (zebrafish). In order to investigate how predictive DNA sequence is of a region’s methylation status, we applied a supervised learning approach using a spectrum kernel support vector machine, to see if a more complex model and supervised learning can be used to improve non-methylated island prediction and to understand the sequence properties of these regions. We demonstrate that DNA sequence is highly predictive of methylation status, and that in contrast to existing CpG island prediction methods our method is able to provide more useful predictions of NMIs genome-wide in all vertebrate organisms that were studied. Our results also show that in cold-blooded vertebrates (Anolis carolinensis, Xenopus tropicalis and Danio rerio) where genome-wide classical CpG island predictions consist primarily of false positives, longer primarily AT-rich DNA sequence features are able to identify these regions much more accurately. PMID:27984582

  17. Computer Technology and Educational Equity. ERIC/CUE Urban Diversity Series, Number 91.

    ERIC Educational Resources Information Center

    Gordon, Edmund W.; Armour-Thomas, Eleanor

    The impact of the technological revolution on education is examined in this monograph, which focuses primarily on computers. First, the history of the educational uses of a variety of media (film, radio, television, teaching machines, and videodisc systems) is traced and assessed. As instructional aids, it is said, the media economize teachers'…

  18. Intermountain Range plant names and symbols

    Treesearch

    A. Perry Plummer; Stephen B. Monsen; Richard Stevens

    1977-01-01

    This revised alphabetical list of botanical and common names of vascular plants that primarily grow on wildlands of the Intermountain region and adjacent areas has been assembled for use in quickly recording occurrence of plants in the field and for rapid machine processing of field data. Included are plants found in Utah, Nevada, southern Idaho, and Wyoming, and most...

  19. Modeling a Linear Generator for Energy Harvesting Applications

    DTIC Science & Technology

    2014-12-01

    sensors where electrical power is not available (e.g., wireless sensors on train cars). While piezoelectric harvesters are primarily utilized in...Ship and the Future of Electricity Generation ............3 2. Unmanned Sensor Energy Needs .......................................................4...18 Figure 8. Example two-pole, three-phase salient-pole synchronous machine showing the general layout of windings and major axis

  20. CoCom and the Future of Conventional Arms Exports in the Former Communist Bloc

    DTIC Science & Technology

    1993-12-01

    structure and background of each of these firms. The first example is the Godollo Machine Factory which was primarily involved in repair and renovation of...regularizedŚ Such an attitude is noteworthy because the potential for growth in the space industry is so great. Not only does Russia produce the " Energia

  1. Autonomous biomorphic robots as platforms for sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tilden, M.; Hasslacher, B.; Mainieri, R.

    1996-10-01

    The idea of building autonomous robots that can carry out complex and nonrepetitive tasks is an old one, so far unrealized in any meaningful hardware. Tilden has shown recently that there are simple, processor-free solutions to building autonomous mobile machines that continuously adapt to unknown and hostile environments, are designed primarily to survive, and are extremely resistant to damage. These devices use smart mechanics and simple (low component count) electronic neuron control structures having the functionality of biological organisms from simple invertebrates to sophisticated members of the insect and crab family. These devices are paradigms for the development of autonomousmore » machines that can carry out directed goals. The machine then becomes a robust survivalist platform that can carry sensors or instruments. These autonomous roving machines, now in an early stage of development (several proof-of-concept prototype walkers have been built), can be developed so that they are inexpensive, robust, and versatile carriers for a variety of instrument packages. Applications are immediate and many, in areas as diverse as prosthetics, medicine, space, construction, nanoscience, defense, remote sensing, environmental cleanup, and biotechnology.« less

  2. Machine Shop. Module 8: CNC (Computerized Numerical Control). Instructor's Guide.

    ERIC Educational Resources Information Center

    Crosswhite, Dwight

    This document consists of materials for a five-unit course on the following topics: (1) safety guidelines; (2) coordinates and dimensions; (3) numerical control math; (4) programming for numerical control machines; and (5) setting and operating the numerical control machine. The instructor's guide begins with a list of competencies covered in the…

  3. Design study and performance analysis of 12S-14P field excitation flux switching motor for hybrid electric vehicle

    NASA Astrophysics Data System (ADS)

    Husin, Zhafir Aizat; Sulaiman, Erwan; Khan, Faisal; Mazlan, Mohamed Mubin Aizat; Othman, Syed Muhammad Naufal Syed

    2015-05-01

    This paper presents a new structure of 12slot-14pole field excitation flux switching motor (FEFSM) as an alternative candidate of non-Permanent Magnet (PM) machine for HEV drives. Design study, performance analysis and optimization of field excitation flux switching machine with non-rare-earth magnet for hybrid electric vehicle drive applications is done. The stator of projected machine consists of iron core made of electromagnetic steels, armature coils and field excitation coils as the only field mmf source. The rotor is consisted of only stack of iron and hence, it is reliable and appropriate for high speed operation. The design target is a machine with the maximum torque, power and power density, more than 210Nm, 123kW and 3.5kW/kg, respectively, which competes with interior permanent magnet synchronous machine used in existing hybrid electric vehicle. Some design feasibility studies on FEFSM based on 2D-FEA and deterministic optimization method will be applied to design the proposed machine.

  4. Foods Sold in School Vending Machines are Associated with Overall Student Dietary Intake

    PubMed Central

    Rovner, Alisha J.; Nansel, Tonja R.; Wang, Jing; Iannotti, Ronald J.

    2010-01-01

    Purpose To examine the association between foods sold in school vending machines and students’ dietary behaviors. Methods The 2005-2006 US Health Behavior in School Aged Children (HBSC) survey was administered to 6th to 10th graders and school administrators. Students’ dietary intake was estimated with a brief food frequency measure. Administrators completed questions about foods sold in vending machines. For each food intake behavior, a multilevel regression analysis modeled students (level 1) nested within schools (level 2), with the corresponding food sold in vending machines as the main predictor. Control variables included gender, grade, family affluence and school poverty. Analyses were conducted separately for 6th to 8th and 9th to 10th grades. Results Eighty-three percent of schools (152 schools, 5,930 students) had vending machines which primarily sold foods of minimal nutritional values (soft drinks, chips and sweets). In younger grades, availability of fruits/vegetables and chocolate/sweets was positively related to the corresponding food intake, with vending machine content and school poverty explaining 70.6% of between-school variation in fruit/vegetable consumption, and 71.7% in sweets consumption. In older grades, there was no significant effect of foods available in vending machines on reported consumption of those foods. Conclusions Vending machines are widely available in US public schools. In younger grades, school vending machines were related to students’ diets positively or negatively, depending on what was sold in them. Schools are in a powerful position to influence children’s diets; therefore attention to foods sold in them is necessary in order to try to improve children’s diets. PMID:21185519

  5. Food sold in school vending machines is associated with overall student dietary intake.

    PubMed

    Rovner, Alisha J; Nansel, Tonja R; Wang, Jing; Iannotti, Ronald J

    2011-01-01

    To examine the association between food sold in school vending machines and the dietary behaviors of students. The 2005-2006 U.S. Health Behavior in School-aged Children survey was administered to 6th to 10th graders and school administrators. Dietary intake in students was estimated with a brief food frequency measure. School administrators completed questions regarding food sold in vending machines. For each food intake behavior, a multilevel regression analysis modeled students (level 1) nested within schools (level 2), with the corresponding food sold in vending machines as the main predictor. Control variables included gender, grade, family affluence, and school poverty index. Analyses were conducted separately for 6th to 8th and 9th-10th grades. In all, 83% of the schools (152 schools; 5,930 students) had vending machines that primarily sold food of minimal nutritional values (soft drinks, chips, and sweets). In younger grades, availability of fruit and/or vegetables and chocolate and/or sweets was positively related to the corresponding food intake, with vending machine content and school poverty index providing an explanation for 70.6% of between-school variation in fruit and/or vegetable consumption and 71.7% in sweets consumption. Among the older grades, there was no significant effect of food available in vending machines on reported consumption of those food. Vending machines are widely available in public schools in the United States. In younger grades, school vending machines were either positively or negatively related to the diets of the students, depending on what was sold in them. Schools are in a powerful position to influence the diets of children; therefore, attention to the food sold at school is necessary to try to improve their diets. Copyright © 2011 Society for Adolescent Health and Medicine. All rights reserved.

  6. Minicomputer front end. [Modcomp II/CP as buffer between CDC 6600 and PDP-9 at graphics stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudson, J.A.

    1976-01-01

    Sandia Labs developed an Interactive Graphics System (SIGS) that was established on a CDC 6600 using a communication scheme based on the Control Data Corporation product IGS. As implemented at Sandia, the graphics station consists primarily of a PDP-9 with a Vector General display. A system is being developed which uses a minicomputer (Modcomp II/CP) as the buffer machine for the graphics stations. The original SIGS required a dedicated peripheral processor (PP) on the CDC 6600 to handle the communication with the stations; however, with the Modcomp handling the actual communication protocol, the PP is only assigned as needed tomore » handle data transfer within the CDC 6600 portion of SIGS. The new system will thus support additional graphics stations with less impact on the CDC 6600. This paper discusses the design philosophy of the system, and the hardware and software used to implement it. 1 figure.« less

  7. Documentation for the machine-readable character coded version of the SKYMAP catalogue

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.

    1981-01-01

    The SKYMAP catalogue is a compilation of astronomical data prepared primarily for purposes of attitude guidance for satellites. In addition to the SKYMAP Master Catalogue data base, a software package of data base management and utility programs is available. The tape version of the SKYMAP Catalogue, as received by the Astronomical Data Center (ADC), contains logical records consisting of a combination of binary and EBCDIC data. Certain character coded data in each record are redundant in that the same data are present in binary form. In order to facilitate wider use of all SKYMAP data by the astronomical community, a formatted (character) version was prepared by eliminating all redundant character data and converting all binary data to character form. The character version of the catalogue is described. The document is intended to fully describe the formatted tape so that users can process the data problems and guess work; it should be distributed with any character version of the catalogue.

  8. Virtual Manufacturing Techniques Designed and Applied to Manufacturing Activities in the Manufacturing Integration and Technology Branch

    NASA Technical Reports Server (NTRS)

    Shearrow, Charles A.

    1999-01-01

    One of the identified goals of EM3 is to implement virtual manufacturing by the time the year 2000 has ended. To realize this goal of a true virtual manufacturing enterprise the initial development of a machinability database and the infrastructure must be completed. This will consist of the containment of the existing EM-NET problems and developing machine, tooling, and common materials databases. To integrate the virtual manufacturing enterprise with normal day to day operations the development of a parallel virtual manufacturing machinability database, virtual manufacturing database, virtual manufacturing paradigm, implementation/integration procedure, and testable verification models must be constructed. Common and virtual machinability databases will include the four distinct areas of machine tools, available tooling, common machine tool loads, and a materials database. The machine tools database will include the machine envelope, special machine attachments, tooling capacity, location within NASA-JSC or with a contractor, and availability/scheduling. The tooling database will include available standard tooling, custom in-house tooling, tool properties, and availability. The common materials database will include materials thickness ranges, strengths, types, and their availability. The virtual manufacturing databases will consist of virtual machines and virtual tooling directly related to the common and machinability databases. The items to be completed are the design and construction of the machinability databases, virtual manufacturing paradigm for NASA-JSC, implementation timeline, VNC model of one bridge mill and troubleshoot existing software and hardware problems with EN4NET. The final step of this virtual manufacturing project will be to integrate other production sites into the databases bringing JSC's EM3 into a position of becoming a clearing house for NASA's digital manufacturing needs creating a true virtual manufacturing enterprise.

  9. Applying machine learning to identify autistic adults using imitation: An exploratory study.

    PubMed

    Li, Baihua; Sharma, Arjun; Meng, James; Purushwalkam, Senthil; Gowen, Emma

    2017-01-01

    Autism spectrum condition (ASC) is primarily diagnosed by behavioural symptoms including social, sensory and motor aspects. Although stereotyped, repetitive motor movements are considered during diagnosis, quantitative measures that identify kinematic characteristics in the movement patterns of autistic individuals are poorly studied, preventing advances in understanding the aetiology of motor impairment, or whether a wider range of motor characteristics could be used for diagnosis. The aim of this study was to investigate whether data-driven machine learning based methods could be used to address some fundamental problems with regard to identifying discriminative test conditions and kinematic parameters to classify between ASC and neurotypical controls. Data was based on a previous task where 16 ASC participants and 14 age, IQ matched controls observed then imitated a series of hand movements. 40 kinematic parameters extracted from eight imitation conditions were analysed using machine learning based methods. Two optimal imitation conditions and nine most significant kinematic parameters were identified and compared with some standard attribute evaluators. To our knowledge, this is the first attempt to apply machine learning to kinematic movement parameters measured during imitation of hand movements to investigate the identification of ASC. Although based on a small sample, the work demonstrates the feasibility of applying machine learning methods to analyse high-dimensional data and suggest the potential of machine learning for identifying kinematic biomarkers that could contribute to the diagnostic classification of autism.

  10. Performance evaluation of the croissant production line with reparable machines

    NASA Astrophysics Data System (ADS)

    Tsarouhas, Panagiotis H.

    2015-03-01

    In this study, the analytical probability models for an automated serial production system, bufferless that consists of n-machines in series with common transfer mechanism and control system was developed. Both time to failure and time to repair a failure are assumed to follow exponential distribution. Applying those models, the effect of system parameters on system performance in actual croissant production line was studied. The production line consists of six workstations with different numbers of reparable machines in series. Mathematical models of the croissant production line have been developed using Markov process. The strength of this study is in the classification of the whole system in states, representing failures of different machines. Failure and repair data from the actual production environment have been used to estimate reliability and maintainability for each machine, workstation, and the entire line is based on analytical models. The analysis provides a useful insight into the system's behaviour, helps to find design inherent faults and suggests optimal modifications to upgrade the system and improve its performance.

  11. Heuristic for Critical Machine Based a Lot Streaming for Two-Stage Hybrid Production Environment

    NASA Astrophysics Data System (ADS)

    Vivek, P.; Saravanan, R.; Chandrasekaran, M.; Pugazhenthi, R.

    2017-03-01

    Lot streaming in Hybrid flowshop [HFS] is encountered in many real world problems. This paper deals with a heuristic approach for Lot streaming based on critical machine consideration for a two stage Hybrid Flowshop. The first stage has two identical parallel machines and the second stage has only one machine. In the second stage machine is considered as a critical by valid reasons these kind of problems is known as NP hard. A mathematical model developed for the selected problem. The simulation modelling and analysis were carried out in Extend V6 software. The heuristic developed for obtaining optimal lot streaming schedule. The eleven cases of lot streaming were considered. The proposed heuristic was verified and validated by real time simulation experiments. All possible lot streaming strategies and possible sequence under each lot streaming strategy were simulated and examined. The heuristic consistently yielded optimal schedule consistently in all eleven cases. The identification procedure for select best lot streaming strategy was suggested.

  12. Machine Shop. Module 1: Machine Shop Orientation and Math. Instructor's Guide.

    ERIC Educational Resources Information Center

    Curtis, Donna; Nobles, Jack

    This document consists of materials for a six-unit course on employment in the machine shop setting, safety, basic math skills, geometric figures and forms, math applications, and right triangles. The instructor's guide begins with a list of competencies covered in the module, descriptions of the materials included, an explanation of how to use…

  13. Machine Shop Suggested Job and Task Sheets. Part I. 25 Elementary Jobs.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    This volume consists of elementary job and task sheets adaptable for use in the regular vocational industrial education programs for the training of machinists and machine shop operators. Twenty-five simple machine shop job sheets are included. Some or all of this material is provided for each job sheet: an introductory sheet with aim, checking…

  14. Machine Shop Suggested Job and Task Sheets. Part II. 21 Advanced Jobs.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    This volume consists of advanced job and task sheets adaptable for use in the regular vocational industrial education programs for the training of machinists and machine shop operators. Twenty-one advanced machine shop job sheets are included. Some or all of this material is provided for each job: an introductory sheet with aim, checking…

  15. A microcomputer network for control of a continuous mining machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiffbauer, W.H.

    1993-12-31

    This report details a microcomputer-based control and monitoring network that was developed in-house by the U.S. Bureau of Mines and installed on a continuous mining machine. The network consists of microcomputers that are connected together via a single twisted-pair cable. Each microcomputer was developed to provide a particular function in the control process. Machine-mounted microcomputers, in conjunction with the appropriate sensors, provide closed-loop control of the machine, navigation, and environmental monitoring. Off-the-machine microcomputers provide remote control of the machine, sensor status, and a connection to the network so that external computers can access network data and control the continuous miningmore » machine. Because of the network`s generic structure, it can be installed on most mining machines.« less

  16. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed.

  17. Grading options for western hemlock "pulpwood" logs from southeastern Alaska.

    Treesearch

    David W. Green; Kent A. McDonald; John Dramm; Kenneth Kilborn

    Properties and grade yield are estimated for structural lumber produced from No. 3, No. 4, and low-end No. 2 grade western hemlock logs of the type previously used primarily for the production of pulp chips. Estimates are given for production in the Structural Framing, Machine Stress Rating, and Laminating Stock grading systems. The information shows that significant...

  18. Fighting Through a Logistics Cyber Attack

    DTIC Science & Technology

    2015-06-19

    Chariot 800 - 1350 Gunpowder 1915 Machine Gun 1915 Tanks 1915 Aircraft 1935 Radar 1945 Nuclear Weapons 1960 Satellites 1989 GPS 2009 Cyber Weapon...primarily remained in the scientific and academic communities for the next 22 years ( Griffiths , 2002). The Internet as we recognize it today... Griffiths (2002), defines the Web as an abstract space information containing hyperlinked documents and other resources, identified by their Uniformed

  19. SCSI Communication Test Bus

    NASA Technical Reports Server (NTRS)

    Hua, Chanh V.; D'Ambrose, John J.; Jaworski, Richard C.; Halula, Elaine M.; Thornton, David N.; Heligman, Robert L.; Turner, Michael R.

    1990-01-01

    Small Computer System Interface (SCSI) communication test bus provides high-data-rate, standard interconnection enabling communication among International Business Machines (IBM) Personal System/2 Micro Channel, other devices connected to Micro Channel, test equipment, and host computer. Serves primarily as nonintrusive input/output attachment to PS/2 Micro Channel bus, providing rapid communication for debugger. Opens up possibility of using debugger in real-time applications.

  20. Progress on SOFIA primary mirror

    NASA Astrophysics Data System (ADS)

    Geyl, Roland; Tarreau, Michel

    2000-06-01

    REOSC, SAGEM Group, has a significant contribution to the SOFIA project with the design and fabrication of the 2.7-m primary mirror and its fixtures as well as the M3 mirror tower assembly. This paper will primarily report the progress made on the primary mirror design and the first important manufacturing step: its lightweighting by machining pockets from the rear side of the blank.

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  4. Embedded control system for computerized franking machine

    NASA Astrophysics Data System (ADS)

    Shi, W. M.; Zhang, L. B.; Xu, F.; Zhan, H. W.

    2007-12-01

    This paper presents a novel control system for franking machine. A methodology for operating a franking machine using the functional controls consisting of connection, configuration and franking electromechanical drive is studied. A set of enabling technologies to synthesize postage management software architectures driven microprocessor-based embedded systems is proposed. The cryptographic algorithm that calculates mail items is analyzed to enhance the postal indicia accountability and security. The study indicated that the franking machine is reliability, performance and flexibility in printing mail items.

  5. Laser Machining Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook, [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and a student laboratory manual for a 1-year vocational training program to prepare students for entry-level employment as laser machining technicians. The program was developed through a modification of the DACUM (Developing a Curriculum) technique. The course syllabi volume…

  6. Microcomputer network for control of a continuous mining machine. Information circular/1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiffbauer, W.H.

    1993-01-01

    The paper details a microcomputer-based control and monitoring network that was developed in-house by the U.S. Bureau of Mines, and installed on a Joy 14 continuous mining machine. The network consists of microcomputers that are connected together via a single twisted pair cable. Each microcomputer was developed to provide a particular function in the control process. Machine-mounted microcomputers in conjunction with the appropriate sensors provide closed-loop control of the machine, navigation, and environmental monitoring. Off-the-machine microcomputers provide remote control of the machine, sensor status, and a connection to the network so that external computers can access network data and controlmore » the continuous mining machine. Although the network was installed on a Joy 14 continuous mining machine, its use extends beyond it. Its generic structure lends itself to installation onto most mining machine types.« less

  7. Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems With Switching [Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems

    DOE PAGES

    Zhang, Hong; Abhyankar, Shrirang; Constantinescu, Emil; ...

    2017-01-24

    Sensitivity analysis is an important tool for describing power system dynamic behavior in response to parameter variations. It is a central component in preventive and corrective control applications. The existing approaches for sensitivity calculations, namely, finite-difference and forward sensitivity analysis, require a computational effort that increases linearly with the number of sensitivity parameters. In this paper, we investigate, implement, and test a discrete adjoint sensitivity approach whose computational effort is effectively independent of the number of sensitivity parameters. The proposed approach is highly efficient for calculating sensitivities of larger systems and is consistent, within machine precision, with the function whosemore » sensitivity we are seeking. This is an essential feature for use in optimization applications. Moreover, our approach includes a consistent treatment of systems with switching, such as dc exciters, by deriving and implementing the adjoint jump conditions that arise from state-dependent and time-dependent switchings. The accuracy and the computational efficiency of the proposed approach are demonstrated in comparison with the forward sensitivity analysis approach. In conclusion, this paper focuses primarily on the power system dynamics, but the approach is general and can be applied to hybrid dynamical systems in a broader range of fields.« less

  8. Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems With Switching [Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hong; Abhyankar, Shrirang; Constantinescu, Emil

    Sensitivity analysis is an important tool for describing power system dynamic behavior in response to parameter variations. It is a central component in preventive and corrective control applications. The existing approaches for sensitivity calculations, namely, finite-difference and forward sensitivity analysis, require a computational effort that increases linearly with the number of sensitivity parameters. In this paper, we investigate, implement, and test a discrete adjoint sensitivity approach whose computational effort is effectively independent of the number of sensitivity parameters. The proposed approach is highly efficient for calculating sensitivities of larger systems and is consistent, within machine precision, with the function whosemore » sensitivity we are seeking. This is an essential feature for use in optimization applications. Moreover, our approach includes a consistent treatment of systems with switching, such as dc exciters, by deriving and implementing the adjoint jump conditions that arise from state-dependent and time-dependent switchings. The accuracy and the computational efficiency of the proposed approach are demonstrated in comparison with the forward sensitivity analysis approach. In conclusion, this paper focuses primarily on the power system dynamics, but the approach is general and can be applied to hybrid dynamical systems in a broader range of fields.« less

  9. ``Diagonalization'' of a compound Atwood machine

    NASA Astrophysics Data System (ADS)

    Crawford, Frank S.

    1987-06-01

    We consider a simple Atwood machine consisting of a massless frictionless pulley no. 0 supporting two masses m1 and m2 connected by a massless flexible string. We show that the string that supports massless pulley no. 0 ``thinks'' it is simply supporting a mass m0, with m0=4m1m2/(m1+m2). This result, together with Einstein's equivalence principle, allows us to solve easily those compound Atwood machines created by replacing one or both of m1 and m2 in machine no. 0 by an Atwood machine. We may then replacing the masses in these new machines by machines, etc. The complete solution can be written down immediately, without solving simultaneous equations. Finally we give the effective mass of an Atwood machine whose pulley has nonzero mass and moment of inertia.

  10. Abrasive slurry composition for machining boron carbide

    DOEpatents

    Duran, Edward L.

    1985-01-01

    An abrasive slurry particularly suited for use in drilling or machining boron carbide consists essentially of a suspension of boron carbide and/or silicon carbide grit in a carrier solution consisting essentially of a dilute solution of alkylaryl polyether alcohol in octyl alcohol. The alkylaryl polyether alcohol functions as a wetting agent which improves the capacity of the octyl alcohol for carrying the grit in suspension, yet without substantially increasing the viscosity of the carrier solution.

  11. Abrasive slurry composition for machining boron carbide

    DOEpatents

    Duran, E.L.

    1984-11-29

    An abrasive slurry particularly suited for use in drilling or machining boron carbide consists essentially of a suspension of boron carbide and/or silicon carbide grit in a carrier solution consisting essentially of a dilute solution of alkylaryl polyether alcohol in octyl alcohol. The alkylaryl polyether alcohol functions as a wetting agent which improves the capacity of the octyl alcohol for carrying the grit in suspension, yet without substantially increasing the viscosity of the carrier solution.

  12. Transducer-actuator systems and methods for performing on-machine measurements and automatic part alignment

    DOEpatents

    Barkman, William E.; Dow, Thomas A.; Garrard, Kenneth P.; Marston, Zachary

    2016-07-12

    Systems and methods for performing on-machine measurements and automatic part alignment, including: a measurement component operable for determining the position of a part on a machine; and an actuation component operable for adjusting the position of the part by contacting the part with a predetermined force responsive to the determined position of the part. The measurement component consists of a transducer. The actuation component consists of a linear actuator. Optionally, the measurement component and the actuation component consist of a single linear actuator operable for contacting the part with a first lighter force for determining the position of the part and with a second harder force for adjusting the position of the part. The actuation component is utilized in a substantially horizontal configuration and the effects of gravitational drop of the part are accounted for in the force applied and the timing of the contact.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angers, Crystal Plume; Bottema, Ryan; Buckley, Les

    Purpose: Treatment unit uptime statistics are typically used to monitor radiation equipment performance. The Ottawa Hospital Cancer Centre has introduced the use of Quality Control (QC) test success as a quality indicator for equipment performance and overall health of the equipment QC program. Methods: Implemented in 2012, QATrack+ is used to record and monitor over 1100 routine machine QC tests each month for 20 treatment and imaging units ( http://qatrackplus.com/ ). Using an SQL (structured query language) script, automated queries of the QATrack+ database are used to generate program metrics such as the number of QC tests executed and themore » percentage of tests passing, at tolerance or at action. These metrics are compared against machine uptime statistics already reported within the program. Results: Program metrics for 2015 show good correlation between pass rate of QC tests and uptime for a given machine. For the nine conventional linacs, the QC test success rate was consistently greater than 97%. The corresponding uptimes for these units are better than 98%. Machines that consistently show higher failure or tolerance rates in the QC tests have lower uptimes. This points to either poor machine performance requiring corrective action or to problems with the QC program. Conclusions: QATrack+ significantly improves the organization of QC data but can also aid in overall equipment management. Complimenting machine uptime statistics with QC test metrics provides a more complete picture of overall machine performance and can be used to identify areas of improvement in the machine service and QC programs.« less

  14. Assisted navigation based on shared-control, using discrete and sparse human-machine interfaces.

    PubMed

    Lopes, Ana C; Nunes, Urbano; Vaz, Luis; Vaz, Luís

    2010-01-01

    This paper presents a shared-control approach for Assistive Mobile Robots (AMR), which depends on the user's ability to navigate a semi-autonomous powered wheelchair, using a sparse and discrete human-machine interface (HMI). This system is primarily intended to help users with severe motor disabilities that prevent them to use standard human-machine interfaces. Scanning interfaces and Brain Computer Interfaces (BCI), characterized to provide a small set of commands issued sparsely, are possible HMIs. This shared-control approach is intended to be applied in an Assisted Navigation Training Framework (ANTF) that is used to train users' ability in steering a powered wheelchair in an appropriate manner, given the restrictions imposed by their limited motor capabilities. A shared-controller based on user characterization, is proposed. This controller is able to share the information provided by the local motion planning level with the commands issued sparsely by the user. Simulation results of the proposed shared-control method, are presented.

  15. Bioterrorism-related Inhalational Anthrax in an Elderly Woman, Connecticut, 2001

    PubMed Central

    Mead, Paul; Armstrong, Gregory L.; Painter, John; Kelley, Katherine A.; Hoffmaster, Alex R.; Mayo, Donald; Barden, Diane; Ridzon, Renee; Parashar, Umesh; Teshale, Eyasu Habtu; Williams, Jen; Noviello, Stephanie; Perz, Joseph F.; Mast, Eric E.; Swerdlow, David L.; Hadler, James L.

    2003-01-01

    On November 20, 2001, inhalational anthrax was confirmed in an elderly woman from rural Connecticut. To determine her exposure source, we conducted an extensive epidemiologic, environmental, and laboratory investigation. Molecular subtyping showed that her isolate was indistinguishable from isolates associated with intentionally contaminated letters. No samples from her home or community yielded Bacillus anthracis, and she received no first-class letters from facilities known to have processed intentionally contaminated letters. Environmental sampling in the regional Connecticut postal facility yielded B. anthracis spores from 4 (31%) of 13 sorting machines. One extensively contaminated machine primarily processes bulk mail. A second machine that does final sorting of bulk mail for her zip code yielded B. anthracis on the column of bins for her carrier route. The evidence suggests she was exposed through a cross-contaminated bulk mail letter. Such cross-contamination of letters and postal facilities has implications for managing the response to future B. anthracis–contaminated mailings. PMID:12781007

  16. UPEML Version 2. 0: A machine-portable CDC Update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Young, M.F.

    1987-05-01

    UPEML is a machine-portable CDC Update emulation program. UPEML is written in ANSI standard Fortran-77 and is relatively simple and compact. It is capable of emulating a significant subset of the standard CDC Update functions, including program library creation and subsequent modification. Machine-portability is an essential attribute of UPEML. UPEML was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX 11/780 and the IBM 3081. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both COS and CTSS operating systems, on APOLLO workstations, and on the HP-9000.more » Version 2.0 includes enhanced error checking, full ASCI character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the compile file. Further enhancements include checks for overlapping corrections, processing of nested calls to common decks, and reads and addfiles from alternate input files.« less

  17. An Automated Classification Technique for Detecting Defects in Battery Cells

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2006-01-01

    Battery cell defect classification is primarily done manually by a human conducting a visual inspection to determine if the battery cell is acceptable for a particular use or device. Human visual inspection is a time consuming task when compared to an inspection process conducted by a machine vision system. Human inspection is also subject to human error and fatigue over time. We present a machine vision technique that can be used to automatically identify defective sections of battery cells via a morphological feature-based classifier using an adaptive two-dimensional fast Fourier transformation technique. The initial area of interest is automatically classified as either an anode or cathode cell view as well as classified as an acceptable or a defective battery cell. Each battery cell is labeled and cataloged for comparison and analysis. The result is the implementation of an automated machine vision technique that provides a highly repeatable and reproducible method of identifying and quantifying defects in battery cells.

  18. Research as a guide for teaching introductory mechanics: An illustration in the context of the Atwood's machine

    NASA Astrophysics Data System (ADS)

    McDermott, Lillian C.; Shaffer, Peter S.; Somers, Mark D.

    1994-01-01

    A problem on the Atwood's machine is often introduced early in the teaching of dynamics to demonstrate the application of Newton's laws to the motion of a compound system. In a series of preliminary studies, student understanding of the Atwood's machine was examined after this topic had been covered in a typical calculus-based course. Analysis of the data revealed that many students had serious difficulties with the acceleration, the internal and external forces, and the role of the string. The present study was undertaken to obtain more detailed information about the nature and prevalence of these difficulties and thus provide a sound basis for the design of more effective instruction. The context for the investigation is a group of related problems involving less complicated compound systems. Specific examples illustrate how this research, which was conducted primarily in a classroom setting, has served as a guide in the development of tutorial materials to supplement the lectures and textbook in a standard introductory course.

  19. Laboratory pavement polishing device (wear machine) versus field friction test units and accumulative traffic (ADT). Phase 2. Correlation study

    NASA Astrophysics Data System (ADS)

    Godwin, H. F.; Loyed, D. B.; Miley, W. G.; Page, G. C.

    1981-04-01

    The degree to which pavement wear (vehicular traffic) could be predicted from testing samples of in-service pavements in the laboratory pavement polishing device was determined. This investigation was made on asphaltic concrete pavements, primarily friction courses used in Florida. These pavements were tested at different levels of accumulative traffic (ADT) for approximately 2 years.

  20. Risk estimation using probability machines.

    PubMed

    Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D

    2014-03-01

    Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.

  1. Recent laser upgrades at Sandia’s Z-backlighter facility in order to accommodate new requirements for magnetized liner inertial fusion on the Z-machine

    DOE PAGES

    Schwarz, Jens; Rambo, Patrick; Armstrong, Darrell; ...

    2016-10-21

    The Z-backlighter laser facility primarily consists of two high energy, high-power laser systems. Z-Beamlet laser (ZBL) (Rambo et al., Appl. Opt. 44, 2421 (2005)) is a multi-kJ-class, nanosecond laser operating at 1054 nm which is frequency doubled to 527 nm in order to provide x-ray backlighting of high energy density events on the Z-machine. Z-Petawatt (ZPW) (Schwarz et al., J. Phys.: Conf. Ser. 112, 032020 (2008)) is a petawatt-class system operating at 1054 nm delivering up to 500 J in 500 fs for backlighting and various short-pulse laser experiments (see also Figure 10 for a facility overview). With the developmentmore » of the magnetized liner inertial fusion (MagLIF) concept on the Z-machine, the primary backlighting missions of ZBL and ZPW have been adjusted accordingly. As a result, we have focused our recent efforts on increasing the output energy of ZBL from 2 to 4 kJ at 527 nm by modifying the fiber front end to now include extra bandwidth (for stimulated Brillouin scattering suppression). The MagLIF concept requires a well-defined/behaved beam for interaction with the pressurized fuel. Hence we have made great efforts to implement an adaptive optics system on ZBL and have explored the use of phase plates. We are also exploring concepts to use ZPW as a backlighter for ZBL driven MagLIF experiments. Alternatively, ZPW could be used as an additional fusion fuel pre-heater or as a temporally flexible high energy pre-pulse. All of these concepts require the ability to operate the ZPW in a nanosecond long-pulse mode, in which the beam can co-propagate with ZBL. Finally, some of the proposed modifications are complete and most of them are well on their way.« less

  2. Recent laser upgrades at Sandia’s Z-backlighter facility in order to accommodate new requirements for magnetized liner inertial fusion on the Z-machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwarz, Jens; Rambo, Patrick; Armstrong, Darrell

    The Z-backlighter laser facility primarily consists of two high energy, high-power laser systems. Z-Beamlet laser (ZBL) (Rambo et al., Appl. Opt. 44, 2421 (2005)) is a multi-kJ-class, nanosecond laser operating at 1054 nm which is frequency doubled to 527 nm in order to provide x-ray backlighting of high energy density events on the Z-machine. Z-Petawatt (ZPW) (Schwarz et al., J. Phys.: Conf. Ser. 112, 032020 (2008)) is a petawatt-class system operating at 1054 nm delivering up to 500 J in 500 fs for backlighting and various short-pulse laser experiments (see also Figure 10 for a facility overview). With the developmentmore » of the magnetized liner inertial fusion (MagLIF) concept on the Z-machine, the primary backlighting missions of ZBL and ZPW have been adjusted accordingly. As a result, we have focused our recent efforts on increasing the output energy of ZBL from 2 to 4 kJ at 527 nm by modifying the fiber front end to now include extra bandwidth (for stimulated Brillouin scattering suppression). The MagLIF concept requires a well-defined/behaved beam for interaction with the pressurized fuel. Hence we have made great efforts to implement an adaptive optics system on ZBL and have explored the use of phase plates. We are also exploring concepts to use ZPW as a backlighter for ZBL driven MagLIF experiments. Alternatively, ZPW could be used as an additional fusion fuel pre-heater or as a temporally flexible high energy pre-pulse. All of these concepts require the ability to operate the ZPW in a nanosecond long-pulse mode, in which the beam can co-propagate with ZBL. Finally, some of the proposed modifications are complete and most of them are well on their way.« less

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  9. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. AAA+ Machines of Protein Destruction in Mycobacteria.

    PubMed

    Alhuwaider, Adnan Ali H; Dougan, David A

    2017-01-01

    The bacterial cytosol is a complex mixture of macromolecules (proteins, DNA, and RNA), which collectively are responsible for an enormous array of cellular tasks. Proteins are central to most, if not all, of these tasks and as such their maintenance (commonly referred to as protein homeostasis or proteostasis) is vital for cell survival during normal and stressful conditions. The two key aspects of protein homeostasis are, (i) the correct folding and assembly of proteins (coupled with their delivery to the correct cellular location) and (ii) the timely removal of unwanted or damaged proteins from the cell, which are performed by molecular chaperones and proteases, respectively. A major class of proteins that contribute to both of these tasks are the AAA+ (ATPases associated with a variety of cellular activities) protein superfamily. Although much is known about the structure of these machines and how they function in the model Gram-negative bacterium Escherichia coli , we are only just beginning to discover the molecular details of these machines and how they function in mycobacteria. Here we review the different AAA+ machines, that contribute to proteostasis in mycobacteria. Primarily we will focus on the recent advances in the structure and function of AAA+ proteases, the substrates they recognize and the cellular pathways they control. Finally, we will discuss the recent developments related to these machines as novel drug targets.

  12. Photoelectron studies of machined brass surfaces

    NASA Astrophysics Data System (ADS)

    Potts, A. W.; Merrison, J. P.; Tournas, A. D.; Yacoot, A.

    UV photoelectron spectroscopy has been used to determine the surface composition of machined brass. The results show a considerable change between the photoelectron surface composition and the bulk composition of the same sample determined by energy-dispersive X-ray fluorescence. On the surface the lead composition is increased by ˜900 G. This is consistent with the important part that lead is believed to play in improving the machinability of this alloy.

  13. Machine vision system for online inspection of freshly slaughtered chickens

    USDA-ARS?s Scientific Manuscript database

    A machine vision system was developed and evaluated for the automation of online inspection to differentiate freshly slaughtered wholesome chickens from systemically diseased chickens. The system consisted of an electron-multiplying charge-coupled-device camera used with an imaging spectrograph and ...

  14. High efficiency machining technology and equipment for edge chamfer of KDP crystals

    NASA Astrophysics Data System (ADS)

    Chen, Dongsheng; Wang, Baorui; Chen, Jihong

    2016-10-01

    Potassium dihydrogen phosphate (KDP) is a type of nonlinear optical crystal material. To Inhibit the transverse stimulated Raman scattering of laser beam and then enhance the optical performance of the optics, the edges of the large-sized KDP crystal needs to be removed to form chamfered faces with high surface quality (RMS<5 nm). However, as the depth of cut (DOC) of fly cutting is usually several, its machining efficiency is too low to be accepted for chamfering of the KDP crystal as the amount of materials to be removed is in the order of millimeter. This paper proposes a novel hybrid machining method, which combines precision grinding with fly cutting, for crackless and high efficiency chamfer of KDP crystal. A specialized machine tool, which adopts aerostatic bearing linear slide and aerostatic bearing spindle, was developed for chamfer of the KDP crystal. The aerostatic bearing linear slide consists of an aerostatic bearing guide with linearity of 0.1 μm/100mm and a linear motor to achieve linear feeding with high precision and high dynamic performance. The vertical spindle consists of an aerostatic bearing spindle with the rotation accuracy (axial) of 0.05 microns and Fork type flexible connection precision driving mechanism. The machining experiment on flying and grinding was carried out, the optimize machining parameters was gained by a series of experiment. Surface roughness of 2.4 nm has been obtained. The machining efficiency can be improved by six times using the combined method to produce the same machined surface quality.

  15. Environmental Impact Research Program. Mechanical Site Preparation Techniques. Section 5.7.1, US Army Corps of Engineers Wildlife Resources Management Manual.

    DTIC Science & Technology

    1986-07-01

    diameter. The machine can also chop shrub thickets of Gambel’s oak (Quercus gw’nbelii) and chokecherry (Prunus virginiana ) into 4- to 6-in. pieces and...bush hog is the side-mounted hog that can be hydraulically lifted up to 15 ft for pruning tree limbs and shrubs. This implement is used primarily on

  16. Institutional Challenges to Developing Metrics of Success in Irregular Warfare

    DTIC Science & Technology

    2011-12-01

    more resources applied to a conflict require a larger military organization to manage and utilize the resources. Additionally, the culture of the...organizational culture closely resembles a “machine bureaucracy,” that is, primarily focused on the internal efficiency of the system and is more...evaluating the effects of their activities. Finally, from the effects of national imperative, the culture of the military organization, and the

  17. Tree shaking machine aids cone collection in a Douglas-fir seed orchard.

    Treesearch

    Donald L. Copes; William K. Randall

    1983-01-01

    A boom-type tree shaker was used in a Douglas-fir seed orchard to remove cones from 7- to 9-meter tall grafted Douglas-fir trees. An average of 55 percent of the cones were removed by shaking, while damage inflicted to the upper crown was confined primarily to branch and leader breakage in the top three internodes. Damage to the lower bole, where the shaker head...

  18. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    PubMed

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Diamond turning machine controller implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garrard, K.P.; Taylor, L.W.; Knight, B.F.

    The standard controller for a Pnuemo ASG 2500 Diamond Turning Machine, an Allen Bradley 8200, has been replaced with a custom high-performance design. This controller consists of four major components. Axis position feedback information is provided by a Zygo Axiom 2/20 laser interferometer with 0.1 micro-inch resolution. Hardware interface logic couples the computers digital and analog I/O channels to the diamond turning machine`s analog motor controllers, the laser interferometer, and other machine status and control information. It also provides front panel switches for operator override of the computer controller and implement the emergency stop sequence. The remaining two components, themore » control computer hardware and software, are discussed in detail below.« less

  20. Air-Bearing Table for Machine Shops

    NASA Technical Reports Server (NTRS)

    Ambrisco, D.

    1986-01-01

    Frequent workpiece repositioning made easier. Air-bearing table facilitates movement of heavy workpiece during machining or between repeated operations at different positions. Table assembly consists of workpiece supporting fixture riding on air bearing. Table especially useful for inertia welding, in which ease of mobility is important.

  1. Geometric improvement of electrochemical discharge micro-drilling using an ultrasonic-vibrated electrolyte

    NASA Astrophysics Data System (ADS)

    Han, Min-Seop; Min, Byung-Kwon; Lee, Sang Jo

    2009-06-01

    Electrochemical discharge machining (ECDM) is a spark-based micromachining method especially suitable for the fabrication of various microstructures on nonconductive materials, such as glass and some engineering ceramics. However, since the spark discharge frequency is drastically reduced as the machining depth increases ECDM microhole drilling has confronted difficulty in achieving uniform geometry for machined holes. One of the primary reasons for this is the difficulty of sustaining an adequate electrolyte flow in the narrow gap between the tool and the workpiece, which results in a widened taper at the hole entrance, as well as a significant reduction of the machining depth. In this paper, ultrasonic electrolyte vibration was used to enhance the machining depth of the ECDM drilling process by assuring an adequate electrolyte flow, thus helping to maintain consistent spark generation. Moreover, the stability of the gas film formation, as well as the surface quality of the hole entrance, was improved with the aid of a side-insulated electrode and a pulse-power generator. The side-insulated electrode prevented stray electrolysis and concentrated the spark discharge at the tool tip, while the pulse voltage reduced thermal damage to the workpiece surface by introducing a periodic pulse-off time. Microholes were fabricated in order to investigate the effects of ultrasonic assistance on the overcut and machining depth of the holes. The experimental results demonstrated that the possibility of consistent spark generation and the machinability of microholes were simultaneously enhanced.

  2. Allocating dissipation across a molecular machine cycle to maximize flux

    PubMed Central

    Brown, Aidan I.; Sivak, David A.

    2017-01-01

    Biomolecular machines consume free energy to break symmetry and make directed progress. Nonequilibrium ATP concentrations are the typical free energy source, with one cycle of a molecular machine consuming a certain number of ATP, providing a fixed free energy budget. Since evolution is expected to favor rapid-turnover machines that operate efficiently, we investigate how this free energy budget can be allocated to maximize flux. Unconstrained optimization eliminates intermediate metastable states, indicating that flux is enhanced in molecular machines with fewer states. When maintaining a set number of states, we show that—in contrast to previous findings—the flux-maximizing allocation of dissipation is not even. This result is consistent with the coexistence of both “irreversible” and reversible transitions in molecular machine models that successfully describe experimental data, which suggests that, in evolved machines, different transitions differ significantly in their dissipation. PMID:29073016

  3. A time-shared machine repair problem with mixed spares under N-policy

    NASA Astrophysics Data System (ADS)

    Jain, Madhu; Shekhar, Chandra; Shukla, Shalini

    2016-06-01

    The present investigation deals with a machine repair problem consisting of cold and warm standby machines. The machines are subject to breakdown and are repaired by the permanent repairman operating under N-policy. There is provision of one additional removable repairman who is called upon when the work load of failed machines crosses a certain threshold level and is removed as soon as the work load again ceases to that level. Both repairmen recover the failed machines by following the time sharing concept which means that the repairmen share their repair job simultaneously among all the failed machines that have joined the system for repair. Markovian model has been developed by considering the queue dependent rates and solved analytically using the recursive technique. Various performance indices are derived which are further used to obtain the cost function. By taking illustration, numerical simulation and sensitivity analysis have been provided.

  4. Structural design of morphing trailing edge actuated by SMA

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Xu, Zhiwei; Zhu, Qian

    2013-09-01

    In this paper, the morphing trailing edge is designed to achieve the up and down deflection under the aerodynamic load. After a detailed and accurate computational analysis to determine the SMA specifications and layout programs, a solid model is created in CATIA and the structures of the morphing wing trailing edge are produced by CNC machining. A set of DSP measurement and control system is designed to accomplish the controlling experiment of the morphing wing trailing edge. At last, via the force analysis, the trailing edge is fabricated with four sections of aluminum alloy, and the arrangement scheme of SMA wires is determined. Experiment of precise control integral has been performed to survey the control effect. The experiment consists of deflection angle tests of the third joint and the integral structure. Primarily, the ultimate deflection angle is tested in these two experiments. Therefore, the controlling experiment of different angles could be performed within this range. The results show that the deflection error is less than 4%and response time is less than 6.7 s, the precise controlling of the morphing trailing edge is preliminary realized.

  5. Age, Gender, and Fine-Grained Ethnicity Prediction using Convolutional Neural Networks for the East Asian Face Dataset

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinivas, Nisha; Rose, Derek C; Bolme, David S

    This paper examines the difficulty associated with performing machine-based automatic demographic prediction on a sub-population of Asian faces. We introduce the Wild East Asian Face dataset (WEAFD), a new and unique dataset to the research community. This dataset consists primarily of labeled face images of individuals from East Asian countries, including Vietnam, Burma, Thailand, China, Korea, Japan, Indonesia, and Malaysia. East Asian turk annotators were uniquely used to judge the age and fine grain ethnicity attributes to reduce the impact of the other race effect and improve quality of annotations. We focus on predicting age, gender and fine-grained ethnicity ofmore » an individual by providing baseline results with a convolutional neural network (CNN). Finegrained ethnicity prediction refers to predicting ethnicity of an individual by country or sub-region (Chinese, Japanese, Korean, etc.) of the East Asian continent. Performance for two CNN architectures is presented, highlighting the difficulty of these tasks and showcasing potential design considerations that ease network optimization by promoting region based feature extraction.« less

  6. The Universal Anaesthesia Machine (UAM): assessment of a new anaesthesia workstation built with global health in mind.

    PubMed

    de Beer, D A H; Nesbitt, F D; Bell, G T; Rapuleng, A

    2017-04-01

    The Universal Anaesthesia Machine has been developed as a complete anaesthesia workstation for use in low- and middle-income countries, where the provision of safe general anaesthesia is often compromised by unreliable supply of electricity and anaesthetic gases. We performed a functional and clinical assessment of this anaesthetic machine, with particular reference to novel features and functioning in the intended environment. The Universal Anaesthesia Machine was found to be reliable, safe and consistent across a range of tests during targeted functional testing. © 2016 The Association of Anaesthetists of Great Britain and Ireland.

  7. Jacks--A Study of Simple Machines.

    ERIC Educational Resources Information Center

    Parsons, Ralph

    This vocational physics individualized student instructional module on jacks (simple machines used to lift heavy objects) contains student prerequisites and objectives, an introduction, and sections on the ratchet bumper jack, the hydraulic jack, the screw jack, and load limitations. Designed with a laboratory orientation, each section consists of…

  8. 29 CFR 1910.262 - Textiles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... cutter is a machine consisting of one or more rotary blades used for the purpose of cutting textile... shall mean the point of contact between two in-running rolls. (25) Openers and pickers. Openers and... roller, doctor blades, etc. The machine is used for printing fabrics. (29) Ranges (bleaching continuous...

  9. 29 CFR 1910.262 - Textiles.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... cutter is a machine consisting of one or more rotary blades used for the purpose of cutting textile... shall mean the point of contact between two in-running rolls. (25) Openers and pickers. Openers and... roller, doctor blades, etc. The machine is used for printing fabrics. (29) Ranges (bleaching continuous...

  10. 29 CFR 1910.262 - Textiles.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... cutter is a machine consisting of one or more rotary blades used for the purpose of cutting textile... shall mean the point of contact between two in-running rolls. (25) Openers and pickers. Openers and... roller, doctor blades, etc. The machine is used for printing fabrics. (29) Ranges (bleaching continuous...

  11. 29 CFR 1910.262 - Textiles.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... cutter is a machine consisting of one or more rotary blades used for the purpose of cutting textile... shall mean the point of contact between two in-running rolls. (25) Openers and pickers. Openers and... roller, doctor blades, etc. The machine is used for printing fabrics. (29) Ranges (bleaching continuous...

  12. 29 CFR 1910.262 - Textiles.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... cutter is a machine consisting of one or more rotary blades used for the purpose of cutting textile... shall mean the point of contact between two in-running rolls. (25) Openers and pickers. Openers and... roller, doctor blades, etc. The machine is used for printing fabrics. (29) Ranges (bleaching continuous...

  13. Machine Learning Techniques for Stellar Light Curve Classification

    NASA Astrophysics Data System (ADS)

    Hinners, Trisha A.; Tat, Kevin; Thorp, Rachel

    2018-07-01

    We apply machine learning techniques in an attempt to predict and classify stellar properties from noisy and sparse time-series data. We preprocessed over 94 GB of Kepler light curves from the Mikulski Archive for Space Telescopes (MAST) to classify according to 10 distinct physical properties using both representation learning and feature engineering approaches. Studies using machine learning in the field have been primarily done on simulated data, making our study one of the first to use real light-curve data for machine learning approaches. We tuned our data using previous work with simulated data as a template and achieved mixed results between the two approaches. Representation learning using a long short-term memory recurrent neural network produced no successful predictions, but our work with feature engineering was successful for both classification and regression. In particular, we were able to achieve values for stellar density, stellar radius, and effective temperature with low error (∼2%–4%) and good accuracy (∼75%) for classifying the number of transits for a given star. The results show promise for improvement for both approaches upon using larger data sets with a larger minority class. This work has the potential to provide a foundation for future tools and techniques to aid in the analysis of astrophysical data.

  14. Design Guidelines and Criteria for User/Operator Transactions with Battlefield Automated Systems. Volume 5. Background Literature

    DTIC Science & Technology

    1981-02-01

    the machine . ARI’s efforts in this area focus on human perfor- mance problems related to interactions with command and control centers, and on issues...improvement of the user- machine interface. Lacking consistent design principles, current practice results in a fragmented and unsystematic approach to system...complexity in the user- machine interface of BAS, ARI supported this effort for develop- me:nt of an online language for Army tactical intelligence

  15. Risk estimation using probability machines

    PubMed Central

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  16. A Boltzmann machine for the organization of intelligent machines

    NASA Technical Reports Server (NTRS)

    Moed, Michael C.; Saridis, George N.

    1990-01-01

    A three-tier structure consisting of organization, coordination, and execution levels forms the architecture of an intelligent machine using the principle of increasing precision with decreasing intelligence from a hierarchically intelligent control. This system has been formulated as a probabilistic model, where uncertainty and imprecision can be expressed in terms of entropies. The optimal strategy for decision planning and task execution can be found by minimizing the total entropy in the system. The focus is on the design of the organization level as a Boltzmann machine. Since this level is responsible for planning the actions of the machine, the Boltzmann machine is reformulated to use entropy as the cost function to be minimized. Simulated annealing, expanding subinterval random search, and the genetic algorithm are presented as search techniques to efficiently find the desired action sequence and illustrated with numerical examples.

  17. Design Evaluation for Personnel, Training and Human Factors (DEPTH) Final Report.

    DTIC Science & Technology

    1998-01-17

    human activity was primarily intended to facilitate man-machine design analyses of complex systems. By importing computer aided design (CAD) data, the human figure models and analysis algorithms can help to ensure components can be seen, reached, lifted and removed by most maintainers. These simulations are also useful for logistics data capture, training, and task analysis. DEPTH was also found to be useful in obtaining task descriptions for technical

  18. Machine-Learning Techniques for the Determination of Attrition of Forces Due to Atmospheric Conditions

    DTIC Science & Technology

    2018-02-01

    the possibility of a correlation between aircraft incidents in the National Transportation Safety Board database and meteorological conditions. If a...strong correlation could be found, it could be used to derive a model to predict aircraft incidents and become part of a decision support tool for...techniques, primarily the random forest algorithm, were used to explore the possibility of a correlation between aircraft incidents in the National

  19. ISS Habitability Data Collection and Preliminary Findings

    NASA Technical Reports Server (NTRS)

    Thaxton, Sherry (Principal Investigator); Greene, Maya; Schuh, Susan; Williams, Thomas; Archer, Ronald; Vasser, Katie

    2017-01-01

    Habitability is the relationship between an individual and their surroundings (i.e. the interplay of the person, machines, environment, and mission). The purpose of this study is to assess habitability and human factors on the ISS to better prepare for future long-duration space flights. Scheduled data collection sessions primarily require the use of iSHORT (iPad app) to capture near real-time habitability feedback and analyze vehicle layout and space utilization.

  20. Europe Report, Science and Technology.

    DTIC Science & Technology

    1986-05-29

    developed the institutional software training of today, primarily in institutions of higher learning or in study courses. TV BASIC-- it really would be a...only to develop software but also to offer large catalogs to users of its machines. Bull is studying the product-marketing strategies of companies such...by DEC, an XCON + XSEL package). A prototype is under study ; it was developed on an SPS 9 with Kool and a relational data base. The first

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This paper is actually a composite of two papers dealing with automation and computerized control of underground mining equipment. The paper primarily discusses drills, haulage equipment, and tunneling machines. It compares performance and cost benefits of conventional equipment to the new automated methods. The company involved are iron ore mining companies in Scandinavia. The papers also discusses the different equipment using air power, water power, hydraulic power, and computer power. The different drill rigs are compared for performance and cost.

  2. Slot Machine Preferences of Pathological and Recreational Gamblers Are Verbally Constructed

    ERIC Educational Resources Information Center

    Dixon, Mark R.; Bihler, Holly L.; Nastally, Becky L.

    2011-01-01

    The current study attempted to alter preferences for concurrently available slot machines of equal payout through the development of equivalence classes and subsequent transfers of functions. Participants rated stimuli consisting of words thought to be associated with having a gambling problem (e.g., "desperation" and "debt"), words associated…

  3. Miniature Gas-Circulating Machine

    NASA Technical Reports Server (NTRS)

    Swift, Walter L.; Valenzuela, Javier A.; Sixsmith, Herbert; Nutt, William E.

    1993-01-01

    Proposed gas-circulating machine consists essentially of centrifugal pump driven by induction motor. Noncontact bearings suppress wear and contamination. Used to circulate helium (or possibly hydrogen or another gas) in regeneration sorption-compressor refrigeration system aboard spacecraft. Also proves useful in terrestrial applications in which long life, reliability, and low contamination essential.

  4. Drilling Precise Orifices and Slots

    NASA Technical Reports Server (NTRS)

    Richards, C. W.; Seidler, J. E.

    1983-01-01

    Reaction control thrustor injector requires precisely machined orifices and slots. Tooling setup consists of rotary table, numerical control system and torque sensitive drill press. Components used to drill oxidizer orifices. Electric discharge machine drills fuel-feed orifices. Device automates production of identical parts so several are completed in less time than previously.

  5. Machine Shop Practice, 13-2. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Army Ordnance Center and School, Aberdeen Proving Ground, MD.

    This military-developed text consists of self-instructional materials dealing with the basic tools and equipment used in metalworking shops. Covered in the individual lessons are the following topics: materials and processes; shop mathematics; blueprint reading and sketching; handtools, measuring instruments, and basic metalworking machines;…

  6. Machine Shop. Student Learning Guide.

    ERIC Educational Resources Information Center

    Palm Beach County Board of Public Instruction, West Palm Beach, FL.

    This student learning guide contains eight modules for completing a course in machine shop. It is designed especially for use in Palm Beach County, Florida. Each module covers one task, and consists of a purpose, performance objective, enabling objectives, learning activities and resources, information sheets, student self-check with answer key,…

  7. Integrating Machine Learning into Space Operations

    NASA Astrophysics Data System (ADS)

    Kelly, K. G.

    There are significant challenges with managing activities in space, which for the scope of this paper are primarily the identification of objects in orbit, maintaining accurate estimates of the orbits of those objects, detecting changes to those orbits, warning of possible collisions between objects and detection of anomalous behavior. The challenges come from the large amounts of data to be processed, which is often incomplete and noisy, limitations on the ability to influence objects in space and the overall strategic importance of space to national interests. The focus of this paper is on defining an approach to leverage the improved capabilities that are possible using state of the art machine learning in a way that empowers operations personnel without sacrificing the security and mission assurance associated with manual operations performed by trained personnel. There has been significant research in the development of algorithms and techniques for applying machine learning in this domain, but deploying new techniques into such a mission critical domain is difficult and time consuming. Establishing a common framework could improve the efficiency with which new techniques are integrated into operations and the overall effectiveness at providing improvements.

  8. Modeling Stochastic Kinetics of Molecular Machines at Multiple Levels: From Molecules to Modules

    PubMed Central

    Chowdhury, Debashish

    2013-01-01

    A molecular machine is either a single macromolecule or a macromolecular complex. In spite of the striking superficial similarities between these natural nanomachines and their man-made macroscopic counterparts, there are crucial differences. Molecular machines in a living cell operate stochastically in an isothermal environment far from thermodynamic equilibrium. In this mini-review we present a catalog of the molecular machines and an inventory of the essential toolbox for theoretically modeling these machines. The tool kits include 1), nonequilibrium statistical-physics techniques for modeling machines and machine-driven processes; and 2), statistical-inference methods for reverse engineering a functional machine from the empirical data. The cell is often likened to a microfactory in which the machineries are organized in modular fashion; each module consists of strongly coupled multiple machines, but different modules interact weakly with each other. This microfactory has its own automated supply chain and delivery system. Buoyed by the success achieved in modeling individual molecular machines, we advocate integration of these models in the near future to develop models of functional modules. A system-level description of the cell from the perspective of molecular machinery (the mechanome) is likely to emerge from further integrations that we envisage here. PMID:23746505

  9. Machine translation project alternatives analysis

    NASA Technical Reports Server (NTRS)

    Bajis, Catherine J.; Bedford, Denise A. D.

    1993-01-01

    The Machine Translation Project consists of several components, two of which, the Project Plan and the Requirements Analysis, have already been delivered. The Project Plan details the overall rationale, objectives and time-table for the project as a whole. The Requirements Analysis compares a number of available machine translation systems, their capabilities, possible configurations, and costs. The Alternatives Analysis has resulted in a number of conclusions and recommendations to the NASA STI program concerning the acquisition of specific MT systems and related hardware and software.

  10. Engineering of Impulse Mechanism for Mechanical Hander Power Tools

    NASA Astrophysics Data System (ADS)

    Nikolaevich Drozdov, Anatoliy

    2018-03-01

    The solution to the problem of human security in cities should be considered on the basis of an integrated and multidisciplinary approach, including issues of security and ecology in the application of technical means used to ensure the viability and development of technocracy. In this regard, an important task is the creation of a safe technique with improved environmental properties with high technological characteristics. This primarily relates to mechanised tool — the division of technological machines with built in engines is that their weight is fully or partially perceived by the operator’s hands, making the flow and control of the car. For this subclass of machines is characterized by certain features: a built-in motor, perception of at least part of their weight by the operator during the work, the implementation of feeding and management at the expense of the muscular power of the operator. Therefore, among the commonly accepted technical and economic characteristics, machines in this case, important ergonomic (ergonomics), regulation of levels which ensures the safety of the operator. To ergonomics include vibration, noise characteristics, mass, and force feeding machine operator. Vibration is a consequence of the dynamism of the system operator machine - processed object (environment) in which the engine energy is redistributed among all the structures, causing their instability. In the machine vibration caused by technological and constructive (transformative mechanisms) unbalance of individual parts of the drive, the presence of technological and design (impact mechanisms) clearances and other reasons. This article describes a new design of impulse mechanism for hander power tools (wrenches, screwdrivers) with enhanced torque. The article substantiates a simulation model of dynamic compression process in an operating chamber during impact, provides simulation results and outlines further lines of research.

  11. Deep Restricted Kernel Machines Using Conjugate Feature Duality.

    PubMed

    Suykens, Johan A K

    2017-08-01

    The aim of this letter is to propose a theory of deep restricted kernel machines offering new foundations for deep learning with kernel machines. From the viewpoint of deep learning, it is partially related to restricted Boltzmann machines, which are characterized by visible and hidden units in a bipartite graph without hidden-to-hidden connections and deep learning extensions as deep belief networks and deep Boltzmann machines. From the viewpoint of kernel machines, it includes least squares support vector machines for classification and regression, kernel principal component analysis (PCA), matrix singular value decomposition, and Parzen-type models. A key element is to first characterize these kernel machines in terms of so-called conjugate feature duality, yielding a representation with visible and hidden units. It is shown how this is related to the energy form in restricted Boltzmann machines, with continuous variables in a nonprobabilistic setting. In this new framework of so-called restricted kernel machine (RKM) representations, the dual variables correspond to hidden features. Deep RKM are obtained by coupling the RKMs. The method is illustrated for deep RKM, consisting of three levels with a least squares support vector machine regression level and two kernel PCA levels. In its primal form also deep feedforward neural networks can be trained within this framework.

  12. Adaptive machine and its thermodynamic costs

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.; Wang, Q. A.

    2013-03-01

    We study the minimal thermodynamically consistent model for an adaptive machine that transfers particles from a higher chemical potential reservoir to a lower one. This model describes essentials of the inhomogeneous catalysis. It is supposed to function with the maximal current under uncertain chemical potentials: if they change, the machine tunes its own structure fitting it to the maximal current under new conditions. This adaptation is possible under two limitations: (i) The degree of freedom that controls the machine's structure has to have a stored energy (described via a negative temperature). The origin of this result is traced back to the Le Chatelier principle. (ii) The machine has to malfunction at a constant environment due to structural fluctuations, whose relative magnitude is controlled solely by the stored energy. We argue that several features of the adaptive machine are similar to those of living organisms (energy storage, aging).

  13. Nanoscale swimmers: hydrodynamic interactions and propulsion of molecular machines

    NASA Astrophysics Data System (ADS)

    Sakaue, T.; Kapral, R.; Mikhailov, A. S.

    2010-06-01

    Molecular machines execute nearly regular cyclic conformational changes as a result of ligand binding and product release. This cyclic conformational dynamics is generally non-reciprocal so that under time reversal a different sequence of machine conformations is visited. Since such changes occur in a solvent, coupling to solvent hydrodynamic modes will generally result in self-propulsion of the molecular machine. These effects are investigated for a class of coarse grained models of protein machines consisting of a set of beads interacting through pair-wise additive potentials. Hydrodynamic effects are incorporated through a configuration-dependent mobility tensor, and expressions for the propulsion linear and angular velocities, as well as the stall force, are obtained. In the limit where conformational changes are small so that linear response theory is applicable, it is shown that propulsion is exponentially small; thus, propulsion is nonlinear phenomenon. The results are illustrated by computations on a simple model molecular machine.

  14. Knowledge-based load leveling and task allocation in human-machine systems

    NASA Technical Reports Server (NTRS)

    Chignell, M. H.; Hancock, P. A.

    1986-01-01

    Conventional human-machine systems use task allocation policies which are based on the premise of a flexible human operator. This individual is most often required to compensate for and augment the capabilities of the machine. The development of artificial intelligence and improved technologies have allowed for a wider range of task allocation strategies. In response to these issues a Knowledge Based Adaptive Mechanism (KBAM) is proposed for assigning tasks to human and machine in real time, using a load leveling policy. This mechanism employs an online workload assessment and compensation system which is responsive to variations in load through an intelligent interface. This interface consists of a loading strategy reasoner which has access to information about the current status of the human-machine system as well as a database of admissible human/machine loading strategies. Difficulties standing in the way of successful implementation of the load leveling strategy are examined.

  15. Predicting competency in automated machine use in an acquired brain injury population using neuropsychological measures.

    PubMed

    Crowe, Simon F; Mahony, Kate; Jackson, Martin

    2004-08-01

    The purpose of the current study was to explore whether performance on standardised neuropsychological measures could predict functional ability with automated machines and services among people with an acquired brain injury (ABI). Participants were 45 individuals who met the criteria for mild, moderate or severe ABI and 15 control participants matched on demographic variables including age- and education. Each participant was required to complete a battery of neuropsychological tests, as well as performing three automated service delivery tasks: a transport automated ticketing machine, an automated teller machine (ATM) and an automated telephone service. The results showed consistently high relationship between the neuropsychological measures, both as single predictors and in combination, and level of competency with the automated machines. Automated machines are part of a relatively new phenomena in service delivery and offer an ecologically valid functional measure of performance that represents a true indication of functional disability.

  16. Three-dimensional analysis of tubular permanent magnet machines

    NASA Astrophysics Data System (ADS)

    Chai, J.; Wang, J.; Howe, D.

    2006-04-01

    This paper presents results from a three-dimensional finite element analysis of a tubular permanent magnet machine, and quantifies the influence of the laminated modules from which the stator core is assembled on the flux linkage and thrust force capability as well as on the self- and mutual inductances. The three-dimensional finite element (FE) model accounts for the nonlinear, anisotropic magnetization characteristic of the laminated stator structure, and for the voids which exist between the laminated modules. Predicted results are compared with those deduced from an axisymmetric FE model. It is shown that the emf and thrust force deduced from the three-dimensional model are significantly lower than those which are predicted from an axisymmetric field analysis, primarily as a consequence of the teeth and yoke being more highly saturated due to the presence of the voids in the laminated stator core.

  17. Characterization of uranium surfaces machined with aqueous propylene glycol-borax or perchloroethylene-mineral oil coolants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cristy, S.S.; Bennett, R.K. Jr.; Dillon, J.J.

    1986-12-31

    The use of perchloroethylene (perc) as an ingredient in coolants for machining enriched uranium at the Oak Ridge Y-12 Plant has been discontinued because of environmental concerns. A new coolant was substituted in December 1985, which consists of an aqueous solution of propylene glycol with borax (sodium tetraborate) added as a nuclear poison and with a nitrite added as a corrosion inhibitor. Uranium surfaces machined using the two coolants were compared with respects to residual contamination, corrosion or corrosion potential, and with the aqueous propylene glycol-borax coolant was found to be better than that of enriched uranium machined with themore » perc-mineral oil coolant. The boron residues on the final-finished parts machined with the borax-containing coolant were not sufficient to cause problems in further processing. All evidence indicated that the enriched uranium surfaces machined with the borax-containing coolant will be as satisfactory as those machined with the perc coolant.« less

  18. Harvesting small trees for bio-energy

    Treesearch

    John Klepac; Robert Rummer; Jason Thompson

    2011-01-01

    A conventional whole-tree logging operation consisting of 4-wheeled and 3-wheeled saw-head feller-bunchers, two grapple skidders and a chipper that produces dirty chips was monitored across several stands and machine performance evaluated. Stands were inventoried to determine density, volume, and basal area per acre and will be used to relate machine performance to...

  19. Machine for preparing phosphors for the fluorimetric determination of uranium

    USGS Publications Warehouse

    Stevens, R.E.; Wood, W.H.; Goetz, K.G.; Horr, C.A.

    1956-01-01

    The time saved by use of a machine for preparing many phosphors at one time increases the rate of productivity of the fluorimetric method for determining uranium. The machine prepares 18 phosphors at a time and eliminates the tedious and time-consuming step of preparing them by hand, while improving the precision of the method in some localities. The machine consists of a ring burner over which the platinum dishes, containing uranium and flux, are rotated. By placing the machine in an inclined position the molten flux comes into contact with all surfaces within th dish as the dishes rotate over the flame. Precision is improved because the heating and cooling conditions are the same for each of the 18 phosphors in one run as well as for successive runs.

  20. Three’s Company: The Efficacy of Third-Party Intervention in Support of Counterinsurgency

    DTIC Science & Technology

    2009-10-26

    counterinsurgency is an appreciation for the unique role of the third-party intervener. Because these studies primarily examine conflicts 2 Thomas X...insurgency as a novel “technology of military conflict.”5 In a related study, Jason Lyall and Isaiah Wilson study conflict outcome and insurgencies...D. Fearon and David D. Laitin, "Ethnicity, Insurgency, and Civil War," 75 6 Jason Lyall and Isaiah Wilson, "Rage Against the Machines: Explaining

  1. Model for Assembly Line Re-Balancing Considering Additional Capacity and Outsourcing to Face Demand Fluctuations

    NASA Astrophysics Data System (ADS)

    Samadhi, TMAA; Sumihartati, Atin

    2016-02-01

    The most critical stage in a garment industry is sewing process, because generally, it consists of a number of operations and a large number of sewing machines for each operation. Therefore, it requires a balancing method that can assign task to work station with balance workloads. Many studies on assembly line balancing assume a new assembly line, but in reality, due to demand fluctuation and demand increased a re-balancing is needed. To cope with those fluctuating demand changes, additional capacity can be carried out by investing in spare sewing machine and paying for sewing service through outsourcing. This study develops an assembly line balancing (ALB) model on existing line to cope with fluctuating demand change. Capacity redesign is decided if the fluctuation demand exceeds the available capacity through a combination of making investment on new machines and outsourcing while considering for minimizing the cost of idle capacity in the future. The objective of the model is to minimize the total cost of the line assembly that consists of operating costs, machine cost, adding capacity cost, losses cost due to idle capacity and outsourcing costs. The model develop is based on an integer programming model. The model is tested for a set of data of one year demand with the existing number of sewing machines of 41 units. The result shows that additional maximum capacity up to 76 units of machine required when there is an increase of 60% of the average demand, at the equal cost parameters..

  2. A Brief Description of My Projects

    NASA Technical Reports Server (NTRS)

    Barnes, Tobin

    2016-01-01

    My internship was in the IDC which consist of a machine shop and an array of design space. During my tour I worked on a wide variety of projects some of which included design, research, machining and fabrication. I gained further knowledge on some machines that I have had prior experience on such as the lathe and Hurco CNC machines. The first thing we did was complete our checkout in the machine shop which went pretty well, since I was already familiar with most of the machines. I also did a couple of practice parts on some of the machines, I made a name block on the CNC machine and I also used the vertical milling machine to complete this project. One of the other projects that I did was machine a hammer with my initials with the use of the lathe and CNC machine, this project took much longer since I had to set up a cylindrical piece on the CNC machine. The first project that I began work on was the Systems Engineering & Management Advancement Program (SEPMAP) Hexacopter project and helped them to assemble and modify one of their particle capture doors on their boxes. After a while we ended up helping them make a hinge and holes to reduce the weight of their design. We helped the NASA Extreme Environment Mission Operations (NEEMO) team a bit with some of their name tags and assembly of some of their underwater parts. One of the more challenging projects was a rail that came in with a rather weirdly drawn part. The biggest project that I worked on was the solar array project. Which consisted of a variety of machining and 3D printing and it took me about 3 different times of re-designing to come up with a final prototype. Along with this project I also had to complete a project in which I had to modify a thermos. This was rather simple since I just had to draw up a part and print it out on the 3D printer. I also learned how to use Pro E/Creo parametric to design a square block and print it on the 3D Printer. All of these projects increased my experience on all of the machines and equipment that I used. I also got to tweak my design skills and better understand how to modify my designs and how to improve those specific designs.

  3. Modeling stochastic kinetics of molecular machines at multiple levels: from molecules to modules.

    PubMed

    Chowdhury, Debashish

    2013-06-04

    A molecular machine is either a single macromolecule or a macromolecular complex. In spite of the striking superficial similarities between these natural nanomachines and their man-made macroscopic counterparts, there are crucial differences. Molecular machines in a living cell operate stochastically in an isothermal environment far from thermodynamic equilibrium. In this mini-review we present a catalog of the molecular machines and an inventory of the essential toolbox for theoretically modeling these machines. The tool kits include 1), nonequilibrium statistical-physics techniques for modeling machines and machine-driven processes; and 2), statistical-inference methods for reverse engineering a functional machine from the empirical data. The cell is often likened to a microfactory in which the machineries are organized in modular fashion; each module consists of strongly coupled multiple machines, but different modules interact weakly with each other. This microfactory has its own automated supply chain and delivery system. Buoyed by the success achieved in modeling individual molecular machines, we advocate integration of these models in the near future to develop models of functional modules. A system-level description of the cell from the perspective of molecular machinery (the mechanome) is likely to emerge from further integrations that we envisage here. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  4. Effects of retrofit emission controls and work practices on perchloroethylene exposures in small dry-cleaning shops.

    PubMed

    Ewers, Lynda M; Ruder, Avima M; Petersen, Martin R; Earnest, G Scott; Goldenhar, Linda M

    2002-02-01

    The effectiveness of commercially available interventions for reducing workers' perchloroethylene exposures in three small dry-cleaning shops was evaluated. Depending upon machine configuration, the intervention consisted of the addition of either a refrigerated condenser or a closed-loop carbon adsorber to the existing dry-cleaning machine. These relatively inexpensive (less than $5000) engineering controls were designed to reduce perchloroethylene emissions when dry-cleaning machine doors were opened for loading or unloading. Effectiveness of the interventions was judged by comparing pre- and postintervention perchloroethylene exposures using three types of measurements in each shop: (1) full-shift, personal breathing zone, air monitoring, (2) next-morning, end-exhaled worker breath concentrations of perchloroethylene, and (3) differences in the end-exhaled breath perchloroethylene concentrations before and after opening the dry-cleaning machine door. In general, measurements supported the hypothesis that machine operators' exposures to perchloroethylene can be reduced. However, work practices, especially maintenance practices, influenced exposures more than was originally anticipated. Only owners of dry-cleaning machines in good repair, with few leaks, should consider retrofitting them, and only after consultation with their machine's manufacturer. If machines are in poor condition, a new machine or alternative technology should be considered. Shop owners and employees should never circumvent safety features on dry-cleaning machines.

  5. Testing of the Support Vector Machine for Binary-Class Classification

    NASA Technical Reports Server (NTRS)

    Scholten, Matthew

    2011-01-01

    The Support Vector Machine is a powerful algorithm, useful in classifying data in to species. The Support Vector Machines implemented in this research were used as classifiers for the final stage in a Multistage Autonomous Target Recognition system. A single kernel SVM known as SVMlight, and a modified version known as a Support Vector Machine with K-Means Clustering were used. These SVM algorithms were tested as classifiers under varying conditions. Image noise levels varied, and the orientation of the targets changed. The classifiers were then optimized to demonstrate their maximum potential as classifiers. Results demonstrate the reliability of SMV as a method for classification. From trial to trial, SVM produces consistent results

  6. minimega

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Fritz, John Floren

    2013-08-27

    Minimega is a simple emulytics platform for creating testbeds of networked devices. The platform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. Minimega attempts to allow experiments to be brought up quickly with nearly no configuration. Minimega also includes tools for simple cluster management, as well as tools for creating Linux based virtual machine images.

  7. Mind, Machine, and Creativity: An Artist's Perspective

    ERIC Educational Resources Information Center

    Sundararajan, Louise

    2014-01-01

    Harold Cohen is a renowned painter who has developed a computer program, AARON, to create art. While AARON has been hailed as one of the most creative AI programs, Cohen consistently rejects the claims of machine creativity. Questioning the possibility for AI to model human creativity, Cohen suggests in so many words that the human mind takes a…

  8. THE COMPUTER CONCEPT OF SELF-INSTRUCTIONAL DEVICES.

    ERIC Educational Resources Information Center

    SILBERMAN, HARRY F.

    THE COMPUTER SYSTEM CONCEPT WILL BE DEVELOPED IN TWO WAYS--FIRST, A DESCRIPTION WILL BE MADE OF THE SMALL COMPUTER-BASED TEACHING MACHINE WHICH IS BEING USED AS A RESEARCH TOOL, SECOND, A DESCRIPTION WILL BE MADE OF THE LARGE COMPUTER LABORATORY FOR AUTOMATED SCHOOL SYSTEMS WHICH ARE BEING DEVELOPED. THE FIRST MACHINE CONSISTS OF THREE ELEMENTS--…

  9. Apprentice Machinist (AFSC 53130), Volumes 1-4, and Change Supplement (AFSC 42730).

    ERIC Educational Resources Information Center

    Air Univ., Gunter AFS, Ala. Extension Course Inst.

    This four-volume student learning package is designed for use by Air Force personnel enrolled in a self-study extension course for apprentice machinists. The package consists of four volumes of instructional text and four workbooks. Covered in the individual volumes are machine shop fundamentals, lathe work, shaper and contour machine work, and…

  10. Documentation for the machine-readable version of the third Santiago-Pulkovo Fundamental Stars Catalogue (SPF-3)

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.

    1982-01-01

    The machine-readable version of a catalog of right ascensions of 671 bright stars is described. The observations were made in a series consisting of 70 stars observed along the meridian from +42 deg to the pole in upper culmination and from the pole to -70 deg in lower culmination.

  11. LOLA Project Artists

    NASA Image and Video Library

    1965-08-10

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly 2 million dollars. James Hansen wrote: This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled. (p. 379) Ellis J. White described the simulator as follows: Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. 200 feet. All models are in full relief except the sphere. -- Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379 Ellis J. White, Discussion of Three Typical Langley Research Center Simulation Programs, Paper presented at the Eastern Simulation Council (EAI s Princeton Computation Center), Princeton, NJ, October 20, 1966.

  12. LOLA Project

    NASA Image and Video Library

    1964-10-28

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: "This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled." (p. 379) Ellis J. White further described LOLA in his paper "Discussion of Three Typical Langley Research Center Simulation Programs," "Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere." -- Published in James R. Hansen, Spaceflight Revolution, NASA SP-4308, p. 379; Ellis J. White, "Discussion of Three Typical Langley Research Center Simulation Programs," Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  13. Systematic elucidation and in vivo validation of sequences enriched in hindbrain transcriptional control

    PubMed Central

    Burzynski, Grzegorz M.; Reed, Xylena; Taher, Leila; Stine, Zachary E.; Matsui, Takeshi; Ovcharenko, Ivan; McCallion, Andrew S.

    2012-01-01

    Illuminating the primary sequence encryption of enhancers is central to understanding the regulatory architecture of genomes. We have developed a machine learning approach to decipher motif patterns of hindbrain enhancers and identify 40,000 sequences in the human genome that we predict display regulatory control that includes the hindbrain. Consistent with their roles in hindbrain patterning, MEIS1, NKX6-1, as well as HOX and POU family binding motifs contributed strongly to this enhancer model. Predicted hindbrain enhancers are overrepresented at genes expressed in hindbrain and associated with nervous system development, and primarily reside in the areas of open chromatin. In addition, 77 (0.2%) of these predictions are identified as hindbrain enhancers on the VISTA Enhancer Browser, and 26,000 (60%) overlap enhancer marks (H3K4me1 or H3K27ac). To validate these putative hindbrain enhancers, we selected 55 elements distributed throughout our predictions and six low scoring controls for evaluation in a zebrafish transgenic assay. When assayed in mosaic transgenic embryos, 51/55 elements directed expression in the central nervous system. Furthermore, 30/34 (88%) predicted enhancers analyzed in stable zebrafish transgenic lines directed expression in the larval zebrafish hindbrain. Subsequent analysis of sequence fragments selected based upon motif clustering further confirmed the critical role of the motifs contributing to the classifier. Our results demonstrate the existence of a primary sequence code characteristic to hindbrain enhancers. This code can be accurately extracted using machine-learning approaches and applied successfully for de novo identification of hindbrain enhancers. This study represents a critical step toward the dissection of regulatory control in specific neuronal subtypes. PMID:22759862

  14. An introductory analysis of digital infrared thermal imaging guided oral cancer detection using multiresolution rotation invariant texture features

    NASA Astrophysics Data System (ADS)

    Chakraborty, M.; Das Gupta, R.; Mukhopadhyay, S.; Anjum, N.; Patsa, S.; Ray, J. G.

    2017-03-01

    This manuscript presents an analytical treatment on the feasibility of multi-scale Gabor filter bank response for non-invasive oral cancer pre-screening and detection in the long infrared spectrum. Incapability of present healthcare technology to detect oral cancer in budding stage manifests in high mortality rate. The paper contributes a step towards automation in non-invasive computer-aided oral cancer detection using an amalgamation of image processing and machine intelligence paradigms. Previous works have shown the discriminative difference of facial temperature distribution between a normal subject and a patient. The proposed work, for the first time, exploits this difference further by representing the facial Region of Interest(ROI) using multiscale rotation invariant Gabor filter bank responses followed by classification using Radial Basis Function(RBF) kernelized Support Vector Machine(SVM). The proposed study reveals an initial increase in classification accuracy with incrementing image scales followed by degradation of performance; an indication that addition of more and more finer scales tend to embed noisy information instead of discriminative texture patterns. Moreover, the performance is consistently better for filter responses from profile faces compared to frontal faces.This is primarily attributed to the ineptness of Gabor kernels to analyze low spatial frequency components over a small facial surface area. On our dataset comprising of 81 malignant, 59 pre-cancerous, and 63 normal subjects, we achieve state-of-the-art accuracy of 85.16% for normal v/s precancerous and 84.72% for normal v/s malignant classification. This sets a benchmark for further investigation of multiscale feature extraction paradigms in IR spectrum for oral cancer detection.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheu, R; Ghafar, R; Powers, A

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determinedmore » by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.« less

  16. A note on the hygiene of meat mincing machines

    PubMed Central

    Dempster, J. F.

    1973-01-01

    Two mincing machines were cleaned by different methods, i.e. (a) a detergent/sterilizer method and (b) scrubbing parts in boiling (98·8° C.) water. Initial results indicated that, on reassembly, post-treatment contamination took place. Efforts to clean each machine as consisting of two distinct parts, (a) the casing and (b) removable parts, were more satisfactory. Four other mincers which could be completely dis-assembled were satisfactorily cleaned, but only in terms of percentage organisms surviving and not in terms of actual numbers surviving. PMID:4520512

  17. Energy Saving Melting and Revert Reduction Technology: Aging of Graphitic Cast Irons and Machinability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richards, Von L.

    2012-09-19

    The objective of this task was to determine whether ductile iron and compacted graphite iron exhibit age strengthening to a statistically significant extent. Further, this effort identified the mechanism by which gray iron age strengthens and the mechanism by which age-strengthening improves the machinability of gray cast iron. These results were then used to determine whether age strengthening improves the machinability of ductile iron and compacted graphite iron alloys in order to develop a predictive model of alloy factor effects on age strengthening. The results of this work will lead to reduced section sizes, and corresponding weight and energy savings.more » Improved machinability will reduce scrap and enhance casting marketability. Technical Conclusions: Age strengthening was demonstrated to occur in gray iron ductile iron and compacted graphite iron. Machinability was demonstrated to be improved by age strengthening when free ferrite was present in the microstructure, but not in a fully pearlitic microstructure. Age strengthening only occurs when there is residual nitrogen in solid solution in the Ferrite, whether the ferrite is free ferrite or the ferrite lamellae within pearlite. Age strengthening can be accelerated by Mn at about 0.5% in excess of the Mn/S balance Estimated energy savings over ten years is 13.05 trillion BTU, based primarily on yield improvement and size reduction of castings for equivalent service. Also it is estimated that the heavy truck end use of lighter castings for equivalent service requirement will result in a diesel fuel energy savings of 131 trillion BTU over ten years.« less

  18. SoilGrids250m: Global gridded soil information based on machine learning

    PubMed Central

    Mendes de Jesus, Jorge; Heuvelink, Gerard B. M.; Ruiperez Gonzalez, Maria; Kilibarda, Milan; Blagotić, Aleksandar; Shangguan, Wei; Wright, Marvin N.; Geng, Xiaoyuan; Bauer-Marschallinger, Bernhard; Guevara, Mario Antonio; Vargas, Rodrigo; MacMillan, Robert A.; Batjes, Niels H.; Leenaars, Johan G. B.; Ribeiro, Eloi; Wheeler, Ichsani; Mantel, Stephan; Kempen, Bas

    2017-01-01

    This paper describes the technical development and accuracy assessment of the most recent and improved version of the SoilGrids system at 250m resolution (June 2016 update). SoilGrids provides global predictions for standard numeric soil properties (organic carbon, bulk density, Cation Exchange Capacity (CEC), pH, soil texture fractions and coarse fragments) at seven standard depths (0, 5, 15, 30, 60, 100 and 200 cm), in addition to predictions of depth to bedrock and distribution of soil classes based on the World Reference Base (WRB) and USDA classification systems (ca. 280 raster layers in total). Predictions were based on ca. 150,000 soil profiles used for training and a stack of 158 remote sensing-based soil covariates (primarily derived from MODIS land products, SRTM DEM derivatives, climatic images and global landform and lithology maps), which were used to fit an ensemble of machine learning methods—random forest and gradient boosting and/or multinomial logistic regression—as implemented in the R packages ranger, xgboost, nnet and caret. The results of 10–fold cross-validation show that the ensemble models explain between 56% (coarse fragments) and 83% (pH) of variation with an overall average of 61%. Improvements in the relative accuracy considering the amount of variation explained, in comparison to the previous version of SoilGrids at 1 km spatial resolution, range from 60 to 230%. Improvements can be attributed to: (1) the use of machine learning instead of linear regression, (2) to considerable investments in preparing finer resolution covariate layers and (3) to insertion of additional soil profiles. Further development of SoilGrids could include refinement of methods to incorporate input uncertainties and derivation of posterior probability distributions (per pixel), and further automation of spatial modeling so that soil maps can be generated for potentially hundreds of soil variables. Another area of future research is the development of methods for multiscale merging of SoilGrids predictions with local and/or national gridded soil products (e.g. up to 50 m spatial resolution) so that increasingly more accurate, complete and consistent global soil information can be produced. SoilGrids are available under the Open Data Base License. PMID:28207752

  19. Mainstream Smoke Levels of Volatile Organic Compounds in 50 U.S. Domestic Cigarette Brands Smoked With the ISO and Canadian Intense Protocols.

    PubMed

    Pazo, Daniel Y; Moliere, Fallon; Sampson, Maureen M; Reese, Christopher M; Agnew-Heard, Kimberly A; Walters, Matthew J; Holman, Matthew R; Blount, Benjamin C; Watson, Clifford H; Chambers, David M

    2016-09-01

    A significant portion of the increased risk of cancer and respiratory disease from exposure to cigarette smoke is attributed to volatile organic compounds (VOCs). In this study, 21 VOCs were quantified in mainstream cigarette smoke from 50U.S. domestic brand varieties that included high market share brands and 2 Kentucky research cigarettes (3R4F and 1R5F). Mainstream smoke was generated under ISO 3308 and Canadian Intense (CI) smoking protocols with linear smoking machines with a gas sampling bag collection followed by solid phase microextraction/gas chromatography/mass spectrometry (SPME/GC/MS) analysis. For both protocols, mainstream smoke VOC amounts among the different brand varieties were strongly correlated between the majority of the analytes. Overall, Pearson correlation (r) ranged from 0.68 to 0.99 for ISO and 0.36 to 0.95 for CI. However, monoaromatic compounds were found to increase disproportionately compared to unsaturated, nitro, and carbonyl compounds under the CI smoking protocol where filter ventilation is blocked. Overall, machine generated "vapor phase" amounts (µg/cigarette) are primarily attributed to smoking protocol (e.g., blocking of vent holes, puff volume, and puff duration) and filter ventilation. A possible cause for the disproportionate increase in monoaromatic compounds could be increased pyrolysis under low oxygen conditions associated with the CI protocol. This is the most comprehensive assessment of volatile organic compounds (VOCs) in cigarette smoke to date, encompassing 21 toxic VOCs, 50 different cigarette brand varieties, and 2 different machine smoking protocols (ISO and CI). For most analytes relative proportions remain consistent among U.S. cigarette brand varieties regardless of smoking protocol, however the CI smoking protocol did cause up to a factor of 6 increase in the proportion of monoaromatic compounds. This study serves as a basis to assess VOC exposure as cigarette smoke is a principle source of overall population-level VOC exposure in the United States. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  20. SoilGrids250m: Global gridded soil information based on machine learning.

    PubMed

    Hengl, Tomislav; Mendes de Jesus, Jorge; Heuvelink, Gerard B M; Ruiperez Gonzalez, Maria; Kilibarda, Milan; Blagotić, Aleksandar; Shangguan, Wei; Wright, Marvin N; Geng, Xiaoyuan; Bauer-Marschallinger, Bernhard; Guevara, Mario Antonio; Vargas, Rodrigo; MacMillan, Robert A; Batjes, Niels H; Leenaars, Johan G B; Ribeiro, Eloi; Wheeler, Ichsani; Mantel, Stephan; Kempen, Bas

    2017-01-01

    This paper describes the technical development and accuracy assessment of the most recent and improved version of the SoilGrids system at 250m resolution (June 2016 update). SoilGrids provides global predictions for standard numeric soil properties (organic carbon, bulk density, Cation Exchange Capacity (CEC), pH, soil texture fractions and coarse fragments) at seven standard depths (0, 5, 15, 30, 60, 100 and 200 cm), in addition to predictions of depth to bedrock and distribution of soil classes based on the World Reference Base (WRB) and USDA classification systems (ca. 280 raster layers in total). Predictions were based on ca. 150,000 soil profiles used for training and a stack of 158 remote sensing-based soil covariates (primarily derived from MODIS land products, SRTM DEM derivatives, climatic images and global landform and lithology maps), which were used to fit an ensemble of machine learning methods-random forest and gradient boosting and/or multinomial logistic regression-as implemented in the R packages ranger, xgboost, nnet and caret. The results of 10-fold cross-validation show that the ensemble models explain between 56% (coarse fragments) and 83% (pH) of variation with an overall average of 61%. Improvements in the relative accuracy considering the amount of variation explained, in comparison to the previous version of SoilGrids at 1 km spatial resolution, range from 60 to 230%. Improvements can be attributed to: (1) the use of machine learning instead of linear regression, (2) to considerable investments in preparing finer resolution covariate layers and (3) to insertion of additional soil profiles. Further development of SoilGrids could include refinement of methods to incorporate input uncertainties and derivation of posterior probability distributions (per pixel), and further automation of spatial modeling so that soil maps can be generated for potentially hundreds of soil variables. Another area of future research is the development of methods for multiscale merging of SoilGrids predictions with local and/or national gridded soil products (e.g. up to 50 m spatial resolution) so that increasingly more accurate, complete and consistent global soil information can be produced. SoilGrids are available under the Open Data Base License.

  1. Screw compressor analysis from a vibration point-of-view

    NASA Astrophysics Data System (ADS)

    Hübel, D.; Žitek, P.

    2017-09-01

    Vibrations are a very typical feature of all compressors and are given great attention in the industry. The reason for this interest is primarily the negative influence that it can have on both the operating staff and the entire machine's service life. The purpose of this work is to describe the methodology of screw compressor analysis from a vibration point-of-view. This analysis is an essential part of the design of vibro-diagnostics of screw compressors with regard to their service life.

  2. [Styles of programming 1952-1972].

    PubMed

    van den Bogaard, Adrienne

    2008-01-01

    In the field of history of computing, the construction of the early computers has received much scholarly attention. However, these machines have not only been important because of their logical design and their engineering, but also because of the programming practices that emerged around these first machines. This article compares two styles of programming that developed around Dutch 'first computers'. The first style is represented by Edsger Wybe Dijkstra (1930-2002), who would receive the Turing Award for his work in 1972. Dijkstra developed a mathematical style of programming--a program was something you should be able to design mathematically and prove it logically. The second style is represented by Willem Louis van der Poel (born 1926). For him, programming is 'trickology'. A program is primarily a technical artefact that should work: a program is something you play with, comparable to the way one solves a puzzle.

  3. Predicting adverse hemodynamic events in critically ill patients.

    PubMed

    Yoon, Joo H; Pinsky, Michael R

    2018-06-01

    The art of predicting future hemodynamic instability in the critically ill has rapidly become a science with the advent of advanced analytical processed based on computer-driven machine learning techniques. How these methods have progressed beyond severity scoring systems to interface with decision-support is summarized. Data mining of large multidimensional clinical time-series databases using a variety of machine learning tools has led to our ability to identify alert artifact and filter it from bedside alarms, display real-time risk stratification at the bedside to aid in clinical decision-making and predict the subsequent development of cardiorespiratory insufficiency hours before these events occur. This fast evolving filed is primarily limited by linkage of high-quality granular to physiologic rationale across heterogeneous clinical care domains. Using advanced analytic tools to glean knowledge from clinical data streams is rapidly becoming a reality whose clinical impact potential is great.

  4. A New High-Speed Oil-Free Turbine Engine Rotordynamic Simulator Test Rig

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.

    2007-01-01

    A new test rig has been developed for simulating high-speed turbomachinery rotor systems using Oil-Free foil air bearing technology. Foil air bearings have been used in turbomachinery, primarily air cycle machines, for the past four decades to eliminate the need for oil lubrication. The goal of applying this bearing technology to other classes of turbomachinery has prompted the fabrication of this test rig. The facility gives bearing designers the capability to test potential bearing designs with shafts that simulate the rotating components of a target machine without the high cost of building "make-and-break" hardware. The data collected from this rig can be used to make design changes to the shaft and bearings in subsequent design iterations. This paper describes the new test rig and demonstrates its capabilities through the initial run with a simulated shaft system.

  5. Machine learning in motion control

    NASA Technical Reports Server (NTRS)

    Su, Renjeng; Kermiche, Noureddine

    1989-01-01

    The existing methodologies for robot programming originate primarily from robotic applications to manufacturing, where uncertainties of the robots and their task environment may be minimized by repeated off-line modeling and identification. In space application of robots, however, a higher degree of automation is required for robot programming because of the desire of minimizing the human intervention. We discuss a new paradigm of robotic programming which is based on the concept of machine learning. The goal is to let robots practice tasks by themselves and the operational data are used to automatically improve their motion performance. The underlying mathematical problem is to solve the problem of dynamical inverse by iterative methods. One of the key questions is how to ensure the convergence of the iterative process. There have been a few small steps taken into this important approach to robot programming. We give a representative result on the convergence problem.

  6. Communication Studies of DMP and SMP Machines

    NASA Technical Reports Server (NTRS)

    Sohn, Andrew; Biswas, Rupak; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Understanding the interplay between machines and problems is key to obtaining high performance on parallel machines. This paper investigates the interplay between programming paradigms and communication capabilities of parallel machines. In particular, we explicate the communication capabilities of the IBM SP-2 distributed-memory multiprocessor and the SGI PowerCHALLENGEarray symmetric multiprocessor. Two benchmark problems of bitonic sorting and Fast Fourier Transform are selected for experiments. Communication-efficient algorithms are developed to exploit the overlapping capabilities of the machines. Programs are written in Message-Passing Interface for portability and identical codes are used for both machines. Various data sizes and message sizes are used to test the machines' communication capabilities. Experimental results indicate that the communication performance of the multiprocessors are consistent with the size of messages. The SP-2 is sensitive to message size but yields a much higher communication overlapping because of the communication co-processor. The PowerCHALLENGEarray is not highly sensitive to message size and yields a low communication overlapping. Bitonic sorting yields lower performance compared to FFT due to a smaller computation-to-communication ratio.

  7. Using human brain activity to guide machine learning.

    PubMed

    Fong, Ruth C; Scheirer, Walter J; Cox, David D

    2018-03-29

    Machine learning is a field of computer science that builds algorithms that learn. In many cases, machine learning algorithms are used to recreate a human ability like adding a caption to a photo, driving a car, or playing a game. While the human brain has long served as a source of inspiration for machine learning, little effort has been made to directly use data collected from working brains as a guide for machine learning algorithms. Here we demonstrate a new paradigm of "neurally-weighted" machine learning, which takes fMRI measurements of human brain activity from subjects viewing images, and infuses these data into the training process of an object recognition learning algorithm to make it more consistent with the human brain. After training, these neurally-weighted classifiers are able to classify images without requiring any additional neural data. We show that our neural-weighting approach can lead to large performance gains when used with traditional machine vision features, as well as to significant improvements with already high-performing convolutional neural network features. The effectiveness of this approach points to a path forward for a new class of hybrid machine learning algorithms which take both inspiration and direct constraints from neuronal data.

  8. Learning in stochastic neural networks for constraint satisfaction problems

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.; Adorf, Hans-Martin

    1989-01-01

    Researchers describe a newly-developed artificial neural network algorithm for solving constraint satisfaction problems (CSPs) which includes a learning component that can significantly improve the performance of the network from run to run. The network, referred to as the Guarded Discrete Stochastic (GDS) network, is based on the discrete Hopfield network but differs from it primarily in that auxiliary networks (guards) are asymmetrically coupled to the main network to enforce certain types of constraints. Although the presence of asymmetric connections implies that the network may not converge, it was found that, for certain classes of problems, the network often quickly converges to find satisfactory solutions when they exist. The network can run efficiently on serial machines and can find solutions to very large problems (e.g., N-queens for N as large as 1024). One advantage of the network architecture is that network connection strengths need not be instantiated when the network is established: they are needed only when a participating neural element transitions from off to on. They have exploited this feature to devise a learning algorithm, based on consistency techniques for discrete CSPs, that updates the network biases and connection strengths and thus improves the network performance.

  9. A Suggested Set of Job and Task Sheets for Machine Shop Training.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    This set of job and task sheets consists of three multi-part jobs that are adaptable for use in regular vocational industrial education programs for training machinists and machine shop operators. After completing the sheets included in this volume, students should be able to construct a planer jack, a radius cutter, and a surface gage. Each job…

  10. Development of a cost-effective machine vision system for in-field sorting and grading of apples: Fruit orientation and size estimation

    USDA-ARS?s Scientific Manuscript database

    The objective of this research was to develop an in-field apple presorting and grading system to separate undersized and defective fruit from fresh market-grade apples. To achieve this goal, a cost-effective machine vision inspection prototype was built, which consisted of a low-cost color camera, L...

  11. Electric field prediction for a human body-electric machine system.

    PubMed

    Ioannides, Maria G; Papadopoulos, Peter J; Dimitropoulou, Eugenia

    2004-01-01

    A system consisting of an electric machine and a human body is studied and the resulting electric field is predicted. A 3-phase induction machine operating at full load is modeled considering its geometry, windings, and materials. A human model is also constructed approximating its geometry and the electric properties of tissues. Using the finite element technique the electric field distribution in the human body is determined for a distance of 1 and 5 m from the machine and its effects are studied. Particularly, electric field potential variations are determined at specific points inside the human body and for these points the electric field intensity is computed and compared to the limit values for exposure according to international standards.

  12. Process Development and Micro-Machining of MARBLE Foam-Cored Rexolite Hemi-Shell Ablator Capsules

    DOE PAGES

    Randolph, Randall Blaine; Oertel, John A.; Schmidt, Derek William; ...

    2016-06-30

    For this study, machined CH hemi-shell ablator capsules have been successfully produced by the MST-7 Target Fabrication Team at Los Alamos National Laboratory. Process development and micro-machining techniques have been developed to produce capsules for both the Omega and National Ignition Facility (NIF) campaigns. These capsules are gas filled up to 10 atm and consist of a machined plastic hemi-shell outer layer that accommodates various specially engineered low-density polystyrene foam cores. Machining and assembly of the two-part, step-jointed plastic hemi-shell outer layer required development of new techniques, processes, and tooling while still meeting very aggressive shot schedules for both campaigns.more » Finally, problems encountered and process improvements will be discussed that describe this very unique, complex capsule design approach through the first Omega proof-of-concept version to the larger NIF version.« less

  13. Development of hand rehabilitation system for paralysis patient - universal design using wire-driven mechanism.

    PubMed

    Yamaura, Hiroshi; Matsushita, Kojiro; Kato, Ryu; Yokoi, Hiroshi

    2009-01-01

    We have developed a hand rehabilitation system for patients suffering from paralysis or contracture. It consists of two components: a hand rehabilitation machine, which moves human finger joints with motors, and a data glove, which provides control of the movement of finger joints attached to the rehabilitation machine. The machine is based on the arm structure type of hand rehabilitation machine; a motor indirectly moves a finger joint via a closed four-link mechanism. We employ a wire-driven mechanism and develop a compact design that can control all three joints (i.e., PIP, DIP and MP ) of a finger and that offers a wider range of joint motion than conventional systems. Furthermore, we demonstrate the hand rehabilitation process, finger joints of the left hand attached to the machine are controlled by the finger joints of the right hand wearing the data glove.

  14. Effects of virtualization on a scientific application - Running a hyperspectral radiative transfer code on virtual machines.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tikotekar, Anand A; Vallee, Geoffroy R; Naughton III, Thomas J

    2008-01-01

    The topic of system-level virtualization has recently begun to receive interest for high performance computing (HPC). This is in part due to the isolation and encapsulation offered by the virtual machine. These traits enable applications to customize their environments and maintain consistent software configurations in their virtual domains. Additionally, there are mechanisms that can be used for fault tolerance like live virtual machine migration. Given these attractive benefits to virtualization, a fundamental question arises, how does this effect my scientific application? We use this as the premise for our paper and observe a real-world scientific code running on a Xenmore » virtual machine. We studied the effects of running a radiative transfer simulation, Hydrolight, on a virtual machine. We discuss our methodology and report observations regarding the usage of virtualization with this application.« less

  15. Trust metrics in information fusion

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2014-05-01

    Trust is an important concept for machine intelligence and is not consistent across many applications. In this paper, we seek to understand trust from a variety of factors: humans, sensors, communications, intelligence processing algorithms and human-machine displays of information. In modeling the various aspects of trust, we provide an example from machine intelligence that supports the various attributes of measuring trust such as sensor accuracy, communication timeliness, machine processing confidence, and display throughput to convey the various attributes that support user acceptance of machine intelligence results. The example used is fusing video and text whereby an analyst needs trust information in the identified imagery track. We use the proportional conflict redistribution rule as an information fusion technique that handles conflicting data from trusted and mistrusted sources. The discussion of the many forms of trust explored in the paper seeks to provide a systems-level design perspective for information fusion trust quantification.

  16. An FMS Dynamic Production Scheduling Algorithm Considering Cutting Tool Failure and Cutting Tool Life

    NASA Astrophysics Data System (ADS)

    Setiawan, A.; Wangsaputra, R.; Martawirya, Y. Y.; Halim, A. H.

    2016-02-01

    This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule.

  17. Fuzzy Logic-Based Audio Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Malcangi, M.

    2008-11-01

    Audio and audio-pattern recognition is becoming one of the most important technologies to automatically control embedded systems. Fuzzy logic may be the most important enabling methodology due to its ability to rapidly and economically model such application. An audio and audio-pattern recognition engine based on fuzzy logic has been developed for use in very low-cost and deeply embedded systems to automate human-to-machine and machine-to-machine interaction. This engine consists of simple digital signal-processing algorithms for feature extraction and normalization, and a set of pattern-recognition rules manually tuned or automatically tuned by a self-learning process.

  18. Principle and design of small-sized and high-definition x-ray machine

    NASA Astrophysics Data System (ADS)

    Zhao, Anqing

    2010-10-01

    The paper discusses the circuit design and working principles of VMOS PWM type 75KV10mA high frequency X-ray machine. The system mainly consists of silicon controlled rectifier, VMOS tube PWM type high-frequency and highvoltage inverter circuit, filament inverter circuit, high-voltage rectifier filter circuit and as X-ray tube. The working process can be carried out under the control of a single-chip microcomputer. Due to the small size and high resolution in imaging, the X-ray machine is mostly adopted for emergent medical diagnosis and specific circumstances where nondestructive tests are conducted.

  19. Design of robotic cells based on relative handling modules with use of SolidWorks system

    NASA Astrophysics Data System (ADS)

    Gaponenko, E. V.; Anciferov, S. I.

    2018-05-01

    The article presents a diagramed engineering solution for a robotic cell with six degrees of freedom for machining of complex details, consisting of the base with a tool installation module and a detail machining module made as parallel structure mechanisms. The output links of the detail machining module and the tool installation module can move along X-Y-Z coordinate axes each. A 3D-model of the complex is designed in the SolidWorks system. It will be used further for carrying out engineering calculations and mathematical analysis and obtaining all required documentation.

  20. Computer-Aided Drafting and Design Series. Educational Resources for the Machine Tool Industry, Course Syllabi, [and] Instructor's Handbook. Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and a student laboratory manual for a 2-year vocational training program to prepare students for entry-level employment in computer-aided drafting and design in the machine tool industry. The program was developed through a modification of the DACUM (Developing a Curriculum)…

  1. Machine-learning-assisted materials discovery using failed experiments

    NASA Astrophysics Data System (ADS)

    Raccuglia, Paul; Elbert, Katherine C.; Adler, Philip D. F.; Falk, Casey; Wenny, Malia B.; Mollo, Aurelio; Zeller, Matthias; Friedler, Sorelle A.; Schrier, Joshua; Norquist, Alexander J.

    2016-05-01

    Inorganic-organic hybrid materials such as organically templated metal oxides, metal-organic frameworks (MOFs) and organohalide perovskites have been studied for decades, and hydrothermal and (non-aqueous) solvothermal syntheses have produced thousands of new materials that collectively contain nearly all the metals in the periodic table. Nevertheless, the formation of these compounds is not fully understood, and development of new compounds relies primarily on exploratory syntheses. Simulation- and data-driven approaches (promoted by efforts such as the Materials Genome Initiative) provide an alternative to experimental trial-and-error. Three major strategies are: simulation-based predictions of physical properties (for example, charge mobility, photovoltaic properties, gas adsorption capacity or lithium-ion intercalation) to identify promising target candidates for synthetic efforts; determination of the structure-property relationship from large bodies of experimental data, enabled by integration with high-throughput synthesis and measurement tools; and clustering on the basis of similar crystallographic structure (for example, zeolite structure classification or gas adsorption properties). Here we demonstrate an alternative approach that uses machine-learning algorithms trained on reaction data to predict reaction outcomes for the crystallization of templated vanadium selenites. We used information on ‘dark’ reactions—failed or unsuccessful hydrothermal syntheses—collected from archived laboratory notebooks from our laboratory, and added physicochemical property descriptions to the raw notebook information using cheminformatics techniques. We used the resulting data to train a machine-learning model to predict reaction success. When carrying out hydrothermal synthesis experiments using previously untested, commercially available organic building blocks, our machine-learning model outperformed traditional human strategies, and successfully predicted conditions for new organically templated inorganic product formation with a success rate of 89 per cent. Inverting the machine-learning model reveals new hypotheses regarding the conditions for successful product formation.

  2. Surface orientation effects on bending properties of surgical mesh are independent of tensile properties.

    PubMed

    Simon, David D; Andrews, Sharon M; Robinson-Zeigler, Rebecca; Valdes, Thelma; Woods, Terry O

    2018-02-01

    Current mechanical testing of surgical mesh focuses primarily on tensile properties even though implanted devices are not subjected to pure tensile loads. Our objective was to determine the flexural (bending) properties of surgical mesh and determine if they correlate with mesh tensile properties. The flexural rigidity values of 11 different surgical mesh designs were determined along three textile directions (machine, cross-machine, and 45° to machine; n = 5 for each) using ASTM D1388-14 while tracking surface orientation. Tensile testing was also performed on the same specimens using ASTM D882-12. Linear regressions were performed to compare mesh flexural rigidity to mesh thickness, areal mass density, filament diameter, ultimate tensile strength, and maximum extension. Of 33 mesh specimen groups, 30 had significant differences in flexural rigidity values when comparing surface orientations (top and bottom). Flexural rigidity and mesh tensile properties also varied with textile direction (machine and cross-machine). There was no strong correlation between the flexural and tensile properties, with mesh thickness having the best overall correlation with flexural rigidity. Currently, surface orientation is not indicated on marketed surgical mesh, and a single mesh may behave differently depending on the direction of loading. The lack of correlation between flexural stiffness and tensile properties indicates the need to examine mesh bending stiffness to provide a more comprehensive understanding of surgical mesh mechanical behaviors. Further investigation is needed to determine if these flexural properties result in the surgical mesh behaving mechanically different depending on implantation direction. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 854-862, 2018. © 2017 Wiley Periodicals, Inc.

  3. Thermodynamic analysis of resources used in manufacturing processes.

    PubMed

    Gutowski, Timothy G; Branham, Matthew S; Dahmus, Jeffrey B; Jones, Alissa J; Thiriez, Alexandre

    2009-03-01

    In this study we use a thermodynamic framework to characterize the material and energy resources used in manufacturing processes. The analysis and data span a wide range of processes from "conventional" processes such as machining, casting, and injection molding, to the so-called "advanced machining" processes such as electrical discharge machining and abrasive waterjet machining, and to the vapor-phase processes used in semiconductor and nanomaterials fabrication. In all, 20 processes are analyzed. The results show that the intensity of materials and energy used per unit of mass of material processed (measured either as specific energy or exergy) has increased by at least 6 orders of magnitude over the past several decades. The increase of material/energy intensity use has been primarily a consequence of the introduction of new manufacturing processes, rather than changes in traditional technologies. This phenomenon has been driven by the desire for precise small-scale devices and product features and enabled by stable and declining material and energy prices over this period. We illustrate the relevance of thermodynamics (including exergy analysis) for all processes in spite of the fact that long-lasting focus in manufacturing has been on product quality--not necessarily energy/material conversion efficiency. We promote the use of thermodynamics tools for analysis of manufacturing processes within the context of rapidly increasing relevance of sustainable human enterprises. We confirm that exergy analysis can be used to identify where resources are lost in these processes, which is the first step in proposing and/or redesigning new more efficient processes.

  4. A semi-automated process for the production of custom-made shoes

    NASA Technical Reports Server (NTRS)

    Farmer, Franklin H.

    1991-01-01

    A more efficient, cost-effective and timely way of designing and manufacturing custom footware is needed. A potential solution to this problem lies in the use of computer-aided design and manufacturing (CAD/CAM) techniques in the production of custom shoes. A prototype computer-based system was developed, and the system is primarily a software entity which directs and controls a 3-D scanner, a lathe or milling machine, and a pattern-cutting machine to produce the shoe last and the components to be assembled into a shoe. The steps in this process are: (1) scan the surface of the foot to obtain a 3-D image; (2) thin the foot surface data and create a tiled wire model of the foot; (3) interactively modify the wire model of the foot to produce a model of the shoe last; (4) machine the last; (5) scan the surface of the last and verify that it correctly represents the last model; (6) design cutting patterns for shoe uppers; (7) cut uppers; (8) machine an inverse mold for the shoe innersole/sole combination; (9) mold the innersole/sole; and (10) assemble the shoe. For all its capabilities, this system still requires the direction and assistance of skilled operators, and shoemakers to assemble the shoes. Currently, the system is running on a SUN3/260 workstation with TAAC application accelerator. The software elements of the system are written in either Fortran or C and run under a UNIX operator system.

  5. Discriminative analysis of schizophrenia using support vector machine and recursive feature elimination on structural MRI images.

    PubMed

    Lu, Xiaobing; Yang, Yongzhe; Wu, Fengchun; Gao, Minjian; Xu, Yong; Zhang, Yue; Yao, Yongcheng; Du, Xin; Li, Chengwei; Wu, Lei; Zhong, Xiaomei; Zhou, Yanling; Fan, Ni; Zheng, Yingjun; Xiong, Dongsheng; Peng, Hongjun; Escudero, Javier; Huang, Biao; Li, Xiaobo; Ning, Yuping; Wu, Kai

    2016-07-01

    Structural abnormalities in schizophrenia (SZ) patients have been well documented with structural magnetic resonance imaging (MRI) data using voxel-based morphometry (VBM) and region of interest (ROI) analyses. However, these analyses can only detect group-wise differences and thus, have a poor predictive value for individuals. In the present study, we applied a machine learning method that combined support vector machine (SVM) with recursive feature elimination (RFE) to discriminate SZ patients from normal controls (NCs) using their structural MRI data. We first employed both VBM and ROI analyses to compare gray matter volume (GMV) and white matter volume (WMV) between 41 SZ patients and 42 age- and sex-matched NCs. The method of SVM combined with RFE was used to discriminate SZ patients from NCs using significant between-group differences in both GMV and WMV as input features. We found that SZ patients showed GM and WM abnormalities in several brain structures primarily involved in the emotion, memory, and visual systems. An SVM with a RFE classifier using the significant structural abnormalities identified by the VBM analysis as input features achieved the best performance (an accuracy of 88.4%, a sensitivity of 91.9%, and a specificity of 84.4%) in the discriminative analyses of SZ patients. These results suggested that distinct neuroanatomical profiles associated with SZ patients might provide a potential biomarker for disease diagnosis, and machine-learning methods can reveal neurobiological mechanisms in psychiatric diseases.

  6. Helium Leak Detection of Vessels in Fuel Transfer Cell (FTC) of Prototype Fast Breeder Reactor (PFBR)

    NASA Astrophysics Data System (ADS)

    Dutta, N. G.

    2012-11-01

    Bharatiya Nabhikiya Vidyut Nigam (BHAVINI) is engaged in construction of 500MW Prototype Fast Breeder Reactor (PFBR) at Kalpak am, Chennai. In this very important and prestigious national programme Special Product Division (SPD) of M/s Kay Bouvet Engg.pvt. ltd. (M/s KBEPL) Satara is contributing in a major way by supplying many important sub-assemblies like- Under Water trolley (UWT), Airlocks (PAL, EAL) Container and Storage Rack (CSR) Vessels in Fuel Transfer Cell (FTC) etc for PFBR. SPD of KBEPL caters to the requirements of Government departments like - Department of Atomic Energy (DAE), BARC, Defense, and Government undertakings like NPCIL, BHAVINI, BHEL etc. and other precision Heavy Engg. Industries. SPD is equipped with large size Horizontal Boring Machines, Vertical Boring Machines, Planno milling, Vertical Turret Lathe (VTL) & Radial drilling Machine, different types of welding machines etc. PFBR is 500 MWE sodium cooled pool type reactor in which energy is produced by fissions of mixed oxides of Uranium and Plutonium pellets by fast neutrons and it also breeds uranium by conversion of thorium, put along with fuel rod in the reactor. In the long run, the breeder reactor produces more fuel then it consumes. India has taken the lead to go ahead with Fast Breeder Reactor Programme to produce electricity primarily because India has large reserve of Thorium. To use Thorium as further fuel in future, thorium has to be converted in Uranium by PFBR Technology.

  7. Modeling Music Emotion Judgments Using Machine Learning Methods

    PubMed Central

    Vempala, Naresh N.; Russo, Frank A.

    2018-01-01

    Emotion judgments and five channels of physiological data were obtained from 60 participants listening to 60 music excerpts. Various machine learning (ML) methods were used to model the emotion judgments inclusive of neural networks, linear regression, and random forests. Input for models of perceived emotion consisted of audio features extracted from the music recordings. Input for models of felt emotion consisted of physiological features extracted from the physiological recordings. Models were trained and interpreted with consideration of the classic debate in music emotion between cognitivists and emotivists. Our models supported a hybrid position wherein emotion judgments were influenced by a combination of perceived and felt emotions. In comparing the different ML approaches that were used for modeling, we conclude that neural networks were optimal, yielding models that were flexible as well as interpretable. Inspection of a committee machine, encompassing an ensemble of networks, revealed that arousal judgments were predominantly influenced by felt emotion, whereas valence judgments were predominantly influenced by perceived emotion. PMID:29354080

  8. Modeling Music Emotion Judgments Using Machine Learning Methods.

    PubMed

    Vempala, Naresh N; Russo, Frank A

    2017-01-01

    Emotion judgments and five channels of physiological data were obtained from 60 participants listening to 60 music excerpts. Various machine learning (ML) methods were used to model the emotion judgments inclusive of neural networks, linear regression, and random forests. Input for models of perceived emotion consisted of audio features extracted from the music recordings. Input for models of felt emotion consisted of physiological features extracted from the physiological recordings. Models were trained and interpreted with consideration of the classic debate in music emotion between cognitivists and emotivists. Our models supported a hybrid position wherein emotion judgments were influenced by a combination of perceived and felt emotions. In comparing the different ML approaches that were used for modeling, we conclude that neural networks were optimal, yielding models that were flexible as well as interpretable. Inspection of a committee machine, encompassing an ensemble of networks, revealed that arousal judgments were predominantly influenced by felt emotion, whereas valence judgments were predominantly influenced by perceived emotion.

  9. How state taxes and policies targeting soda consumption modify the association between school vending machines and student dietary behaviors: a cross-sectional analysis.

    PubMed

    Taber, Daniel R; Chriqui, Jamie F; Vuillaume, Renee; Chaloupka, Frank J

    2014-01-01

    Sodas are widely sold in vending machines and other school venues in the United States, particularly in high school. Research suggests that policy changes have reduced soda access, but the impact of reduced access on consumption is unclear. This study was designed to identify student, environmental, or policy characteristics that modify the associations between school vending machines and student dietary behaviors. Data on school vending machine access and student diet were obtained as part of the National Youth Physical Activity and Nutrition Study (NYPANS) and linked to state-level data on soda taxes, restaurant taxes, and state laws governing the sale of soda in schools. Regression models were used to: 1) estimate associations between vending machine access and soda consumption, fast food consumption, and lunch source, and 2) determine if associations were modified by state soda taxes, restaurant taxes, laws banning in-school soda sales, or student characteristics (race/ethnicity, sex, home food access, weight loss behaviors.). Contrary to the hypothesis, students tended to consume 0.53 fewer servings of soda/week (95% CI: -1.17, 0.11) and consume fast food on 0.24 fewer days/week (95% CI: -0.44, -0.05) if they had in-school access to vending machines. They were also less likely to consume soda daily (23.9% vs. 27.9%, average difference  =  -4.02, 95% CI: -7.28, -0.76). However, these inverse associations were observed primarily among states with lower soda and restaurant tax rates (relative to general food tax rates) and states that did not ban in-school soda sales. Associations did not vary by any student characteristics except for weight loss behaviors. Isolated changes to the school food environment may have unintended consequences unless policymakers incorporate other initiatives designed to discourage overall soda consumption.

  10. How State Taxes and Policies Targeting Soda Consumption Modify the Association between School Vending Machines and Student Dietary Behaviors: A Cross-Sectional Analysis

    PubMed Central

    Taber, Daniel R.; Chriqui, Jamie F.; Vuillaume, Renee; Chaloupka, Frank J.

    2014-01-01

    Background Sodas are widely sold in vending machines and other school venues in the United States, particularly in high school. Research suggests that policy changes have reduced soda access, but the impact of reduced access on consumption is unclear. This study was designed to identify student, environmental, or policy characteristics that modify the associations between school vending machines and student dietary behaviors. Methods Data on school vending machine access and student diet were obtained as part of the National Youth Physical Activity and Nutrition Study (NYPANS) and linked to state-level data on soda taxes, restaurant taxes, and state laws governing the sale of soda in schools. Regression models were used to: 1) estimate associations between vending machine access and soda consumption, fast food consumption, and lunch source, and 2) determine if associations were modified by state soda taxes, restaurant taxes, laws banning in-school soda sales, or student characteristics (race/ethnicity, sex, home food access, weight loss behaviors.) Results Contrary to the hypothesis, students tended to consume 0.53 fewer servings of soda/week (95% CI: -1.17, 0.11) and consume fast food on 0.24 fewer days/week (95% CI: -0.44, -0.05) if they had in-school access to vending machines. They were also less likely to consume soda daily (23.9% vs. 27.9%, average difference = -4.02, 95% CI: -7.28, -0.76). However, these inverse associations were observed primarily among states with lower soda and restaurant tax rates (relative to general food tax rates) and states that did not ban in-school soda sales. Associations did not vary by any student characteristics except for weight loss behaviors. Conclusion Isolated changes to the school food environment may have unintended consequences unless policymakers incorporate other initiatives designed to discourage overall soda consumption. PMID:25083906

  11. Present and Future of M2M

    NASA Astrophysics Data System (ADS)

    Ono, Satoru; Watanabe, Takashi

    In recent years, the rapid progress in the development of hardware and software technologies enables tiny and low cost information devices hereinafter referred to as Machine to be widely available. M2M (Machine to Machine) has been of much attention where many tiny machines are connected to each other through networks with minimal human intervention to provide smooth and intelligent management. M2M is a promising core technology providing timely, flexible, efficient and comprehensive service at low cost. M2M has wide variety of applications including energy management system, environmental monitoring system, intelligent transport system, industrial automation system and other applications. M2M consists of terminals and networks that connect them. In this paper, we mainly focus on M2M networking and mention the future direction of the technology.

  12. Reliability Analysis of Uniaxially Ground Brittle Materials

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.

    1995-01-01

    The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.

  13. Ultrashort pulse laser machining of metals and alloys

    DOEpatents

    Perry, Michael D.; Stuart, Brent C.

    2003-09-16

    The invention consists of a method for high precision machining (cutting, drilling, sculpting) of metals and alloys. By using pulses of a duration in the range of 10 femtoseconds to 100 picoseconds, extremely precise machining can be achieved with essentially no heat or shock affected zone. Because the pulses are so short, there is negligible thermal conduction beyond the region removed resulting in negligible thermal stress or shock to the material beyond approximately 0.1-1 micron (dependent upon the particular material) from the laser machined surface. Due to the short duration, the high intensity (>10.sup.12 W/cm.sup.2) associated with the interaction converts the material directly from the solid-state into an ionized plasma. Hydrodynamic expansion of the plasma eliminates the need for any ancillary techniques to remove material and produces extremely high quality machined surfaces with negligible redeposition either within the kerf or on the surface. Since there is negligible heating beyond the depth of material removed, the composition of the remaining material is unaffected by the laser machining process. This enables high precision machining of alloys and even pure metals with no change in grain structure.

  14. Walking robot: A design project for undergraduate students

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The design and construction of the University of Maryland walking machine was completed during the 1989 to 1990 academic year. It was required that the machine be capable of completing a number of tasks including walking a straight line, turning to change direction, and manuevering over an obstacle such as a set of stairs. The machine consists of two sets of four telescoping legs that alternately support the entire structure. A gear box and crank arm assembly is connected to the leg sets to provide the power required for the translational motion of the machine. By retracting all eight legs, the robot comes to rest on a central Bigfoot support. Turning is accomplished by rotating this machine about this support. The machine can be controlled by using either a user-operated remote tether or the onboard computer for the execution of control commands. Absolute encoders are attached to all motors to provide the control computer with information regarding the status of the motors. Long and short range infrared sensors provide the computer with feedback information regarding the machine's position relative to a series of stripes and reflectors. These infrared sensors simulate how the robot might sense and gain information about the environment of Mars.

  15. Nanowire nanocomputer as a finite-state machine.

    PubMed

    Yao, Jun; Yan, Hao; Das, Shamik; Klemic, James F; Ellenbogen, James C; Lieber, Charles M

    2014-02-18

    Implementation of complex computer circuits assembled from the bottom up and integrated on the nanometer scale has long been a goal of electronics research. It requires a design and fabrication strategy that can address individual nanometer-scale electronic devices, while enabling large-scale assembly of those devices into highly organized, integrated computational circuits. We describe how such a strategy has led to the design, construction, and demonstration of a nanoelectronic finite-state machine. The system was fabricated using a design-oriented approach enabled by a deterministic, bottom-up assembly process that does not require individual nanowire registration. This methodology allowed construction of the nanoelectronic finite-state machine through modular design using a multitile architecture. Each tile/module consists of two interconnected crossbar nanowire arrays, with each cross-point consisting of a programmable nanowire transistor node. The nanoelectronic finite-state machine integrates 180 programmable nanowire transistor nodes in three tiles or six total crossbar arrays, and incorporates both sequential and arithmetic logic, with extensive intertile and intratile communication that exhibits rigorous input/output matching. Our system realizes the complete 2-bit logic flow and clocked control over state registration that are required for a finite-state machine or computer. The programmable multitile circuit was also reprogrammed to a functionally distinct 2-bit full adder with 32-set matched and complete logic output. These steps forward and the ability of our unique design-oriented deterministic methodology to yield more extensive multitile systems suggest that proposed general-purpose nanocomputers can be realized in the near future.

  16. Nanowire nanocomputer as a finite-state machine

    PubMed Central

    Yao, Jun; Yan, Hao; Das, Shamik; Klemic, James F.; Ellenbogen, James C.; Lieber, Charles M.

    2014-01-01

    Implementation of complex computer circuits assembled from the bottom up and integrated on the nanometer scale has long been a goal of electronics research. It requires a design and fabrication strategy that can address individual nanometer-scale electronic devices, while enabling large-scale assembly of those devices into highly organized, integrated computational circuits. We describe how such a strategy has led to the design, construction, and demonstration of a nanoelectronic finite-state machine. The system was fabricated using a design-oriented approach enabled by a deterministic, bottom–up assembly process that does not require individual nanowire registration. This methodology allowed construction of the nanoelectronic finite-state machine through modular design using a multitile architecture. Each tile/module consists of two interconnected crossbar nanowire arrays, with each cross-point consisting of a programmable nanowire transistor node. The nanoelectronic finite-state machine integrates 180 programmable nanowire transistor nodes in three tiles or six total crossbar arrays, and incorporates both sequential and arithmetic logic, with extensive intertile and intratile communication that exhibits rigorous input/output matching. Our system realizes the complete 2-bit logic flow and clocked control over state registration that are required for a finite-state machine or computer. The programmable multitile circuit was also reprogrammed to a functionally distinct 2-bit full adder with 32-set matched and complete logic output. These steps forward and the ability of our unique design-oriented deterministic methodology to yield more extensive multitile systems suggest that proposed general-purpose nanocomputers can be realized in the near future. PMID:24469812

  17. Prosthetic EMG control enhancement through the application of man-machine principles

    NASA Technical Reports Server (NTRS)

    Simcox, W. A.

    1977-01-01

    An area in medicine that appears suitable to man-machine principles is rehabilitation research, particularly when the motor aspects of the body are involved. If one considers the limb, whether functional or not, as the machine, the brain as the controller and the neuromuscular system as the man-machine interface, the human body is reduced to a man-machine system that can benefit from the principles behind such systems. The area of rehabilitation that this paper deals with is that of an arm amputee and his prosthetic device. Reducing this area to its man-machine basics, the problem becomes one of attaining natural multiaxis prosthetic control using Electromyographic activity (EMG) as the means of communication between man and prothesis. In order to use EMG as the communication channel it must be amplified and processed to yield a high information signal suitable for control. The most common processing scheme employed is termed Mean Value Processing. This technique for extracting the useful EMG signal consists of a differential to single ended conversion to the surface activity followed by a rectification and smoothing.

  18. Stress and Strain Distributions during Machining of Ti-6Al-4V at Ambient and Cryogenic Temperatures

    NASA Astrophysics Data System (ADS)

    Rahman, Md. Fahim

    Dry and liquid nitrogen pre-cooled Ti-6Al-4V samples were machined at a cutting speed of 43.2 m/min and at low (0.1 mm/rev) to high (0.4 mm/rev) feed rates for understanding the effects of temperature and strain rate on chip microstructures. During cryogenic machining, it was observed that between feed rates of 0.10 and 0.30 mm/rev, a 25% pressure reduction on tool occurred. Smaller number of chips and low tool/chip contact time and temperature were observed (compared to dry machining under ambient conditions). An in-situ set-up that consisted of a microscope and a lathe was constructed and helped to propose a novel serrated chip formation mechanism when microstructures (strain localization) and surface roughness were considered. Dimpled fracture surfaces observed in high-speed-machined chips were formed due to stable crack propagation that was also recorded during in-situ machining. An instability criterion was developed that showed easier strain localization within the 0.10-0.30mm/rev feed rate range.

  19. A micro-machined source transducer for a parametric array in air.

    PubMed

    Lee, Haksue; Kang, Daesil; Moon, Wonkyu

    2009-04-01

    Parametric array applications in air, such as highly directional parametric loudspeaker systems, usually rely on large radiators to generate the high-intensity primary beams required for nonlinear interactions. However, a conventional transducer, as a primary wave projector, requires a great deal of electrical power because its electroacoustic efficiency is very low due to the large characteristic mechanical impedance in air. The feasibility of a micro-machined ultrasonic transducer as an efficient finite-amplitude wave projector was studied. A piezoelectric micro-machined ultrasonic transducer array consisting of lead zirconate titanate uni-morph elements was designed and fabricated for this purpose. Theoretical and experimental evaluations showed that a micro-machined ultrasonic transducer array can be used as an efficient source transducer for a parametric array in air. The beam patterns and propagation curves of the difference frequency wave and the primary wave generated by the micro-machined ultrasonic transducer array were measured. Although the theoretical results were based on ideal parametric array models, the theoretical data explained the experimental results reasonably well. These experiments demonstrated the potential of micro-machined primary wave projector.

  20. Comparison of shear wave velocities on ultrasound elastography between different machines, transducers, and acquisition depths: a phantom study.

    PubMed

    Shin, Hyun Joo; Kim, Myung-Joon; Kim, Ha Yan; Roh, Yun Ho; Lee, Mi-Jung

    2016-10-01

    To investigate consistency in shear wave velocities (SWVs) on ultrasound elastography using different machines, transducers and acquisition depths. The SWVs were measured using an elasticity phantom with a Young's modulus of 16.9 kPa, with three recently introduced ultrasound elastography machines (A, B and C from different vendors) and two transducers (low and high frequencies) at four depths (2, 3, 4 and 5 cm). Mean SWVs from 15 measurements and coefficient of variations (CVs) were compared between three machines, two transducers and four acquisition depths. The SWVs using the high frequency transducer were not acquired at 5 cm depth in machine B, and a high frequency transducer was not available in machine C. The mean SWVs in the three machines were different (p ≤ 0.002). The CVs were 0-0.09 in three machines. The mean SWVs between the two transducers were different (p < 0.001) except at 4 and 5 cm depths in machine A. The SWVs were affected by the acquisition depths in all conditions (p < 0.001). There is considerable difference in SWVs on ultrasound elastography depending on different machines, transducers and acquisition depths. Caution is needed when using the cutoff values of SWVs in different conditions. • The shear wave velocities (SWVs) are different between different ultrasound elastography machines • The SWVs are also different between different transducers and acquisition depths • Caution is needed when using the cutoff SWVs measured under different conditions.

  1. Signatures of compact halos of sterile-neutrino dark matter

    NASA Astrophysics Data System (ADS)

    Kühnel, Florian; Ohlsson, Tommy

    2017-11-01

    We investigate compact halos of sterile-neutrino dark matter and examine observable signatures with respect to neutrino and photon emission. Primarily, we consider two cases: primordial black-hole halos and ultracompact minihalos. In both cases, we find that there exists a broad range of possible parameter choices such that detection in the near future with x-ray and gamma-ray telescopes might be well possible. In fact, for energies above 10 TeV, the neutrino telescope IceCube would be a splendid detection machine for such macroscopic dark-matter candidates.

  2. Land Vehicle Navigation ? A Worldwide Perspective

    NASA Astrophysics Data System (ADS)

    French, Robert L.

    This paper was presented at the NAV '90 conference and was first published in the Journal in 1991 (Vol. 44, p. 25). It is followed by comments from Christopher Querée.The future shakeout and consolidation of vehicle navigation technologies and systems approaches will occur primarily in the vehicle location, mobile data communications, and man/machine interface areas. Digital maps will not be directly affected because, although there is still a dearth of formal standards, there is already a high degree of uniformity among approaches being pursued in all parts of the world.

  3. Fourdrinier-Machine Tender (paper & pulp, wallboard) 539.782; Back Tender, Paper Machine (paper & pulp) 534.782--Technical Report on Development of USTES Aptitude Test Battery.

    ERIC Educational Resources Information Center

    Manpower Administration (DOL), Washington, DC. U.S. Training and Employment Service.

    The United States Training and Employment Service General Aptitude Test Battery (GATB), first published in 1947, has been included in a continuing program of research to validate the tests against success in many different occupations. The GATB consists of 12 tests which measure nine aptitudes: General Learning Ability; Verbal Aptitude; Numerical…

  4. Improving Energy Efficiency in CNC Machining

    NASA Astrophysics Data System (ADS)

    Pavanaskar, Sushrut S.

    We present our work on analyzing and improving the energy efficiency of multi-axis CNC milling process. Due to the differences in energy consumption behavior, we treat 3- and 5-axis CNC machines separately in our work. For 3-axis CNC machines, we first propose an energy model that estimates the energy requirement for machining a component on a specified 3-axis CNC milling machine. Our model makes machine-specific predictions of energy requirements while also considering the geometric aspects of the machining toolpath. Our model - and the associated software tool - facilitate direct comparison of various alternative toolpath strategies based on their energy-consumption performance. Further, we identify key factors in toolpath planning that affect energy consumption in CNC machining. We then use this knowledge to propose and demonstrate a novel toolpath planning strategy that may be used to generate new toolpaths that are inherently energy-efficient, inspired by research on digital micrography -- a form of computational art. For 5-axis CNC machines, the process planning problem consists of several sub-problems that researchers have traditionally solved separately to obtain an approximate solution. After illustrating the need to solve all sub-problems simultaneously for a truly optimal solution, we propose a unified formulation based on configuration space theory. We apply our formulation to solve a problem variant that retains key characteristics of the full problem but has lower dimensionality, allowing visualization in 2D. Given the complexity of the full 5-axis toolpath planning problem, our unified formulation represents an important step towards obtaining a truly optimal solution. With this work on the two types of CNC machines, we demonstrate that without changing the current infrastructure or business practices, machine-specific, geometry-based, customized toolpath planning can save energy in CNC machining.

  5. Information about Student Enrollment, College Staff and the Budget.

    ERIC Educational Resources Information Center

    College of the Canyons, Santa Clarita, CA. Office of Institutional Development.

    Consisting primarily of charts and tables, this report provides historical data on student enrollment, college staff, and the budget at California's College of the Canyons, focusing primarily on the period from 1990-94. The first section provides tables on student enrollment, including total headcount; enrollment by gender, age group,…

  6. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  7. Boosted Regression Trees Outperforms Support Vector Machines in Predicting (Regional) Yields of Winter Wheat from Single and Cumulated Dekadal Spot-VGT Derived Normalized Difference Vegetation Indices

    NASA Astrophysics Data System (ADS)

    Stas, Michiel; Dong, Qinghan; Heremans, Stien; Zhang, Beier; Van Orshoven, Jos

    2016-08-01

    This paper compares two machine learning techniques to predict regional winter wheat yields. The models, based on Boosted Regression Trees (BRT) and Support Vector Machines (SVM), are constructed of Normalized Difference Vegetation Indices (NDVI) derived from low resolution SPOT VEGETATION satellite imagery. Three types of NDVI-related predictors were used: Single NDVI, Incremental NDVI and Targeted NDVI. BRT and SVM were first used to select features with high relevance for predicting the yield. Although the exact selections differed between the prefectures, certain periods with high influence scores for multiple prefectures could be identified. The same period of high influence stretching from March to June was detected by both machine learning methods. After feature selection, BRT and SVM models were applied to the subset of selected features for actual yield forecasting. Whereas both machine learning methods returned very low prediction errors, BRT seems to slightly but consistently outperform SVM.

  8. Developing Lathing Parameters for PBX 9501

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodrum, Randall Brock

    This thesis presents the work performed on lathing PBX 9501 to gather and analyze cutting force and temperature data during the machining process. This data will be used to decrease federal-regulation-constrained machining time of the high explosive PBX 9501. The effects of machining parameters depth of cut, surface feet per minute, and inches per revolution on cutting force and cutting interface were evaluated. Cutting tools of tip radius 0.005 -inches and 0.05 -inches were tested to determine what effect the tool shape had on the machining process as well. A consistently repeatable relationship of temperature to changing depth of cutmore » and surface feet per minute is found, while only a weak dependence was found to changing inches per revolution. Results also show the relation of cutting force to depth of cut and inches per revolution, while weak dependence on SFM is found. Conclusions suggest rapid, shallow cuts optimize machining time for a billet of PBX 9501, while minimizing temperature increase and cutting force.« less

  9. Feasibility study of a brine boiling machine by solar energy

    NASA Astrophysics Data System (ADS)

    Phayom, W.

    2018-06-01

    This study presented the technical and operational feasibility of brine boiling machine by using solar energy instead of firewood or husk for salt production. The solar salt brine boiling machine consisted of a boiling chamber with an enhanced thermal efficiency through use of a solar brine heater. The stainless steel solar salt brine boiling chamber had dimensions of 60 cm x 70 cm x 20 cm. The steel brine heater had dimensions of 70 cm x 80 cm x 20 cm. The tilt angle of both the boiling chamber and brine heater was 20 degrees from horizontal. The brine temperature in the reservoir tank was 42°C with a flow rate of 6.64 L/h discharging into the solar boiling machine. It was found that the thermal efficiency and overall efficiency of the solar salt brine boiling machine were 0.63 and 0.38, respectively at a solar irradiance of 787.6 W/m2. The results shows that the potential of using solar energy for salt production system is feasible.

  10. The impact of machine learning techniques in the study of bipolar disorder: A systematic review.

    PubMed

    Librenza-Garcia, Diego; Kotzian, Bruno Jaskulski; Yang, Jessica; Mwangi, Benson; Cao, Bo; Pereira Lima, Luiza Nunes; Bermudez, Mariane Bagatin; Boeira, Manuela Vianna; Kapczinski, Flávio; Passos, Ives Cavalcante

    2017-09-01

    Machine learning techniques provide new methods to predict diagnosis and clinical outcomes at an individual level. We aim to review the existing literature on the use of machine learning techniques in the assessment of subjects with bipolar disorder. We systematically searched PubMed, Embase and Web of Science for articles published in any language up to January 2017. We found 757 abstracts and included 51 studies in our review. Most of the included studies used multiple levels of biological data to distinguish the diagnosis of bipolar disorder from other psychiatric disorders or healthy controls. We also found studies that assessed the prediction of clinical outcomes and studies using unsupervised machine learning to build more consistent clinical phenotypes of bipolar disorder. We concluded that given the clinical heterogeneity of samples of patients with BD, machine learning techniques may provide clinicians and researchers with important insights in fields such as diagnosis, personalized treatment and prognosis orientation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Modeling Gas and Gas Hydrate Accumulation in Marine Sediments Using a K-Nearest Neighbor Machine-Learning Technique

    NASA Astrophysics Data System (ADS)

    Wood, W. T.; Runyan, T. E.; Palmsten, M.; Dale, J.; Crawford, C.

    2016-12-01

    Natural Gas (primarily methane) and gas hydrate accumulations require certain bio-geochemical, as well as physical conditions, some of which are poorly sampled and/or poorly understood. We exploit recent advances in the prediction of seafloor porosity and heat flux via machine learning techniques (e.g. Random forests and Bayesian networks) to predict the occurrence of gas and subsequently gas hydrate in marine sediments. The prediction (actually guided interpolation) of key parameters we use in this study is a K-nearest neighbor technique. KNN requires only minimal pre-processing of the data and predictors, and requires minimal run-time input so the results are almost entirely data-driven. Specifically we use new estimates of sedimentation rate and sediment type, along with recently derived compaction modeling to estimate profiles of porosity and age. We combined the compaction with seafloor heat flux to estimate temperature with depth and geologic age, which, with estimates of organic carbon, and models of methanogenesis yield limits on the production of methane. Results include geospatial predictions of gas (and gas hydrate) accumulations, with quantitative estimates of uncertainty. The Generic Earth Modeling System (GEMS) we have developed to derive the machine learning estimates is modular and easily updated with new algorithms or data.

  12. KARL: A Knowledge-Assisted Retrieval Language. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Triantafyllopoulos, Spiros

    1985-01-01

    Data classification and storage are tasks typically performed by application specialists. In contrast, information users are primarily non-computer specialists who use information in their decision-making and other activities. Interaction efficiency between such users and the computer is often reduced by machine requirements and resulting user reluctance to use the system. This thesis examines the problems associated with information retrieval for non-computer specialist users, and proposes a method for communicating in restricted English that uses knowledge of the entities involved, relationships between entities, and basic English language syntax and semantics to translate the user requests into formal queries. The proposed method includes an intelligent dictionary, syntax and semantic verifiers, and a formal query generator. In addition, the proposed system has a learning capability that can improve portability and performance. With the increasing demand for efficient human-machine communication, the significance of this thesis becomes apparent. As human resources become more valuable, software systems that will assist in improving the human-machine interface will be needed and research addressing new solutions will be of utmost importance. This thesis presents an initial design and implementation as a foundation for further research and development into the emerging field of natural language database query systems.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randolph, Randall Blaine; Oertel, John A.; Schmidt, Derek William

    For this study, machined CH hemi-shell ablator capsules have been successfully produced by the MST-7 Target Fabrication Team at Los Alamos National Laboratory. Process development and micro-machining techniques have been developed to produce capsules for both the Omega and National Ignition Facility (NIF) campaigns. These capsules are gas filled up to 10 atm and consist of a machined plastic hemi-shell outer layer that accommodates various specially engineered low-density polystyrene foam cores. Machining and assembly of the two-part, step-jointed plastic hemi-shell outer layer required development of new techniques, processes, and tooling while still meeting very aggressive shot schedules for both campaigns.more » Finally, problems encountered and process improvements will be discussed that describe this very unique, complex capsule design approach through the first Omega proof-of-concept version to the larger NIF version.« less

  14. Dictionary of Basic Military Terms

    DTIC Science & Technology

    1965-04-01

    having nuclear charges. 101 ATOMNAYA SILOVAYA (ENERGEHCHESKAYA) KORA- BEL’NAYA (SUDOVAYA) USTANOVKA (atomic power plant for ship propulsion )- A special...atomic power plant for ship propulsion consists of an atomic "boiler," or reactor, a turbine (steam or gas), and electro- mechanical machinery. The...type, is mounted on a heay artillery tractor chassis. A high - speed trench-digging machine can dig trenches to a depth of 1.5 meters. The machine’s

  15. The National Longitudinal Study of the High School Class of 1972 (NLS-72), Fifth Follow-Up (1986) Data File [machine-readable data file].

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    This machine-readable data file (MDRF) contains information from the fifth follow-up survey of the National Longitudinal Survey of the High School Class of 1972. The survey was carried out along with the third survey of the High School and Beyond Study. The fifth follow-up data file consists of 12,841 records. The data tape contains information on…

  16. The Effect of a Brief Acceptance and Commitment Therapy Intervention on the Near-Miss Effect in Problem Gamblers

    ERIC Educational Resources Information Center

    Nastally, Becky L.; Dixon, Mark R.

    2012-01-01

    In the current study, 3 participants with a history of problem gambling were exposed to computerized slot machine play consisting of outcomes that depicted wins, losses, and near misses (2 out of 3 identical slot machine symbols). Participants were asked to rate each type of outcome in terms of its closeness to a win on a scale of 1 to 10 before…

  17. Rapid Assemblers for Voxel-Based VLSI Robotics

    DTIC Science & Technology

    2014-02-12

    relied on coin- cell batteries with high energy density, but low power density. Each of the actuators presented requires relatively high power...The device consists of a low power DC- DC low to high voltage converter operated by 4A cell batteries and an assembler, which is a grid of electrodes...design, simulate and fabricate complex 3D machines, as well as to repair, adapt and recycle existing machines, and to perform rigorous design

  18. Human Factors and Robotics: Current Status and Future Prospects.

    DTIC Science & Technology

    1981-10-01

    relatively simple "pick and place" machines which have mechanical arms and hands for transferring workpieces, and may be reprogrammable . Japan’s...tasks can consist of self-monitoring of activity or the control of other machines. Spot welding in the manufacture of automobiles represents probably...application alone. On an automobile production line, the robot must be able to remember several different body styles (e.g., 2-door versus 4-door) with

  19. FEMA and RAM Analysis for the Multi Canister Overpack (MCO) Handling Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SWENSON, C.E.

    2000-06-01

    The Failure Modes and Effects Analysis and the Reliability, Availability, and Maintainability Analysis performed for the Multi-Canister Overpack Handling Machine (MHM) has shown that the current design provides for a safe system, but the reliability of the system (primarily due to the complexity of the interlocks and permissive controls) is relatively low. No specific failure modes were identified where significant consequences to the public occurred, or where significant impact to nearby workers should be expected. The overall reliability calculation for the MHM shows a 98.1 percent probability of operating for eight hours without failure, and an availability of the MHMmore » of 90 percent. The majority of the reliability issues are found in the interlocks and controls. The availability of appropriate spare parts and maintenance personnel, coupled with well written operating procedures, will play a more important role in successful mission completion for the MHM than other less complicated systems.« less

  20. Industry's tireless eyes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-08-01

    This article reports that there are literally hundreds of machine vision systems from which to choose. They range in cost from $10,000 to $1,000,000. Most have been designed for specific applications; the same systems if used for a different application may fail dismally. How can you avoid wasting money on inferior, useless, or nonexpandable systems. A good reference is the Automated Vision Association in Ann Arbor, Mich., a trade group comprised of North American machine vision manufacturers. Reputable suppliers caution users to do their homework before making an investment. Important considerations include comprehensive details on the objects to be viewed-thatmore » is, quantity, shape, dimension, size, and configuration details; lighting characteristics and variations; component orientation details. Then, what do you expect the system to do-inspect, locate components, aid in robotic vision. Other criteria include system speed and related accuracy and reliability. What are the projected benefits and system paybacks.. Examine primarily paybacks associated with scrap and rework reduction as well as reduced warranty costs.« less

  1. Exploring Deep Learning and Sparse Matrix Format Selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Y.; Liao, C.; Shen, X.

    We proposed to explore the use of Deep Neural Networks (DNN) for addressing the longstanding barriers. The recent rapid progress of DNN technology has created a large impact in many fields, which has significantly improved the prediction accuracy over traditional machine learning techniques in image classifications, speech recognitions, machine translations, and so on. To some degree, these tasks resemble the decision makings in many HPC tasks, including the aforementioned format selection for SpMV and linear solver selection. For instance, sparse matrix format selection is akin to image classification—such as, to tell whether an image contains a dog or a cat;more » in both problems, the right decisions are primarily determined by the spatial patterns of the elements in an input. For image classification, the patterns are of pixels, and for sparse matrix format selection, they are of non-zero elements. DNN could be naturally applied if we regard a sparse matrix as an image and the format selection or solver selection as classification problems.« less

  2. APS deposition facility upgrades and future plans

    NASA Astrophysics Data System (ADS)

    Conley, Ray; Shi, Bing; Erdmann, Mark; Izzo, Scott; Assoufid, Lahsen; Goetze, Kurt; Mooney, Tim; Lauer, Kenneth

    2014-09-01

    The Advanced Photon Source (APS) has recently invested resources to upgrade or replace aging deposition systems with modern equipment. Of the three existing deposition systems, one will receive an upgrade, while two are being replaced. A design which adds a three-substrate planetary for the APS rotary deposition system is almost complete. The replacement for the APS large deposition system, dubbed the "Modular Deposition System", has been conceptually designed and is in the procurement process. Eight cathodes will sputter horizontally on mirrors up to 1.5 meters in length. This new instrument is designed to interface with ion-milling instruments and various metrology equipment for ion-beam figuring. A third linear machine, called the APS Profile Coating System, has two cathodes and is designed to accept substrates up to 200mm in length. While this machine is primarily intended for fabrication of figured KB mirrors using the profile-coating technique, it has also been used to produce multilayer monochromators for beamline use.

  3. Transparent process migration: Design alternatives and the Sprite implementation

    NASA Technical Reports Server (NTRS)

    Douglis, Fred; Ousterhout, John

    1991-01-01

    The Sprite operating system allows executing processes to be moved between hosts at any time. We use this process migration mechanism to offload work onto idle machines, and also to evict migrated processes when idle workstations are reclaimed by their owners. Sprite's migration mechanism provides a high degree of transparency both for migrated processes and for users. Idle machines are identified, and eviction is invoked, automatically by daemon processes. On Sprite it takes up to a few hundred milliseconds on SPARCstation 1 workstations to perform a remote exec, while evictions typically occur in a few seconds. The pmake program uses remote invocation to invoke tasks concurrently. Compilations commonly obtain speedup factors in the range of three to six; they are limited primarily by contention for centralized resources such as file servers. CPU-bound tasks such as simulations can make more effective use of idle hosts, obtaining as much as eight-fold speedup over a period of hours. Process migration has been in regular service for over two years.

  4. The Performance of the NAS HSPs in 1st Half of 1994

    NASA Technical Reports Server (NTRS)

    Bergeron, Robert J.; Walter, Howard (Technical Monitor)

    1995-01-01

    During the first six months of 1994, the NAS (National Airspace System) 16-CPU Y-MP C90 Von Neumann (VN) delivered an average throughput of 4.045 GFLOPS while the ACSF (Aeronautics Consolidated Supercomputer Facility) 8-CPU Y-MP C90 Eagle averaged 1.658 GFLOPS. The VN rate represents a machine efficiency of 26.3% whereas the Eagle rate corresponds to a machine efficiency of 21.6%. VN displayed a greater efficiency than Eagle primarily because the stronger workload demand for its CPU cycles allowed it to devote more time to user programs and less time to idle. An additional factor increasing VN efficiency was the ability of the UNICOS 8.0 Operating System to deliver a larger fraction of CPU time to user programs. Although measurements indicate increasing vector length for both workloads, insufficient vector lengths continue to hinder HSP (High Speed Processor) performance. To improve HSP performance, NAS should continue to encourage the HSP users to modify their codes to increase program vector length.

  5. Predicting and explaining inflammation in Crohn's disease patients using predictive analytics methods and electronic medical record data.

    PubMed

    Reddy, Bhargava K; Delen, Dursun; Agrawal, Rupesh K

    2018-01-01

    Crohn's disease is among the chronic inflammatory bowel diseases that impact the gastrointestinal tract. Understanding and predicting the severity of inflammation in real-time settings is critical to disease management. Extant literature has primarily focused on studies that are conducted in clinical trial settings to investigate the impact of a drug treatment on the remission status of the disease. This research proposes an analytics methodology where three different types of prediction models are developed to predict and to explain the severity of inflammation in patients diagnosed with Crohn's disease. The results show that machine-learning-based analytic methods such as gradient boosting machines can predict the inflammation severity with a very high accuracy (area under the curve = 92.82%), followed by regularized regression and logistic regression. According to the findings, a combination of baseline laboratory parameters, patient demographic characteristics, and disease location are among the strongest predictors of inflammation severity in Crohn's disease patients.

  6. Can a Smartphone Diagnose Parkinson Disease? A Deep Neural Network Method and Telediagnosis System Implementation.

    PubMed

    Zhang, Y N

    2017-01-01

    Parkinson's disease (PD) is primarily diagnosed by clinical examinations, such as walking test, handwriting test, and MRI diagnostic. In this paper, we propose a machine learning based PD telediagnosis method for smartphone. Classification of PD using speech records is a challenging task owing to the fact that the classification accuracy is still lower than doctor-level. Here we demonstrate automatic classification of PD using time frequency features, stacked autoencoders (SAE), and K nearest neighbor (KNN) classifier. KNN classifier can produce promising classification results from useful representations which were learned by SAE. Empirical results show that the proposed method achieves better performance with all tested cases across classification tasks, demonstrating machine learning capable of classifying PD with a level of competence comparable to doctor. It concludes that a smartphone can therefore potentially provide low-cost PD diagnostic care. This paper also gives an implementation on browser/server system and reports the running time cost. Both advantages and disadvantages of the proposed telediagnosis system are discussed.

  7. Can a Smartphone Diagnose Parkinson Disease? A Deep Neural Network Method and Telediagnosis System Implementation

    PubMed Central

    2017-01-01

    Parkinson's disease (PD) is primarily diagnosed by clinical examinations, such as walking test, handwriting test, and MRI diagnostic. In this paper, we propose a machine learning based PD telediagnosis method for smartphone. Classification of PD using speech records is a challenging task owing to the fact that the classification accuracy is still lower than doctor-level. Here we demonstrate automatic classification of PD using time frequency features, stacked autoencoders (SAE), and K nearest neighbor (KNN) classifier. KNN classifier can produce promising classification results from useful representations which were learned by SAE. Empirical results show that the proposed method achieves better performance with all tested cases across classification tasks, demonstrating machine learning capable of classifying PD with a level of competence comparable to doctor. It concludes that a smartphone can therefore potentially provide low-cost PD diagnostic care. This paper also gives an implementation on browser/server system and reports the running time cost. Both advantages and disadvantages of the proposed telediagnosis system are discussed. PMID:29075547

  8. Prediction of outcome in internet-delivered cognitive behaviour therapy for paediatric obsessive-compulsive disorder: A machine learning approach.

    PubMed

    Lenhard, Fabian; Sauer, Sebastian; Andersson, Erik; Månsson, Kristoffer Nt; Mataix-Cols, David; Rück, Christian; Serlachius, Eva

    2018-03-01

    There are no consistent predictors of treatment outcome in paediatric obsessive-compulsive disorder (OCD). One reason for this might be the use of suboptimal statistical methodology. Machine learning is an approach to efficiently analyse complex data. Machine learning has been widely used within other fields, but has rarely been tested in the prediction of paediatric mental health treatment outcomes. To test four different machine learning methods in the prediction of treatment response in a sample of paediatric OCD patients who had received Internet-delivered cognitive behaviour therapy (ICBT). Participants were 61 adolescents (12-17 years) who enrolled in a randomized controlled trial and received ICBT. All clinical baseline variables were used to predict strictly defined treatment response status three months after ICBT. Four machine learning algorithms were implemented. For comparison, we also employed a traditional logistic regression approach. Multivariate logistic regression could not detect any significant predictors. In contrast, all four machine learning algorithms performed well in the prediction of treatment response, with 75 to 83% accuracy. The results suggest that machine learning algorithms can successfully be applied to predict paediatric OCD treatment outcome. Validation studies and studies in other disorders are warranted. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Vowel Imagery Decoding toward Silent Speech BCI Using Extreme Learning Machine with Electroencephalogram

    PubMed Central

    Kim, Jongin; Park, Hyeong-jun

    2016-01-01

    The purpose of this study is to classify EEG data on imagined speech in a single trial. We recorded EEG data while five subjects imagined different vowels, /a/, /e/, /i/, /o/, and /u/. We divided each single trial dataset into thirty segments and extracted features (mean, variance, standard deviation, and skewness) from all segments. To reduce the dimension of the feature vector, we applied a feature selection algorithm based on the sparse regression model. These features were classified using a support vector machine with a radial basis function kernel, an extreme learning machine, and two variants of an extreme learning machine with different kernels. Because each single trial consisted of thirty segments, our algorithm decided the label of the single trial by selecting the most frequent output among the outputs of the thirty segments. As a result, we observed that the extreme learning machine and its variants achieved better classification rates than the support vector machine with a radial basis function kernel and linear discrimination analysis. Thus, our results suggested that EEG responses to imagined speech could be successfully classified in a single trial using an extreme learning machine with a radial basis function and linear kernel. This study with classification of imagined speech might contribute to the development of silent speech BCI systems. PMID:28097128

  10. The universal numbers. From Biology to Physics.

    PubMed

    Marchal, Bruno

    2015-12-01

    I will explain how the mathematicians have discovered the universal numbers, or abstract computer, and I will explain some abstract biology, mainly self-reproduction and embryogenesis. Then I will explain how and why, and in which sense, some of those numbers can dream and why their dreams can glue together and must, when we assume computationalism in cognitive science, generate a phenomenological physics, as part of a larger phenomenological theology (in the sense of the greek theologians). The title should have been "From Biology to Physics, through the Phenomenological Theology of the Universal Numbers", if that was not too long for a title. The theology will consist mainly, like in some (neo)platonist greek-indian-chinese tradition, in the truth about numbers' relative relations, with each others, and with themselves. The main difference between Aristotle and Plato is that Aristotle (especially in its common and modern christian interpretation) makes reality WYSIWYG (What you see is what you get: reality is what we observe, measure, i.e. the natural material physical science) where for Plato and the (rational) mystics, what we see might be only the shadow or the border of something else, which might be non physical (mathematical, arithmetical, theological, …). Since Gödel, we know that Truth, even just the Arithmetical Truth, is vastly bigger than what the machine can rationally justify. Yet, with Church's thesis, and the mechanizability of the diagonalizations involved, machines can apprehend this and can justify their limitations, and get some sense of what might be true beyond what they can prove or justify rationally. Indeed, the incompleteness phenomenon introduces a gap between what is provable by some machine and what is true about that machine, and, as Gödel saw already in 1931, the existence of that gap is accessible to the machine itself, once it is has enough provability abilities. Incompleteness separates truth and provable, and machines can justify this in some way. More importantly incompleteness entails the distinction between many intensional variants of provability. For example, the absence of reflexion (beweisbar(⌜A⌝) → A with beweisbar being Gödel's provability predicate) makes it impossible for the machine's provability to obey the axioms usually taken for a theory of knowledge. The most important consequence of this in the machine's possible phenomenology is that it provides sense, indeed arithmetical sense, to intensional variants of provability, like the logics of provability-and-truth, which at the propositional level can be mirrored by the logic of provable-and-true statements (beweisbar(⌜A⌝) ∧ A). It is incompleteness which makes this logic different from the logic of provability. Other variants, like provable-and-consistent, or provable-and-consistent-and-true, appears in the same way, and inherits the incompleteness splitting, unlike beweisbar(⌜A⌝) ∧ A. I will recall thought experience which motivates the use of those intensional variants to associate a knower and an observer in some canonical way to the machines or the numbers. We will in this way get an abstract and phenomenological theology of a machine M through the true logics of their true self-referential abilities (even if not provable, or knowable, by the machine itself), in those different intensional senses. Cognitive science and theoretical physics motivate the study of those logics with the arithmetical interpretation of the atomic sentences restricted to the "verifiable" (Σ1) sentences, which is the way to study the theology of the computationalist machine. This provides a logic of the observable, as expected by the Universal Dovetailer Argument, which will be recalled briefly, and which can lead to a comparison of the machine's logic of physics with the empirical logic of the physicists (like quantum logic). This leads also to a series of open problems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A study of the effectiveness of machine learning methods for classification of clinical interview fragments into a large number of categories.

    PubMed

    Hasan, Mehedi; Kotov, Alexander; Carcone, April; Dong, Ming; Naar, Sylvie; Hartlieb, Kathryn Brogan

    2016-08-01

    This study examines the effectiveness of state-of-the-art supervised machine learning methods in conjunction with different feature types for the task of automatic annotation of fragments of clinical text based on codebooks with a large number of categories. We used a collection of motivational interview transcripts consisting of 11,353 utterances, which were manually annotated by two human coders as the gold standard, and experimented with state-of-art classifiers, including Naïve Bayes, J48 Decision Tree, Support Vector Machine (SVM), Random Forest (RF), AdaBoost, DiscLDA, Conditional Random Fields (CRF) and Convolutional Neural Network (CNN) in conjunction with lexical, contextual (label of the previous utterance) and semantic (distribution of words in the utterance across the Linguistic Inquiry and Word Count dictionaries) features. We found out that, when the number of classes is large, the performance of CNN and CRF is inferior to SVM. When only lexical features were used, interview transcripts were automatically annotated by SVM with the highest classification accuracy among all classifiers of 70.8%, 61% and 53.7% based on the codebooks consisting of 17, 20 and 41 codes, respectively. Using contextual and semantic features, as well as their combination, in addition to lexical ones, improved the accuracy of SVM for annotation of utterances in motivational interview transcripts with a codebook consisting of 17 classes to 71.5%, 74.2%, and 75.1%, respectively. Our results demonstrate the potential of using machine learning methods in conjunction with lexical, semantic and contextual features for automatic annotation of clinical interview transcripts with near-human accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Information about Student Enrollment, College Staff and the Budget.

    ERIC Educational Resources Information Center

    College of the Canyons, Santa Clarita, CA. Office of Institutional Development.

    Consisting primarily of charts and tables, this report provides historical data on student enrollment, college staff, and the budget at California's College of the Canyons, focusing primarily on the period from 1991 to 1995. The first section provides tables on student enrollment, including total headcount; enrollment by full-/part-time status,…

  13. Espresso coffee foam delays cooling of the liquid phase.

    PubMed

    Arii, Yasuhiro; Nishizawa, Kaho

    2017-04-01

    Espresso coffee foam, called crema, is known to be a marker of the quality of espresso coffee extraction. However, the role of foam in coffee temperature has not been quantitatively clarified. In this study, we used an automatic machine for espresso coffee extraction. We evaluated whether the foam prepared using the machine was suitable for foam analysis. After extraction, the percentage and consistency of the foam were measured using various techniques, and changes in the foam volume were tracked over time. Our extraction method, therefore, allowed consistent preparation of high-quality foam. We also quantitatively determined that the foam phase slowed cooling of the liquid phase after extraction. High-quality foam plays an important role in delaying the cooling of espresso coffee.

  14. Use of computer systems and process information for blast furnace operations at U. S. Steel, Gary Works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sherman, G.J.; Zmierski, M.L.

    1994-09-01

    US Steel Iron Producing Div. consists of four operating blast furnaces ranging in process control capabilities from 1950's and 1960's era hardware to state of the art technology. The oldest control system consists of a large number of panels containing numerous relays, indicating lights, selector switches, push buttons, analog controllers, strip chart recorders and annunciators. In contrast, the state of the art control system utilizes remote I/O, two sets of redundant PLC's, redundant charge director computer, redundant distributed control system, high resolution video-graphic display system and supervisory computer for real-time data acquisition. Process data are collected and archived on twomore » DEC VAX computers, one for No. 13 blast furnace and the other for the three south end furnaces. Historical trending, data analysis and reporting are available to iron producing personnel through terminals and PC's connected directly to the systems, dial-up modems and various network configurations. These two machines are part of the iron producing network which allows them to pass and receive information from each other as well as numerous other sources throughout the division. This configuration allows personnel to access most pertinent furnace information from a single source. The basic objective of the control systems is to charge raw materials to the top of the furnace at aim weights and sequence, while maintaining blast conditions at the bottom of the furnace at required temperature, pressure and composition. Control changes by the operators are primarily supervisory based on review of system generated plots and tables.« less

  15. Effet de l'usinage sur les proprietes mecaniques en tension et controle non-destructif des materiaux composites

    NASA Astrophysics Data System (ADS)

    Genereux, Louis-Alexandre

    The main goal of this work is to evaluate the impact of milling operations on the integrity of unidirectional carbon/epoxy laminate. Milling, often used for finishing composite structures, cause some damage in the form of craters, cracks and thermal damage to the matrix. Here, two approaches are used to qualify and quantify the amount of damage. First, two nondestructive testing methods, namely immersion ultrasonic inspection and pulsed thermography, are evaluated on samples with artificial defects. These techniques are then used on machined samples with realistic machining damages. Only ultrasounds allowed the detection and quantification of the machining damages, but only if the damages are at the surface of the laminate. The depth of damage depends primarily on the fiber orientation of the first ply with respect to the cutting direction. The ultrasonic inspections are also accompanied by scanning electron microscope observations. The second approach is to check whether the presence of the machining damage will affect the mechanical properties of the laminate. To do this, static tensile tests are performed on samples prepared by three different methods, namely, by abrasive diamond saw, by saw cut followed by sanding and finally by milling. The results show that the damages caused by the milling operation are not important enough to affect the ultimate stress and elastic modulus. Despite this, it would be interesting, for future works, to investigate this aspect in fatigue rather than with static tests. The presence of damages on the edge might promote delamination during cyclic loads.

  16. UPEML Version 3.0: A machine-portable CDC update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Haill, T.A.

    1992-04-01

    UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less

  17. UPEML Version 3. 0: A machine-portable CDC update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Haill, T.A.

    1992-04-01

    UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less

  18. 36 CFR 910.2 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: public improvements construction and square development. Public improvements construction consists of... configuration, and pedestrian amenities. Square development consists of design and construction of development projects primarily on city blocks, known as squares, within the Development Area. These development...

  19. 36 CFR 910.2 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: public improvements construction and square development. Public improvements construction consists of... configuration, and pedestrian amenities. Square development consists of design and construction of development projects primarily on city blocks, known as squares, within the Development Area. These development...

  20. 36 CFR 910.2 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: public improvements construction and square development. Public improvements construction consists of... configuration, and pedestrian amenities. Square development consists of design and construction of development projects primarily on city blocks, known as squares, within the Development Area. These development...

  1. 36 CFR 910.2 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: public improvements construction and square development. Public improvements construction consists of... configuration, and pedestrian amenities. Square development consists of design and construction of development projects primarily on city blocks, known as squares, within the Development Area. These development...

  2. A Solution Method of Scheduling Problem with Worker Allocation by a Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Osawa, Akira; Ida, Kenichi

    In a scheduling problem with worker allocation (SPWA) proposed by Iima et al, the worker's skill level to each machine is all the same. However, each worker has a different skill level for each machine in the real world. For that reason, we propose a new model of SPWA in which a worker has the different skill level to each machine. To solve the problem, we propose a new GA for SPWA consisting of the following new three procedures, shortening of idle time, modifying infeasible solution to feasible solution, and a new selection method for GA. The effectiveness of the proposed algorithm is clarified by numerical experiments using benchmark problems for job-shop scheduling.

  3. Mind, Machine, and Creativity: An Artist's Perspective.

    PubMed

    Sundararajan, Louise

    2014-06-01

    Harold Cohen is a renowned painter who has developed a computer program, AARON, to create art. While AARON has been hailed as one of the most creative AI programs, Cohen consistently rejects the claims of machine creativity. Questioning the possibility for AI to model human creativity, Cohen suggests in so many words that the human mind takes a different route to creativity, a route that privileges the relational, rather than the computational, dimension of cognition. This unique perspective on the tangled web of mind, machine, and creativity is explored by an application of three relational models of the mind to an analysis of Cohen's talks and writings, which are available on his website: www.aaronshome.com.

  4. Mind, Machine, and Creativity: An Artist's Perspective

    PubMed Central

    Sundararajan, Louise

    2014-01-01

    Harold Cohen is a renowned painter who has developed a computer program, AARON, to create art. While AARON has been hailed as one of the most creative AI programs, Cohen consistently rejects the claims of machine creativity. Questioning the possibility for AI to model human creativity, Cohen suggests in so many words that the human mind takes a different route to creativity, a route that privileges the relational, rather than the computational, dimension of cognition. This unique perspective on the tangled web of mind, machine, and creativity is explored by an application of three relational models of the mind to an analysis of Cohen's talks and writings, which are available on his website: www.aaronshome.com. PMID:25541564

  5. Joint optimization of maintenance, buffers and machines in manufacturing lines

    NASA Astrophysics Data System (ADS)

    Nahas, Nabil; Nourelfath, Mustapha

    2018-01-01

    This article considers a series manufacturing line composed of several machines separated by intermediate buffers of finite capacity. The goal is to find the optimal number of preventive maintenance actions performed on each machine, the optimal selection of machines and the optimal buffer allocation plan that minimize the total system cost, while providing the desired system throughput level. The mean times between failures of all machines are assumed to increase when applying periodic preventive maintenance. To estimate the production line throughput, a decomposition method is used. The decision variables in the formulated optimal design problem are buffer levels, types of machines and times between preventive maintenance actions. Three heuristic approaches are developed to solve the formulated combinatorial optimization problem. The first heuristic consists of a genetic algorithm, the second is based on the nonlinear threshold accepting metaheuristic and the third is an ant colony system. The proposed heuristics are compared and their efficiency is shown through several numerical examples. It is found that the nonlinear threshold accepting algorithm outperforms the genetic algorithm and ant colony system, while the genetic algorithm provides better results than the ant colony system for longer manufacturing lines.

  6. University of Maryland walking robot: A design project for undergraduate students

    NASA Technical Reports Server (NTRS)

    Olsen, Bob; Bielec, Jim; Hartsig, Dave; Oliva, Mani; Grotheer, Phil; Hekmat, Morad; Russell, David; Tavakoli, Hossein; Young, Gary; Nave, Tom

    1990-01-01

    The design and construction required that the walking robot machine be capable of completing a number of tasks including walking in a straight line, turning to change direction, and maneuvering over an obstable such as a set of stairs. The machine consists of two sets of four telescoping legs that alternately support the entire structure. A gear-box and crank-arm assembly is connected to the leg sets to provide the power required for the translational motion of the machine. By retracting all eight legs, the robot comes to rest on a central Bigfoot support. Turning is accomplished by rotating the machine about this support. The machine can be controlled by using either a user operated remote tether or the on-board computer for the execution of control commands. Absolute encoders are attached to all motors (leg, main drive, and Bigfoot) to provide the control computer with information regarding the status of the motors (up-down motion, forward or reverse rotation). Long and short range infrared sensors provide the computer with feedback information regarding the machine's relative position to a series of stripes and reflectors. These infrared sensors simulate how the robot might sense and gain information about the environment of Mars.

  7. Hybrid Power Management for Office Equipment

    NASA Astrophysics Data System (ADS)

    Gingade, Ganesh P.

    Office machines (such as printers, scanners, fax, and copiers) can consume significant amounts of power. Few studies have been devoted to power management of office equipment. Most office machines have sleep modes to save power. Power management of these machines are usually timeout-based: a machine sleeps after being idle long enough. Setting the timeout duration can be difficult: if it is too long, the machine wastes power during idleness. If it is too short, the machine sleeps too soon and too often--the wakeup delay can significantly degrade productivity. Thus, power management is a tradeoff between saving energy and keeping short response time. Many power management policies have been published and one policy may outperform another in some scenarios. There is no definite conclusion which policy is always better. This thesis describes two methods for office equipment power management. The first method adaptively reduces power based on a constraint of the wakeup delay. The second method is a hybrid with multiple candidate policies and it selects the most appropriate power management policy. Using six months of request traces from 18 different offices, we demonstrate that the hybrid policy outperforms individual policies. We also discover that power management based on business hours does not produce consistent energy savings.

  8. Tool life and surface integrity aspects when drilling nickel alloy

    NASA Astrophysics Data System (ADS)

    Kannan, S.; Pervaiz, S.; Vincent, S.; Karthikeyan, R.

    2018-04-01

    Nickel based super alloys manufactured through powder metallurgy (PM) route are required to increase the operational efficiency of gas turbine engines. They are material of choice for high pressure components due to their superior high temperature strength, excellent corrosion, oxidation and creep resistance. This unique combination of mechanical and thermal properties makes them even more difficult-to-machine. In this paper, the hole making process using coated carbide inserts by drilling and plunge milling for a nickel-based powder metallurgy super alloy has been investigated. Tool life and process capability studies were conducted using optimized process parameters using high pressure coolants. The experimental trials were directed towards an assessment of the tendency for surface malformations and detrimental residual stress profiles. Residual stresses in both the radial and circumferential directions have been evaluated as a function of depth from the machined surface using the target strain gauge / center hole drilling method. Circumferential stresses near workpiece surface and at depth of 512 µm in the starting material was primarily circumferential compression which was measured to be average of –404 MPa. However, the radial stresses near workpiece surface was tensile and transformed to be compressive in nature at depth of 512 µm in the starting material (average: -87 Mpa). The magnitude and the depth below the machined surface in both radial and circumferential directions were primarily tensile in nature which increased with hole number due to a rise of temperature at the tool–workpiece interface with increasing tool wear. These profiles are of critical importance for the selection of cutting strategies to ensure avoidance/minimization of tensile residual stresses that can be detrimental to the fatigue performance of the components. These results clearly show a tendency for the circumferential stresses to be more tensile than the radial stresses. Overall the results indicate that the effect of drilling and milling parameters is most marked in terms of surface quality in the circumferential direction. Material removal rates and tool flank wear must be maintained within the control limits to maintain hole integrity.

  9. Toward transient finite element simulation of thermal deformation of machine tools in real-time

    NASA Astrophysics Data System (ADS)

    Naumann, Andreas; Ruprecht, Daniel; Wensch, Joerg

    2018-01-01

    Finite element models without simplifying assumptions can accurately describe the spatial and temporal distribution of heat in machine tools as well as the resulting deformation. In principle, this allows to correct for displacements of the Tool Centre Point and enables high precision manufacturing. However, the computational cost of FE models and restriction to generic algorithms in commercial tools like ANSYS prevents their operational use since simulations have to run faster than real-time. For the case where heat diffusion is slow compared to machine movement, we introduce a tailored implicit-explicit multi-rate time stepping method of higher order based on spectral deferred corrections. Using the open-source FEM library DUNE, we show that fully coupled simulations of the temperature field are possible in real-time for a machine consisting of a stock sliding up and down on rails attached to a stand.

  10. Discovering Fine-grained Sentiment in Suicide Notes

    PubMed Central

    Wang, Wenbo; Chen, Lu; Tan, Ming; Wang, Shaojun; Sheth, Amit P.

    2012-01-01

    This paper presents our solution for the i2b2 sentiment classification challenge. Our hybrid system consists of machine learning and rule-based classifiers. For the machine learning classifier, we investigate a variety of lexical, syntactic and knowledge-based features, and show how much these features contribute to the performance of the classifier through experiments. For the rule-based classifier, we propose an algorithm to automatically extract effective syntactic and lexical patterns from training examples. The experimental results show that the rule-based classifier outperforms the baseline machine learning classifier using unigram features. By combining the machine learning classifier and the rule-based classifier, the hybrid system gains a better trade-off between precision and recall, and yields the highest micro-averaged F-measure (0.5038), which is better than the mean (0.4875) and median (0.5027) micro-average F-measures among all participating teams. PMID:22879770

  11. Be Stars in the Open Cluster NGC 6830

    NASA Astrophysics Data System (ADS)

    Yu, Po-Chieh; Lin, Chien-Cheng; Lin, Hsing-Wen; Lee, Chien-De; Konidaris, Nick; Ngeow, Chow-Choong; Ip, Wing-Huen; Chen, Wen-Ping; Chen, Hui-Chen; Malkan, Matthew A.; Chang, Chan-Kao; Laher, Russ; Huang, Li-Ching; Cheng, Yu-Chi; Edelson, Rick; Ritter, Andreas; Quimby, Robert; Ben-Ami, Sagi; Ofek, Eran. O.; Surace, Jason; Kulkarni, Shrinivas R.

    2016-05-01

    We report the discovery of two new Be stars, and re-identify one known Be star in the open cluster NGC 6830. Eleven Hα emitters were discovered using the Hα imaging photometry of the Palomar Transient Factory Survey. Stellar membership of the candidates was verified with photometric and kinematic information using 2MASS data and proper motions. The spectroscopic confirmation was carried out by using the Shane 3 m telescope at the Lick observatory. Based on their spectral types, three Hα emitters were confirmed as Be stars with Hα equivalent widths greater than -10 Å. Two objects were also observed by the new spectrograph spectral energy distribution-machine (SED-machine) on the Palomar 60-inch Telescope. The SED-machine results show strong Hα emission lines, which are consistent with the results of the Lick observations. The high efficiency of the SED-machine can provide rapid observations for Be stars in a comprehensive survey in the future.

  12. Coal-Quality Information - Key to the Efficient and Environmentally Sound Use of Coal

    USGS Publications Warehouse

    Finkleman, Robert B.

    1997-01-01

    The rock that we refer to as coal is derived principally from decomposed organic matter (plants) consisting primarily of the element carbon. When coal is burned, it produces energy in the form of heat, which is used to power machines such as steam engines or to drive turbines that produce electricity. Almost 60 percent of the electricity produced in the United States is derived from coal combustion. Coal is an extraordinarily complex material. In addition to organic matter, coal contains water (up to 40 or more percent by weight for some lignitic coals), oils, gases (such as methane), waxes (used to make shoe polish), and perhaps most importantly, inorganic matter (fig. 1). The inorganic matter--minerals and trace elements--cause many of the health, environmental, and technological problems attributed to coal use (fig. 2). 'Coal quality' is the term used to refer to the properties and characteristics of coal that influence its behavior and use. Among the coal-quality characteristics that will be important for future coal use are the concentrations, distribution, and forms of the many elements contained in the coal that we intend to burn. Knowledge of these quality characteristics in U.S. coal deposits may allow us to use this essential energy resource more efficiently and effectively and with less undesirable environmental impact.

  13. Decoding of top-down cognitive processing for SSVEP-controlled BMI

    PubMed Central

    Min, Byoung-Kyong; Dähne, Sven; Ahn, Min-Hee; Noh, Yung-Kyun; Müller, Klaus-Robert

    2016-01-01

    We present a fast and accurate non-invasive brain-machine interface (BMI) based on demodulating steady-state visual evoked potentials (SSVEPs) in electroencephalography (EEG). Our study reports an SSVEP-BMI that, for the first time, decodes primarily based on top-down and not bottom-up visual information processing. The experimental setup presents a grid-shaped flickering line array that the participants observe while intentionally attending to a subset of flickering lines representing the shape of a letter. While the flickering pixels stimulate the participant’s visual cortex uniformly with equal probability, the participant’s intention groups the strokes and thus perceives a ‘letter Gestalt’. We observed decoding accuracy of 35.81% (up to 65.83%) with a regularized linear discriminant analysis; on average 2.05-fold, and up to 3.77-fold greater than chance levels in multi-class classification. Compared to the EEG signals, an electrooculogram (EOG) did not significantly contribute to decoding accuracies. Further analysis reveals that the top-down SSVEP paradigm shows the most focalised activation pattern around occipital visual areas; Granger causality analysis consistently revealed prefrontal top-down control over early visual processing. Taken together, the present paradigm provides the first neurophysiological evidence for the top-down SSVEP BMI paradigm, which potentially enables multi-class intentional control of EEG-BMIs without using gaze-shifting. PMID:27808125

  14. Decoding of top-down cognitive processing for SSVEP-controlled BMI

    NASA Astrophysics Data System (ADS)

    Min, Byoung-Kyong; Dähne, Sven; Ahn, Min-Hee; Noh, Yung-Kyun; Müller, Klaus-Robert

    2016-11-01

    We present a fast and accurate non-invasive brain-machine interface (BMI) based on demodulating steady-state visual evoked potentials (SSVEPs) in electroencephalography (EEG). Our study reports an SSVEP-BMI that, for the first time, decodes primarily based on top-down and not bottom-up visual information processing. The experimental setup presents a grid-shaped flickering line array that the participants observe while intentionally attending to a subset of flickering lines representing the shape of a letter. While the flickering pixels stimulate the participant’s visual cortex uniformly with equal probability, the participant’s intention groups the strokes and thus perceives a ‘letter Gestalt’. We observed decoding accuracy of 35.81% (up to 65.83%) with a regularized linear discriminant analysis; on average 2.05-fold, and up to 3.77-fold greater than chance levels in multi-class classification. Compared to the EEG signals, an electrooculogram (EOG) did not significantly contribute to decoding accuracies. Further analysis reveals that the top-down SSVEP paradigm shows the most focalised activation pattern around occipital visual areas; Granger causality analysis consistently revealed prefrontal top-down control over early visual processing. Taken together, the present paradigm provides the first neurophysiological evidence for the top-down SSVEP BMI paradigm, which potentially enables multi-class intentional control of EEG-BMIs without using gaze-shifting.

  15. Improved and Cost Effective Machining Techniques for Tracked Combat Vehicle Parts

    DTIC Science & Technology

    1983-10-01

    steel is shown in Figure 7-7 and consists of tempered marten- site. Three of the alloys which are used in the gas turbine engine are cast 17 - 4PH ...stainless steel, Inconel 718 and Inconel 713. The 17 - 4PH stainless steel was machined in the solution treated and aged condition. The microstructure as shown...SECURITY CLASS. (of thia report) ISa. DECLASSIFICATION/DOWNGRADING SCHEDULE 16. DISTRIBUTION STATEMENT (of thie Report) 17 . DISTRIBUTION STATEMENT (of

  16. Feasibility of Virtual Machine and Cloud Computing Technologies for High Performance Computing

    DTIC Science & Technology

    2014-05-01

    Hat Enterprise Linux SaaS software as a service VM virtual machine vNUMA virtual non-uniform memory access WRF weather research and forecasting...previously mentioned in Chapter I Section B1 of this paper, which is used to run the weather research and forecasting ( WRF ) model in their experiments...against a VMware virtualization solution of WRF . The experiment consisted of running WRF in a standard configuration between the D-VTM and VMware while

  17. SAINT: A combined simulation language for modeling man-machine systems

    NASA Technical Reports Server (NTRS)

    Seifert, D. J.

    1979-01-01

    SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.

  18. FLUXCOM - Overview and First Synthesis

    NASA Astrophysics Data System (ADS)

    Jung, M.; Ichii, K.; Tramontana, G.; Camps-Valls, G.; Schwalm, C. R.; Papale, D.; Reichstein, M.; Gans, F.; Weber, U.

    2015-12-01

    We present a community effort aiming at generating an ensemble of global gridded flux products by upscaling FLUXNET data using an array of different machine learning methods including regression/model tree ensembles, neural networks, and kernel machines. We produced products for gross primary production, terrestrial ecosystem respiration, net ecosystem exchange, latent heat, sensible heat, and net radiation for two experimental protocols: 1) at a high spatial and 8-daily temporal resolution (5 arc-minute) using only remote sensing based inputs for the MODIS era; 2) 30 year records of daily, 0.5 degree spatial resolution by incorporating meteorological driver data. Within each set-up, all machine learning methods were trained with the same input data for carbon and energy fluxes respectively. Sets of input driver variables were derived using an extensive formal variable selection exercise. The performance of the extrapolation capacities of the approaches is assessed with a fully internally consistent cross-validation. We perform cross-consistency checks of the gridded flux products with independent data streams from atmospheric inversions (NEE), sun-induced fluorescence (GPP), catchment water balances (LE, H), satellite products (Rn), and process-models. We analyze the uncertainties of the gridded flux products and for example provide a breakdown of the uncertainty of mean annual GPP originating from different machine learning methods, different climate input data sets, and different flux partitioning methods. The FLUXCOM archive will provide an unprecedented source of information for water, energy, and carbon cycle studies.

  19. Influence of "in series" elastic resistance on muscular performance during a biceps-curl set on the cable machine.

    PubMed

    García-López, David; Herrero, Azael J; González-Calvo, Gustavo; Rhea, Matthew R; Marín, Pedro J

    2010-09-01

    This study aimed to investigate the role of elastic resistance (ER) applied "in series" to a pulley-cable (PC) machine on the number of repetitions performed, kinematics parameters, and perceived exertion during a biceps-curl set to failure with a submaximal load (70% of the 1 repetition maximum). Twenty-one undergraduate students (17 men and 4 women) performed, on 2 different days, 1 biceps-curl set on the PC machine. Subjects were randomly assigned to complete 2 experimental conditions in a cross-over fashion: conventional PC mode or ER + PC mode. Results indicate ER applied "in series" to a PC machine significantly reduces (p < 0.05) the maximal number of repetitions and results in a smooth and consistent decline in mean acceleration throughout the set, in comparison to the conventional PC mode. Although no significant differences were found concerning intrarepetition kinematics, the ER trended to reduce (18.6%) the peak acceleration of the load. With a more uniformly distributed external resistance, a greater average muscle tension could have been achieved throughout the range of movement, leading to greater fatigue that could explain the lower number of maximal repetitions achieved. The application of force in a smooth, consistent fashion during each repetition of an exercise, while avoiding active deceleration, is expected to enhance the benefits of the resistance exercise, especially for those seeking greater increases in muscular hypertrophy.

  20. Lawrence Livermore National Laboratory ULTRA-350 Test Bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, D J; Wulff, T A; Carlisle, K

    2001-04-10

    LLNL has many in-house designed high precision machine tools. Some of these tools include the Large Optics Diamond Turning Machine (LODTM) [1], Diamond Turning Machine No.3 (DTM-3) and two Precision Engineering Research Lathes (PERL-1 and PERL-11). These machines have accuracy in the sub-micron range and in most cases position resolution in the couple of nanometers range. All of these machines are built with similar underlying technologies. The machines use capstan drive technology, laser interferometer position feedback, tachometer velocity feedback, permanent magnet (PM) brush motors and analog velocity and position loop servo compensation [2]. The machine controller does not perform anymore » servo compensation it simply computes the differences between the commanded position and the actual position (the following error) and sends this to a D/A for the analog servo position loop. LLNL is designing a new high precision diamond turning machine. The machine is called the ULTRA 350 [3]. In contrast to many of the proven technologies discussed above, the plan for the new machine is to use brushless linear motors, high precision linear scales, machine controller motor commutation and digital servo compensation for the velocity and position loops. Although none of these technologies are new and have been in use in industry, applications of these technologies to high precision diamond turning is limited. To minimize the risks of these technologies in the new machine design, LLNL has established a test bed to evaluate these technologies for application in high precision diamond turning. The test bed is primarily composed of commercially available components. This includes the slide with opposed hydrostatic bearings, the oil system, the brushless PM linear motor, the two-phase input three-phase output linear motor amplifier and the system controller. The linear scales are not yet commercially available but use a common electronic output format. As of this writing, the final verdict for the use of these technologies is still out but the first part of the work has been completed with promising results. The goal of this part of the work was to close a servo position loop around a slide incorporating these technologies and to measure the performance. This paper discusses the tests that were setup for system evaluation and the results of the measurements made. Some very promising results include; slide positioning to nanometer level and slow speed slide direction reversal at less than 100nm/min with no observed discontinuities. This is very important for machine contouring in diamond turning. As a point of reference, at 100 nm/min it would take the slide almost 7 years to complete the full designed travel of 350 mm. This speed has been demonstrated without the use of a velocity sensor. The velocity is derived from the position sensor. With what has been learned on the test bed, the paper finishes with a brief comparison of the old and new technologies. The emphasis of this comparison will be on the servo performance as illustrated with bode plot diagrams.« less

  1. Lawrence Livermore National Laboratory ULTRA-350 Test Bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, D J; Wulff, T A; Carlisle, K

    2001-04-10

    LLNL has many in-house designed high precision machine tools. Some of these tools include the Large Optics Diamond Turning Machine (LODTM) [1], Diamond Turning Machine No.3 (DTM-3) and two Precision Engineering Research Lathes (PERL-I and PERL-II). These machines have accuracy in the sub-micron range and in most cases position resolution in the couple of nanometers range. All of these machines are built with similar underlying technologies. The machines use capstan drive technology, laser interferometer position feedback, tachometer velocity feedback, permanent magnet (PM) brush motors and analog velocity and position loop servo compensation [2]. The machine controller does not perform anymore » servo compensation it simply computes the differences between the commanded position and the actual position (the following error) and sends this to a D/A for the analog servo position loop. LLNL is designing a new high precision diamond turning machine. The machine is called the ULTRA 350 [3]. In contrast to many of the proven technologies discussed above, the plan for the new machine is to use brushless linear motors, high precision linear scales, machine controller motor commutation and digital servo compensation for the velocity and position loops. Although none of these technologies are new and have been in use in industry, applications of these technologies to high precision diamond turning is limited. To minimize the risks of these technologies in the new machine design, LLNL has established a test bed to evaluate these technologies for application in high precision diamond turning. The test bed is primarily composed of commercially available components. This includes the slide with opposed hydrostatic bearings, the oil system, the brushless PM linear motor, the two-phase input three-phase output linear motor amplifier and the system controller. The linear scales are not yet commercially available but use a common electronic output format. As of this writing, the final verdict for the use of these technologies is still out but the first part of the work has been completed with promising results. The goal of this part of the work was to close a servo position loop around a slide incorporating these technologies and to measure the performance. This paper discusses the tests that were setup for system evaluation and the results of the measurements made. Some very promising results include; slide positioning to nanometer level and slow speed slide direction reversal at less than 100nm/min with no observed discontinuities. This is very important for machine contouring in diamond turning. As a point of reference, at 100 nm/min it would take the slide almost 7 years to complete the full designed travel of 350 mm. This speed has been demonstrated without the use of a velocity sensor. The velocity is derived from the position sensor. With what has been learned on the test bed, the paper finishes with a brief comparison of the old and new technologies. The emphasis of this comparison will be on the servo performance as illustrated with bode plot diagrams.« less

  2. 36 CFR § 910.2 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: public improvements construction and square development. Public improvements construction consists of... configuration, and pedestrian amenities. Square development consists of design and construction of development projects primarily on city blocks, known as squares, within the Development Area. These development...

  3. SU-F-T-226: QA Management for a Large Institution with Multiple Campuses for FMEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, G; Chan, M; Lovelock, D

    2016-06-15

    Purpose: To redesign our radiation therapy QA program with the goal to improve quality, efficiency, and consistency among a growing number of campuses at a large institution. Methods: A QA committee was established with at least one physicist representing each of our six campuses (22 linacs). Weekly meetings were scheduled to advise on and update current procedures, to review end-to-end and other test results, and to prepare composite reports for internal and external audits. QA procedures for treatment and imaging equipment were derived from TG Reports 142 and 66, practice guidelines, and feedback from ACR evaluations. The committee focused onmore » reaching a consensus on a single QA program among all campuses using the same type of equipment and reference data. Since the recommendations for tolerances referenced to baseline data were subject to interpretation in some instances, the committee reviewed the characteristics of all machines and quantified any variations before choosing between treatment planning system (i.e. treatment planning system commissioning data that is representative for all machines) or machine-specific values (i.e. commissioning data of the individual machines) as baseline data. Results: The configured QA program will be followed strictly by all campuses. Inventory of available equipment has been compiled, and additional equipment acquisitions for the QA program are made as needed. Dosimetric characteristics are evaluated for all machines using the same methods to ensure consistency of beam data where possible. In most cases, baseline data refer to treatment planning system commissioning data but machine-specific values are used as reference where it is deemed appropriate. Conclusion: With a uniform QA scheme, variations in QA procedures are kept to a minimum. With a centralized database, data collection and analysis are simplified. This program will facilitate uniformity in patient treatments and analysis of large amounts of QA data campus-wide, which will ultimately facilitate FMEA.« less

  4. Machine learning to predict the occurrence of bisphosphonate-related osteonecrosis of the jaw associated with dental extraction: A preliminary report.

    PubMed

    Kim, Dong Wook; Kim, Hwiyoung; Nam, Woong; Kim, Hyung Jun; Cha, In-Ho

    2018-04-23

    The aim of this study was to build and validate five types of machine learning models that can predict the occurrence of BRONJ associated with dental extraction in patients taking bisphosphonates for the management of osteoporosis. A retrospective review of the medical records was conducted to obtain cases and controls for the study. Total 125 patients consisting of 41 cases and 84 controls were selected for the study. Five machine learning prediction algorithms including multivariable logistic regression model, decision tree, support vector machine, artificial neural network, and random forest were implemented. The outputs of these models were compared with each other and also with conventional methods, such as serum CTX level. Area under the receiver operating characteristic (ROC) curve (AUC) was used to compare the results. The performance of machine learning models was significantly superior to conventional statistical methods and single predictors. The random forest model yielded the best performance (AUC = 0.973), followed by artificial neural network (AUC = 0.915), support vector machine (AUC = 0.882), logistic regression (AUC = 0.844), decision tree (AUC = 0.821), drug holiday alone (AUC = 0.810), and CTX level alone (AUC = 0.630). Machine learning methods showed superior performance in predicting BRONJ associated with dental extraction compared to conventional statistical methods using drug holiday and serum CTX level. Machine learning can thus be applied in a wide range of clinical studies. Copyright © 2017. Published by Elsevier Inc.

  5. Type 2 Diabetes Screening Test by Means of a Pulse Oximeter.

    PubMed

    Moreno, Enrique Monte; Lujan, Maria Jose Anyo; Rusinol, Montse Torrres; Fernandez, Paqui Juarez; Manrique, Pilar Nunez; Trivino, Cristina Aragon; Miquel, Magda Pedrosa; Rodriguez, Marife Alvarez; Burguillos, M Jose Gonzalez

    2017-02-01

    In this paper, we propose a method for screening for the presence of type 2 diabetes by means of the signal obtained from a pulse oximeter. The screening system consists of two parts: the first analyzes the signal obtained from the pulse oximeter, and the second consists of a machine-learning module. The system consists of a front end that extracts a set of features form the pulse oximeter signal. These features are based on physiological considerations. The set of features were the input of a machine-learning algorithm that determined the class of the input sample, i.e., whether the subject had diabetes or not. The machine-learning algorithms were random forests, gradient boosting, and linear discriminant analysis as benchmark. The system was tested on a database of [Formula: see text] subjects (two samples per subject) collected from five community health centers. The mean receiver operating characteristic area found was [Formula: see text]% (median value [Formula: see text]% and range [Formula: see text]%), with a specificity =  [Formula: see text]% for a threshold that gave a sensitivity = [Formula: see text]%. We present a screening method for detecting diabetes that has a performance comparable to the glycated haemoglobin (haemoglobin A1c HbA1c) test, does not require blood extraction, and yields results in less than 5 min.

  6. BNL 56 MHz HOM damper prototype fabrication at JLAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huque, N.; McIntyre, G.; Daly, E. F.

    A prototype Higher-Order Mode (HOM) Damper was fabricated at JLab for the Relativistic Heavy-Ion Collider’s (RHIC) 56 MHz cavity at Brookhaven National Laboratory (BNL). Primarily constructed from high RRR Niobium and Sapphire, the coaxial damper presented significant challenges in electron-beam welding (EBW), brazing and machining via acid etching. The results of the prototype operation brought about changes in the damper design, due to overheating braze alloys and possible multi-pacting. Five production HOM dampers are currently being fabricated at JLab. This paper outlines the challenges faced in the fabrication process, and the solutions put in place.

  7. BNL 56 MHz HOM Damper Prototype Fabrication at JLab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huque, Naeem A.; Daly, Edward F.; Clemens, William A.

    A prototype Higher-Order Mode (HOM) Damper was fabricated at JLab for the Relativistic Heavy-Ion Collider's (RHIC) 56 MHz cavity at Brookhaven National Laboratory (BNL). Primarily constructed from high RRR Niobium and Sapphire, the coaxial damper presented significant challenges in electron-beam welding (EBW), brazing and machining via acid etching. The results of the prototype operation brought about changes in the damper design, due to overheating braze alloys and possible multi-pacting. Five production HOM dampers are currently being fabricated at JLab. This paper outlines the challenges faced in the fabrication process, and the solutions put in place.

  8. The development of mixer machine for organic animal feed production: Proposed study

    NASA Astrophysics Data System (ADS)

    Leman, A. M.; Wahab, R. Abdul; Zakaria, Supaat; Feriyanto, Dafit; Nor, M. I. F. Che Mohd; Muzarpar, Syafiq

    2017-09-01

    Mixer machine plays a major role in producing homogenous composition of animal feed. Long time production, inhomogeneous and minor agglomeration has been observed by existing mixer. Therefore, this paper proposed continuous mixer to enhance mixing efficiency with shorter time of mixing process in order to abbreviate the whole process in animal feed production. Through calculation of torque, torsion, bending, power and energy consumption will perform in mixer machine process. Proposed mixer machine is designed by two layer buckets with purpose for continuity of mixing process. Mixing process was performed by 4 blades which consists of various arm length such as 50, 100,150 and 225 mm in 60 rpm velocity clockwise rotation. Therefore by using this machine will produce the homogenous composition of animal feed through nutrition analysis and short operation time of mixing process approximately of 5 minutes. Therefore, the production of animal feed will suitable for various animals including poultry and aquatic fish. This mixer will available for various organic material in animal feed production. Therefore, this paper will highlights some areas such as continues animal feed supply chain and bio-based animal feed.

  9. Design and fabrication of a freeform phase plate for high-order ocular aberration correction

    NASA Astrophysics Data System (ADS)

    Yi, Allen Y.; Raasch, Thomas W.

    2005-11-01

    In recent years it has become possible to measure and in some instances to correct the high-order aberrations of human eyes. We have investigated the correction of wavefront error of human eyes by using phase plates designed to compensate for that error. The wavefront aberrations of the four eyes of two subjects were experimentally determined, and compensating phase plates were machined with an ultraprecision diamond-turning machine equipped with four independent axes. A slow-tool servo freeform trajectory was developed for the machine tool path. The machined phase-correction plates were measured and compared with the original design values to validate the process. The position of the phase-plate relative to the pupil is discussed. The practical utility of this mode of aberration correction was investigated with visual acuity testing. The results are consistent with the potential benefit of aberration correction but also underscore the critical positioning requirements of this mode of aberration correction. This process is described in detail from optical measurements, through machining process design and development, to final results.

  10. Banknotes and unattended cash transactions

    NASA Astrophysics Data System (ADS)

    Bernardini, Ronald R.

    2000-04-01

    There is a 64 billion dollar annual unattended cash transaction business in the US with 10 to 20 million daily transactions. Even small problems with the machine readability of banknotes can quickly become a major problem to the machine manufacturer and consumer. Traditional note designs incorporate overt security features for visual validation by the public. Many of these features such as fine line engraving, microprinting and watermarks are unsuitable as machine readable features in low cost note acceptors. Current machine readable features, mostly covert, were designed and implemented with the central banks in mind. These features are only usable by the banks large, high speed currency sorting and validation equipment. New note designs should consider and provide for low cost not acceptors, implementing features developed for inexpensive sensing technologies. Machine readable features are only as good as their consistency. Quality of security features as well as that of the overall printing process must be maintained to ensure reliable and secure operation of note readers. Variations in printing and of the components used to make the note are one of the major causes of poor performance in low cost note acceptors. The involvement of machine manufacturers in new currency designs will aid note producers in the design of a note that is machine friendly, helping to secure the acceptance of the note by the public as well as acting asa deterrent to fraud.

  11. Homopolar machine for reversible energy storage and transfer systems

    DOEpatents

    Stillwagon, Roy E.

    1978-01-01

    A homopolar machine designed to operate as a generator and motor in reversibly storing and transferring energy between the machine and a magnetic load coil for a thermo-nuclear reactor. The machine rotor comprises hollow thin-walled cylinders or sleeves which form the basis of the system by utilizing substantially all of the rotor mass as a conductor thus making it possible to transfer substantially all the rotor kinetic energy electrically to the load coil in a highly economical and efficient manner. The rotor is divided into multiple separate cylinders or sleeves of modular design, connected in series and arranged to rotate in opposite directions but maintain the supply of current in a single direction to the machine terminals. A stator concentrically disposed around the sleeves consists of a hollow cylinder having a number of excitation coils each located radially outward from the ends of adjacent sleeves. Current collected at an end of each sleeve by sleeve slip rings and brushes is transferred through terminals to the magnetic load coil. Thereafter, electrical energy returned from the coil then flows through the machine which causes the sleeves to motor up to the desired speed in preparation for repetition of the cycle. To eliminate drag on the rotor between current pulses, the brush rigging is designed to lift brushes from all slip rings in the machine.

  12. Homopolar machine for reversible energy storage and transfer systems

    DOEpatents

    Stillwagon, Roy E.

    1981-01-01

    A homopolar machine designed to operate as a generator and motor in reversibly storing and transferring energy between the machine and a magnetic load coil for a thermo-nuclear reactor. The machine rotor comprises hollow thin-walled cylinders or sleeves which form the basis of the system by utilizing substantially all of the rotor mass as a conductor thus making it possible to transfer substantially all the rotor kinetic energy electrically to the load coil in a highly economical and efficient manner. The rotor is divided into multiple separate cylinders or sleeves of modular design, connected in series and arranged to rotate in opposite directions but maintain the supply of current in a single direction to the machine terminals. A stator concentrically disposed around the sleeves consists of a hollow cylinder having a number of excitation coils each located radially outward from the ends of adjacent sleeves. Current collected at an end of each sleeve by sleeve slip rings and brushes is transferred through terminals to the magnetic load coil. Thereafter, electrical energy returned from the coil then flows through the machine which causes the sleeves to motor up to the desired speed in preparation for repetition of the cycle. To eliminate drag on the rotor between current pulses, the brush rigging is designed to lift brushes from all slip rings in the machine.

  13. The influence of the focus position on laser machining and laser micro-structuring monocrystalline diamond surface

    NASA Astrophysics Data System (ADS)

    Wu, Mingtao; Guo, Bing; Zhao, Qingliang; Fan, Rongwei; Dong, Zhiwei; Yu, Xin

    2018-06-01

    Micro-structured surface on diamond is widely used in microelectronics, optical elements, MEMS and NEMS components, ultra-precision machining tools, etc. The efficient micro-structuring of diamond material is still a challenging task. In this article, the influence of the focus position on laser machining and laser micro-structuring monocrystalline diamond surface were researched. At the beginning, the ablation threshold and its incubation effect of monocrystalline diamond were determined and discussed. As the accumulated laser pulses ranged from 40 to 5000, the laser ablation threshold decreased from 1.48 J/cm2 to 0.97 J/cm2. Subsequently, the variation of the ablation width and ablation depth in laser machining were studied. With enough pulse energy, the ablation width mainly depended on the laser propagation attributes while the ablation depth was a complex function of the focus position. Raman analysis was used to detect the variation of the laser machined diamond surface after the laser machining experiments. Graphite formation was discovered on the machined diamond surface and graphitization was enhanced after the defocusing quantity exceeded 45 μm. At last, several micro-structured surfaces were successfully fabricated on diamond surface with the defined micro-structure patterns and structuring ratios just by adjusting the defocusing quantity. The experimental structuring ratio was consistent with the theoretical analysis.

  14. Orbital-free bond breaking via machine learning

    NASA Astrophysics Data System (ADS)

    Snyder, John C.; Rupp, Matthias; Hansen, Katja; Blooston, Leo; Müller, Klaus-Robert; Burke, Kieron

    2013-12-01

    Using a one-dimensional model, we explore the ability of machine learning to approximate the non-interacting kinetic energy density functional of diatomics. This nonlinear interpolation between Kohn-Sham reference calculations can (i) accurately dissociate a diatomic, (ii) be systematically improved with increased reference data and (iii) generate accurate self-consistent densities via a projection method that avoids directions with no data. With relatively few densities, the error due to the interpolation is smaller than typical errors in standard exchange-correlation functionals.

  15. minimega v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crussell, Jonathan; Erickson, Jeremy; Fritz, David

    minimega is an emulytics platform for creating testbeds of networked devices. The platoform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. minimega allows experiments to be brought up quickly with almost no configuration. minimega also includes tools for simple cluster, management, as well as tools for creating Linux-based virtual machines. This release of minimega includes new emulated sensors for Android devices to improve the fidelity of testbeds that include mobile devices. Emulated sensors include GPS and

  16. Successful attack on permutation-parity-machine-based neural cryptography.

    PubMed

    Seoane, Luís F; Ruttor, Andreas

    2012-02-01

    An algorithm is presented which implements a probabilistic attack on the key-exchange protocol based on permutation parity machines. Instead of imitating the synchronization of the communicating partners, the strategy consists of a Monte Carlo method to sample the space of possible weights during inner rounds and an analytic approach to convey the extracted information from one outer round to the next one. The results show that the protocol under attack fails to synchronize faster than an eavesdropper using this algorithm.

  17. Liquid lens: advances in adaptive optics

    NASA Astrophysics Data System (ADS)

    Casey, Shawn Patrick

    2010-12-01

    'Liquid lens' technologies promise significant advancements in machine vision and optical communications systems. Adaptations for machine vision, human vision correction, and optical communications are used to exemplify the versatile nature of this technology. Utilization of liquid lens elements allows the cost effective implementation of optical velocity measurement. The project consists of a custom image processor, camera, and interface. The images are passed into customized pattern recognition and optical character recognition algorithms. A single camera would be used for both speed detection and object recognition.

  18. Template For Aiming An X-Ray Machine

    NASA Technical Reports Server (NTRS)

    Morphet, W. J.

    1994-01-01

    Relatively inexpensive template helps in aligning x-ray machine with phenolic ring to be inspected for flaws. Phenolic ring in original application part of rocket nozzle. Concept also applicable to x-ray inspection of other rings. Template contains alignment holes for adjusting orientation, plus target spot for adjusting lateral position, of laser spotting beam. (Laser spotting beam coincides with the x-ray beam, turned on later, after alignment completed.) Use of template decreases positioning time and error, providing consistent sensitivity for detection of flaws.

  19. Data presentation techniques for rotating machinery malfunction diagnosis

    NASA Technical Reports Server (NTRS)

    Spettel, T.

    1985-01-01

    Baseline steady state data is excellent for documentation of vibration signals at normal operating conditions. Assuming that a set of initial data was acquired with the machinery in a good state of repair, any future changes or deterioration in mechanical condition can be easily compared to the baseline information. Often this type of comparison will yield sufficient information for evaluation of the problem. However, many malfunctions require the analysis of transient data in order to identify the malfunction. Steady-state data formats consist of: Time Base Waveform, Orbit, Spectrum. Transient data formats consist of: Polar, Bode, Cascade. Our objective is to demonstrate the use of the above formats to diagnose a machine malfunction. A turbine-driven compressor train is chosen as an example. The machine train outline drawing is shown.

  20. Static Frequency Converter System Installed and Tested

    NASA Technical Reports Server (NTRS)

    Brown, Donald P.; Sadhukhan, Debashis

    2003-01-01

    A new Static Frequency Converter (SFC) system has been installed and tested at the NASA Glenn Research Center s Central Air Equipment Building to provide consistent, reduced motor start times and improved reliability for the building s 14 large exhausters and compressors. The operational start times have been consistent around 2 min, 20 s per machine. This is at least a 3-min improvement (per machine) over the old variable-frequency motor generator sets. The SFC was designed and built by Asea Brown Boveri (ABB) and installed by Encompass Design Group (EDG) as part of a Construction of Facilities project managed by Glenn (Robert Scheidegger, project manager). The authors designed the Central Process Distributed Control Systems interface and control between the programmable logic controller, solid-state exciter, and switchgear, which was constructed by Gilcrest Electric.

  1. In vitro assessment of cutting efficiency and durability of zirconia removal diamond rotary instruments.

    PubMed

    Kim, Joon-Soo; Bae, Ji-Hyeon; Yun, Mi-Jung; Huh, Jung-Bo

    2017-06-01

    Recently, zirconia removal diamond rotary instruments have become commercially available for efficient cutting of zirconia. However, research of cutting efficiency and the cutting characteristics of zirconia removal diamond rotary instruments is limited. The purpose of this in vitro study was to assess and compare the cutting efficiency, durability, and diamond rotary instrument wear pattern of zirconia diamond removal rotary instruments with those of conventional diamond rotary instruments. In addition, the surface characteristics of the cut zirconia were assessed. Block specimens of 3 mol% yttrium cation-doped tetragonal zirconia polycrystal were machined 10 times for 1 minute each using a high-speed handpiece with 6 types of diamond rotary instrument from 2 manufacturers at a constant force of 2 N (n=5). An electronic scale was used to measure the lost weight after each cut in order to evaluate the cutting efficiency. Field emission scanning electron microscopy was used to evaluate diamond rotary instrument wear patterns and machined zirconia block surface characteristics. Data were statistically analyzed using the Kruskal-Wallis test, followed by the Mann-Whitney U test (α=.05). Zirconia removal fine grit diamond rotary instruments showed cutting efficiency that was reduced compared with conventional fine grit diamond rotary instruments. Diamond grit fracture was the most dominant diamond rotary instrument wear pattern in all groups. All machined zirconia surfaces were primarily subjected to plastic deformation, which is evidence of ductile cutting. Zirconia blocks machined with zirconia removal fine grit diamond rotary instruments showed the least incidence of surface flaws. Although zirconia removal diamond rotary instruments did not show improved cutting efficiency compared with conventional diamond rotary instruments, the machined zirconia surface showed smoother furrows of plastic deformation and fewer surface flaws. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  2. State but not district nutrition policies are associated with less junk food in vending machines and school stores in US public schools.

    PubMed

    Kubik, Martha Y; Wall, Melanie; Shen, Lijuan; Nanney, Marilyn S; Nelson, Toben F; Laska, Melissa N; Story, Mary

    2010-07-01

    Policy that targets the school food environment has been advanced as one way to increase the availability of healthy food at schools and healthy food choice by students. Although both state- and district-level policy initiatives have focused on school nutrition standards, it remains to be seen whether these policies translate into healthy food practices at the school level, where student behavior will be impacted. To examine whether state- and district-level nutrition policies addressing junk food in school vending machines and school stores were associated with less junk food in school vending machines and school stores. Junk food was defined as foods and beverages with low nutrient density that provide calories primarily through fats and added sugars. A cross-sectional study design was used to assess self-report data collected by computer-assisted telephone interviews or self-administered mail questionnaires from state-, district-, and school-level respondents participating in the School Health Policies and Programs Study 2006. The School Health Policies and Programs Study, administered every 6 years since 1994 by the Centers for Disease Control and Prevention, is considered the largest, most comprehensive assessment of school health policies and programs in the United States. A nationally representative sample (n=563) of public elementary, middle, and high schools was studied. Logistic regression adjusted for school characteristics, sampling weights, and clustering was used to analyze data. Policies were assessed for strength (required, recommended, neither required nor recommended prohibiting junk food) and whether strength was similar for school vending machines and school stores. School vending machines and school stores were more prevalent in high schools (93%) than middle (84%) and elementary (30%) schools. For state policies, elementary schools that required prohibiting junk food in school vending machines and school stores offered less junk food than elementary schools that neither required nor recommended prohibiting junk food (13% vs 37%; P=0.006). Middle schools that required prohibiting junk food in vending machines and school stores offered less junk food than middle schools that recommended prohibiting junk food (71% vs 87%; P=0.07). Similar associations were not evident for district-level polices or high schools. Policy may be an effective tool to decrease junk food in schools, particularly in elementary and middle schools. Copyright 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  3. State but not District Nutrition Policies Are Associated with Less Junk Food in Vending Machines and School Stores in US Public Schools

    PubMed Central

    KUBIK, MARTHA Y.; WALL, MELANIE; SHEN, LIJUAN; NANNEY, MARILYN S.; NELSON, TOBEN F.; LASKA, MELISSA N.; STORY, MARY

    2012-01-01

    Background Policy that targets the school food environment has been advanced as one way to increase the availability of healthy food at schools and healthy food choice by students. Although both state- and district-level policy initiatives have focused on school nutrition standards, it remains to be seen whether these policies translate into healthy food practices at the school level, where student behavior will be impacted. Objective To examine whether state- and district-level nutrition policies addressing junk food in school vending machines and school stores were associated with less junk food in school vending machines and school stores. Junk food was defined as foods and beverages with low nutrient density that provide calories primarily through fats and added sugars. Design A cross-sectional study design was used to assess self-report data collected by computer-assisted telephone interviews or self-administered mail questionnaires from state-, district-, and school-level respondents participating in the School Health Policies and Programs Study 2006. The School Health Policies and Programs Study, administered every 6 years since 1994 by the Centers for Disease Control and Prevention, is considered the largest, most comprehensive assessment of school health policies and programs in the United States. Subjects/setting A nationally representative sample (n = 563) of public elementary, middle, and high schools was studied. Statistical analysis Logistic regression adjusted for school characteristics, sampling weights, and clustering was used to analyze data. Policies were assessed for strength (required, recommended, neither required nor recommended prohibiting junk food) and whether strength was similar for school vending machines and school stores. Results School vending machines and school stores were more prevalent in high schools (93%) than middle (84%) and elementary (30%) schools. For state policies, elementary schools that required prohibiting junk food in school vending machines and school stores offered less junk food than elementary schools that neither required nor recommended prohibiting junk food (13% vs 37%; P = 0.006). Middle schools that required prohibiting junk food in vending machines and school stores offered less junk food than middle schools that recommended prohibiting junk food (71% vs 87%; P = 0.07). Similar associations were not evident for district-level polices or high schools. Conclusions Policy may be an effective tool to decrease junk food in schools, particularly in elementary and middle schools. PMID:20630161

  4. A new self-regulated self-excited single-phase induction generator using a squirrel cage three-phase induction machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fukami, Tadashi; Imamura, Michinori; Kaburaki, Yuichi

    1995-12-31

    A new single-phase capacitor self-excited induction generator with self-regulating feature is presented. The new generator consists of a squirrel cage three-phase induction machine and three capacitors connected in series and parallel with a single phase load. The voltage regulation of this generator is very small due to the effect of the three capacitors. Moreover, since a Y-connected stator winding is employed, the waveform of the output voltage becomes sinusoidal. In this paper the system configuration and the operating principle of the new generator are explained, and the basic characteristics are also investigated by means of a simple analysis and experimentsmore » with a laboratory machine.« less

  5. Portable programming on parallel/networked computers using the Application Portable Parallel Library (APPL)

    NASA Technical Reports Server (NTRS)

    Quealy, Angela; Cole, Gary L.; Blech, Richard A.

    1993-01-01

    The Application Portable Parallel Library (APPL) is a subroutine-based library of communication primitives that is callable from applications written in FORTRAN or C. APPL provides a consistent programmer interface to a variety of distributed and shared-memory multiprocessor MIMD machines. The objective of APPL is to minimize the effort required to move parallel applications from one machine to another, or to a network of homogeneous machines. APPL encompasses many of the message-passing primitives that are currently available on commercial multiprocessor systems. This paper describes APPL (version 2.3.1) and its usage, reports the status of the APPL project, and indicates possible directions for the future. Several applications using APPL are discussed, as well as performance and overhead results.

  6. Torsion effect of swing frame on the measurement of horizontal two-plane balancing machine

    NASA Astrophysics Data System (ADS)

    Wang, Qiuxiao; Wang, Dequan; He, Bin; Jiang, Pan; Wu, Zhaofu; Fu, Xiaoyan

    2017-03-01

    In this paper, the vibration model of swing frame of two-plane balancing machine is established to calculate the vibration center position of swing frame first. The torsional stiffness formula of spring plate twisting around the vibration center is then deduced by using superposition principle. Finally, the dynamic balancing experiments prove the irrationality of A-B-C algorithm which ignores the torsion effect, and show that the torsional stiffness deduced by experiments is consistent with the torsional stiffness calculated by theory. The experimental datas show the influence of the torsion effect of swing frame on the separation ratio of sided balancing machines, which reveals the sources of measurement error and assesses the application scope of A-B-C algorithm.

  7. ERA 1103 UNIVAC 2 Calculating Machine

    NASA Image and Video Library

    1955-09-21

    The new 10-by 10-Foot Supersonic Wind Tunnel at the Lewis Flight Propulsion Laboratory included high tech data acquisition and analysis systems. The reliable gathering of pressure, speed, temperature, and other data from test runs in the facilities was critical to the research process. Throughout the 1940s and early 1950s female employees, known as computers, recorded all test data and performed initial calculations by hand. The introduction of punch card computers in the late 1940s gradually reduced the number of hands-on calculations. In the mid-1950s new computational machines were installed in the office building of the 10-by 10-Foot tunnel. The new systems included this UNIVAC 1103 vacuum tube computer—the lab’s first centralized computer system. The programming was done on paper tape and fed into the machine. The 10-by 10 computer center also included the Lewis-designed Computer Automated Digital Encoder (CADDE) and Digital Automated Multiple Pressure Recorder (DAMPR) systems which converted test data to binary-coded decimal numbers and recorded test pressures automatically, respectively. The systems primarily served the 10-by 10, but were also applied to the other large facilities. Engineering Research Associates (ERA) developed the initial UNIVAC computer for the Navy in the late 1940s. In 1952 the company designed a commercial version, the UNIVAC 1103. The 1103 was the first computer designed by Seymour Cray and the first commercially successful computer.

  8. Architectural requirements for the Red Storm computing system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camp, William J.; Tomkins, James Lee

    This report is based on the Statement of Work (SOW) describing the various requirements for delivering 3 new supercomputer system to Sandia National Laboratories (Sandia) as part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative (ASCI) program. This system is named Red Storm and will be a distributed memory, massively parallel processor (MPP) machine built primarily out of commodity parts. The requirements presented here distill extensive architectural and design experience accumulated over a decade and a half of research, development and production operation of similar machines at Sandia. Red Storm will have an unusually high bandwidth, low latencymore » interconnect, specially designed hardware and software reliability features, a light weight kernel compute node operating system and the ability to rapidly switch major sections of the machine between classified and unclassified computing environments. Particular attention has been paid to architectural balance in the design of Red Storm, and it is therefore expected to achieve an atypically high fraction of its peak speed of 41 TeraOPS on real scientific computing applications. In addition, Red Storm is designed to be upgradeable to many times this initial peak capability while still retaining appropriate balance in key design dimensions. Installation of the Red Storm computer system at Sandia's New Mexico site is planned for 2004, and it is expected that the system will be operated for a minimum of five years following installation.« less

  9. Ambiguity and variability of database and software names in bioinformatics.

    PubMed

    Duck, Geraint; Kovacevic, Aleksandar; Robertson, David L; Stevens, Robert; Nenadic, Goran

    2015-01-01

    There are numerous options available to achieve various tasks in bioinformatics, but until recently, there were no tools that could systematically identify mentions of databases and tools within the literature. In this paper we explore the variability and ambiguity of database and software name mentions and compare dictionary and machine learning approaches to their identification. Through the development and analysis of a corpus of 60 full-text documents manually annotated at the mention level, we report high variability and ambiguity in database and software mentions. On a test set of 25 full-text documents, a baseline dictionary look-up achieved an F-score of 46 %, highlighting not only variability and ambiguity but also the extensive number of new resources introduced. A machine learning approach achieved an F-score of 63 % (with precision of 74 %) and 70 % (with precision of 83 %) for strict and lenient matching respectively. We characterise the issues with various mention types and propose potential ways of capturing additional database and software mentions in the literature. Our analyses show that identification of mentions of databases and tools is a challenging task that cannot be achieved by relying on current manually-curated resource repositories. Although machine learning shows improvement and promise (primarily in precision), more contextual information needs to be taken into account to achieve a good degree of accuracy.

  10. Integration of Machining and Inspection in Aerospace Manufacturing

    NASA Astrophysics Data System (ADS)

    Simpson, Bart; Dicken, Peter J.

    2011-12-01

    The main challenge for aerospace manufacturers today is to develop the ability to produce high-quality products on a consistent basis as quickly as possible and at the lowest-possible cost. At the same time, rising material prices are making the cost of scrap higher than ever so making it more important to minimise waste. Proper inspection and quality control methods are no longer a luxury; they are an essential part of every manufacturing operation that wants to grow and be successful. However, simply bolting on some quality control procedures to the existing manufacturing processes is not enough. Inspection must be fully-integrated with manufacturing for the investment to really produce significant improvements. The traditional relationship between manufacturing and inspection is that machining is completed first on the company's machine tools and the components are then transferred to dedicated inspection equipment to be approved or rejected. However, as machining techniques become more sophisticated, and as components become larger and more complex, there are a growing number of cases where closer integration is required to give the highest productivity and the biggest reductions in wastage. Instead of a simple linear progression from CAD to CAM to machining to inspection, a more complicated series of steps is needed, with extra data needed to fill any gaps in the information available at the various stages. These new processes can be grouped under the heading of "adaptive machining". The programming of most machining operations is based around knowing three things: the position of the workpiece on the machine, the starting shape of the material to be machined, and the final shape that needs to be achieved at the end of the operation. Adaptive machining techniques allow successful machining when at least one of those elements is unknown, by using in-process measurement to close the information gaps in the process chain. It also allows any errors to be spotted earlier in the manufacturing process, so helping the problems to be resolved more quickly and at lower cost.

  11. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    PubMed Central

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-01-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840

  12. Development and validation of a machine learning algorithm and hybrid system to predict the need for life-saving interventions in trauma patients.

    PubMed

    Liu, Nehemiah T; Holcomb, John B; Wade, Charles E; Batchinsky, Andriy I; Cancio, Leopoldo C; Darrah, Mark I; Salinas, José

    2014-02-01

    Accurate and effective diagnosis of actual injury severity can be problematic in trauma patients. Inherent physiologic compensatory mechanisms may prevent accurate diagnosis and mask true severity in many circumstances. The objective of this project was the development and validation of a multiparameter machine learning algorithm and system capable of predicting the need for life-saving interventions (LSIs) in trauma patients. Statistics based on means, slopes, and maxima of various vital sign measurements corresponding to 79 trauma patient records generated over 110,000 feature sets, which were used to develop, train, and implement the system. Comparisons among several machine learning models proved that a multilayer perceptron would best implement the algorithm in a hybrid system consisting of a machine learning component and basic detection rules. Additionally, 295,994 feature sets from 82 h of trauma patient data showed that the system can obtain 89.8 % accuracy within 5 min of recorded LSIs. Use of machine learning technologies combined with basic detection rules provides a potential approach for accurately assessing the need for LSIs in trauma patients. The performance of this system demonstrates that machine learning technology can be implemented in a real-time fashion and potentially used in a critical care environment.

  13. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  14. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  15. Development of hardware system using temperature and vibration maintenance models integration concepts for conventional machines monitoring: a case study

    NASA Astrophysics Data System (ADS)

    Adeyeri, Michael Kanisuru; Mpofu, Khumbulani; Kareem, Buliaminu

    2016-03-01

    This article describes the integration of temperature and vibration models for maintenance monitoring of conventional machinery parts in which their optimal and best functionalities are affected by abnormal changes in temperature and vibration values thereby resulting in machine failures, machines breakdown, poor quality of products, inability to meeting customers' demand, poor inventory control and just to mention a few. The work entails the use of temperature and vibration sensors as monitoring probes programmed in microcontroller using C language. The developed hardware consists of vibration sensor of ADXL345, temperature sensor of AD594/595 of type K thermocouple, microcontroller, graphic liquid crystal display, real time clock, etc. The hardware is divided into two: one is based at the workstation (majorly meant to monitor machines behaviour) and the other at the base station (meant to receive transmission of machines information sent from the workstation), working cooperatively for effective functionalities. The resulting hardware built was calibrated, tested using model verification and validated through principles pivoted on least square and regression analysis approach using data read from the gear boxes of extruding and cutting machines used for polyethylene bag production. The results got therein confirmed related correlation existing between time, vibration and temperature, which are reflections of effective formulation of the developed concept.

  16. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment.

    PubMed

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye

    2016-06-07

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  17. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-06-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  18. Machine vision based quality inspection of flat glass products

    NASA Astrophysics Data System (ADS)

    Zauner, G.; Schagerl, M.

    2014-03-01

    This application paper presents a machine vision solution for the quality inspection of flat glass products. A contact image sensor (CIS) is used to generate digital images of the glass surfaces. The presented machine vision based quality inspection at the end of the production line aims to classify five different glass defect types. The defect images are usually characterized by very little `image structure', i.e. homogeneous regions without distinct image texture. Additionally, these defect images usually consist of only a few pixels. At the same time the appearance of certain defect classes can be very diverse (e.g. water drops). We used simple state-of-the-art image features like histogram-based features (std. deviation, curtosis, skewness), geometric features (form factor/elongation, eccentricity, Hu-moments) and texture features (grey level run length matrix, co-occurrence matrix) to extract defect information. The main contribution of this work now lies in the systematic evaluation of various machine learning algorithms to identify appropriate classification approaches for this specific class of images. In this way, the following machine learning algorithms were compared: decision tree (J48), random forest, JRip rules, naive Bayes, Support Vector Machine (multi class), neural network (multilayer perceptron) and k-Nearest Neighbour. We used a representative image database of 2300 defect images and applied cross validation for evaluation purposes.

  19. Efficient full-chip SRAF placement using machine learning for best accuracy and improved consistency

    NASA Astrophysics Data System (ADS)

    Wang, Shibing; Baron, Stanislas; Kachwala, Nishrin; Kallingal, Chidam; Sun, Dezheng; Shu, Vincent; Fong, Weichun; Li, Zero; Elsaid, Ahmad; Gao, Jin-Wei; Su, Jing; Ser, Jung-Hoon; Zhang, Quan; Chen, Been-Der; Howell, Rafael; Hsu, Stephen; Luo, Larry; Zou, Yi; Zhang, Gary; Lu, Yen-Wen; Cao, Yu

    2018-03-01

    Various computational approaches from rule-based to model-based methods exist to place Sub-Resolution Assist Features (SRAF) in order to increase process window for lithography. Each method has its advantages and drawbacks, and typically requires the user to make a trade-off between time of development, accuracy, consistency and cycle time. Rule-based methods, used since the 90 nm node, require long development time and struggle to achieve good process window performance for complex patterns. Heuristically driven, their development is often iterative and involves significant engineering time from multiple disciplines (Litho, OPC and DTCO). Model-based approaches have been widely adopted since the 20 nm node. While the development of model-driven placement methods is relatively straightforward, they often become computationally expensive when high accuracy is required. Furthermore these methods tend to yield less consistent SRAFs due to the nature of the approach: they rely on a model which is sensitive to the pattern placement on the native simulation grid, and can be impacted by such related grid dependency effects. Those undesirable effects tend to become stronger when more iterations or complexity are needed in the algorithm to achieve required accuracy. ASML Brion has developed a new SRAF placement technique on the Tachyon platform that is assisted by machine learning and significantly improves the accuracy of full chip SRAF placement while keeping consistency and runtime under control. A Deep Convolutional Neural Network (DCNN) is trained using the target wafer layout and corresponding Continuous Transmission Mask (CTM) images. These CTM images have been fully optimized using the Tachyon inverse mask optimization engine. The neural network generated SRAF guidance map is then used to place SRAF on full-chip. This is different from our existing full-chip MB-SRAF approach which utilizes a SRAF guidance map (SGM) of mask sensitivity to improve the contrast of optical image at the target pattern edges. In this paper, we demonstrate that machine learning assisted SRAF placement can achieve a superior process window compared to the SGM model-based SRAF method, while keeping the full-chip runtime affordable, and maintain consistency of SRAF placement . We describe the current status of this machine learning assisted SRAF technique and demonstrate its application to full chip mask synthesis and discuss how it can extend the computational lithography roadmap.

  20. Glaucomatous patterns in Frequency Doubling Technology (FDT) perimetry data identified by unsupervised machine learning classifiers.

    PubMed

    Bowd, Christopher; Weinreb, Robert N; Balasubramanian, Madhusudhanan; Lee, Intae; Jang, Giljin; Yousefi, Siamak; Zangwill, Linda M; Medeiros, Felipe A; Girkin, Christopher A; Liebmann, Jeffrey M; Goldbaum, Michael H

    2014-01-01

    The variational Bayesian independent component analysis-mixture model (VIM), an unsupervised machine-learning classifier, was used to automatically separate Matrix Frequency Doubling Technology (FDT) perimetry data into clusters of healthy and glaucomatous eyes, and to identify axes representing statistically independent patterns of defect in the glaucoma clusters. FDT measurements were obtained from 1,190 eyes with normal FDT results and 786 eyes with abnormal FDT results from the UCSD-based Diagnostic Innovations in Glaucoma Study (DIGS) and African Descent and Glaucoma Evaluation Study (ADAGES). For all eyes, VIM input was 52 threshold test points from the 24-2 test pattern, plus age. FDT mean deviation was -1.00 dB (S.D. = 2.80 dB) and -5.57 dB (S.D. = 5.09 dB) in FDT-normal eyes and FDT-abnormal eyes, respectively (p<0.001). VIM identified meaningful clusters of FDT data and positioned a set of statistically independent axes through the mean of each cluster. The optimal VIM model separated the FDT fields into 3 clusters. Cluster N contained primarily normal fields (1109/1190, specificity 93.1%) and clusters G1 and G2 combined, contained primarily abnormal fields (651/786, sensitivity 82.8%). For clusters G1 and G2 the optimal number of axes were 2 and 5, respectively. Patterns automatically generated along axes within the glaucoma clusters were similar to those known to be indicative of glaucoma. Fields located farther from the normal mean on each glaucoma axis showed increasing field defect severity. VIM successfully separated FDT fields from healthy and glaucoma eyes without a priori information about class membership, and identified familiar glaucomatous patterns of loss.

  1. Machine Vision For Industrial Control:The Unsung Opportunity

    NASA Astrophysics Data System (ADS)

    Falkman, Gerald A.; Murray, Lawrence A.; Cooper, James E.

    1984-05-01

    Vision modules have primarily been developed to relieve those pressures newly brought into existence by Inspection (QUALITY) and Robotic (PRODUCTIVITY) mandates. Industrial Control pressure stems on the other hand from the older first industrial revolution mandate of throughput. Satisfying such pressure calls for speed in both imaging and decision making. Vision companies have, however, put speed on a backburner or ignore it entirely because most modules are computer/software based which limits their speed potential. Increasingly, the keynote being struck at machine vision seminars is that "Visual and Computational Speed Must Be Increased and Dramatically!" There are modular hardwired-logic systems that are fast but, all too often, they are not very bright. Such units: Measure the fill factor of bottles as they spin by, Read labels on cans, Count stacked plastic cups or Monitor the width of parts streaming past the camera. Many are only a bit more complex than a photodetector. Once in place, most of these units are incapable of simple upgrading to a new task and are Vision's analog to the robot industry's pick and place (RIA TYPE E) robot. Vision thus finds itself amidst the same quandries that once beset the Robot Industry of America when it tried to define a robot, excluded dumb ones, and was left with only slow machines whose unit volume potential is shatteringly low. This paper develops an approach to meeting the need of a vision system that cuts a swath into the terra incognita of intelligent, high-speed vision processing. Main attention is directed to vision for industrial control. Some presently untapped vision application areas that will be serviced include: Electronics, Food, Sports, Pharmaceuticals, Machine Tools and Arc Welding.

  2. Sensitivity of Support Vector Machine Predictions of Passive Microwave Brightness Temperature Over Snow-covered Terrain in High Mountain Asia

    NASA Astrophysics Data System (ADS)

    Ahmad, J. A.; Forman, B. A.

    2017-12-01

    High Mountain Asia (HMA) serves as a water supply source for over 1.3 billion people, primarily in south-east Asia. Most of this water originates as snow (or ice) that melts during the summer months and contributes to the run-off downstream. In spite of its critical role, there is still considerable uncertainty regarding the total amount of snow in HMA and its spatial and temporal variation. In this study, the NASA Land Information Systems (LIS) is used to model the hydrologic cycle over the Indus basin. In addition, the ability of support vector machines (SVM), a machine learning technique, to predict passive microwave brightness temperatures at a specific frequency and polarization as a function of LIS-derived land surface model output is explored in a sensitivity analysis. Multi-frequency, multi-polarization passive microwave brightness temperatures as measured by the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) over the Indus basin are used as training targets during the SVM training process. Normalized sensitivity coefficients (NSC) are then computed to assess the sensitivity of a well-trained SVM to each LIS-derived state variable. Preliminary results conform with the known first-order physics. For example, input states directly linked to physical temperature like snow temperature, air temperature, and vegetation temperature have positive NSC's whereas input states that increase volume scattering such as snow water equivalent or snow density yield negative NSC's. Air temperature exhibits the largest sensitivity coefficients due to its inherent, high-frequency variability. Adherence of this machine learning algorithm to the first-order physics bodes well for its potential use in LIS as the observation operator within a radiance data assimilation system aimed at improving regional- and continental-scale snow estimates.

  3. Optimization of the production process using virtual model of a workspace

    NASA Astrophysics Data System (ADS)

    Monica, Z.

    2015-11-01

    Optimization of the production process is an element of the design cycle consisting of: problem definition, modelling, simulation, optimization and implementation. Without the use of simulation techniques, the only thing which could be achieved is larger or smaller improvement of the process, not the optimization (i.e., the best result it is possible to get for the conditions under which the process works). Optimization is generally management actions that are ultimately bring savings in time, resources, and raw materials and improve the performance of a specific process. It does not matter whether it is a service or manufacturing process. Optimizing the savings generated by improving and increasing the efficiency of the processes. Optimization consists primarily of organizational activities that require very little investment, or rely solely on the changing organization of work. Modern companies operating in a market economy shows a significant increase in interest in modern methods of production management and services. This trend is due to the high competitiveness among companies that want to achieve success are forced to continually modify the ways to manage and flexible response to changing demand. Modern methods of production management, not only imply a stable position of the company in the sector, but also influence the improvement of health and safety within the company and contribute to the implementation of more efficient rules for standardization work in the company. This is why in the paper is presented the application of such developed environment like Siemens NX to create the virtual model of a production system and to simulate as well as optimize its work. The analyzed system is the robotized workcell consisting of: machine tools, industrial robots, conveyors, auxiliary equipment and buffers. In the program could be defined the control program realizing the main task in the virtual workcell. It is possible, using this tool, to optimize both the object trajectory and the cooperation process.

  4. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  5. Way to nanogrinding technology

    NASA Astrophysics Data System (ADS)

    Miyashita, Masakazu

    1990-11-01

    Precision finishing process of hard and brittle material components such as single crystal silicon wafer and magnetic head consists of lapping and polishing which depend too much on skilled labor. This process is based on the traditional optical production technology and entirely different from the automated mass production technique in automobile production. Instead of traditional lapping and polishing, the nanogrinding is proposed as a new stock removal machining to generate optical surface on brittle materials. By this new technology, the damage free surface which is the same one produced by lapping and polishing can be obtained on brittle materials, and the free carvature can also be generated on brittle materials. This technology is based on the motion copying principle which is the same as in case of metal parts machining. The new nanogrinding technology is anticipated to be adapted as the machining technique suitable for automated mass production, because the stable machining on the level of optical production technique is expected to be obtained by the traditional lapping and polishing.

  6. Evaluation of Iron Loss in Interior Permanent Magnet Synchronous Motor with Consideration of Rotational Field

    NASA Astrophysics Data System (ADS)

    Ma, Lei; Sanada, Masayuki; Morimoto, Shigeo; Takeda, Yoji; Kaido, Chikara; Wakisaka, Takeaki

    Loss evaluation is an important issue in the design of electrical machines. Due to the complicate structure and flux distribution, it is difficult to predict the iron loss in the machines exactly. This paper studies the iron loss in interior permanent magnet synchronous motors based on the finite element method. The iron loss test data of core material are used in the fitting of the hysteresis and eddy current loss constants. For motors in practical operation, additional iron losses due to the appearance of rotation of flux density vector and harmonic flux density distribution makes the calculation data deviates from the measured ones. Revision is made to account for these excess iron losses which exist in the practical operating condition. Calculation results show good consistence with the experimental ones. The proposed method provides a possible way to predict the iron loss of the electrical machine with good precision, and may be helpful in the selection of the core material which is best suitable for a certain machine.

  7. Cross-platform normalization of microarray and RNA-seq data for machine learning applications

    PubMed Central

    Thompson, Jeffrey A.; Tan, Jie

    2016-01-01

    Large, publicly available gene expression datasets are often analyzed with the aid of machine learning algorithms. Although RNA-seq is increasingly the technology of choice, a wealth of expression data already exist in the form of microarray data. If machine learning models built from legacy data can be applied to RNA-seq data, larger, more diverse training datasets can be created and validation can be performed on newly generated data. We developed Training Distribution Matching (TDM), which transforms RNA-seq data for use with models constructed from legacy platforms. We evaluated TDM, as well as quantile normalization, nonparanormal transformation, and a simple log2 transformation, on both simulated and biological datasets of gene expression. Our evaluation included both supervised and unsupervised machine learning approaches. We found that TDM exhibited consistently strong performance across settings and that quantile normalization also performed well in many circumstances. We also provide a TDM package for the R programming language. PMID:26844019

  8. Machine learning plus optical flow: a simple and sensitive method to detect cardioactive drugs

    NASA Astrophysics Data System (ADS)

    Lee, Eugene K.; Kurokawa, Yosuke K.; Tu, Robin; George, Steven C.; Khine, Michelle

    2015-07-01

    Current preclinical screening methods do not adequately detect cardiotoxicity. Using human induced pluripotent stem cell-derived cardiomyocytes (iPS-CMs), more physiologically relevant preclinical or patient-specific screening to detect potential cardiotoxic effects of drug candidates may be possible. However, one of the persistent challenges for developing a high-throughput drug screening platform using iPS-CMs is the need to develop a simple and reliable method to measure key electrophysiological and contractile parameters. To address this need, we have developed a platform that combines machine learning paired with brightfield optical flow as a simple and robust tool that can automate the detection of cardiomyocyte drug effects. Using three cardioactive drugs of different mechanisms, including those with primarily electrophysiological effects, we demonstrate the general applicability of this screening method to detect subtle changes in cardiomyocyte contraction. Requiring only brightfield images of cardiomyocyte contractions, we detect changes in cardiomyocyte contraction comparable to - and even superior to - fluorescence readouts. This automated method serves as a widely applicable screening tool to characterize the effects of drugs on cardiomyocyte function.

  9. Machine learnt bond order potential to investigate the low thermal conductivity of stanene nanostructures

    NASA Astrophysics Data System (ADS)

    Cherukara, Mathew; Narayanan, Badri; Kinaci, Alper; Sasikumar, Kiran; Gray, Stephen; Chan, Maria; Sankaranarayanan, Subramanian

    The growth of stanene on a Bi2Te3\\ substrate has engendered a great deal of interest, in part due to stanene's predicted exotic properties. In particular, stanene shows promise in topological insulation, large-gap 2D quantum spin hall states, lossless electrical conduction, enhanced thermoelectricity, and topological superconductivity. However, atomistic investigations of growth mechanisms (needed to guide synthesis), phonon transport (crucial for designing thermoelectrics), and thermo-mechanical behavior of stanene are scarce. This paucity is primarily due to the lack of inter-atomic potentials that can accurately capture atomic interactions in stanene. To address this, we have developed a machine learnt bond-order potential (BOP) based on Tersoff's formalism that can accurately capture bond breaking/formation events, structure, energetics, thermodynamics, thermal conductivity, and mechanical properties of single layer tin, using a training set derived from density functional theory calculations. Finally, we employed our newly developed BOP to study anisotropy in thermal conductivity of stanene sheets, temperature induced rippling, as well as dependence of anharmonicity and thermal conductivity on temperature.

  10. Utilizing Skylab data in on-going resources management programs in the state of Ohio

    NASA Technical Reports Server (NTRS)

    Baldridge, P. E. (Principal Investigator); Goesling, P. H.; Martin, T. A.; Wukelic, G. E.; Stephan, J. G.; Smail, H. E.; Ebbert, T. F.

    1975-01-01

    The author has identified the following significant results. The use of Skylab imagery for total area woodland surveys was found to be more accurate and cheaper than conventional surveys using aerial photo-plot techniques. Machine-aided (primarily density slicing) analyses of Skylab 190A and 190B color and infrared color photography demonstrated the feasibility of using such data for differentiating major timber classes including pines, hardwoods, mixed, cut, and brushland providing such analyses are made at scales of 1:24,000 and larger. Manual and machine-assisted image analysis indicated that spectral and spatial capabilities of Skylab EREP photography are adequate to distinguish most parameters of current, coal surface mining concern associated with: (1) active mining, (2) orphan lands, (3) reclaimed lands, and (4) active reclamation. Excellent results were achieved when comparing Skylab and aerial photographic interpretations of detailed surface mining features. Skylab photographs when combined with other data bases (e.g., census, agricultural land productivity, and transportation networks), provide a comprehensive, meaningful, and integrated view of major elements involved in the urbanization/encroachment process.

  11. Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitation.

    PubMed

    Maidenbaum, Shachar; Abboud, Sami; Amedi, Amir

    2014-04-01

    Sensory substitution devices (SSDs) have come a long way since first developed for visual rehabilitation. They have produced exciting experimental results, and have furthered our understanding of the human brain. Unfortunately, they are still not used for practical visual rehabilitation, and are currently considered as reserved primarily for experiments in controlled settings. Over the past decade, our understanding of the neural mechanisms behind visual restoration has changed as a result of converging evidence, much of which was gathered with SSDs. This evidence suggests that the brain is more than a pure sensory-machine but rather is a highly flexible task-machine, i.e., brain regions can maintain or regain their function in vision even with input from other senses. This complements a recent set of more promising behavioral achievements using SSDs and new promising technologies and tools. All these changes strongly suggest that the time has come to revive the focus on practical visual rehabilitation with SSDs and we chart several key steps in this direction such as training protocols and self-train tools. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Design of cylindrical pipe automatic welding control system based on STM32

    NASA Astrophysics Data System (ADS)

    Chen, Shuaishuai; Shen, Weicong

    2018-04-01

    The development of modern economy makes the demand for pipeline construction and construction rapidly increasing, and the pipeline welding has become an important link in pipeline construction. At present, there are still a large number of using of manual welding methods at home and abroad, and field pipe welding especially lacks miniature and portable automatic welding equipment. An automated welding system consists of a control system, which consisting of a lower computer control panel and a host computer operating interface, as well as automatic welding machine mechanisms and welding power systems in coordination with the control system. In this paper, a new control system of automatic pipe welding based on the control panel of the lower computer and the interface of the host computer is proposed, which has many advantages over the traditional automatic welding machine.

  13. Support Vector Machine-Based Endmember Extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Filippi, Anthony M; Archibald, Richard K

    Introduced in this paper is the utilization of Support Vector Machines (SVMs) to automatically perform endmember extraction from hyperspectral data. The strengths of SVM are exploited to provide a fast and accurate calculated representation of high-dimensional data sets that may consist of multiple distributions. Once this representation is computed, the number of distributions can be determined without prior knowledge. For each distribution, an optimal transform can be determined that preserves informational content while reducing the data dimensionality, and hence, the computational cost. Finally, endmember extraction for the whole data set is accomplished. Results indicate that this Support Vector Machine-Based Endmembermore » Extraction (SVM-BEE) algorithm has the capability of autonomously determining endmembers from multiple clusters with computational speed and accuracy, while maintaining a robust tolerance to noise.« less

  14. Re-designing a mechanism for higher speed: A case history from textile machinery

    NASA Astrophysics Data System (ADS)

    Douglas, S. S.; Rooney, G. T.

    The generation of general mechanism design software which is the formulation of suitable objective functions is discussed. There is a consistent drive towards higher speeds in the development of industrial sewing machines. This led to experimental analyses of dynamic performance and to a search for improved design methods. The experimental work highlighted the need for smoothness of motion at high speed, component inertias, and frame structural stiffness. Smoothness is associated with transmission properties and harmonic analysis. These are added to other design requirements of synchronization, mechanism size, and function. Some of the mechanism trains in overedte sewing machines are shown. All these trains are designed by digital optimization. The design software combines analysis of the sewing machine mechanisms, formulation of objectives innumerical terms, and suitable mathematical optimization ttechniques.

  15. Application of coordinate transform on ball plate calibration

    NASA Astrophysics Data System (ADS)

    Wei, Hengzheng; Wang, Weinong; Ren, Guoying; Pei, Limei

    2015-02-01

    For the ball plate calibration method with coordinate measurement machine (CMM) equipped with laser interferometer, it is essential to adjust the ball plate parallel to the direction of laser beam. It is very time-consuming. To solve this problem, a method based on coordinate transformation between machine system and object system is presented. With the fixed points' coordinates of the ball plate measured in the object system and machine system, the transformation matrix between the coordinate systems is calculated. The laser interferometer measurement data error due to the placement of ball plate can be corrected with this transformation matrix. Experimental results indicate that this method is consistent with the handy adjustment method. It avoids the complexity of ball plate adjustment. It also can be applied to the ball beam calibration.

  16. Anechoic chamber in industrial plants. [construction materials and structural design

    NASA Technical Reports Server (NTRS)

    Halpert, E.; Juncu, O.; Lorian, R.; Marfievici, D.; Mararu, I.

    1974-01-01

    A light anechoic chamber for routine acoustical measurements in the machine building industry is reported. The outer housing of the chamber consists of modules cast in glass fiber reinforced polyester resin; the inner housing consists of pyramidal modules cut out of sound absorbing slates. The parameters of this anechoic chamber facilitate acoustical measurements according to ISO and CAEM recommendations.

  17. Rotor internal friction instability

    NASA Technical Reports Server (NTRS)

    Bently, D. E.; Muszynska, A.

    1985-01-01

    Two aspects of internal friction affecting stability of rotating machines are discussed. The first role of internal friction consists of decreasing the level of effective damping during rotor subsynchronous and backward precessional vibrations caused by some other instability mechanisms. The second role of internal frication consists of creating rotor instability, i.e., causing self-excited subsynchronous vibrations. Experimental test results document both of these aspects.

  18. Creation of operation algorithms for combined operation of anti-lock braking system (ABS) and electric machine included in the combined power plant

    NASA Astrophysics Data System (ADS)

    Bakhmutov, S. V.; Ivanov, V. G.; Karpukhin, K. E.; Umnitsyn, A. A.

    2018-02-01

    The paper considers the Anti-lock Braking System (ABS) operation algorithm, which enables the implementation of hybrid braking, i.e. the braking process combining friction brake mechanisms and e-machine (electric machine), which operates in the energy recovery mode. The provided materials focus only on the rectilinear motion of the vehicle. That the ABS task consists in the maintenance of the target wheel slip ratio, which depends on the tyre-road adhesion coefficient. The tyre-road adhesion coefficient was defined based on the vehicle deceleration. In the course of calculated studies, the following operation algorithm of hybrid braking was determined. At adhesion coefficient ≤0.1, driving axle braking occurs only due to the e-machine operating in the energy recovery mode. In other cases, depending on adhesion coefficient, the e-machine provides the brake torque, which changes from 35 to 100% of the maximum available brake torque. Virtual tests showed that values of the wheel slip ratio are close to the required ones. Thus, this algorithm makes it possible to implement hybrid braking by means of the two sources creating the brake torque.

  19. Development of a sterilizing in-place application for a production machine using Vaporized Hydrogen Peroxide.

    PubMed

    Mau, T; Hartmann, V; Burmeister, J; Langguth, P; Häusler, H

    2004-01-01

    The use of steam in sterilization processes is limited by the implementation of heat-sensitive components inside the machines to be sterilized. Alternative low-temperature sterilization methods need to be found and their suitability evaluated. Vaporized Hydrogen Peroxide (VHP) technology was adapted for a production machine consisting of highly sensitive pressure sensors and thermo-labile air tube systems. This new kind of "cold" surface sterilization, known from the Barrier Isolator Technology, is based on the controlled release of hydrogen peroxide vapour into sealed enclosures. A mobile VHP generator was used to generate the hydrogen peroxide vapour. The unit was combined with the air conduction system of the production machine. Terminal vacuum pumps were installed to distribute the gas within the production machine and for its elimination. In order to control the sterilization process, different physical process monitors were incorporated. The validation of the process was based on biological indicators (Geobacillus stearothermophilus). The Limited Spearman Karber Method (LSKM) was used to statistically evaluate the sterilization process. The results show that it is possible to sterilize surfaces in a complex tube system with the use of gaseous hydrogen peroxide. A total microbial reduction of 6 log units was reached.

  20. An easy-to-use calculating machine to simulate steady state and non-steady-state preparative separations by multiple dual mode counter-current chromatography with semi-continuous loading of feed mixtures.

    PubMed

    Kostanyan, Artak E; Shishilov, Oleg N

    2018-06-01

    Multiple dual mode counter-current chromatography (MDM CCC) separation processes with semi-continuous large sample loading consist of a succession of two counter-current steps: with "x" phase (first step) and "y" phase (second step) flow periods. A feed mixture dissolved in the "x" phase is continuously loaded into a CCC machine at the beginning of the first step of each cycle over a constant time with the volumetric rate equal to the flow rate of the pure "x" phase. An easy-to-use calculating machine is developed to simulate the chromatograms and the amounts of solutes eluted with the phases at each cycle for steady-state (the duration of the flow periods of the phases is kept constant for all the cycles) and non-steady-state (with variable duration of alternating phase elution steps) separations. Using the calculating machine, the separation of mixtures containing up to five components can be simulated and designed. Examples of the application of the calculating machine for the simulation of MDM CCC processes are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Laser machining of explosives

    DOEpatents

    Perry, Michael D.; Stuart, Brent C.; Banks, Paul S.; Myers, Booth R.; Sefcik, Joseph A.

    2000-01-01

    The invention consists of a method for machining (cutting, drilling, sculpting) of explosives (e.g., TNT, TATB, PETN, RDX, etc.). By using pulses of a duration in the range of 5 femtoseconds to 50 picoseconds, extremely precise and rapid machining can be achieved with essentially no heat or shock affected zone. In this method, material is removed by a nonthermal mechanism. A combination of multiphoton and collisional ionization creates a critical density plasma in a time scale much shorter than electron kinetic energy is transferred to the lattice. The resulting plasma is far from thermal equilibrium. The material is in essence converted from its initial solid-state directly into a fully ionized plasma on a time scale too short for thermal equilibrium to be established with the lattice. As a result, there is negligible heat conduction beyond the region removed resulting in negligible thermal stress or shock to the material beyond a few microns from the laser machined surface. Hydrodynamic expansion of the plasma eliminates the need for any ancillary techniques to remove material and produces extremely high quality machined surfaces. There is no detonation or deflagration of the explosive in the process and the material which is removed is rendered inert.

  2. Detection of inter-turn short-circuit at start-up of induction machine based on torque analysis

    NASA Astrophysics Data System (ADS)

    Pietrowski, Wojciech; Górny, Konrad

    2017-12-01

    Recently, interest in new diagnostics methods in a field of induction machines was observed. Research presented in the paper shows the diagnostics of induction machine based on torque pulsation, under inter-turn short-circuit, during start-up of a machine. In the paper three numerical techniques were used: finite element analysis, signal analysis and artificial neural networks (ANN). The elaborated numerical model of faulty machine consists of field, circuit and motion equations. Voltage excited supply allowed to determine the torque waveform during start-up. The inter-turn short-circuit was treated as a galvanic connection between two points of the stator winding. The waveforms were calculated for different amounts of shorted-turns from 0 to 55. Due to the non-stationary waveforms a wavelet packet decomposition was used to perform an analysis of the torque. The obtained results of analysis were used as input vector for ANN. The response of the neural network was the number of shorted-turns in the stator winding. Special attention was paid to compare response of general regression neural network (GRNN) and multi-layer perceptron neural network (MLP). Based on the results of the research, the efficiency of the developed algorithm can be inferred.

  3. The machine-readable Durchmusterungen - Classical catalogs in contemporary form. [for positional astronomy and identification of stars

    NASA Technical Reports Server (NTRS)

    Warren, Wayne H., Jr.; Ochsenbein, Francois; Rappaport, Barry N.

    1990-01-01

    The entire series of Durchmusterung (DM) catalogs (Bonner, Southern, Cordoba, Cape Photographic) has been computerized through a collaborative effort among institutions and individuals in France and the United States of America. Complete verification of the data, both manually and by computer, the inclusion of all supplemental stars (represented by lower case letters), complete representation of all numerical data, and a consistent format for all catalogs, should make this collection of machine-readable data a valuable addition to digitized astronomical archives.

  4. The semantics of Chemical Markup Language (CML): dictionaries and conventions.

    PubMed

    Murray-Rust, Peter; Townsend, Joe A; Adams, Sam E; Phadungsukanan, Weerapong; Thomas, Jens

    2011-10-14

    The semantic architecture of CML consists of conventions, dictionaries and units. The conventions conform to a top-level specification and each convention can constrain compliant documents through machine-processing (validation). Dictionaries conform to a dictionary specification which also imposes machine validation on the dictionaries. Each dictionary can also be used to validate data in a CML document, and provide human-readable descriptions. An additional set of conventions and dictionaries are used to support scientific units. All conventions, dictionaries and dictionary elements are identifiable and addressable through unique URIs.

  5. The semantics of Chemical Markup Language (CML): dictionaries and conventions

    PubMed Central

    2011-01-01

    The semantic architecture of CML consists of conventions, dictionaries and units. The conventions conform to a top-level specification and each convention can constrain compliant documents through machine-processing (validation). Dictionaries conform to a dictionary specification which also imposes machine validation on the dictionaries. Each dictionary can also be used to validate data in a CML document, and provide human-readable descriptions. An additional set of conventions and dictionaries are used to support scientific units. All conventions, dictionaries and dictionary elements are identifiable and addressable through unique URIs. PMID:21999509

  6. Development of an After-Sales Support Inter-Enterprise Collaboration System Using Information Technologies

    NASA Astrophysics Data System (ADS)

    Kimura, Toshiaki; Kasai, Fumio; Kamio, Yoichi; Kanda, Yuichi

    This research paper discusses a manufacturing support system which supports not only maintenance services but also consulting services for manufacturing systems consisting of multi-vendor machine tools. In order to do this system enables inter-enterprise collaboration between engineering companies and machine tool vendors. The system is called "After-Sales Support Inter-enterprise collaboration System using information Technologies" (ASSIST). This paper describes the concept behind the planned ASSIST, the development of a prototype of the system, and discusses test operation results of the system.

  7. The Atwood machine revisited using smartphones

    NASA Astrophysics Data System (ADS)

    Monteiro, Martín; Stari, Cecilia; Cabeza, Cecilia; Marti, Arturo C.

    2015-09-01

    The Atwood machine is a simple device used for centuries to demonstrate Newton's second law. It consists of two supports containing different masses joined by a string. Here we propose an experiment in which a smartphone is fixed to one support. With the aid of the built-in accelerometer of the smartphone, the vertical acceleration is registered. By redistributing the masses of the supports, a linear relationship between the mass difference and the vertical acceleration is obtained. In this experiment, the use of a smartphone contributes to enhance a classical demonstration.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    none,

    The Fusion Energy Science Advisory Committee was asked to conduct a review of Fusion Materials Research Program (the Structural Materials portion of the Fusion Program) by Dr. Martha Krebs, Director of Energy Research for the Department of Energy. This request was motivated by the fact that significant changes have been made in the overall direction of the Fusion Program from one primarily focused on the milestones necessary to the construction of successively larger machines to one where the necessary scientific basis for an attractive fusion energy system is. better understood. It was in this context that the review of currentmore » scientific excellence and recommendations for future goals and balance within the Program was requested.« less

  9. Advances in sputtered and ion plated solid film lubrication

    NASA Technical Reports Server (NTRS)

    Spalvins, T.

    1985-01-01

    The glow discharge or ion assisted vacuum deposition techniques, primarily sputtering and ion plating, have rapidly emerged and offer great potential to deposit solid lubricants. The increased energizing of these deposition processes lead to improved adherence and coherence, favorable morphological growth, higher density, and reduced residual stresses in the film. These techniques are of invaluable importance where high precision machines tribo-components require very thin, uniform lubricating films (0.2 m), which do not interface with component tolerances. The performance of sputtered MoS2 films and ion plated Au and Pb films are described in terms of film thickness, coefficient of friction, and wear lives.

  10. Fundamental research in artificial intelligence at NASA

    NASA Technical Reports Server (NTRS)

    Friedland, Peter

    1990-01-01

    This paper describes basic research at NASA in the field of artificial intelligence. The work is conducted at the Ames Research Center and the Jet Propulsion Laboratory, primarily under the auspices of the NASA-wide Artificial Intelligence Program in the Office of Aeronautics, Exploration and Technology. The research is aimed at solving long-term NASA problems in missions operations, spacecraft autonomy, preservation of corporate knowledge about NASA missions and vehicles, and management/analysis of scientific and engineering data. From a scientific point of view, the research is broken into the categories of: planning and scheduling; machine learning; and design of and reasoning about large-scale physical systems.

  11. In-process fault detection for textile fabric production: onloom imaging

    NASA Astrophysics Data System (ADS)

    Neumann, Florian; Holtermann, Timm; Schneider, Dorian; Kulczycki, Ashley; Gries, Thomas; Aach, Til

    2011-05-01

    Constant and traceable high fabric quality is of high importance both for technical and for high-quality conventional fabrics. Usually, quality inspection is carried out by trained personal, whose detection rate and maximum period of concentration are limited. Low resolution automated fabric inspection machines using texture analysis were developed. Since 2003, systems for the in-process inspection on weaving machines ("onloom") are commercially available. With these defects can be detected, but not measured quantitative precisely. Most systems are also prone to inevitable machine vibrations. Feedback loops for fault prevention are not established. Technology has evolved since 2003: Camera and computer prices dropped, resolutions were enhanced, recording speeds increased. These are the preconditions for real-time processing of high-resolution images. So far, these new technological achievements are not used in textile fabric production. For efficient use, a measurement system must be integrated into the weaving process; new algorithms for defect detection and measurement must be developed. The goal of the joint project is the development of a modern machine vision system for nondestructive onloom fabric inspection. The system consists of a vibration-resistant machine integration, a high-resolution machine vision system, and new, reliable, and robust algorithms with quality database for defect documentation. The system is meant to detect, measure, and classify at least 80 % of economically relevant defects. Concepts for feedback loops into the weaving process will be pointed out.

  12. Revision of Import and Export Requirements for Controlled Substances, Listed Chemicals, and Tableting and Encapsulating Machines, Including Changes To Implement the International Trade Data System (ITDS); Revision of Reporting Requirements for Domestic Transactions in Listed Chemicals and Tableting and Encapsulating Machines; and Technical Amendments. Final rule.

    PubMed

    2016-12-30

    The Drug Enforcement Administration is updating its regulations for the import and export of tableting and encapsulating machines, controlled substances, and listed chemicals, and its regulations relating to reports required for domestic transactions in listed chemicals, gamma-hydroxybutyric acid, and tableting and encapsulating machines. In accordance with Executive Order 13563, the Drug Enforcement Administration has reviewed its import and export regulations and reporting requirements for domestic transactions in listed chemicals (and gamma-hydroxybutyric acid) and tableting and encapsulating machines, and evaluated them for clarity, consistency, continued accuracy, and effectiveness. The amendments clarify certain policies and reflect current procedures and technological advancements. The amendments also allow for the implementation, as applicable to tableting and encapsulating machines, controlled substances, and listed chemicals, of the President's Executive Order 13659 on streamlining the export/import process and requiring the government-wide utilization of the International Trade Data System (ITDS). This rule additionally contains amendments that implement recent changes to the Controlled Substances Import and Export Act (CSIEA) for reexportation of controlled substances among members of the European Economic Area made by the Improving Regulatory Transparency for New Medical Therapies Act. The rule also includes additional substantive and technical and stylistic amendments.

  13. A Review of Current Machine Learning Methods Used for Cancer Recurrence Modeling and Prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemphill, Geralyn M.

    Cancer has been characterized as a heterogeneous disease consisting of many different subtypes. The early diagnosis and prognosis of a cancer type has become a necessity in cancer research. A major challenge in cancer management is the classification of patients into appropriate risk groups for better treatment and follow-up. Such risk assessment is critically important in order to optimize the patient’s health and the use of medical resources, as well as to avoid cancer recurrence. This paper focuses on the application of machine learning methods for predicting the likelihood of a recurrence of cancer. It is not meant to bemore » an extensive review of the literature on the subject of machine learning techniques for cancer recurrence modeling. Other recent papers have performed such a review, and I will rely heavily on the results and outcomes from these papers. The electronic databases that were used for this review include PubMed, Google, and Google Scholar. Query terms used include “cancer recurrence modeling”, “cancer recurrence and machine learning”, “cancer recurrence modeling and machine learning”, and “machine learning for cancer recurrence and prediction”. The most recent and most applicable papers to the topic of this review have been included in the references. It also includes a list of modeling and classification methods to predict cancer recurrence.« less

  14. Application of target costing in machining

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Bhaskaran; Kokatnur, Ameet; Gupta, Deepak P.

    2004-11-01

    In today's intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functionality requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. Target costing is a structured approach to determine the cost at which a proposed product, meeting the quality and functionality requirements, must be produced in order to generate the desired profits. It subtracts the desired profit margin from the company's selling price to establish the manufacturing cost of the product. Extensive literature review revealed that companies in automotive, electronic and process industries have reaped the benefits of target costing. However target costing approach has not been applied in the machining industry, but other techniques based on Geometric Programming, Goal Programming, and Lagrange Multiplier have been proposed for application in this industry. These models follow a forward approach, by first selecting a set of machining parameters, and then determining the machining cost. Hence in this study we have developed an algorithm to apply the concepts of target costing, which is a backward approach that selects the machining parameters based on the required machining costs, and is therefore more suitable for practical applications in process improvement and cost reduction. A target costing model was developed for turning operation and was successfully validated using practical data.

  15. Core Muscle Activity, Exercise Preference, and Perceived Exertion during Core Exercise with Elastic Resistance versus Machine.

    PubMed

    Vinstrup, Jonas; Sundstrup, Emil; Brandt, Mikkel; Jakobsen, Markus D; Calatayud, Joaquin; Andersen, Lars L

    2015-01-01

    Objectives. To investigate core muscle activity, exercise preferences, and perceived exertion during two selected core exercises performed with elastic resistance versus a conventional training machine. Methods. 17 untrained men aged 26-67 years participated in surface electromyography (EMG) measurements of five core muscles during torso-twists performed from left to right with elastic resistance and in the machine, respectively. The order of the exercises was randomized and each exercise consisted of 3 repetitions performed at a 10 RM load. EMG amplitude was normalized (nEMG) to maximum voluntary isometric contraction (MVC). Results. A higher right erector spinae activity in the elastic exercise compared with the machine exercise (50% [95% CI 36-64] versus 32% [95% CI 18-46] nEMG) was found. By contrast, the machine exercise, compared with the elastic exercise, showed higher left external oblique activity (77% [95% CI 64-90] versus 54% [95% CI 40-67] nEMG). For the rectus abdominis, right external oblique, and left erector spinae muscles there were no significant differences. Furthermore, 76% preferred the torso-twist with elastic resistance over the machine exercise. Perceived exertion (Borg CR10) was not significantly different between machine (5.8 [95% CI 4.88-6.72]) and elastic exercise (5.7 [95% CI 4.81-6.59]). Conclusion. Torso-twists using elastic resistance showed higher activity of the erector spinae, whereas torso-twist in the machine resulted in higher activity of the external oblique. For the remaining core muscles the two training modalities induced similar muscular activation. In spite of similar perceived exertion the majority of the participants preferred the exercise using elastic resistance.

  16. Predicting hydrofacies and hydraulic conductivity from direct-push data using a data-driven relevance vector machine approach: Motivations, algorithms, and application

    NASA Astrophysics Data System (ADS)

    Paradis, Daniel; Lefebvre, René; Gloaguen, Erwan; Rivera, Alfonso

    2015-01-01

    The spatial heterogeneity of hydraulic conductivity (K) exerts a major control on groundwater flow and solute transport. The heterogeneous spatial distribution of K can be imaged using indirect geophysical data as long as reliable relations exist to link geophysical data to K. This paper presents a nonparametric learning machine approach to predict aquifer K from cone penetrometer tests (CPT) coupled with a soil moisture and resistivity probe (SMR) using relevance vector machines (RVMs). The learning machine approach is demonstrated with an application to a heterogeneous unconsolidated littoral aquifer in a 12 km2 subwatershed, where relations between K and multiparameters CPT/SMR soundings appear complex. Our approach involved fuzzy clustering to define hydrofacies (HF) on the basis of CPT/SMR and K data prior to the training of RVMs for HFs recognition and K prediction on the basis of CPT/SMR data alone. The learning machine was built from a colocated training data set representative of the study area that includes K data from slug tests and CPT/SMR data up-scaled at a common vertical resolution of 15 cm with K data. After training, the predictive capabilities of the learning machine were assessed through cross validation with data withheld from the training data set and with K data from flowmeter tests not used during the training process. Results show that HF and K predictions from the learning machine are consistent with hydraulic tests. The combined use of CPT/SMR data and RVM-based learning machine proved to be powerful and efficient for the characterization of high-resolution K heterogeneity for unconsolidated aquifers.

  17. Multidimensional Interactive Radiology Report and Analysis: standardization of workflow and reporting for renal mass tracking and quantification

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Ma, Kevin; Yepes, Fernando; Nadamuni, Mridula; Nayyar, Megha; Liu, Brent; Duddalwar, Vinay; Lepore, Natasha

    2015-12-01

    A conventional radiology report primarily consists of a large amount of unstructured text, and lacks clear, concise, consistent and content-rich information. Hence, an area of unmet clinical need consists of developing better ways to communicate radiology findings and information specific to each patient. Here, we design a new workflow and reporting system that combines and integrates advances in engineering technology with those from the medical sciences, the Multidimensional Interactive Radiology Report and Analysis (MIRRA). Until recently, clinical standards have primarily relied on 2D images for the purpose of measurement, but with the advent of 3D processing, many of the manually measured metrics can be automated, leading to better reproducibility and less subjective measurement placement. Hence, we make use this newly available 3D processing in our workflow. Our pipeline is used here to standardize the labeling, tracking, and quantifying of metrics for renal masses.

  18. Monocular depth perception using image processing and machine learning

    NASA Astrophysics Data System (ADS)

    Hombali, Apoorv; Gorde, Vaibhav; Deshpande, Abhishek

    2011-10-01

    This paper primarily exploits some of the more obscure, but inherent properties of camera and image to propose a simpler and more efficient way of perceiving depth. The proposed method involves the use of a single stationary camera at an unknown perspective and an unknown height to determine depth of an object on unknown terrain. In achieving so a direct correlation between a pixel in an image and the corresponding location in real space has to be formulated. First, a calibration step is undertaken whereby the equation of the plane visible in the field of view is calculated along with the relative distance between camera and plane by using a set of derived spatial geometrical relations coupled with a few intrinsic properties of the system. The depth of an unknown object is then perceived by first extracting the object under observation using a series of image processing steps followed by exploiting the aforementioned mapping of pixel and real space coordinate. The performance of the algorithm is greatly enhanced by the introduction of reinforced learning making the system independent of hardware and environment. Furthermore the depth calculation function is modified with a supervised learning algorithm giving consistent improvement in results. Thus, the system uses the experience in past and optimizes the current run successively. Using the above procedure a series of experiments and trials are carried out to prove the concept and its efficacy.

  19. Assessing ADHD symptoms in children and adults: evaluating the role of objective measures.

    PubMed

    Emser, Theresa S; Johnston, Blair A; Steele, J Douglas; Kooij, Sandra; Thorell, Lisa; Christiansen, Hanna

    2018-05-18

    Diagnostic guidelines recommend using a variety of methods to assess and diagnose ADHD. Applying subjective measures always incorporates risks such as informant biases or large differences between ratings obtained from diverse sources. Furthermore, it has been demonstrated that ratings and tests seem to assess somewhat different constructs. The use of objective measures might thus yield valuable information for diagnosing ADHD. This study aims at evaluating the role of objective measures when trying to distinguish between individuals with ADHD and controls. Our sample consisted of children (n = 60) and adults (n = 76) diagnosed with ADHD and matched controls who completed self- and observer ratings as well as objective tasks. Diagnosis was primarily based on clinical interviews. A popular pattern recognition approach, support vector machines, was used to predict the diagnosis. We observed relatively high accuracy of 79% (adults) and 78% (children) applying solely objective measures. Predicting an ADHD diagnosis using both subjective and objective measures exceeded the accuracy of objective measures for both adults (89.5%) and children (86.7%), with the subjective variables proving to be the most relevant. We argue that objective measures are more robust against rater bias and errors inherent in subjective measures and may be more replicable. Considering the high accuracy of objective measures only, we found in our study, we think that they should be incorporated in diagnostic procedures for assessing ADHD.

  20. A hybrid MLP-CNN classifier for very fine resolution remotely sensed image classification

    NASA Astrophysics Data System (ADS)

    Zhang, Ce; Pan, Xin; Li, Huapeng; Gardiner, Andy; Sargent, Isabel; Hare, Jonathon; Atkinson, Peter M.

    2018-06-01

    The contextual-based convolutional neural network (CNN) with deep architecture and pixel-based multilayer perceptron (MLP) with shallow structure are well-recognized neural network algorithms, representing the state-of-the-art deep learning method and the classical non-parametric machine learning approach, respectively. The two algorithms, which have very different behaviours, were integrated in a concise and effective way using a rule-based decision fusion approach for the classification of very fine spatial resolution (VFSR) remotely sensed imagery. The decision fusion rules, designed primarily based on the classification confidence of the CNN, reflect the generally complementary patterns of the individual classifiers. In consequence, the proposed ensemble classifier MLP-CNN harvests the complementary results acquired from the CNN based on deep spatial feature representation and from the MLP based on spectral discrimination. Meanwhile, limitations of the CNN due to the adoption of convolutional filters such as the uncertainty in object boundary partition and loss of useful fine spatial resolution detail were compensated. The effectiveness of the ensemble MLP-CNN classifier was tested in both urban and rural areas using aerial photography together with an additional satellite sensor dataset. The MLP-CNN classifier achieved promising performance, consistently outperforming the pixel-based MLP, spectral and textural-based MLP, and the contextual-based CNN in terms of classification accuracy. This research paves the way to effectively address the complicated problem of VFSR image classification.

  1. The design and development of a high-throughput magneto-mechanostimulation device for cartilage tissue engineering.

    PubMed

    Brady, Mariea A; Vaze, Reva; Amin, Harsh D; Overby, Darryl R; Ethier, C Ross

    2014-02-01

    To recapitulate the in vivo environment and create neo-organoids that replace lost or damaged tissue requires the engineering of devices, which provide appropriate biophysical cues. To date, bioreactors for cartilage tissue engineering have focused primarily on biomechanical stimulation. There is a significant need for improved devices for articular cartilage tissue engineering capable of simultaneously applying multiple biophysical (electrokinetic and mechanical) stimuli. We have developed a novel high-throughput magneto-mechanostimulation bioreactor, capable of applying static and time-varying magnetic fields, as well as multiple and independently adjustable mechanical loading regimens. The device consists of an array of 18 individual stations, each of which uses contactless magnetic actuation and has an integrated Hall Effect sensing system, enabling the real-time measurements of applied field, force, and construct thickness, and hence, the indirect measurement of construct mechanical properties. Validation tests showed precise measurements of thickness, within 14 μm of gold standard calliper measurements; further, applied force was measured to be within 0.04 N of desired force over a half hour dynamic loading, which was repeatable over a 3-week test period. Finally, construct material properties measured using the bioreactor were not significantly different (p=0.97) from those measured using a standard materials testing machine. We present a new method for articular cartilage-specific bioreactor design, integrating combinatorial magneto-mechanostimulation, which is very attractive from functional and cost viewpoints.

  2. Wear rate quantifying in real-time using the charged particle surface activation

    NASA Astrophysics Data System (ADS)

    Alexandreanu, B.; Popa-Simil, L.; Voiculescu, D.; Racolta, P. M.

    1997-02-01

    Surface activation, commonly known as Thin Layer Activation (TLA), is currently employed in over 30 accelerator laboratories around the world for wear and/or corrosion monitoring in industrial plants [1-6]. TLA was primarily designed and developed to meet requirements of potential industrial partners, in order to transfer this technique from research to industry. The method consists of accelerated ion bombardment of a surface of interest, e.g., a machine part subjected to wear. Loss of material owing to wear, erosive corrosion or abrasion is characterized by monitoring the resultant changes in radioactivity. In principle, depending upon the case at hand, one may choose to measure either the remnant activity of the component of interest or to monitor the activity of the debris. For applications of the second type, especially when a lubricating agent is involved, dedicated installations have been constructed and adapted to an engine or a tribological testing stand in order to assure oil circulation around an externally placed detection gauge. This way, the wear particles suspended in the lubricant can be detected and the material loss rates quantified in real time. Moreover, in specific cases, such as the one presented in this paper, remnant activity measurements prove to be useful tools for complementary results. This paper provides a detailed presentation of such a case: in situ resistance-to-wear testing of two types of piston rings.

  3. Photovoltaic generating systems in rural schools in Neuquen Province, Argentina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawand, T.A.; Campbell, J.

    1997-12-01

    During the period 1994-95, solar photovoltaic systems were installed at a number of schools in Neuquen Province, Argentina, by the Provincial electric utility, Ente Provincial de Energia del Neuquen. This was undertaken with funds provided by the Inter-American Development Bank. In all, there are 12 schools that have had photovoltaic generating systems installed. These generating systems are designed to provide electricity for the basic needs at the schools: primarily for lighting, and to operate small electrical appliances such as communication radios, televisions, VCR`s, AM/FM and short-wave radios. They do not provide enough energy to operate large consumption appliances such asmore » washing machines, microwaves, refrigerators, power tools, etc. The program of provision of PV systems was supplemented with training on simple systems for cooking food or drying fruit, etc. These techniques are primarily intended for demonstration at the schools thus serving an educational role with the hope that they will be transmitted in time to the families of the students where the need is manifested the most.« less

  4. The need for calcium imaging in nonhuman primates: New motor neuroscience and brain-machine interfaces.

    PubMed

    O'Shea, Daniel J; Trautmann, Eric; Chandrasekaran, Chandramouli; Stavisky, Sergey; Kao, Jonathan C; Sahani, Maneesh; Ryu, Stephen; Deisseroth, Karl; Shenoy, Krishna V

    2017-01-01

    A central goal of neuroscience is to understand how populations of neurons coordinate and cooperate in order to give rise to perception, cognition, and action. Nonhuman primates (NHPs) are an attractive model with which to understand these mechanisms in humans, primarily due to the strong homology of their brains and the cognitively sophisticated behaviors they can be trained to perform. Using electrode recordings, the activity of one to a few hundred individual neurons may be measured electrically, which has enabled many scientific findings and the development of brain-machine interfaces. Despite these successes, electrophysiology samples sparsely from neural populations and provides little information about the genetic identity and spatial micro-organization of recorded neurons. These limitations have spurred the development of all-optical methods for neural circuit interrogation. Fluorescent calcium signals serve as a reporter of neuronal responses, and when combined with post-mortem optical clearing techniques such as CLARITY, provide dense recordings of neuronal populations, spatially organized and annotated with genetic and anatomical information. Here, we advocate that this methodology, which has been of tremendous utility in smaller animal models, can and should be developed for use with NHPs. We review here several of the key opportunities and challenges for calcium-based optical imaging in NHPs. We focus on motor neuroscience and brain-machine interface design as representative domains of opportunity within the larger field of NHP neuroscience. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Function allocation for humans and automation in the context of team dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. Joe; John O'Hara; Jacques Hugo

    Within Human Factors Engineering, a decision-making process called function allocation (FA) is used during the design life cycle of complex systems to distribute the system functions, often identified through a functional requirements analysis, to all human and automated machine agents (or teammates) involved in controlling the system. Most FA methods make allocation decisions primarily by comparing the capabilities of humans and automation, but then also by considering secondary factors such as cost, regulations, and the health and safety of workers. The primary analysis of the strengths and weaknesses of humans and machines, however, is almost always considered in terms ofmore » individual human or machine capabilities. Yet, FA is fundamentally about teamwork in that the goal of the FA decision-making process is to determine what are the optimal allocations of functions among agents. Given this framing of FA, and the increasing use of and sophistication of automation, there are two related social psychological issues that current FA methods need to address more thoroughly. First, many principles for effective human teamwork are not considered as central decision points or in the iterative hypothesis and testing phase in most FA methods, when it is clear that social factors have numerous positive and negative effects on individual and team capabilities. Second, social psychological factors affecting team performance and can be difficult to translate to automated agents, and most FA methods currently do not account for this effect. The implications for these issues are discussed.« less

  6. 27 CFR 479.32a - Reduced rate of tax for small importers and manufacturers.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... MACHINE GUNS, DESTRUCTIVE DEVICES, AND CERTAIN OTHER FIREARMS Special (Occupational) Taxes § 479.32a... 50% control over a group consisting of corporations and one, or more, partnerships and/or sole...

  7. 27 CFR 479.32a - Reduced rate of tax for small importers and manufacturers.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... MACHINE GUNS, DESTRUCTIVE DEVICES, AND CERTAIN OTHER FIREARMS Special (Occupational) Taxes § 479.32a... 50% control over a group consisting of corporations and one, or more, partnerships and/or sole...

  8. 27 CFR 479.32a - Reduced rate of tax for small importers and manufacturers.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... MACHINE GUNS, DESTRUCTIVE DEVICES, AND CERTAIN OTHER FIREARMS Special (Occupational) Taxes § 479.32a... 50% control over a group consisting of corporations and one, or more, partnerships and/or sole...

  9. 27 CFR 479.32a - Reduced rate of tax for small importers and manufacturers.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... MACHINE GUNS, DESTRUCTIVE DEVICES, AND CERTAIN OTHER FIREARMS Special (Occupational) Taxes § 479.32a... 50% control over a group consisting of corporations and one, or more, partnerships and/or sole...

  10. A Simplified Design with a Toothed Belt and Non-Circular Pulleys to Separate Parts from a Magazine File

    NASA Astrophysics Data System (ADS)

    Hanke, U.; Modler, K.-H.; Neumann, R.; Fischer, C.

    The objective of this paper is to simplify a very complex guidance mechanism, currently used for lid separating issues in a packaging-machine. The task of this machine is to pick up a lid from a magazine file, rotate it around 180° and place it on tins. The developed mechanism works successfully but with a very complex construction. It consists of a planetary cam mechanism, combined with a toothed gear (with a constant transmission ratio) and a guiding mechanism with a toothed belt and circular pulleys. Such complex constructions are very common in industrial solutions. The idea of the authors is to show a much simpler design in solving the same problem. They developed a guidance mechanism realizing the same function, consisting only of a toothed belt with non-circular pulleys. The used parts are common trade articles.

  11. Machine-learning in astronomy

    NASA Astrophysics Data System (ADS)

    Hobson, Michael; Graff, Philip; Feroz, Farhan; Lasenby, Anthony

    2014-05-01

    Machine-learning methods may be used to perform many tasks required in the analysis of astronomical data, including: data description and interpretation, pattern recognition, prediction, classification, compression, inference and many more. An intuitive and well-established approach to machine learning is the use of artificial neural networks (NNs), which consist of a group of interconnected nodes, each of which processes information that it receives and then passes this product on to other nodes via weighted connections. In particular, I discuss the first public release of the generic neural network training algorithm, called SkyNet, and demonstrate its application to astronomical problems focusing on its use in the BAMBI package for accelerated Bayesian inference in cosmology, and the identification of gamma-ray bursters. The SkyNet and BAMBI packages, which are fully parallelised using MPI, are available at http://www.mrao.cam.ac.uk/software/.

  12. Study of Environmental Data Complexity using Extreme Learning Machine

    NASA Astrophysics Data System (ADS)

    Leuenberger, Michael; Kanevski, Mikhail

    2017-04-01

    The main goals of environmental data science using machine learning algorithm deal, in a broad sense, around the calibration, the prediction and the visualization of hidden relationship between input and output variables. In order to optimize the models and to understand the phenomenon under study, the characterization of the complexity (at different levels) should be taken into account. Therefore, the identification of the linear or non-linear behavior between input and output variables adds valuable information for the knowledge of the phenomenon complexity. The present research highlights and investigates the different issues that can occur when identifying the complexity (linear/non-linear) of environmental data using machine learning algorithm. In particular, the main attention is paid to the description of a self-consistent methodology for the use of Extreme Learning Machines (ELM, Huang et al., 2006), which recently gained a great popularity. By applying two ELM models (with linear and non-linear activation functions) and by comparing their efficiency, quantification of the linearity can be evaluated. The considered approach is accompanied by simulated and real high dimensional and multivariate data case studies. In conclusion, the current challenges and future development in complexity quantification using environmental data mining are discussed. References - Huang, G.-B., Zhu, Q.-Y., Siew, C.-K., 2006. Extreme learning machine: theory and applications. Neurocomputing 70 (1-3), 489-501. - Kanevski, M., Pozdnoukhov, A., Timonin, V., 2009. Machine Learning for Spatial Environmental Data. EPFL Press; Lausanne, Switzerland, p.392. - Leuenberger, M., Kanevski, M., 2015. Extreme Learning Machines for spatial environmental data. Computers and Geosciences 85, 64-73.

  13. Predicting knee replacement damage in a simulator machine using a computational model with a consistent wear factor.

    PubMed

    Zhao, Dong; Sakoda, Hideyuki; Sawyer, W Gregory; Banks, Scott A; Fregly, Benjamin J

    2008-02-01

    Wear of ultrahigh molecular weight polyethylene remains a primary factor limiting the longevity of total knee replacements (TKRs). However, wear testing on a simulator machine is time consuming and expensive, making it impractical for iterative design purposes. The objectives of this paper were first, to evaluate whether a computational model using a wear factor consistent with the TKR material pair can predict accurate TKR damage measured in a simulator machine, and second, to investigate how choice of surface evolution method (fixed or variable step) and material model (linear or nonlinear) affect the prediction. An iterative computational damage model was constructed for a commercial knee implant in an AMTI simulator machine. The damage model combined a dynamic contact model with a surface evolution model to predict how wear plus creep progressively alter tibial insert geometry over multiple simulations. The computational framework was validated by predicting wear in a cylinder-on-plate system for which an analytical solution was derived. The implant damage model was evaluated for 5 million cycles of simulated gait using damage measurements made on the same implant in an AMTI machine. Using a pin-on-plate wear factor for the same material pair as the implant, the model predicted tibial insert wear volume to within 2% error and damage depths and areas to within 18% and 10% error, respectively. Choice of material model had little influence, while inclusion of surface evolution affected damage depth and area but not wear volume predictions. Surface evolution method was important only during the initial cycles, where variable step was needed to capture rapid geometry changes due to the creep. Overall, our results indicate that accurate TKR damage predictions can be made with a computational model using a constant wear factor obtained from pin-on-plate tests for the same material pair, and furthermore, that surface evolution method matters only during the initial "break in" period of the simulation.

  14. In vivo and in vitro performance of a China-made hemodialysis machine: a multi-center prospective controlled study.

    PubMed

    Wang, Yong; Chen, Xiang-Mei; Cai, Guang-Yan; Li, Wen-Ge; Zhang, Ai-Hua; Hao, Li-Rong; Shi, Ming; Wang, Rong; Jiang, Hong-Li; Luo, Hui-Min; Zhang, Dong; Sun, Xue-Feng

    2017-08-02

    To evaluate the in vivo and in vitro performance of a China-made dialysis machine (SWS-4000). This was a multi-center prospective controlled study consisting of both long-term in vitro evaluations and cross-over in vivo tests in 132 patients. The China-made SWS-4000 dialysis machine was compared with a German-made dialysis machine (Fresenius 4008) with regard to Kt/V values, URR values, and dialysis-related adverse reactions in patients on maintenance hemodialysis, as well as the ultrafiltration rate, the concentration of electrolytes in the proportioned dialysate, the rate of heparin injection, the flow rate of the blood pump, and the rate of malfunction. The Kt/V and URR values at the 1st and 4th weeks of dialysis as well as the incidence of adverse effects did not differ between the two groups in cross-over in vivo tests (P > 0.05). There were no significant differences between the two groups in the error values of the ultrafiltration rate, the rate of heparin injection or the concentrations of electrolytes in the proportioned dialysate at different time points under different parameter settings. At weeks 2 and 24, with the flow rate of the blood pump set at 300 mL/min, the actual error of the SWS-4000 dialysis machine was significantly higher than that of the Fresenius 4008 dialysis machine (P < 0.05), but there was no significant difference at other time points or under other settings (P > 0.05). The malfunction rate was higher in the SWS-4000 group than in the Fresenius 4008 group (P < 0.05). The in vivo performance of the SWS-4000 dialysis machine is roughly comparable to that of the Fresenius 4008 dialysis machine; however, the malfunction rate of the former is higher than that of the latter in in vitro tests. The stability and long-term accuracy of the SWS-4000 dialysis machine remain to be improved.

  15. Characterizing the SEMG patterns with myofascial pain using a multi-scale wavelet model through machine learning approaches.

    PubMed

    Lin, Yu-Ching; Yu, Nan-Ying; Jiang, Ching-Fen; Chang, Shao-Hsia

    2018-06-02

    In this paper, we introduce a newly developed multi-scale wavelet model for the interpretation of surface electromyography (SEMG) signals and validate the model's capability to characterize changes in neuromuscular activation in cases with myofascial pain syndrome (MPS) via machine learning methods. The SEMG data collected from normal (N = 30; 27 women, 3 men) and MPS subjects (N = 26; 22 women, 4 men) were adopted for this retrospective analysis. SMEGs were measured from the taut-band loci on both sides of the trapezius muscle on the upper back while he/she conducted a cyclic bilateral backward shoulder extension movement within 1 min. Classification accuracy of the SEMG model to differentiate MPS patients from normal subjects was 77% using template matching and 60% using K-means clustering. Classification consistency between the two machine learning methods was 87% in the normal group and 93% in the MPS group. The 2D feature graphs derived from the proposed multi-scale model revealed distinct patterns between normal subjects and MPS patients. The classification consistency using template matching and K-means clustering suggests the potential of using the proposed model to characterize interference pattern changes induced by MPS. Copyright © 2018. Published by Elsevier Ltd.

  16. Comparaison de méthodes d'identification des paramètres d'une machine asynchrone

    NASA Astrophysics Data System (ADS)

    Bellaaj-Mrabet, N.; Jelassi, K.

    1998-07-01

    Interests, in Genetic Algorithms (G.A.) expands rapidly. This paper consists initially to apply G.A. for identifying induction motor parameters. Next, we compare the performances with classical methods like Maximum Likelihood and classical electrotechnical methods. These methods are applied on three induction motors of different powers to compare results following a set of criteria. Les algorithmes génétiques sont des méthodes adaptatives de plus en plus utilisée pour la résolution de certains problèmes d'optimisation. Le présent travail consiste d'une part, à mettre en œuvre un A.G sur des problèmes d'identification des machines électriques, et d'autre part à comparer ses performances avec les méthodes classiques tels que la méthode du maximum de vraisemblance et la méthode électrotechnique basée sur des essais à vides et en court-circuit. Ces méthodes sont appliquées sur des machines asynchrones de différentes puissances. Les résultats obtenus sont comparés selon certains critères, permettant de conclure sur la validité et la performance de chaque méthode.

  17. IRB Process Improvements: A Machine Learning Analysis.

    PubMed

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  18. A design of the u-health monitoring system using a Nintendo DS game machine.

    PubMed

    Lee, Sangjoon; Kim, Jinkwon; Kim, Jungkuk; Lee, Myoungho

    2009-01-01

    In this paper, we used the hand held type a Nintendo DS Game Machine for consisting of a u-Health Monitoring system. This system is consists of four parts. Biosignal acquire device is the first. The Second is a wireless sensor network device. The third is a wireless base-station for connecting internet network. Displaying units are the last part which were a personal computer and a Nintendo DS game machine. The bio-signal measurement device among the four parts the u-health monitoring system can acquire 7-channels data which have 3-channels ECG(Electrocardiogram), 3-axis accelerometer and tilting sensor data. Acquired data connect up the internet network throughout the wireless sensor network and a base-station. In the experiment, we concurrently display the bio-signals on to a monitor of personal computer and LCD of a Nintendo DS using wireless internet protocol and those monitoring devices placed off to the one side an office building. The result of the experiment, this proposed system effectively can transmit patient's biosignal data as a long time and a long distance. This suggestion of the u-health monitoring system need to operate in the ambulance, general hospitals and geriatric institutions as a u-health monitoring device.

  19. Construction and Design of a full size sTGC prototype for the ATLAS New Small Wheel upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    For the forthcoming Phase-I upgrade to the LHC (2018/19), the first station of the ATLAS muon end-cap system, Small Wheel, will need to be replaced. The New Small Wheel (NSW) will have to operate in a high background radiation region while reconstructing muon tracks with high precision as well as furnishing information for the Level-1 trigger. In particular, the precision reconstruction of tracks requires a spatial resolution of about 100 μm, and the Level-1 trigger track segments have to be reconstructed with an angular resolution of approximately 1 mrad. The NSW will have two chamber technologies, one primarily devoted tomore » the Level-1 trigger function the small-strip Thin Gap Chambers (sTGC) and one dedicated to precision tracking, Micromegas detectors, (MM). The single sTGC planes of a quadruplet consists of an anode layer of 50 μm gold plated tungsten wire sandwiched between two resistive cathode layers. Behind one of the resistive cathode layers, a PCB with precise machined strips (thus the name sTGC's) spaced every 3.2 mm allows to achieve the position resolution that ranges from 70 to 150 μm, depending on the incident particle angle. Behind the second cathode, a PCB that contains an arrangement of pads, allows for a fast coincidence between successive sTGC layers to tag the passage of a track and reads only the corresponding strips for triggering. To be able to profit from the high accuracy of each of the sTGC planes for trigger purposes, their relative geometrical position between planes has to be controlled to within a precision of about 40 μm in their parallelism, as well (due to the various incident angles), to within a precision of 80 μm in the relative distance between the planes to achieve the overall angular resolution of 1 mrad. The needed accuracy in the position and parallelism of the strips is achieved by machining brass inserts together when machining the strip patterns into the cathode boards in a single step. The inserts can then be used as external references on a granite table. Precision methods are used to maintain high accuracy when combining four single detector gaps first into two doublets and then into a quadruplet. We will present results on the ongoing construction of full size (∼1 x 1 m) sTGC quadruplet prototypes before full construction starts in 2015. (authors)« less

  20. Index to selected machine-readable geohydrologic data for Precambrian through Cretaceous rocks in Kansas

    USGS Publications Warehouse

    Spinazola, J.M.; Hansen, C.V.; Underwood, E.J.; Kenny, J.F.; Wolf, R.J.

    1987-01-01

    Machine-readable geohydrologic data for Precambrian through Cretaceous rocks in Kansas were compiled as part of the USGS Central Midwest Regional Aquifer System Analysis. The geohydrologic data include log, water quality, water level, hydraulics, and water use information. The log data consist of depths to the top of selected geologic formations determined from about 275 sites with geophysical logs and formation lithologies from about 190 sites with lithologic logs. The water quality data consist of about 10,800 analyses, of which about 1 ,200 are proprietary. The water level data consist of about 4 ,480 measured water levels and about 4,175 equivalent freshwater hydraulic heads, of which about 3,745 are proprietary. The hydraulics data consist of results from about 30 specific capacity tests and about 20 aquifer tests, and interpretations of about 285 drill stem tests (of which about 60 are proprietary) and about 75 core-sample analyses. The water use data consist of estimates of freshwater withdrawals from Precambrian through Cretaceous geohydrologic units for each of the 105 counties in Kansas. Average yearly withdrawals were estimated for each decade from 1940 to 1980. All the log and water use data and the nonproprietary parts of the water quality , water level, and hydraulics data are available on magnetic tape from the USGS office in Lawrence, Kansas. (Author 's abstract)

  1. Artificial Neural Network Based Fault Diagnostics of Rolling Element Bearings Using Time-Domain Features

    NASA Astrophysics Data System (ADS)

    Samanta, B.; Al-Balushi, K. R.

    2003-03-01

    A procedure is presented for fault diagnosis of rolling element bearings through artificial neural network (ANN). The characteristic features of time-domain vibration signals of the rotating machinery with normal and defective bearings have been used as inputs to the ANN consisting of input, hidden and output layers. The features are obtained from direct processing of the signal segments using very simple preprocessing. The input layer consists of five nodes, one each for root mean square, variance, skewness, kurtosis and normalised sixth central moment of the time-domain vibration signals. The inputs are normalised in the range of 0.0 and 1.0 except for the skewness which is normalised between -1.0 and 1.0. The output layer consists of two binary nodes indicating the status of the machine—normal or defective bearings. Two hidden layers with different number of neurons have been used. The ANN is trained using backpropagation algorithm with a subset of the experimental data for known machine conditions. The ANN is tested using the remaining set of data. The effects of some preprocessing techniques like high-pass, band-pass filtration, envelope detection (demodulation) and wavelet transform of the vibration signals, prior to feature extraction, are also studied. The results show the effectiveness of the ANN in diagnosis of the machine condition. The proposed procedure requires only a few features extracted from the measured vibration data either directly or with simple preprocessing. The reduced number of inputs leads to faster training requiring far less iterations making the procedure suitable for on-line condition monitoring and diagnostics of machines.

  2. Study on intelligent processing system of man-machine interactive garment frame model

    NASA Astrophysics Data System (ADS)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  3. Open Architecture Data System for NASA Langley Combined Loads Test System

    NASA Technical Reports Server (NTRS)

    Lightfoot, Michael C.; Ambur, Damodar R.

    1998-01-01

    The Combined Loads Test System (COLTS) is a new structures test complex that is being developed at NASA Langley Research Center (LaRC) to test large curved panels and cylindrical shell structures. These structural components are representative of aircraft fuselage sections of subsonic and supersonic transport aircraft and cryogenic tank structures of reusable launch vehicles. Test structures are subjected to combined loading conditions that simulate realistic flight load conditions. The facility consists of two pressure-box test machines and one combined loads test machine. Each test machine possesses a unique set of requirements or research data acquisition and real-time data display. Given the complex nature of the mechanical and thermal loads to be applied to the various research test articles, each data system has been designed with connectivity attributes that support both data acquisition and data management functions. This paper addresses the research driven data acquisition requirements for each test machine and demonstrates how an open architecture data system design not only meets those needs but provides robust data sharing between data systems including the various control systems which apply spectra of mechanical and thermal loading profiles.

  4. Airline policies for passengers with obstructive sleep apnoea who require in-flight continuous positive airways pressure.

    PubMed

    Walker, Jacqueline; Kelly, Paul T; Beckert, Lutz

    2010-04-01

    The aim of this study was to investigate the current policies of Australian and New Zealand airlines on the use of in-flight CPAP by passengers with OSA. A survey was conducted of 53 commercial airlines servicing international routes. Information was obtained from airline call centres and websites. The policies, approval schemes and costs associated with in-flight use of CPAP were documented for individual airlines. Of the 53 airlines contacted, 28 (53%) were able to support passengers requiring in-flight CPAP. All these airlines required passengers to bring their own machines, and allowed the use of battery-operated machines. Six airlines (21%) allowed passengers to plug their machines into the aircraft power supply. The majority of airlines (19, 68%) did not charge passengers for the use of CPAP, while 9 (32%) were unsure of their charging policies. Many airlines only permitted certain models of CPAP machine or battery types. Many airlines are unaware of CPAP. Those who are, have relatively consistent policies concerning the use of in-flight CPAP.

  5. Design of a Versatile, Teleoperable, Towable Lifting Machine with Robotic Capabilities for Use in Nasa's Lunar Base Operations

    NASA Technical Reports Server (NTRS)

    Harris, Elizabeth; Ogle, James; Schoppe, Dean

    1989-01-01

    The lifting machine will assist in lifting cargo off of landers sent to the Moon and in the construction of a lunar base. Three possible designs were considered for the overall configuration of the lifting machine: the variable angle crane, the tower crane, and the gantry crane. Alternate designs were developed for the major components of the lifting machine. A teleoperable, variable angle crane was chosen as its final design. The design consists of a telescoping boom mounted to a chassis that is supported by two conical wheels for towing and four outriggers for stability. Attached to the end of the boom is a seven degree of freedom robot arm for light, dexterous, lifting operations. A cable and hook suspends from the end of the boom for heavy, gross, lifting operations. Approximate structural sizes were determined for the lifter and its components. However, further analysis is needed to determine the optimum design dimensions. The design team also constructed a model of the design which demonstrates its features and operating principals.

  6. A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting

    NASA Astrophysics Data System (ADS)

    Kim, T.; Joo, K.; Seo, J.; Heo, J. H.

    2016-12-01

    Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.

  7. Investigation of Combined Motor/Magnetic Bearings for Flywheel Energy Storage Systems

    NASA Technical Reports Server (NTRS)

    Hofmann, Heath

    2003-01-01

    Dr. Hofmann's work in the summer of 2003 consisted of two separate projects. In the first part of the summer, Dr. Hofmann prepared and collected information regarding rotor losses in synchronous machines; in particular, machines with low rotor losses operating in vacuum and supported by magnetic bearings, such as the motor/generator for flywheel energy storage systems. This work culminated in a presentation at NASA Glenn Research Center on this topic. In the second part, Dr. Hofmann investigated an approach to flywheel energy storage where the phases of the flywheel motor/generator are connected in parallel with the phases of an induction machine driving a mechanical actuator. With this approach, additional power electronics for driving the flywheel unit are not required. Simulations of the connection of a flywheel energy storage system to a model of an electromechanical actuator testbed at NASA Glenn were performed that validated the proposed approach. A proof-of-concept experiment using the D1 flywheel unit at NASA Glenn and a Sundstrand induction machine connected to a dynamometer was successfully conducted.

  8. Knowledge-based vision and simple visual machines.

    PubMed Central

    Cliff, D; Noble, J

    1997-01-01

    The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684

  9. Display-And-Alarm Circuit For Accelerometer

    NASA Technical Reports Server (NTRS)

    Bozeman, Richard J., Jr.

    1995-01-01

    Compact accelerometer assembly consists of commercial accelerometer retrofit with display-and-alarm circuit. Provides simple means for technician attending machine to monitor vibrations. Also simpifies automatic safety shutdown by providing local alarm or shutdown signal when vibration exceeds preset level.

  10. OT calibration and service maintenance manual.

    DOT National Transportation Integrated Search

    2012-01-01

    The machine conditions, as well as the values at the calibration and control parameters, may determine the quality of each test results obtained. In order to keep consistency and accuracy, the conditions, performance and measurements of an OT must be...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, Wayne; Borders, Tammie

    INL successfully developed a proof of concept for "Software Defined Anything" by emulating the laboratory's business applications that run on Virtual Machines. The work INL conducted demonstrates to industry on how this methodology can be used to improve security, automate and repeat processes, and improve consistency.

  12. An fMRI and effective connectivity study investigating miss errors during advice utilization from human and machine agents.

    PubMed

    Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; de Visser, Ewart; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank

    2017-10-01

    As society becomes more reliant on machines and automation, understanding how people utilize advice is a necessary endeavor. Our objective was to reveal the underlying neural associations during advice utilization from expert human and machine agents with fMRI and multivariate Granger causality analysis. During an X-ray luggage-screening task, participants accepted or rejected good or bad advice from either the human or machine agent framed as experts with manipulated reliability (high miss rate). We showed that the machine-agent group decreased their advice utilization compared to the human-agent group and these differences in behaviors during advice utilization could be accounted for by high expectations of reliable advice and changes in attention allocation due to miss errors. Brain areas involved with the salience and mentalizing networks, as well as sensory processing involved with attention, were recruited during the task and the advice utilization network consisted of attentional modulation of sensory information with the lingual gyrus as the driver during the decision phase and the fusiform gyrus as the driver during the feedback phase. Our findings expand on the existing literature by showing that misses degrade advice utilization, which is represented in a neural network involving salience detection and self-processing with perceptual integration.

  13. Field precision machining technology of target chamber in ICF lasers

    NASA Astrophysics Data System (ADS)

    Xu, Yuanli; Wu, Wenkai; Shi, Sucun; Duan, Lin; Chen, Gang; Wang, Baoxu; Song, Yugang; Liu, Huilin; Zhu, Mingzhi

    2016-10-01

    In ICF lasers, many independent laser beams are required to be positioned on target with a very high degree of accuracy during a shot. The target chamber provides a precision platform and datum reference for final optics assembly and target collimation and location system. The target chamber consists of shell with welded flanges, reinforced concrete pedestal, and lateral support structure. The field precision machining technology of target chamber in ICF lasers have been developed based on ShenGuangIII (SGIII). The same center of the target chamber is adopted in the process of design, fabrication, and alignment. The technologies of beam collimation and datum reference transformation are developed for the fabrication, positioning and adjustment of target chamber. A supporting and rotating mechanism and a special drilling machine are developed to bore the holes of ports. An adjustment mechanism is designed to accurately position the target chamber. In order to ensure the collimation requirements of the beam leading and focusing and the target positioning, custom-machined spacers are used to accurately correct the alignment error of the ports. Finally, this paper describes the chamber center, orientation, and centering alignment error measurements of SGIII. The measurements show the field precision machining of SGIII target chamber meet its design requirement. These information can be used on similar systems.

  14. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    PubMed

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  15. Assessment of genetic and nongenetic interactions for the prediction of depressive symptomatology: an analysis of the Wisconsin Longitudinal Study using machine learning algorithms.

    PubMed

    Roetker, Nicholas S; Page, C David; Yonker, James A; Chang, Vicky; Roan, Carol L; Herd, Pamela; Hauser, Taissa S; Hauser, Robert M; Atwood, Craig S

    2013-10-01

    We examined depression within a multidimensional framework consisting of genetic, environmental, and sociobehavioral factors and, using machine learning algorithms, explored interactions among these factors that might better explain the etiology of depressive symptoms. We measured current depressive symptoms using the Center for Epidemiologic Studies Depression Scale (n = 6378 participants in the Wisconsin Longitudinal Study). Genetic factors were 78 single nucleotide polymorphisms (SNPs); environmental factors-13 stressful life events (SLEs), plus a composite proportion of SLEs index; and sociobehavioral factors-18 personality, intelligence, and other health or behavioral measures. We performed traditional SNP associations via logistic regression likelihood ratio testing and explored interactions with support vector machines and Bayesian networks. After correction for multiple testing, we found no significant single genotypic associations with depressive symptoms. Machine learning algorithms showed no evidence of interactions. Naïve Bayes produced the best models in both subsets and included only environmental and sociobehavioral factors. We found no single or interactive associations with genetic factors and depressive symptoms. Various environmental and sociobehavioral factors were more predictive of depressive symptoms, yet their impacts were independent of one another. A genome-wide analysis of genetic alterations using machine learning methodologies will provide a framework for identifying genetic-environmental-sociobehavioral interactions in depressive symptoms.

  16. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  17. The environmental impact of the Glostavent® anesthetic machine.

    PubMed

    Eltringham, Roger J; Neighbour, Robert C

    2015-06-01

    Because anesthetic machines have become more complex and more expensive, they have become less suitable for use in the many isolated hospitals in the poorest countries in the world. In these situations, they are frequently unable to function at all because of interruptions in the supply of oxygen or electricity and the absence of skilled technicians for maintenance and servicing. Despite these disadvantages, these machines are still delivered in large numbers, thereby expending precious resources without any benefit to patients. The Glostavent was introduced primarily to enable an anesthetic service to be delivered in these difficult circumstances. It is smaller and less complex than standard anesthetic machines and much less expensive to produce. It combines a drawover anesthetic system with an oxygen concentrator and a gas-driven ventilator. It greatly reduces the need for the purchase and transport of cylinders of compressed gases, reduces the impact on the environment, and enables considerable savings. Cylinder oxygen is expensive to produce and difficult to transport over long distances on poor roads. Consequently, the supply may run out. However, when using the Glostavent, oxygen is normally produced at a fraction of the cost of cylinders by the oxygen concentrator, which is an integral part of the Glostavent. This enables great savings in the purchase and transport cost of oxygen cylinders. If the electricity fails and the oxygen concentrator ceases to function, oxygen from a reserve cylinder automatically provides the pressure to drive the ventilator and oxygen for the breathing circuit. Consequently, economy is achieved because the ventilator has been designed to minimize the amount of driving gas required to one-seventh of the patient's tidal volume. Additional economies are achieved by completely eliminating spillage of oxygen from the breathing system and by recycling the driving gas into the breathing system to increase the Fraction of Inspired Oxygen (FIO2) at no extra cost. Savings also are accrued when using the drawover breathing system as the need for nitrous oxide, compressed air, and soda lime are eliminated. The Glostavent enables the administration of safe anesthesia to be continued when standard machines are unable to function and can do so with minimal harm to the environment.

  18. Predicting Survival From Large Echocardiography and Electronic Health Record Datasets: Optimization With Machine Learning.

    PubMed

    Samad, Manar D; Ulloa, Alvaro; Wehner, Gregory J; Jing, Linyuan; Hartzel, Dustin; Good, Christopher W; Williams, Brent A; Haggerty, Christopher M; Fornwalt, Brandon K

    2018-06-09

    The goal of this study was to use machine learning to more accurately predict survival after echocardiography. Predicting patient outcomes (e.g., survival) following echocardiography is primarily based on ejection fraction (EF) and comorbidities. However, there may be significant predictive information within additional echocardiography-derived measurements combined with clinical electronic health record data. Mortality was studied in 171,510 unselected patients who underwent 331,317 echocardiograms in a large regional health system. We investigated the predictive performance of nonlinear machine learning models compared with that of linear logistic regression models using 3 different inputs: 1) clinical variables, including 90 cardiovascular-relevant International Classification of Diseases, Tenth Revision, codes, and age, sex, height, weight, heart rate, blood pressures, low-density lipoprotein, high-density lipoprotein, and smoking; 2) clinical variables plus physician-reported EF; and 3) clinical variables and EF, plus 57 additional echocardiographic measurements. Missing data were imputed with a multivariate imputation by using a chained equations algorithm (MICE). We compared models versus each other and baseline clinical scoring systems by using a mean area under the curve (AUC) over 10 cross-validation folds and across 10 survival durations (6 to 60 months). Machine learning models achieved significantly higher prediction accuracy (all AUC >0.82) over common clinical risk scores (AUC = 0.61 to 0.79), with the nonlinear random forest models outperforming logistic regression (p < 0.01). The random forest model including all echocardiographic measurements yielded the highest prediction accuracy (p < 0.01 across all models and survival durations). Only 10 variables were needed to achieve 96% of the maximum prediction accuracy, with 6 of these variables being derived from echocardiography. Tricuspid regurgitation velocity was more predictive of survival than LVEF. In a subset of studies with complete data for the top 10 variables, multivariate imputation by chained equations yielded slightly reduced predictive accuracies (difference in AUC of 0.003) compared with the original data. Machine learning can fully utilize large combinations of disparate input variables to predict survival after echocardiography with superior accuracy. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  19. Centre of mass determination based on an optical weighing machine using fiber Bragg gratings

    NASA Astrophysics Data System (ADS)

    Oliveira, Rui; Roriz, Paulo; Marques, Manuel B.; Frazão, Orlando

    2015-09-01

    The purpose of the present work was to construct a weighing machine based on fiber Bragg gratings (FBGs) for the location of the 2D coordinates of the center of gravity (COG) of objects with complex geometry and density distribution. The apparatus consisted of a rigid equilateral triangular platform mounted on three supports at its vertices, two of them having cantilevers instrumented with FBGs. As an example, two femur bone models, one with and one without a hip stem prosthesis, are used to discuss the changing of the COM caused by the implementation of the prosthesis.

  20. Reliability and quality assurance on the MOD 2 wind system

    NASA Technical Reports Server (NTRS)

    Mason, W. E. B.; Jones, B. G.

    1981-01-01

    The Safety, Reliability, and Quality Assurance (R&QA) approach developed for the largest wind turbine generator, the Mod 2, is described. The R&QA approach assures that the machine is not hazardous to the public or to the operating personnel, is operated unattended on a utility grid, demonstrates reliable operation, and helps establish the quality assurance and maintainability requirements for future wind turbine projects. The significant guideline consisted of a failure modes and effects analysis (FMEA) during the design phase, hardware inspections during parts fabrication, and three simple documents to control activities during machine construction and operation.

  1. Machine learning prediction for classification of outcomes in local minimisation

    NASA Astrophysics Data System (ADS)

    Das, Ritankar; Wales, David J.

    2017-01-01

    Machine learning schemes are employed to predict which local minimum will result from local energy minimisation of random starting configurations for a triatomic cluster. The input data consists of structural information at one or more of the configurations in optimisation sequences that converge to one of four distinct local minima. The ability to make reliable predictions, in terms of the energy or other properties of interest, could save significant computational resources in sampling procedures that involve systematic geometry optimisation. Results are compared for two energy minimisation schemes, and for neural network and quadratic functions of the inputs.

  2. Experience with modified aerospace reliability and quality assurance method for wind turbines

    NASA Technical Reports Server (NTRS)

    Klein, W. E.

    1982-01-01

    The SR&QA approach assures that the machine is not hazardous to the public or operating personnel, can operate unattended on a utility grid, demonstrates reliability operation, and helps establish the quality assurance and maintainability requirements for future wind turbine projects. The approach consisted of modified failure modes and effects analysis (FMEA) during the design phase, minimal hardware inspection during parts fabrication, and three simple documents to control activities during machine construction and operation. Five years experience shows that this low cost approach works well enough that it should be considered by others for similar projects.

  3. Tool wear modeling using abductive networks

    NASA Astrophysics Data System (ADS)

    Masory, Oren

    1992-09-01

    A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.

  4. Development of the Occupational Physical Assessment Test (OPAT) for Combat Arms Soldiers

    DTIC Science & Technology

    2015-10-01

    Direct Fire - x x 2 I 2 2 11 Prepare Dismounted TOW Firinq Position x 12 Enqaqe Tarqets with a Caliber .50 M2 Machine Gun x 13 Lav a 120mm Mortar...Lift and Hold Round, Place in Tube) x 17 Mount M2 .50 Cal Machine Gun Receiver on an Abrams Tank 18 Stow Ammunition on an Abrams Tank I (Load 120mm...ammo on an Abrams, transfer ammo on a FAASV, and load the main gun . 12 TESTING OVERVIEW Testing consisted of the two to six criterion tasks and up to

  5. A Combination Therapy of JO-I and Chemotherapy in Ovarian Cancer Models

    DTIC Science & Technology

    2013-10-01

    which consists of a 3PAR storage backend and is sharing data via a highly available NetApp storage gateway and 2 high throughput commodity storage...Environment is configured as self- service Enterprise cloud and currently hosts more than 700 virtual machines. The network infrastructure consists of...technology infrastructure and information system applications designed to integrate, automate, and standardize operations. These systems fuse state of

  6. Oakton Community College Comprehensive Annual Financial Report, Fiscal Year Ended June 30, 1996.

    ERIC Educational Resources Information Center

    Hilquist, David E.

    Consisting primarily of tables, this report provides financial data on Oakton Community College in Illinois for the fiscal year ending on June 30, 1996. This comprehensive annual financial report consists of an introductory section, financial section, statistical section, and special reports section. The introductory section includes a transmittal…

  7. Development of an Electric Motor Powered Low Cost Coconut Deshelling Machine

    NASA Astrophysics Data System (ADS)

    Mondal, Imdadul Hoque; Prasanna Kumar, G. V.

    2016-06-01

    An electric motor powered coconut deshelling machine was developed in line with the commercially available unit, but with slight modifications. The machine worked on the principle that the coconut shell can be caused to fail in shear and compressive forces. It consisted of a toothed wheel, a deshelling rod, an electric motor, and a compound chain drive. A bevelled 16 teeth sprocket with 18 mm pitch was used as the toothed wheel. Mild steel round bar of 18 mm diameter was used as the deshelling rod. The sharp edge tip of the deshelling rod was inserted below the shell to apply shear force on the shell, and the fruit was tilted toward the rotary toothed wheel to apply the compressive force on the shell. The speed of rotation of the toothed wheel was set at 34 ± 2 rpm. The output capacity of the machine was found to be 24 coconuts/h with 95 % of the total time effectively used for deshelling. The labour requirement was found to be 43 man-h/1000 nuts. About 13 % of the kernels got scraped and about 7 % got sliced during the operation. The developed coconut deshelling machine was recommended for the minimum annual use of 200 h or deshelling of 4700 coconuts per year. The cost of operation for 200 h of annual use was found to be about ` 47/h. The developed machine was found to be simple, easy to operate, energy efficient, safe and reduce drudgery involved in deshelling by conventional methods.

  8. Advanced Propulsion Power Distribution System for Next Generation Electric/Hybrid Vehicle. Phase 1; Preliminary System Studies

    NASA Technical Reports Server (NTRS)

    Bose, Bimal K.; Kim, Min-Huei

    1995-01-01

    The report essentially summarizes the work performed in order to satisfy the above project objective. In the beginning, different energy storage devices, such as battery, flywheel and ultra capacitor are reviewed and compared, establishing the superiority of the battery. Then, the possible power sources, such as IC engine, diesel engine, gas turbine and fuel cell are reviewed and compared, and the superiority of IC engine has been established. Different types of machines for drive motor/engine generator, such as induction machine, PM synchronous machine and switched reluctance machine are compared, and the induction machine is established as the superior candidate. Similar discussion was made for power converters and devices. The Insulated Gate Bipolar Transistor (IGBT) appears to be the most superior device although Mercury Cadmium Telluride (MCT) shows future promise. Different types of candidate distribution systems with the possible combinations of power and energy sources have been discussed and the most viable system consisting of battery, IC engine and induction machine has been identified. Then, HFAC system has been compared with the DC system establishing the superiority of the former. The detailed component sizing calculations of HFAC and DC systems reinforce the superiority of the former. A preliminary control strategy has been developed for the candidate HFAC system. Finally, modeling and simulation study have been made to validate the system performance. The study in the report demonstrates the superiority of HFAC distribution system for next generation electric/hybrid vehicle.

  9. A comparison of temporal artery thermometers with internal blood monitors to measure body temperature during hemodialysis.

    PubMed

    Lunney, Meaghan; Tonelli, Bronwyn; Lewis, Rachel; Wiebe, Natasha; Thomas, Chandra; MacRae, Jennifer; Tonelli, Marcello

    2018-06-14

    Thermometers that measure core (internal) body temperature are the gold standard for monitoring temperature. Despite that most modern hemodialysis machines are equipped with an internal blood monitor that measures core body temperature, current practice is to use peripheral thermometers. A better understanding of how peripheral thermometers compare with the dialysis machine thermometer may help guide practice. The study followed a prospective cross-sectional design. Hemodialysis patients were recruited from 2 sites in Calgary, Alberta (April - June 2017). Body temperatures were obtained from peripheral (temporal artery) and dialysis machine thermometers concurrently. Paired t-tests, Bland-Altman plots, and quantile-quantile plots were used to compare measurements from the two devices and to explore potential factors affecting temperature in hemodialysis patients. The mean body temperature of 94 hemodialysis patients measured using the temporal artery thermometer (36.7 °C) was significantly different than the dialysis machine thermometer (36.4 °C); p < 0.001. The mean difference (0.27 °C) appeared to be consistent across average temperature (range: 35.8-37.3 °C). Temperature measured by the temporal artery thermometer was statistically and clinically higher than that measured by the dialysis machine thermometer. Using the dialysis machine to monitor body temperature may result in more accurate readings and is likely to reduce the purchasing and maintenance costs associated with manual temperature readings, as well as easing the workload for dialysis staff.

  10. Factors influencing equipment selection in electron beam processing

    NASA Astrophysics Data System (ADS)

    Barnard, J. W.

    2003-08-01

    During the eighties and nineties accelerator manufacturers dramatically increased the beam power available for high-energy equipment. This effort was directed primarily at meeting the demands of the sterilization industry. During this era, the perception that bigger (higher power, higher energy) was always better prevailed since the operating and capital costs of accelerators did not increase with power and energy as fast as the throughput. High power was needed to maintain per unit costs low for treatment. This philosophy runs counter to certain present-day realities of the sterilization business as well as conditions influencing accelerator selection in other electron beam applications. Recent experience in machine selection is described and factors affecting choice are presented.

  11. National Test Facility civilian agency use of supercomputers not feasible

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-01

    Based on interviews with civilian agencies cited in the House report (DOE, DoEd, HHS, FEMA, NOAA), none would be able to make effective use of NTF`s excess supercomputing capabilities. These agencies stated they could not use the resources primarily because (1) NTF`s supercomputers are older machines whose performance and costs cannot match those of more advanced computers available from other sources and (2) some agencies have not yet developed applications requiring supercomputer capabilities or do not have funding to support such activities. In addition, future support for the hardware and software at NTF is uncertain, making any investment by anmore » outside user risky.« less

  12. Pixels, people, perception, pet peeves, and possibilities: a look at displays

    NASA Astrophysics Data System (ADS)

    Task, H. Lee

    2007-04-01

    This year marks the 35 th anniversary of the Visually Coupled Systems symposium held at Brooks Air Force Base, San Antonio, Texas in November of 1972. This paper uses the proceedings of the 1972 VCS symposium as a guide to address several topics associated primarily with helmet-mounted displays, systems integration and the human-machine interface. Specific topics addressed include monocular and binocular helmet-mounted displays (HMDs), visor projection HMDs, color HMDs, system integration with aircraft windscreens, visual interface issues and others. In addition, this paper also addresses a few mysteries and irritations (pet peeves) collected over the past 35+ years of experience in the display and display related areas.

  13. RAPID and HTML5's potential

    NASA Technical Reports Server (NTRS)

    Torosyan, David

    2012-01-01

    Just as important as the engineering that goes into building a robot is the method of interaction, or how human users will use the machine. As part of the Human-System Interactions group (Conductor) at JPL, I explored using a web interface to interact with ATHLETE, a prototype lunar rover. I investigated the usefulness of HTML 5 and Javascript as a telemetry viewer as well as the feasibility of having a rover communicate with a web server. To test my ideas I built a mobile-compatible website and designed primarily for an Android tablet. The website took input from ATHLETE engineers, and upon its completion I conducted a user test to assess its effectiveness.

  14. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  15. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  16. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  17. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  18. IPCS user's manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGoldrick, P.R.

    1980-12-11

    The Interprocess Communications System (IPCS) was written to provide a virtual machine upon which the Supervisory Control and Diagnostic System (SCDS) for the Mirror Fusion Test Facility (MFTF) could be built. The hardware upon which the IPCS runs consists of nine minicomputers sharing some common memory.

  19. Self-Service Fare Collection on the San Diego Trolley

    DOT National Transportation Integrated Search

    1984-05-01

    The San Diego Trolley (owner by the Metropolitan Transit Development Board) began operations in July 1981 using self-service fare collection (SSFC). Passengers must have proof of payment consisting of a single-ride ticket bought at a vending machine ...

  20. Batch Scheduling for Hybrid Assembly Differentiation Flow Shop to Minimize Total Actual Flow Time

    NASA Astrophysics Data System (ADS)

    Maulidya, R.; Suprayogi; Wangsaputra, R.; Halim, A. H.

    2018-03-01

    A hybrid assembly differentiation flow shop is a three-stage flow shop consisting of Machining, Assembly and Differentiation Stages and producing different types of products. In the machining stage, parts are processed in batches on different (unrelated) machines. In the assembly stage, each part of the different parts is assembled into an assembly product. Finally, the assembled products will further be processed into different types of final products in the differentiation stage. In this paper, we develop a batch scheduling model for a hybrid assembly differentiation flow shop to minimize the total actual flow time defined as the total times part spent in the shop floor from the arrival times until its due date. We also proposed a heuristic algorithm for solving the problems. The proposed algorithm is tested using a set of hypothetic data. The solution shows that the algorithm can solve the problems effectively.

  1. Robust iterative learning contouring controller with disturbance observer for machine tool feed drives.

    PubMed

    Simba, Kenneth Renny; Bui, Ba Dinh; Msukwa, Mathew Renny; Uchiyama, Naoki

    2018-04-01

    In feed drive systems, particularly machine tools, a contour error is more significant than the individual axial tracking errors from the view point of enhancing precision in manufacturing and production systems. The contour error must be within the permissible tolerance of given products. In machining complex or sharp-corner products, large contour errors occur mainly owing to discontinuous trajectories and the existence of nonlinear uncertainties. Therefore, it is indispensable to design robust controllers that can enhance the tracking ability of feed drive systems. In this study, an iterative learning contouring controller consisting of a classical Proportional-Derivative (PD) controller and disturbance observer is proposed. The proposed controller was evaluated experimentally by using a typical sharp-corner trajectory, and its performance was compared with that of conventional controllers. The results revealed that the maximum contour error can be reduced by about 37% on average. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Simulation of dynamic processes when machining transition surfaces of stepped shafts

    NASA Astrophysics Data System (ADS)

    Maksarov, V. V.; Krasnyy, V. A.; Viushin, R. V.

    2018-03-01

    The paper addresses the characteristics of stepped surfaces of parts categorized as "solids of revolution". It is noted that in the conditions of transition modes during the switch to end surface machining, there is cutting with varied load intensity in the section of the cut layer, which leads to change in cutting force, onset of vibrations, an increase in surface layer roughness, a decrease of size precision, and increased wear of a tool's cutting edge. This work proposes a method that consists in developing a CNC program output code that allows one to process complex forms of stepped shafts with only one machine setup. The authors developed and justified a mathematical model of a technological system for mechanical processing with consideration for the resolution of tool movement at the stages of transition processes to assess the dynamical stability of a system in the process of manufacturing stepped surfaces of parts of “solid of revolution” type.

  3. Energy landscapes for a machine-learning prediction of patient discharge

    NASA Astrophysics Data System (ADS)

    Das, Ritankar; Wales, David J.

    2016-06-01

    The energy landscapes framework is applied to a configuration space generated by training the parameters of a neural network. In this study the input data consists of time series for a collection of vital signs monitored for hospital patients, and the outcomes are patient discharge or continued hospitalisation. Using machine learning as a predictive diagnostic tool to identify patterns in large quantities of electronic health record data in real time is a very attractive approach for supporting clinical decisions, which have the potential to improve patient outcomes and reduce waiting times for discharge. Here we report some preliminary analysis to show how machine learning might be applied. In particular, we visualize the fitting landscape in terms of locally optimal neural networks and the connections between them in parameter space. We anticipate that these results, and analogues of thermodynamic properties for molecular systems, may help in the future design of improved predictive tools.

  4. Machine vision methods for use in grain variety discrimination and quality analysis

    NASA Astrophysics Data System (ADS)

    Winter, Philip W.; Sokhansanj, Shahab; Wood, Hugh C.

    1996-12-01

    Decreasing cost of computer technology has made it feasible to incorporate machine vision technology into the agriculture industry. The biggest attraction to using a machine vision system is the computer's ability to be completely consistent and objective. One use is in the variety discrimination and quality inspection of grains. Algorithms have been developed using Fourier descriptors and neural networks for use in variety discrimination of barley seeds. RGB and morphology features have been used in the quality analysis of lentils, and probability distribution functions and L,a,b color values for borage dockage testing. These methods have been shown to be very accurate and have a high potential for agriculture. This paper presents the techniques used and results obtained from projects including: a lentil quality discriminator, a barley variety classifier, a borage dockage tester, a popcorn quality analyzer, and a pistachio nut grading system.

  5. Design description of the Schuchuli Village photovoltaic power system

    NASA Technical Reports Server (NTRS)

    Ratajczak, A. F.; Vasicek, R. W.; Delombard, R.

    1981-01-01

    A stand alone photovoltaic (PV) power system for the village of Schuchuli (Gunsight), Arizona, on the Papago Indian Reservation is a limited energy, all 120 V (d.c.) system to which loads cannot be arbitrarily added and consists of a 3.5 kW (peak) PV array, 2380 ampere-hours of battery storage, an electrical equipment building, a 120 V (d.c.) electrical distribution network, and equipment and automatic controls to provide control power for pumping water into an existing water system; operating 15 refrigerators, a clothes washing machine, a sewing machine, and lights for each of the homes and communal buildings. A solar hot water heater supplies hot water for the washing machine and communal laundry. Automatic control systems provide voltage control by limiting the number of PV strings supplying power during system operation and battery charging, and load management for operating high priority at the expense of low priority loads as the main battery becomes depleted.

  6. Automated discrimination of dementia spectrum disorders using extreme learning machine and structural T1 MRI features.

    PubMed

    Jongin Kim; Boreom Lee

    2017-07-01

    The classification of neuroimaging data for the diagnosis of Alzheimer's Disease (AD) is one of the main research goals of the neuroscience and clinical fields. In this study, we performed extreme learning machine (ELM) classifier to discriminate the AD, mild cognitive impairment (MCI) from normal control (NC). We compared the performance of ELM with that of a linear kernel support vector machine (SVM) for 718 structural MRI images from Alzheimer's Disease Neuroimaging Initiative (ADNI) database. The data consisted of normal control, MCI converter (MCI-C), MCI non-converter (MCI-NC), and AD. We employed SVM-based recursive feature elimination (RFE-SVM) algorithm to find the optimal subset of features. In this study, we found that the RFE-SVM feature selection approach in combination with ELM shows the superior classification accuracy to that of linear kernel SVM for structural T1 MRI data.

  7. Machine Learning for Treatment Assignment: Improving Individualized Risk Attribution

    PubMed Central

    Weiss, Jeremy; Kuusisto, Finn; Boyd, Kendrick; Liu, Jie; Page, David

    2015-01-01

    Clinical studies model the average treatment effect (ATE), but apply this population-level effect to future individuals. Due to recent developments of machine learning algorithms with useful statistical guarantees, we argue instead for modeling the individualized treatment effect (ITE), which has better applicability to new patients. We compare ATE-estimation using randomized and observational analysis methods against ITE-estimation using machine learning, and describe how the ITE theoretically generalizes to new population distributions, whereas the ATE may not. On a synthetic data set of statin use and myocardial infarction (MI), we show that a learned ITE model improves true ITE estimation and outperforms the ATE. We additionally argue that ITE models should be learned with a consistent, nonparametric algorithm from unweighted examples and show experiments in favor of our argument using our synthetic data model and a real data set of D-penicillamine use for primary biliary cirrhosis. PMID:26958271

  8. Machine Learning for Treatment Assignment: Improving Individualized Risk Attribution.

    PubMed

    Weiss, Jeremy; Kuusisto, Finn; Boyd, Kendrick; Liu, Jie; Page, David

    2015-01-01

    Clinical studies model the average treatment effect (ATE), but apply this population-level effect to future individuals. Due to recent developments of machine learning algorithms with useful statistical guarantees, we argue instead for modeling the individualized treatment effect (ITE), which has better applicability to new patients. We compare ATE-estimation using randomized and observational analysis methods against ITE-estimation using machine learning, and describe how the ITE theoretically generalizes to new population distributions, whereas the ATE may not. On a synthetic data set of statin use and myocardial infarction (MI), we show that a learned ITE model improves true ITE estimation and outperforms the ATE. We additionally argue that ITE models should be learned with a consistent, nonparametric algorithm from unweighted examples and show experiments in favor of our argument using our synthetic data model and a real data set of D-penicillamine use for primary biliary cirrhosis.

  9. Development of a classification method for a crack on a pavement surface images using machine learning

    NASA Astrophysics Data System (ADS)

    Hizukuri, Akiyoshi; Nagata, Takeshi

    2017-03-01

    The purpose of this study is to develop a classification method for a crack on a pavement surface image using machine learning to reduce a maintenance fee. Our database consists of 3500 pavement surface images. This includes 800 crack and 2700 normal pavement surface images. The pavement surface images first are decomposed into several sub-images using a discrete wavelet transform (DWT) decomposition. We then calculate the wavelet sub-band histogram from each several sub-images at each level. The support vector machine (SVM) with computed wavelet sub-band histogram is employed for distinguishing between a crack and normal pavement surface images. The accuracies of the proposed classification method are 85.3% for crack and 84.4% for normal pavement images. The proposed classification method achieved high performance. Therefore, the proposed method would be useful in maintenance inspection.

  10. Nutrition environment measures survey-vending: development, dissemination, and reliability.

    PubMed

    Voss, Carol; Klein, Susan; Glanz, Karen; Clawson, Margaret

    2012-07-01

    Researchers determined a need to develop an instrument to assess the vending machine environment that was comparably reliable and valid to other Nutrition Environment Measures Survey tools and that would provide consistent and comparable data for businesses, schools, and communities. Tool development, reliability testing, and dissemination of the Nutrition Environment Measures Survey-Vending (NEMS-V) involved a collaboration of students, professionals, and community leaders. Interrater reliability testing showed high levels of agreement among trained raters on the products and evaluations of products. NEMS-V can benefit public health partners implementing policy and environmental change initiatives as a part of their community wellness activities. The vending machine project will support a policy calling for state facilities to provide a minimum of 30% of foods and beverages in vending machines as healthy options, based on NEMS-V criteria, which will be used as a model for other businesses.

  11. Diamond Smoothing Tools

    NASA Technical Reports Server (NTRS)

    Voronov, Oleg

    2007-01-01

    Diamond smoothing tools have been proposed for use in conjunction with diamond cutting tools that are used in many finish-machining operations. Diamond machining (including finishing) is often used, for example, in fabrication of precise metal mirrors. A diamond smoothing tool according to the proposal would have a smooth spherical surface. For a given finish machining operation, the smoothing tool would be mounted next to the cutting tool. The smoothing tool would slide on the machined surface left behind by the cutting tool, plastically deforming the surface material and thereby reducing the roughness of the surface, closing microcracks and otherwise generally reducing or eliminating microscopic surface and subsurface defects, and increasing the microhardness of the surface layer. It has been estimated that if smoothing tools of this type were used in conjunction with cutting tools on sufficiently precise lathes, it would be possible to reduce the roughness of machined surfaces to as little as 3 nm. A tool according to the proposal would consist of a smoothing insert in a metal holder. The smoothing insert would be made from a diamond/metal functionally graded composite rod preform, which, in turn, would be made by sintering together a bulk single-crystal or polycrystalline diamond, a diamond powder, and a metallic alloy at high pressure. To form the spherical smoothing tip, the diamond end of the preform would be subjected to flat grinding, conical grinding, spherical grinding using diamond wheels, and finally spherical polishing and/or buffing using diamond powders. If the diamond were a single crystal, then it would be crystallographically oriented, relative to the machining motion, to minimize its wear and maximize its hardness. Spherically polished diamonds could also be useful for purposes other than smoothing in finish machining: They would likely also be suitable for use as heat-resistant, wear-resistant, unlubricated sliding-fit bearing inserts.

  12. Normal contour error measurement on-machine and compensation method for polishing complex surface by MRF

    NASA Astrophysics Data System (ADS)

    Chen, Hua; Chen, Jihong; Wang, Baorui; Zheng, Yongcheng

    2016-10-01

    The Magnetorheological finishing (MRF) process, based on the dwell time method with the constant normal spacing for flexible polishing, would bring out the normal contour error in the fine polishing complex surface such as aspheric surface. The normal contour error would change the ribbon's shape and removal characteristics of consistency for MRF. Based on continuously scanning the normal spacing between the workpiece and the finder by the laser range finder, the novel method was put forward to measure the normal contour errors while polishing complex surface on the machining track. The normal contour errors was measured dynamically, by which the workpiece's clamping precision, multi-axis machining NC program and the dynamic performance of the MRF machine were achieved for the verification and security check of the MRF process. The unit for measuring the normal contour errors of complex surface on-machine was designed. Based on the measurement unit's results as feedback to adjust the parameters of the feed forward control and the multi-axis machining, the optimized servo control method was presented to compensate the normal contour errors. The experiment for polishing 180mm × 180mm aspherical workpiece of fused silica by MRF was set up to validate the method. The results show that the normal contour error was controlled in less than 10um. And the PV value of the polished surface accuracy was improved from 0.95λ to 0.09λ under the conditions of the same process parameters. The technology in the paper has been being applied in the PKC600-Q1 MRF machine developed by the China Academe of Engineering Physics for engineering application since 2014. It is being used in the national huge optical engineering for processing the ultra-precision optical parts.

  13. Fluidics and heat generation of Alcon Infiniti and Legacy, Bausch & Lomb Millennium, and advanced medical optics sovereign phacoemulsification systems.

    PubMed

    Floyd, Michael S; Valentine, Jeremy R; Olson, Randall J

    2006-09-01

    To study heat generation, vacuum, and flow characteristics of the Alcon Infiniti and Bausch & Lomb Millennium with results compared with the Alcon Legacy and advanced medical optics (AMO) Sovereign machines previously studied. Experimental study. Heat generation with continuous ultrasound was determined with and without a 200-g weight. Flow and vacuum were determined from 12 to 40-ml/min in 2-ml/min steps. The impact of a STAAR Cruise Control was also tested. Millennium created the most heat/20% of power (5.67 +/- 0.51 degrees C unweighted and 6.80 +/- 0.80 degrees C weighted), followed by Sovereign (4.59 +/- 0.70 degrees C unweighted and 5.65 +/- 0.72 degrees C weighted), Infiniti (2.79 +/- 0.62 degrees C unweighted and 3.96 +/- 0.31 degrees C weighted), and Legacy (1.99 +/- 0.49 degrees C unweighted and 4.27 +/- 0.76 degrees C weighted; P < .0001 for all comparisons between machines except Infiniti vs Legacy, both weighted). Flow studies revealed that Millennium Peristaltic was 17% less than indicated (P < .0001 to all other machines), and all other machines were within 3.5% of indicated. Cruise Control decreased flow by 4.1% (P < .0001 for same machine without it). Millennium Venturi had the greatest vacuum (81% more than the least Sovereign; P < .0001), and Cruise Control increased vacuum in a peristaltic machine 35% more than the Venturi system (P < .0001). Percent power is not consistent in regard to heat generation, however, flow was accurate for all machines except Millennium Peristaltic. Restriction with Cruise Control elevates unoccluded vacuum to levels greater than the Venturi system tested.

  14. Detection of longitudinal visual field progression in glaucoma using machine learning.

    PubMed

    Yousefi, Siamak; Kiwaki, Taichi; Zheng, Yuhui; Suigara, Hiroki; Asaoka, Ryo; Murata, Hiroshi; Lemij, Hans; Yamanishi, Kenji

    2018-06-16

    Global indices of standard automated perimerty are insensitive to localized losses, while point-wise indices are sensitive but highly variable. Region-wise indices sit in between. This study introduces a machine-learning-based index for glaucoma progression detection that outperforms global, region-wise, and point-wise indices. Development and comparison of a prognostic index. Visual fields from 2085 eyes of 1214 subjects were used to identify glaucoma progression patterns using machine learning. Visual fields from 133 eyes of 71 glaucoma patients were collected 10 times over 10 weeks to provide a no-change, test-retest dataset. The parameters of all methods were identified using visual field sequences in the test-retest dataset to meet fixed 95% specificity. An independent dataset of 270 eyes of 136 glaucoma patients and survival analysis were utilized to compare methods. The time to detect progression in 25% of the eyes in the longitudinal dataset using global mean deviation (MD) was 5.2 years (95% confidence interval, 4.1 - 6.5 years); 4.5 years (4.0 - 5.5) using region-wise, 3.9 years (3.5 - 4.6) using point-wise, and 3.5 years (3.1 - 4.0) using machine learning analysis. The time until 25% of eyes showed subsequently confirmed progression after two additional visits were included were 6.6 years (5.6 - 7.4 years), 5.7 years (4.8 - 6.7), 5.6 years (4.7 - 6.5), and 5.1 years (4.5 - 6.0) for global, region-wise, point-wise, and machine learning analyses, respectively. Machine learning analysis detects progressing eyes earlier than other methods consistently, with or without confirmation visits. In particular, machine learning detects more slowly progressing eyes than other methods. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. TH-B-BRC-00: How to Identify and Resolve Potential Clinical Errors Before They Impact Patients Treatment: Lessons Learned

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    Radiation treatment consists of a chain of events influenced by the quality of machine operation, beam data commissioning, machine calibration, patient specific data, simulation, treatment planning, imaging and treatment delivery. There is always a chance that the clinical medical physicist may make or fail to detect an error in one of the events that may impact on the patient’s treatment. In the clinical scenario, errors may be systematic and, without peer review, may have a low detectability because they are not part of routine QA procedures. During treatment, there might be errors on machine that needs attention. External reviews ofmore » some of the treatment delivery components by independent reviewers, like IROC, can detect errors, but may not be timely. The goal of this session is to help junior clinical physicists identify potential errors as well as the approach of quality assurance to perform a root cause analysis to find and eliminate an error and to continually monitor for errors. A compilation of potential errors will be presented by examples of the thought process required to spot the error and determine the root cause. Examples may include unusual machine operation, erratic electrometer reading, consistent lower electron output, variation in photon output, body parts inadvertently left in beam, unusual treatment plan, poor normalization, hot spots etc. Awareness of the possibility and detection of error in any link of the treatment process chain will help improve the safe and accurate delivery of radiation to patients. Four experts will discuss how to identify errors in four areas of clinical treatment. D. Followill, NIH grant CA 180803.« less

  16. TH-B-BRC-01: How to Identify and Resolve Potential Clinical Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, I.

    2016-06-15

    Radiation treatment consists of a chain of events influenced by the quality of machine operation, beam data commissioning, machine calibration, patient specific data, simulation, treatment planning, imaging and treatment delivery. There is always a chance that the clinical medical physicist may make or fail to detect an error in one of the events that may impact on the patient’s treatment. In the clinical scenario, errors may be systematic and, without peer review, may have a low detectability because they are not part of routine QA procedures. During treatment, there might be errors on machine that needs attention. External reviews ofmore » some of the treatment delivery components by independent reviewers, like IROC, can detect errors, but may not be timely. The goal of this session is to help junior clinical physicists identify potential errors as well as the approach of quality assurance to perform a root cause analysis to find and eliminate an error and to continually monitor for errors. A compilation of potential errors will be presented by examples of the thought process required to spot the error and determine the root cause. Examples may include unusual machine operation, erratic electrometer reading, consistent lower electron output, variation in photon output, body parts inadvertently left in beam, unusual treatment plan, poor normalization, hot spots etc. Awareness of the possibility and detection of error in any link of the treatment process chain will help improve the safe and accurate delivery of radiation to patients. Four experts will discuss how to identify errors in four areas of clinical treatment. D. Followill, NIH grant CA 180803.« less

  17. A machine-learned analysis of human gene polymorphisms modulating persisting pain points at major roles of neuroimmune processes.

    PubMed

    Kringel, Dario; Lippmann, Catharina; Parnham, Michael J; Kalso, Eija; Ultsch, Alfred; Lötsch, Jörn

    2018-06-19

    Human genetic research has implicated functional variants of more than one hundred genes in the modulation of persisting pain. Artificial intelligence and machine learning techniques may combine this knowledge with results of genetic research gathered in any context, which permits the identification of the key biological processes involved in chronic sensitization to pain. Based on published evidence, a set of 110 genes carrying variants reported to be associated with modulation of the clinical phenotype of persisting pain in eight different clinical settings was submitted to unsupervised machine-learning aimed at functional clustering. Subsequently, a mathematically supported subset of genes, comprising those most consistently involved in persisting pain, was analyzed by means of computational functional genomics in the Gene Ontology knowledgebase. Clustering of genes with evidence for a modulation of persisting pain elucidated a functionally heterogeneous set. The situation cleared when the focus was narrowed to a genetic modulation consistently observed throughout several clinical settings. On this basis, two groups of biological processes, the immune system and nitric oxide signaling, emerged as major players in sensitization to persisting pain, which is biologically highly plausible and in agreement with other lines of pain research. The present computational functional genomics-based approach provided a computational systems-biology perspective on chronic sensitization to pain. Human genetic control of persisting pain points to the immune system as a source of potential future targets for drugs directed against persisting pain. Contemporary machine-learned methods provide innovative approaches to knowledge discovery from previous evidence. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  18. A discrepancy within primate spatial vision and its bearing on the definition of edge detection processes in machine vision

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.

    1990-01-01

    The visual perception of form information is considered to be based on the functioning of simple and complex neurons in the primate striate cortex. However, a review of the physiological data on these brain cells cannot be harmonized with either the perceptual spatial frequency performance of primates or the performance which is necessary for form perception in humans. This discrepancy together with recent interest in cortical-like and perceptual-like processing in image coding and machine vision prompted a series of image processing experiments intended to provide some definition of the selection of image operators. The experiments were aimed at determining operators which could be used to detect edges in a computational manner consistent with the visual perception of structure in images. Fundamental issues were the selection of size (peak spatial frequency) and circular versus oriented operators (or some combination). In a previous study, circular difference-of-Gaussian (DOG) operators, with peak spatial frequency responses at about 11 and 33 cyc/deg were found to capture the primary structural information in images. Here larger scale circular DOG operators were explored and led to severe loss of image structure and introduced spatial dislocations (due to blur) in structure which is not consistent with visual perception. Orientation sensitive operators (akin to one class of simple cortical neurons) introduced ambiguities of edge extent regardless of the scale of the operator. For machine vision schemes which are functionally similar to natural vision form perception, two circularly symmetric very high spatial frequency channels appear to be necessary and sufficient for a wide range of natural images. Such a machine vision scheme is most similar to the physiological performance of the primate lateral geniculate nucleus rather than the striate cortex.

  19. Improving machine learning reproducibility in genetic association studies with proportional instance cross validation (PICV).

    PubMed

    Piette, Elizabeth R; Moore, Jason H

    2018-01-01

    Machine learning methods and conventions are increasingly employed for the analysis of large, complex biomedical data sets, including genome-wide association studies (GWAS). Reproducibility of machine learning analyses of GWAS can be hampered by biological and statistical factors, particularly so for the investigation of non-additive genetic interactions. Application of traditional cross validation to a GWAS data set may result in poor consistency between the training and testing data set splits due to an imbalance of the interaction genotypes relative to the data as a whole. We propose a new cross validation method, proportional instance cross validation (PICV), that preserves the original distribution of an independent variable when splitting the data set into training and testing partitions. We apply PICV to simulated GWAS data with epistatic interactions of varying minor allele frequencies and prevalences and compare performance to that of a traditional cross validation procedure in which individuals are randomly allocated to training and testing partitions. Sensitivity and positive predictive value are significantly improved across all tested scenarios for PICV compared to traditional cross validation. We also apply PICV to GWAS data from a study of primary open-angle glaucoma to investigate a previously-reported interaction, which fails to significantly replicate; PICV however improves the consistency of testing and training results. Application of traditional machine learning procedures to biomedical data may require modifications to better suit intrinsic characteristics of the data, such as the potential for highly imbalanced genotype distributions in the case of epistasis detection. The reproducibility of genetic interaction findings can be improved by considering this variable imbalance in cross validation implementation, such as with PICV. This approach may be extended to problems in other domains in which imbalanced variable distributions are a concern.

  20. Literature classification for semi-automated updating of biological knowledgebases

    PubMed Central

    2013-01-01

    Background As the output of biological assays increase in resolution and volume, the body of specialized biological data, such as functional annotations of gene and protein sequences, enables extraction of higher-level knowledge needed for practical application in bioinformatics. Whereas common types of biological data, such as sequence data, are extensively stored in biological databases, functional annotations, such as immunological epitopes, are found primarily in semi-structured formats or free text embedded in primary scientific literature. Results We defined and applied a machine learning approach for literature classification to support updating of TANTIGEN, a knowledgebase of tumor T-cell antigens. Abstracts from PubMed were downloaded and classified as either "relevant" or "irrelevant" for database update. Training and five-fold cross-validation of a k-NN classifier on 310 abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining and machine learning. The addition of such data will aid in the transition of biological databases to knowledgebases. PMID:24564403

  1. Convolutional Neural Network Based on Extreme Learning Machine for Maritime Ships Recognition in Infrared Images.

    PubMed

    Khellal, Atmane; Ma, Hongbin; Fei, Qing

    2018-05-09

    The success of Deep Learning models, notably convolutional neural networks (CNNs), makes them the favorable solution for object recognition systems in both visible and infrared domains. However, the lack of training data in the case of maritime ships research leads to poor performance due to the problem of overfitting. In addition, the back-propagation algorithm used to train CNN is very slow and requires tuning many hyperparameters. To overcome these weaknesses, we introduce a new approach fully based on Extreme Learning Machine (ELM) to learn useful CNN features and perform a fast and accurate classification, which is suitable for infrared-based recognition systems. The proposed approach combines an ELM based learning algorithm to train CNN for discriminative features extraction and an ELM based ensemble for classification. The experimental results on VAIS dataset, which is the largest dataset of maritime ships, confirm that the proposed approach outperforms the state-of-the-art models in term of generalization performance and training speed. For instance, the proposed model is up to 950 times faster than the traditional back-propagation based training of convolutional neural networks, primarily for low-level features extraction.

  2. Wire array K-shell sources on the SPHINX generator

    NASA Astrophysics Data System (ADS)

    D'Almeida, Thierry; Lassalle, Francis; Grunenwald, Julien; Maury, Patrick; Zucchini, Frédéric; Niasse, Nicolas; Chittenden, Jeremy

    2014-10-01

    The SPHINX machine is a LTD based Z-pinch driver operated by the CEA Gramat (France) and primarily used for studying K-shell radiation effects. We present the results of experiments carried out with single and nested large diameter aluminium wire array loads driven by a current of ~5 MA in ~800 ns. The dynamic of the implosion is studied with filtered X-UV time-integrated pin-hole cameras. The plasma electron temperature and the characteristics of the sources are estimated with time and spatially dependent spectrographs and PCDs. It is shown that Al K-shell yields (>1 keV) up to 27 kJ are obtained for a total radiation of ~ 230 kJ. These results are compared with simulations performed using the latest implementation of the non-LTE DCA code Spk in the 3D Eulerian MHD framework Gorgon developed at Imperial College. Filtered synthetic bolometers and PCD signals, time-dependent spatially integrated spectra and X-UV images are produced and show a good agreement with the experimental data. The capabilities of a prospective SPHINX II machine (20 MA ~ 800 ns) are also assessed for a wider variety of sources (Ti, Cu and W).

  3. [Evaluation of orthodontic friction using a tribometer with alternating movement].

    PubMed

    Pernier, C M; Jablonska-Mazanek, E D; Ponsonnet, L; Grosgogeat, B

    2005-12-01

    It is essential for orthodontists to control the complex phenomenon of friction. The in vitro techniques, usually dynamometers or tensile testing machines, used to measure the frictional resistance between arch wires and brackets are linear and unidirectional and can be criticised because tooth movements, such as tipping and uprighting, as well everyday oral activities, primarily chewing, are not uni-dimensional but more closely resemble the small amplitude oscillatory phenomena known as fretting. We therefore decided to develop a fretting machine not with linear but with alternating movements better suited to evaluate the frictional behaviour of orthodontic bracket-wire combinations. Once we had completed construction of this device, we proceeded to measure the frictional resistance between one stainless steel bracket (MicroArch GAC) and five wires currently used in orthodontics (Two nickel-titanium shape memory alloys: Neo Sentalloy and Neo Sentalloy with Ionguard GAC--Three titanium-molybdenum alloys: TMA and Low Friction TMA Ormco and Resolve GAC). We were able to set up a classification of the wires according to their coefficient of friction, demonstrating the inefficacy of ion implantation and quantifying the increase in the coefficient of friction which occurs when Resolve wires are placed in the oral environment for approximately one year.

  4. Prediction of microRNA target genes using an efficient genetic algorithm-based decision tree.

    PubMed

    Rabiee-Ghahfarrokhi, Behzad; Rafiei, Fariba; Niknafs, Ali Akbar; Zamani, Behzad

    2015-01-01

    MicroRNAs (miRNAs) are small, non-coding RNA molecules that regulate gene expression in almost all plants and animals. They play an important role in key processes, such as proliferation, apoptosis, and pathogen-host interactions. Nevertheless, the mechanisms by which miRNAs act are not fully understood. The first step toward unraveling the function of a particular miRNA is the identification of its direct targets. This step has shown to be quite challenging in animals primarily because of incomplete complementarities between miRNA and target mRNAs. In recent years, the use of machine-learning techniques has greatly increased the prediction of miRNA targets, avoiding the need for costly and time-consuming experiments to achieve miRNA targets experimentally. Among the most important machine-learning algorithms are decision trees, which classify data based on extracted rules. In the present work, we used a genetic algorithm in combination with C4.5 decision tree for prediction of miRNA targets. We applied our proposed method to a validated human datasets. We nearly achieved 93.9% accuracy of classification, which could be related to the selection of best rules.

  5. Semi-supervised and unsupervised extreme learning machines.

    PubMed

    Huang, Gao; Song, Shiji; Gupta, Jatinder N D; Wu, Cheng

    2014-12-01

    Extreme learning machines (ELMs) have proven to be efficient and effective learning mechanisms for pattern classification and regression. However, ELMs are primarily applied to supervised learning problems. Only a few existing research papers have used ELMs to explore unlabeled data. In this paper, we extend ELMs for both semi-supervised and unsupervised tasks based on the manifold regularization, thus greatly expanding the applicability of ELMs. The key advantages of the proposed algorithms are as follows: 1) both the semi-supervised ELM (SS-ELM) and the unsupervised ELM (US-ELM) exhibit learning capability and computational efficiency of ELMs; 2) both algorithms naturally handle multiclass classification or multicluster clustering; and 3) both algorithms are inductive and can handle unseen data at test time directly. Moreover, it is shown in this paper that all the supervised, semi-supervised, and unsupervised ELMs can actually be put into a unified framework. This provides new perspectives for understanding the mechanism of random feature mapping, which is the key concept in ELM theory. Empirical study on a wide range of data sets demonstrates that the proposed algorithms are competitive with the state-of-the-art semi-supervised or unsupervised learning algorithms in terms of accuracy and efficiency.

  6. Knowledge engineering for PACES, the particle accelerator control expert system

    NASA Astrophysics Data System (ADS)

    Lind, P. C.; Poehlman, W. F. S.; Stark, J. W.; Cousins, T.

    1992-04-01

    The KN-3000 used at Defense Research Establishment Ottawa is a Van de Graaff particle accelerator employed primarily to produce monoenergetic neutrons for calibrating radiation detectors. To provide training and assistance for new operators, it was decided to develop an expert system for accelerator operation. Knowledge engineering aspects of the expert system are reviewed. Two important issues are involved: the need to encapsulate expert knowledge into the system in a form that facilitates automatic accelerator operation and to partition the system so that time-consuming inferencing is minimized in favor of faster, more algorithmic control. It is seen that accelerator control will require fast, narrowminded decision making for rapid fine tuning, but slower and broader reasoning for machine startup, shutdown, fault diagnosis, and correction. It is also important to render the knowledge base in a form conducive to operator training. A promising form of the expert system involves a hybrid system in which high level reasoning is performed on the host machine that interacts with the user, while an embedded controller employs neural networks for fast but limited adjustment of accelerator performance. This partitioning of duty facilitates a hierarchical chain of command yielding an effective mixture of speed and reasoning ability.

  7. Prediction of microRNA target genes using an efficient genetic algorithm-based decision tree

    PubMed Central

    Rabiee-Ghahfarrokhi, Behzad; Rafiei, Fariba; Niknafs, Ali Akbar; Zamani, Behzad

    2015-01-01

    MicroRNAs (miRNAs) are small, non-coding RNA molecules that regulate gene expression in almost all plants and animals. They play an important role in key processes, such as proliferation, apoptosis, and pathogen–host interactions. Nevertheless, the mechanisms by which miRNAs act are not fully understood. The first step toward unraveling the function of a particular miRNA is the identification of its direct targets. This step has shown to be quite challenging in animals primarily because of incomplete complementarities between miRNA and target mRNAs. In recent years, the use of machine-learning techniques has greatly increased the prediction of miRNA targets, avoiding the need for costly and time-consuming experiments to achieve miRNA targets experimentally. Among the most important machine-learning algorithms are decision trees, which classify data based on extracted rules. In the present work, we used a genetic algorithm in combination with C4.5 decision tree for prediction of miRNA targets. We applied our proposed method to a validated human datasets. We nearly achieved 93.9% accuracy of classification, which could be related to the selection of best rules. PMID:26649272

  8. Slow Slip and Earthquake Nucleation in Meter-Scale Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Mclaskey, G.

    2017-12-01

    The initiation of dynamic rupture is thought to be preceded by a quasistatic nucleation phase. Observations of recent earthquakes sometimes support this by illuminating slow slip and foreshocks in the vicinity of the eventual hypocenter. I describe laboratory earthquake experiments conducted on two large-scale loading machines at Cornell University that provide insight into the way earthquake nucleation varies with normal stress, healing time, and loading rate. The larger of the two machines accommodates a 3 m long granite sample, and when loaded to 7 MPa stress levels, we observe dynamic rupture events that are preceded by a measureable nucleation zone with dimensions on the order of 1 m. The smaller machine accommodates a 0.76 m sample that is roughly the same size as the nucleation zone. On this machine, small variations in nucleation properties result in measurable differences in slip events, and we generate both dynamic rupture events (> 0.1 m/s slip rates) and slow slip events ( 0.001 to 30 mm/s slip rates). Slow events occur when instability cannot fully nucleate before reaching the sample ends. Dynamic events occur after long healing times or abrupt increases in loading rate which suggests that these factors shrink the spatial and temporal extents of the nucleation zone. Arrays of slip, strain, and ground motion sensors installed on the sample allow us to quantify seismic coupling and study details of premonitory slip and afterslip. The slow slip events we observe are primarily aseismic (less than 1% of the seismic coupling of faster events) and produce swarms of very small M -6 to M -8 events. These mechanical and seismic interactions suggest that faults with transitional behavior—where creep, small earthquakes, and tremor are often observed—could become seismically coupled if loaded rapidly, either by a slow slip front or dynamic rupture of an earthquake that nucleated elsewhere.

  9. Machine learning approaches to investigate the impact of PCBs on the transcriptome of the common bottlenose dolphin (Tursiops truncatus).

    PubMed

    Mancia, Annalaura; Ryan, James C; Van Dolah, Frances M; Kucklick, John R; Rowles, Teresa K; Wells, Randall S; Rosel, Patricia E; Hohn, Aleta A; Schwacke, Lori H

    2014-09-01

    As top-level predators, common bottlenose dolphins (Tursiops truncatus) are particularly sensitive to chemical and biological contaminants that accumulate and biomagnify in the marine food chain. This work investigates the potential use of microarray technology and gene expression profile analysis to screen common bottlenose dolphins for exposure to environmental contaminants through the immunological and/or endocrine perturbations associated with these agents. A dolphin microarray representing 24,418 unigene sequences was used to analyze blood samples collected from 47 dolphins during capture-release health assessments from five different US coastal locations (Beaufort, NC, Sarasota Bay, FL, Saint Joseph Bay, FL, Sapelo Island, GA and Brunswick, GA). Organohalogen contaminants including pesticides, polychlorinated biphenyl congeners (PCBs) and polybrominated diphenyl ether congeners were determined in blubber biopsy samples from the same animals. A subset of samples (n = 10, males; n = 8, females) with the highest and the lowest measured values of PCBs in their blubber was used as strata to determine the differential gene expression of the exposure extremes through machine learning classification algorithms. A set of genes associated primarily with nuclear and DNA stability, cell division and apoptosis regulation, intra- and extra-cellular traffic, and immune response activation was selected by the algorithm for identifying the two exposure extremes. In order to test the hypothesis that these gene expression patterns reflect PCB exposure, we next investigated the blood transcriptomes of the remaining dolphin samples using machine-learning approaches, including K-nn and Support Vector Machines classifiers. Using the derived gene sets, the algorithms worked very well (100% success rate) at classifying dolphins according to the contaminant load accumulated in their blubber. These results suggest that gene expression profile analysis may provide a valuable means to screen for indicators of chemical exposure. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Atom redistribution and multilayer structure in NiTi shape memory alloy induced by high energy proton irradiation

    NASA Astrophysics Data System (ADS)

    Wang, Haizhen; Yi, Xiaoyang; Zhu, Yingying; Yin, Yongkui; Gao, Yuan; Cai, Wei; Gao, Zhiyong

    2017-10-01

    The element distribution and surface microstructure in NiTi shape memory alloys exposed to 3 MeV proton irradiation were investigated. Redistribution of the alloying element and a clearly visible multilayer structure consisting of three layers were observed on the surface of NiTi shape memory alloys after proton irradiation. The outermost layer consists primarily of a columnar-like TiH2 phase with a tetragonal structure, and the internal layer is primarily comprised of a bcc austenite phase. In addition, the Ti2Ni phase, with an fcc structure, serves as the transition layer between the outermost and internal layer. The above-mentioned phenomenon is attributed to the preferential sputtering of high energy protons and segregation induced by irradiation.

  11. 7 CFR 1469.3 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... grasses and legumes; summer fallow; typically cropped wet areas, such as rice fields, rotated to wildlife... machine harvested. The crop may be grasses, legumes, or a combination of both. Incidental forest land... mixture, or a grass-legume mixture. Management usually consists of cultural treatments: fertilization...

  12. MSUSTAT.

    ERIC Educational Resources Information Center

    Mauriello, David

    1984-01-01

    Reviews an interactive statistical analysis package (designed to run on 8- and 16-bit machines that utilize CP/M 80 and MS-DOS operating systems), considering its features and uses, documentation, operation, and performance. The package consists of 40 general purpose statistical procedures derived from the classic textbook "Statistical…

  13. Career Directions: HVACR Technician

    ERIC Educational Resources Information Center

    Moore, Pam

    2005-01-01

    Heating/ventilation/air conditioning/refrigeration (HVACR) technicians (also known as "heating and cooling technicians") are the people who install, maintain, test and repair the machines that control temperature, circulation, moisture and purity of air in residential, commercial and industrial buildings. These systems consist of a variety of…

  14. MCAID--A Generalized Text Driver.

    ERIC Educational Resources Information Center

    Ahmed, K.; Dickinson, C. J.

    MCAID is a relatively machine-independent technique for writing computer-aided instructional material consisting of descriptive text, multiple choice questions, and the ability to call compiled subroutines to perform extensive calculations. It was specially developed to incorporate test-authoring around complex mathematical models to explore a…

  15. View northwest of building 19 area used for pattern shop ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View northwest of building 19 area used for pattern shop storage (foreground), building 17 section of structure on left. This structure consists of two formerly separate buildings. - Naval Base Philadelphia-Philadelphia Naval Shipyard, Machine Shops, League Island, Philadelphia, Philadelphia County, PA

  16. Tactical Mission Command (TMC)

    DTIC Science & Technology

    2016-03-01

    capabilities to Army commanders and their staffs, consisting primarily of a user-customizable Common Operating Picture ( COP ) enabled with real-time... COP viewer and data management capability. It is a collaborative, visualization and planning application that also provides a common map display... COP ): Display the COP consisting of the following:1 Friendly forces determined by the commander including subordinate and supporting units at

  17. [Computer Program PEDAGE -- MARKTF-M5-F4.

    ERIC Educational Resources Information Center

    Toronto Univ. (Ontario). Dept. of Geology.

    The computer program MARKTF-M5, written in FORTRAN IV, scores tests (consisting of true-or-false statement about concepts or facts) by comparing the list of true or false values prepared by the instructor with those from the students. The output consists of information to the supervisor about the performance of the students, primarily for his…

  18. A vibro-haptic human-machine interface for structural health monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mascarenas, David; Plont, Crystal; Brown, Christina

    The structural health monitoring (SHM) community’s goal has been to endow physical systems with a nervous system not unlike those commonly found in living organisms. Typically the SHM community has attempted to do this by instrumenting structures with a variety of sensors, and then applying various signal processing and classification procedures to the data in order to detect the presence of damage, the location of damage, the severity of damage, and to estimate the remaining useful life of the structure. This procedure has had some success, but we are still a long way from achieving the performance of nervous systemsmore » found in biology. This is primarily because contemporary classification algorithms do not have the performance required. In many cases expert judgment is superior to automated classification. This work introduces a new paradigm. We propose interfacing the human nervous system to the distributed sensor network located on the structure and developing new techniques to enable human-machine cooperation. Results from the field of sensory substitution suggest this should be possible. This study investigates a vibro-haptic human-machine interface for SHM. The investigation was performed using a surrogate three-story structure. The structure features three nonlinearity-inducing bumpers to simulate damage. Accelerometers are placed on each floor to measure the response of the structure to a harmonic base excitation. The accelerometer measurements are preprocessed. As a result, the preprocessed data is then encoded encoded as a vibro-tactile stimulus. Human subjects were then subjected to the vibro-tactile stimulus and asked to characterize the damage in the structure.« less

  19. A vibro-haptic human-machine interface for structural health monitoring

    DOE PAGES

    Mascarenas, David; Plont, Crystal; Brown, Christina; ...

    2014-11-01

    The structural health monitoring (SHM) community’s goal has been to endow physical systems with a nervous system not unlike those commonly found in living organisms. Typically the SHM community has attempted to do this by instrumenting structures with a variety of sensors, and then applying various signal processing and classification procedures to the data in order to detect the presence of damage, the location of damage, the severity of damage, and to estimate the remaining useful life of the structure. This procedure has had some success, but we are still a long way from achieving the performance of nervous systemsmore » found in biology. This is primarily because contemporary classification algorithms do not have the performance required. In many cases expert judgment is superior to automated classification. This work introduces a new paradigm. We propose interfacing the human nervous system to the distributed sensor network located on the structure and developing new techniques to enable human-machine cooperation. Results from the field of sensory substitution suggest this should be possible. This study investigates a vibro-haptic human-machine interface for SHM. The investigation was performed using a surrogate three-story structure. The structure features three nonlinearity-inducing bumpers to simulate damage. Accelerometers are placed on each floor to measure the response of the structure to a harmonic base excitation. The accelerometer measurements are preprocessed. As a result, the preprocessed data is then encoded encoded as a vibro-tactile stimulus. Human subjects were then subjected to the vibro-tactile stimulus and asked to characterize the damage in the structure.« less

  20. Neural networks and applications tutorial

    NASA Astrophysics Data System (ADS)

    Guyon, I.

    1991-09-01

    The importance of neural networks has grown dramatically during this decade. While only a few years ago they were primarily of academic interest, now dozens of companies and many universities are investigating the potential use of these systems and products are beginning to appear. The idea of building a machine whose architecture is inspired by that of the brain has roots which go far back in history. Nowadays, technological advances of computers and the availability of custom integrated circuits, permit simulations of hundreds or even thousands of neurons. In conjunction, the growing interest in learning machines, non-linear dynamics and parallel computation spurred renewed attention in artificial neural networks. Many tentative applications have been proposed, including decision systems (associative memories, classifiers, data compressors and optimizers), or parametric models for signal processing purposes (system identification, automatic control, noise canceling, etc.). While they do not always outperform standard methods, neural network approaches are already used in some real world applications for pattern recognition and signal processing tasks. The tutorial is divided into six lectures, that where presented at the Third Graduate Summer Course on Computational Physics (September 3-7, 1990) on Parallel Architectures and Applications, organized by the European Physical Society: (1) Introduction: machine learning and biological computation. (2) Adaptive artificial neurons (perceptron, ADALINE, sigmoid units, etc.): learning rules and implementations. (3) Neural network systems: architectures, learning algorithms. (4) Applications: pattern recognition, signal processing, etc. (5) Elements of learning theory: how to build networks which generalize. (6) A case study: a neural network for on-line recognition of handwritten alphanumeric characters.

  1. Disrupted white matter connectivity underlying developmental dyslexia: A machine learning approach.

    PubMed

    Cui, Zaixu; Xia, Zhichao; Su, Mengmeng; Shu, Hua; Gong, Gaolang

    2016-04-01

    Developmental dyslexia has been hypothesized to result from multiple causes and exhibit multiple manifestations, implying a distributed multidimensional effect on human brain. The disruption of specific white-matter (WM) tracts/regions has been observed in dyslexic children. However, it remains unknown if developmental dyslexia affects the human brain WM in a multidimensional manner. Being a natural tool for evaluating this hypothesis, the multivariate machine learning approach was applied in this study to compare 28 school-aged dyslexic children with 33 age-matched controls. Structural magnetic resonance imaging (MRI) and diffusion tensor imaging were acquired to extract five multitype WM features at a regional level: white matter volume, fractional anisotropy, mean diffusivity, axial diffusivity, and radial diffusivity. A linear support vector machine (LSVM) classifier achieved an accuracy of 83.61% using these MRI features to distinguish dyslexic children from controls. Notably, the most discriminative features that contributed to the classification were primarily associated with WM regions within the putative reading network/system (e.g., the superior longitudinal fasciculus, inferior fronto-occipital fasciculus, thalamocortical projections, and corpus callosum), the limbic system (e.g., the cingulum and fornix), and the motor system (e.g., the cerebellar peduncle, corona radiata, and corticospinal tract). These results were well replicated using a logistic regression classifier. These findings provided direct evidence supporting a multidimensional effect of developmental dyslexia on WM connectivity of human brain, and highlighted the involvement of WM tracts/regions beyond the well-recognized reading system in dyslexia. Finally, the discriminating results demonstrated a potential of WM neuroimaging features as imaging markers for identifying dyslexic individuals. © 2016 Wiley Periodicals, Inc.

  2. Galaxy Zoo and SPARCFIRE: constraints on spiral arm formation mechanisms from spiral arm number and pitch angles

    NASA Astrophysics Data System (ADS)

    Hart, Ross E.; Bamford, Steven P.; Hayes, Wayne B.; Cardamone, Carolin N.; Keel, William C.; Kruk, Sandor J.; Lintott, Chris J.; Masters, Karen L.; Simmons, Brooke D.; Smethurst, Rebecca J.

    2017-12-01

    In this paper, we study the morphological properties of spiral galaxies, including measurements of spiral arm number and pitch angle. Using Galaxy Zoo 2, a stellar mass-complete sample of 6222 SDSS spiral galaxies is selected. We use the machine vision algorithm SPARCFIRE to identify spiral arm features and measure their associated geometries. A support vector machine classifier is employed to identify reliable spiral features, with which we are able to estimate pitch angles for half of our sample. We use these machine measurements to calibrate visual estimates of arm tightness, and hence estimate pitch angles for our entire sample. The properties of spiral arms are compared with respect to various galaxy properties. The star formation properties of galaxies vary significantly with arm number, but not pitch angle. We find that galaxies hosting strong bars have spiral arms substantially (4°-6°) looser than unbarred galaxies. Accounting for this, spiral arms associated with many-armed structures are looser (by 2°) than those in two-armed galaxies. In contrast to this average trend, galaxies with greater bulge-to-total stellar mass ratios display both fewer and looser spiral arms. This effect is primarily driven by the galaxy disc, such that galaxies with more massive discs contain more spiral arms with tighter pitch angles. This implies that galaxy central mass concentration is not the dominant cause of pitch angle and arm number variations between galaxies, which in turn suggests that not all spiral arms are governed by classical density waves or modal theories.

  3. EPA's Research Report on Turfgrass Allowance

    EPA Pesticide Factsheets

    The purpose of this report is to present and explain data indicating that residential landscapes consisting primarily of turfgrass use considerably more water than landscapes with a mixture of other plants.

  4. Aquilla Lake, Brazos River Basin, Texas, Pre-Impoundment Environmental Study: Supplement to Design Memorandum Number 9, Master Plan (in Response to: 40CFR 1505.3),

    DTIC Science & Technology

    1983-06-01

    phaeacantha White Prairie Rose Rosa filiolosa Bur Oak Quercus macrocarpa Slippery Elm Ulrnus rubra Elbow-Bush Forestiera pubescens Southen Black-haw Virburnum...It LIST OF PLATES Plate Title Page 1 Above, a cedar elm woodland scene ( -5), herbaceous component consists primarily of Canada...3( 2 Above, view of a pecan parkland (T3-2), herbaceous and shrub components composed primarily of Smilax, June 1980. Below, a mesquite/cedar elm

  5. Calibrating random forests for probability estimation.

    PubMed

    Dankowski, Theresa; Ziegler, Andreas

    2016-09-30

    Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  6. QA in Radiation Therapy: The RPC Perspective

    NASA Astrophysics Data System (ADS)

    Ibbott, G. S.

    2010-11-01

    The Radiological Physics Center (RPC) is charged with assuring the consistent delivery of radiation doses to patients on NCI-sponsored clinical trials. To accomplish this, the RPC conducts annual mailed audits of machine calibration, dosimetry audit visits to institutions, reviews of treatment records, and credentialing procedures requiring the irradiation of anthropomorphic phantoms. Through these measurements, the RPC has gained an understanding of the level of quality assurance practiced in this cohort of institutions, and a database of measurements of beam characteristics of a large number of treatment machines. The results of irradiations of phantoms have yielded insight into the delivery of advanced technology treatment procedures.

  7. Optimization of Support Vector Machine (SVM) for Object Classification

    NASA Technical Reports Server (NTRS)

    Scholten, Matthew; Dhingra, Neil; Lu, Thomas T.; Chao, Tien-Hsin

    2012-01-01

    The Support Vector Machine (SVM) is a powerful algorithm, useful in classifying data into species. The SVMs implemented in this research were used as classifiers for the final stage in a Multistage Automatic Target Recognition (ATR) system. A single kernel SVM known as SVMlight, and a modified version known as a SVM with K-Means Clustering were used. These SVM algorithms were tested as classifiers under varying conditions. Image noise levels varied, and the orientation of the targets changed. The classifiers were then optimized to demonstrate their maximum potential as classifiers. Results demonstrate the reliability of SVM as a method for classification. From trial to trial, SVM produces consistent results.

  8. Building environment analysis based on temperature and humidity for smart energy systems.

    PubMed

    Yun, Jaeseok; Won, Kwang-Ho

    2012-10-01

    In this paper, we propose a new HVAC (heating, ventilation, and air conditioning) control strategy as part of the smart energy system that can balance occupant comfort against building energy consumption using ubiquitous sensing and machine learning technology. We have developed ZigBee-based wireless sensor nodes and collected realistic temperature and humidity data during one month from a laboratory environment. With the collected data, we have established a building environment model using machine learning algorithms, which can be used to assess occupant comfort level. We expect the proposed HVAC control strategy will be able to provide occupants with a consistently comfortable working or home environment.

  9. Quadcopter control using a BCI

    NASA Astrophysics Data System (ADS)

    Rosca, S.; Leba, M.; Ionica, A.; Gamulescu, O.

    2018-01-01

    The paper presents how there can be interconnected two ubiquitous elements nowadays. On one hand, the drones, which are increasingly present and integrated into more and more fields of activity, beyond the military applications they come from, moving towards entertainment, real-estate, delivery and so on. On the other hand, unconventional man-machine interfaces, which are generous topics to explore now and in the future. Of these, we chose brain computer interface (BCI), which allows human-machine interaction without requiring any moving elements. The research consists of mathematical modeling and numerical simulation of a drone and a BCI. Then there is presented an application using a Parrot mini-drone and an Emotiv Insight BCI.

  10. Motor-response learning at a process control panel by an autonomous robot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spelt, P.F.; de Saussure, G.; Lyness, E.

    1988-01-01

    The Center for Engineering Systems Advanced Research (CESAR) was founded at Oak Ridge National Laboratory (ORNL) by the Department of Energy's Office of Energy Research/Division of Engineering and Geoscience (DOE-OER/DEG) to conduct basic research in the area of intelligent machines. Therefore, researchers at the CESAR Laboratory are engaged in a variety of research activities in the field of machine learning. In this paper, we describe our approach to a class of machine learning which involves motor response acquisition using feedback from trial-and-error learning. Our formulation is being experimentally validated using an autonomous robot, learning tasks of control panel monitoring andmore » manipulation for effect process control. The CLIPS Expert System and the associated knowledge base used by the robot in the learning process, which reside in a hypercube computer aboard the robot, are described in detail. Benchmark testing of the learning process on a robot/control panel simulation system consisting of two intercommunicating computers is presented, along with results of sample problems used to train and test the expert system. These data illustrate machine learning and the resulting performance improvement in the robot for problems similar to, but not identical with, those on which the robot was trained. Conclusions are drawn concerning the learning problems, and implications for future work on machine learning for autonomous robots are discussed. 16 refs., 4 figs., 1 tab.« less

  11. Interlock system for machine protection of the KOMAC 100-MeV proton linac

    NASA Astrophysics Data System (ADS)

    Song, Young-Gi

    2015-02-01

    The 100-MeV proton linear accelerator of the Korea Multi-purpose Accelerator Complex (KOMAC) has been developed. The beam service started this year after performing the beam commissioning. If the very sensitive and essential equipment is to be protected during machine operation, a machine interlock system is required, and the interlock system has been implemented. The purpose of the interlock system is to shut off the beam when the radio-frequency (RF) and ion source are unstable or a beam loss occurs. The interlock signal of the KOMAC linac includes a variety of sources, such as the beam loss, RF and high-voltage converter modulator faults, and fast closing valves of the vacuum window at the beam lines and so on. This system consists of a hardware-based interlock system using analog circuits and a software-based interlock system using an industrial programmable logic controller (PLC). The hardware-based interlock system has been fabricated, and the requirement has been satisfied with the results being within 10 µs. The software logic interlock system using the PLC has been connected to the framework of with the experimental physics and industrial control system (EPICS) to integrate a variety of interlock signals and to control the machine components when an interlock occurs. This paper will describe the design and the construction of the machine interlock system for the KOMAC 100-MeV linac.

  12. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation

    PubMed Central

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541

  13. Development of a Crush and Mix Machine for Composite Brick Fabrication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sothea, Kruy; Fazli, Nik; Hamdi, M.

    2011-01-17

    Currently, people are more and more concerned about the environmental protection. Municipal solid wastes (MSW) have bad effect on the environment and also human health. In addition, the amounts of municipal solid wastes are increasing due to the economic development, density of population, especially in the developing countries and they are recycled in a little percentage. To address this problem, the composite brick forming machine was designed and developed to make brick using combination of MSW and mortar. The machine consists of two independent parts, crusher and mixer part, and molding part. This paper explores the design of crusher andmore » mixer part. The crusher has ability to cut MSW such as wood, paper and plastic into small size. There are two mixers; one is used for making mortar and other use for making slurry. FEA analyses were carried out to address the suitable strength of the critical parts of the crusher which ensures that crusher can run properly with high efficiency. The experimentation of the crusher shows that it has high performance for cutting MSW. The mixers also work very well in high efficiency. The results of composite brick testing have been shown that ability of the machine can performance well. This is the innovation of crush and mix machine which is portable and economic by using MSW in replacement of sand.« less

  14. Development of a Crush and Mix Machine for Composite Brick Fabrication

    NASA Astrophysics Data System (ADS)

    Sothea, Kruy; Fazli, Nik; Hamdi, M.; Aoyama, Hideki

    2011-01-01

    Currently, people are more and more concerned about the environmental protection. Municipal solid wastes (MSW) have bad effect on the environment and also human health. In addition, the amounts of municipal solid wastes are increasing due to the economic development, density of population, especially in the developing countries and they are recycled in a little percentage. To address this problem, the composite brick forming machine was designed and developed to make brick using combination of MSW and mortar. The machine consists of two independent parts, crusher and mixer part, and molding part. This paper explores the design of crusher and mixer part. The crusher has ability to cut MSW such as wood, paper and plastic into small size. There are two mixers; one is used for making mortar and other use for making slurry. FEA analyses were carried out to address the suitable strength of the critical parts of the crusher which ensures that crusher can run properly with high efficiency. The experimentation of the crusher shows that it has high performance for cutting MSW. The mixers also work very well in high efficiency. The results of composite brick testing have been shown that ability of the machine can performance well. This is the innovation of crush and mix machine which is portable and economic by using MSW in replacement of sand.

  15. Assessment of Genetic and Nongenetic Interactions for the Prediction of Depressive Symptomatology: An Analysis of the Wisconsin Longitudinal Study Using Machine Learning Algorithms

    PubMed Central

    Roetker, Nicholas S.; Yonker, James A.; Chang, Vicky; Roan, Carol L.; Herd, Pamela; Hauser, Taissa S.; Hauser, Robert M.

    2013-01-01

    Objectives. We examined depression within a multidimensional framework consisting of genetic, environmental, and sociobehavioral factors and, using machine learning algorithms, explored interactions among these factors that might better explain the etiology of depressive symptoms. Methods. We measured current depressive symptoms using the Center for Epidemiologic Studies Depression Scale (n = 6378 participants in the Wisconsin Longitudinal Study). Genetic factors were 78 single nucleotide polymorphisms (SNPs); environmental factors—13 stressful life events (SLEs), plus a composite proportion of SLEs index; and sociobehavioral factors—18 personality, intelligence, and other health or behavioral measures. We performed traditional SNP associations via logistic regression likelihood ratio testing and explored interactions with support vector machines and Bayesian networks. Results. After correction for multiple testing, we found no significant single genotypic associations with depressive symptoms. Machine learning algorithms showed no evidence of interactions. Naïve Bayes produced the best models in both subsets and included only environmental and sociobehavioral factors. Conclusions. We found no single or interactive associations with genetic factors and depressive symptoms. Various environmental and sociobehavioral factors were more predictive of depressive symptoms, yet their impacts were independent of one another. A genome-wide analysis of genetic alterations using machine learning methodologies will provide a framework for identifying genetic–environmental–sociobehavioral interactions in depressive symptoms. PMID:23927508

  16. System technology for laser-assisted milling with tool integrated optics

    NASA Astrophysics Data System (ADS)

    Hermani, Jan-Patrick; Emonts, Michael; Brecher, Christian

    2013-02-01

    High strength metal alloys and ceramics offer a huge potential for increased efficiency (e. g. in engine components for aerospace or components for gas turbines). However, mass application is still hampered by cost- and time-consuming end-machining due to long processing times and high tool wear. Laser-induced heating shortly before machining can reduce the material strength and improve machinability significantly. The Fraunhofer IPT has developed and successfully realized a new approach for laser-assisted milling with spindle and tool integrated, co-rotating optics. The novel optical system inside the tool consists of one deflection prism to position the laser spot in front of the cutting insert and one focusing lens. Using a fiber laser with high beam quality the laser spot diameter can be precisely adjusted to the chip size. A high dynamic adaption of the laser power signal according to the engagement condition of the cutting tool was realized in order not to irradiate already machined work piece material. During the tool engagement the laser power is controlled in proportion to the current material removal rate, which has to be calculated continuously. The needed geometric values are generated by a CAD/CAM program and converted into a laser power signal by a real-time controller. The developed milling tool with integrated optics and the algorithm for laser power control enable a multi-axis laser-assisted machining of complex parts.

  17. Improved Saturated Hydraulic Conductivity Pedotransfer Functions Using Machine Learning Methods

    NASA Astrophysics Data System (ADS)

    Araya, S. N.; Ghezzehei, T. A.

    2017-12-01

    Saturated hydraulic conductivity (Ks) is one of the fundamental hydraulic properties of soils. Its measurement, however, is cumbersome and instead pedotransfer functions (PTFs) are often used to estimate it. Despite a lot of progress over the years, generic PTFs that estimate hydraulic conductivity generally don't have a good performance. We develop significantly improved PTFs by applying state of the art machine learning techniques coupled with high-performance computing on a large database of over 20,000 soils—USKSAT and the Florida Soil Characterization databases. We compared the performance of four machine learning algorithms (k-nearest neighbors, gradient boosted model, support vector machine, and relevance vector machine) and evaluated the relative importance of several soil properties in explaining Ks. An attempt is also made to better account for soil structural properties; we evaluated the importance of variables derived from transformations of soil water retention characteristics and other soil properties. The gradient boosted models gave the best performance with root mean square errors less than 0.7 and mean errors in the order of 0.01 on a log scale of Ks [cm/h]. The effective particle size, D10, was found to be the single most important predictor. Other important predictors included percent clay, bulk density, organic carbon percent, coefficient of uniformity and values derived from water retention characteristics. Model performances were consistently better for Ks values greater than 10 cm/h. This study maximizes the extraction of information from a large database to develop generic machine learning based PTFs to estimate Ks. The study also evaluates the importance of various soil properties and their transformations in explaining Ks.

  18. A Concept for Optimizing Behavioural Effectiveness & Efficiency

    NASA Astrophysics Data System (ADS)

    Barca, Jan Carlo; Rumantir, Grace; Li, Raymond

    Both humans and machines exhibit strengths and weaknesses that can be enhanced by merging the two entities. This research aims to provide a broader understanding of how closer interactions between these two entities can facilitate more optimal goal-directed performance through the use of artificial extensions of the human body. Such extensions may assist us in adapting to and manipulating our environments in a more effective way than any system known today. To demonstrate this concept, we have developed a simulation where a semi interactive virtual spider can be navigated through an environment consisting of several obstacles and a virtual predator capable of killing the spider. The virtual spider can be navigated through the use of three different control systems that can be used to assist in optimising overall goal directed performance. The first two control systems use, an onscreen button interface and a touch sensor, respectively to facilitate human navigation of the spider. The third control system is an autonomous navigation system through the use of machine intelligence embedded in the spider. This system enables the spider to navigate and react to changes in its local environment. The results of this study indicate that machines should be allowed to override human control in order to maximise the benefits of collaboration between man and machine. This research further indicates that the development of strong machine intelligence, sensor systems that engage all human senses, extra sensory input systems, physical remote manipulators, multiple intelligent extensions of the human body, as well as a tighter symbiosis between man and machine, can support an upgrade of the human form.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, W.P.; Burkhardt, H.L.; Alsager, M.

    Surveys were conducted of 200 machines owned by 194 dentists. The survey consisted of: a visual check of the condition of the machine; determination of whether adequate aluminum filtration and a lead washer were installed; and a check of the condition of the tube housing shielding with a radiation survey meter. The exposures received by the dentist and his assistant while in their normal positions were also measured. Recommended changes in operating technique were: operation at a higher voltage and lower amperage; use of high- speed film; and underexposure and overdevelopment of the film. Before washers and filters were changedmore » it was found that the equivalent filtration in 161 machines was less than 2.5 mm of aluminum, 19 machines had 2.5 mm, and 20 had an unknown amount. Before the program was initiated the beam diameter of 100 machines was 2.75 in.; 71 were larger than 2.75 in., and in 29 cases, unknown. Emulsion rating of film used initially was: 120 dentists used fast, 67 intermediate, 6 slow, and 7 films of unknown speed. The operating kilovoltage varied widely: 11 operated at 50-3 kv, 117 at 65-7, 29 at 70-2, 15 at 80-90, and 18 at an unknown voltage. It was found that 11 machines still in use were manufactured before 1935, 23 were manufactured from 1935 to 1944, 44 from 1955 to 1960, and 20 had an unknown date of manufacture. Although in most instances radiation levels did not approach the maximum allowable limits, it was recommended all unnecessary radiation exposure be eliminated. (H.H.D.)« less

  20. Machine learning on brain MRI data for differential diagnosis of Parkinson's disease and Progressive Supranuclear Palsy.

    PubMed

    Salvatore, C; Cerasa, A; Castiglioni, I; Gallivanone, F; Augimeri, A; Lopez, M; Arabia, G; Morelli, M; Gilardi, M C; Quattrone, A

    2014-01-30

    Supervised machine learning has been proposed as a revolutionary approach for identifying sensitive medical image biomarkers (or combination of them) allowing for automatic diagnosis of individual subjects. The aim of this work was to assess the feasibility of a supervised machine learning algorithm for the assisted diagnosis of patients with clinically diagnosed Parkinson's disease (PD) and Progressive Supranuclear Palsy (PSP). Morphological T1-weighted Magnetic Resonance Images (MRIs) of PD patients (28), PSP patients (28) and healthy control subjects (28) were used by a supervised machine learning algorithm based on the combination of Principal Components Analysis as feature extraction technique and on Support Vector Machines as classification algorithm. The algorithm was able to obtain voxel-based morphological biomarkers of PD and PSP. The algorithm allowed individual diagnosis of PD versus controls, PSP versus controls and PSP versus PD with an Accuracy, Specificity and Sensitivity>90%. Voxels influencing classification between PD and PSP patients involved midbrain, pons, corpus callosum and thalamus, four critical regions known to be strongly involved in the pathophysiological mechanisms of PSP. Classification accuracy of individual PSP patients was consistent with previous manual morphological metrics and with other supervised machine learning application to MRI data, whereas accuracy in the detection of individual PD patients was significantly higher with our classification method. The algorithm provides excellent discrimination of PD patients from PSP patients at an individual level, thus encouraging the application of computer-based diagnosis in clinical practice. Copyright © 2013 Elsevier B.V. All rights reserved.

Top