Process Capability of High Speed Micro End-Milling of Inconel 718 with Minimum Quantity Lubrication
NASA Astrophysics Data System (ADS)
Rahman, Mohamed Abd; Yeakub Ali, Mohammad; Rahman Shah Rosli, Abdul; Banu, Asfana
2017-03-01
The demand for micro-parts is expected to grow and micro-machining has been shown to be a viable manufacturing process to produce these products. These micro-products may be produced from hard-to-machine materials such as superalloys under little or no metal cutting fluids to reduce machining cost or drawbacks associated with health and environment. This project aims to investigate the capability of micro end-milling process of Inconel 718 with minimum quantity lubrication (MQL). Microtools DT-110 multi-process micro machine was used to machine 10 micro-channels with MQL and 10 more under dry condition while maintaining the same machining parameters. The width of the micro-channels was measured using digital microscope and used to determine the process capability indices, Cp and Cpk. QI Macros SPC for Excel was used to analyze the resultant machining data. The results indicated that micro end-milling process of Inconel 718 was not capable under both MQL and dry cutting conditions as indicated by the Cp values of less than 1.0. However, the use of MQL helped the process to be more stable and capable. Results obtained showed that the process variation was greatly reduced by using MQL in micro end-milling of Inconel 718.
Process capability improvement through DMAIC for aluminum alloy wheel machining
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa; Babu, B. Surendra
2017-07-01
This paper first enlists the generic problems of alloy wheel machining and subsequently details on the process improvement of the identified critical-to-quality machining characteristic of A356 aluminum alloy wheel machining process. The causal factors are traced using the Ishikawa diagram and prioritization of corrective actions is done through process failure modes and effects analysis. Process monitoring charts are employed for improving the process capability index of the process, at the industrial benchmark of four sigma level, which is equal to the value of 1.33. The procedure adopted for improving the process capability levels is the define-measure-analyze-improve-control (DMAIC) approach. By following the DMAIC approach, the C p, C pk and C pm showed signs of improvement from an initial value of 0.66, -0.24 and 0.27, to a final value of 4.19, 3.24 and 1.41, respectively.
Process-based tolerance assessment of connecting rod machining process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.
2016-06-01
Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.
Bidding-based autonomous process planning and scheduling
NASA Astrophysics Data System (ADS)
Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.
1995-08-01
Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.
Fatigue Life Variability in Large Aluminum Forgings with Residual Stress
2011-07-01
been conducted. A detailed finite element analysis of the forge/ quench /coldwork/machine process was performed in order to predict the bulk residual...forge/ quench /coldwork/machine process was performed in order to predict the bulk residual stresses in a fictitious aluminum bulkhead. The residual...continues to develop the capability for computational simulation of the forge, quench , cold work and machining processes. In order to handle the
Linear- and Repetitive Feature Detection Within Remotely Sensed Imagery
2017-04-01
applicable to Python or other pro- gramming languages with image- processing capabilities. 4.1 Classification machine learning The first methodology uses...remotely sensed images that are in panchromatic or true-color formats. Image- processing techniques, in- cluding Hough transforms, machine learning, and...data fusion .................................................................................................... 44 6.3 Context-based processing
Proposed algorithm to improve job shop production scheduling using ant colony optimization method
NASA Astrophysics Data System (ADS)
Pakpahan, Eka KA; Kristina, Sonna; Setiawan, Ari
2017-12-01
This paper deals with the determination of job shop production schedule on an automatic environment. On this particular environment, machines and material handling system are integrated and controlled by a computer center where schedule were created and then used to dictate the movement of parts and the operations at each machine. This setting is usually designed to have an unmanned production process for a specified interval time. We consider here parts with various operations requirement. Each operation requires specific cutting tools. These parts are to be scheduled on machines each having identical capability, meaning that each machine is equipped with a similar set of cutting tools therefore is capable of processing any operation. The availability of a particular machine to process a particular operation is determined by the remaining life time of its cutting tools. We proposed an algorithm based on the ant colony optimization method and embedded them on matlab software to generate production schedule which minimize the total processing time of the parts (makespan). We test the algorithm on data provided by real industry and the process shows a very short computation time. This contributes a lot to the flexibility and timelines targeted on an automatic environment.
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-01-01
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-11-16
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.
Rapid Prototyping: State of the Art Review
2003-10-23
Steel H13 Tool Steel CP Ti, Ti-6Al-4V Titanium Tungsten Copper Aluminum Nickel...The company’s LENS 750 and LENS 850 machines (both $440,000 to $640,000) are capable of producing parts in 16 stainless steel , H13 tool steel ...machining. 20 The Arcam EBM S12 model sells for $500,000 and is capable of processing two materials. One is H13 tool steel , while the other
NASA Astrophysics Data System (ADS)
Lary, D. J.
2013-12-01
A BigData case study is described where multiple datasets from several satellites, high-resolution global meteorological data, social media and in-situ observations are combined using machine learning on a distributed cluster using an automated workflow. The global particulate dataset is relevant to global public health studies and would not be possible to produce without the use of the multiple big datasets, in-situ data and machine learning.To greatly reduce the development time and enhance the functionality a high level language capable of parallel processing has been used (Matlab). A key consideration for the system is high speed access due to the large data volume, persistence of the large data volumes and a precise process time scheduling capability.
NASA Technical Reports Server (NTRS)
Byman, J. E.
1985-01-01
A brief history of aircraft production techniques is given. A flexible machining cell is then described. It is a computer controlled system capable of performing 4-axis machining part cleaning, dimensional inspection and materials handling functions in an unmanned environment. The cell was designed to: allow processing of similar and dissimilar parts in random order without disrupting production; allow serial (one-shipset-at-a-time) manufacturing; reduce work-in-process inventory; maximize machine utilization through remote set-up; maximize throughput and minimize labor.
NASA Technical Reports Server (NTRS)
Sampson, Paul G.; Sny, Linda C.
1992-01-01
The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).
Traceability of On-Machine Tool Measurement: A Review.
Mutilba, Unai; Gomez-Acedo, Eneko; Kortaberria, Gorka; Olarra, Aitor; Yagüe-Fabra, Jose A
2017-07-11
Nowadays, errors during the manufacturing process of high value components are not acceptable in driving industries such as energy and transportation. Sectors such as aerospace, automotive, shipbuilding, nuclear power, large science facilities or wind power need complex and accurate components that demand close measurements and fast feedback into their manufacturing processes. New measuring technologies are already available in machine tools, including integrated touch probes and fast interface capabilities. They provide the possibility to measure the workpiece in-machine during or after its manufacture, maintaining the original setup of the workpiece and avoiding the manufacturing process from being interrupted to transport the workpiece to a measuring position. However, the traceability of the measurement process on a machine tool is not ensured yet and measurement data is still not fully reliable enough for process control or product validation. The scientific objective is to determine the uncertainty on a machine tool measurement and, therefore, convert it into a machine integrated traceable measuring process. For that purpose, an error budget should consider error sources such as the machine tools, components under measurement and the interactions between both of them. This paper reviews all those uncertainty sources, being mainly focused on those related to the machine tool, either on the process of geometric error assessment of the machine or on the technology employed to probe the measurand.
Development of a low energy micro sheet forming machine
NASA Astrophysics Data System (ADS)
Razali, A. R.; Ann, C. T.; Shariff, H. M.; Kasim, N. I.; Musa, M. A.; Ahmad, A. F.
2017-10-01
It is expected that with the miniaturization of materials being processed, energy consumption is also being `miniaturized' proportionally. The focus of this study was to design a low energy micro-sheet-forming machine for thin sheet metal application and fabricate a low direct current powered micro-sheet-forming machine. A prototype of low energy system for a micro-sheet-forming machine which includes mechanical and electronic elements was developed. The machine was tested for its performance in terms of natural frequency, punching forces, punching speed and capability, energy consumption (single punch and frequency-time based). Based on the experiments, the machine can do 600 stroke per minute and the process is unaffected by the machine's natural frequency. It was also found that sub-Joule of power was required for a single stroke of punching/blanking process. Up to 100micron thick carbon steel shim was successfully tested and punched. It concludes that low power forming machine is feasible to be developed and be used to replace high powered machineries to form micro-products/parts.
NASA Astrophysics Data System (ADS)
Lingadurai, K.; Nagasivamuni, B.; Muthu Kamatchi, M.; Palavesam, J.
2012-06-01
Wire electrical discharge machining (WEDM) is a specialized thermal machining process capable of accurately machining parts of hard materials with complex shapes. Parts having sharp edges that pose difficulties to be machined by the main stream machining processes can be easily machined by WEDM process. Design of Experiments approach (DOE) has been reported in this work for stainless steel AISI grade-304 which is used in cryogenic vessels, evaporators, hospital surgical equipment, marine equipment, fasteners, nuclear vessels, feed water tubing, valves, refrigeration equipment, etc., is machined by WEDM with brass wire electrode. The DOE method is used to formulate the experimental layout, to analyze the effect of each parameter on the machining characteristics, and to predict the optimal choice for each WEDM parameter such as voltage, pulse ON, pulse OFF and wire feed. It is found that these parameters have a significant influence on machining characteristic such as metal removal rate (MRR), kerf width and surface roughness (SR). The analysis of the DOE reveals that, in general the pulse ON time significantly affects the kerf width and the wire feed rate affects SR, while, the input voltage mainly affects the MRR.
Defense Logistics Standard Systems Functional Requirements.
1987-03-01
Artificial Intelligence - the development of a machine capability to perform functions normally concerned with human intelligence, such as learning , adapting...Basic Data Base Machine Configurations .... ......... D- 18 xx ~ ?f~~~vX PART I: MODELS - DEFENSE LOGISTICS STANDARD SYSTEMS FUNCTIONAL REQUIREMENTS...On-line, Interactive Access. Integrating user input and machine output in a dynamic, real-time, give-and- take process is considered the optimum mode
Additive Manufacturing in Production: A Study Case Applying Technical Requirements
NASA Astrophysics Data System (ADS)
Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni
Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.
Gökhan Demir, Ali; Previtali, Barbara
2014-06-01
Magnesium alloys constitute an interesting solution for cardiovascular stents due to their biocompatibility and biodegradability in human body. Laser microcutting is the industrially accepted method for stent manufacturing. However, the laser-material interaction should be well investigated to control the quality characteristics of the microcutting process that concern the surface roughness, chemical composition, and microstructure of the final device. Despite the recent developments in industrial laser systems, a universal laser source that can be manipulated flexibly in terms of process parameters is far from reality. Therefore, comparative studies are required to demonstrate processing capabilities. In particular, the laser pulse duration is a key factor determining the processing regime. This work approaches the laser microcutting of AZ31 Mg alloy from the perspective of a comparative study to evaluate the machining capabilities in continuous wave (CW), ns- and fs-pulsed regimes. Three industrial grade machining systems were compared to reach a benchmark in machining quality, productivity, and ease of postprocessing. The results confirmed that moving toward the ultrashort pulse domain the machining quality increases, but the need for postprocessing remains. The real advantage of ultrashort pulsed machining was the ease in postprocessing and maintaining geometrical integrity of the stent mesh after chemical etching. Resultantly, the overall production cycle time was shortest for fs-pulsed laser system, despite the fact that CW laser system provided highest cutting speed.
Specification and preliminary design of an array processor
NASA Technical Reports Server (NTRS)
Slotnick, D. L.; Graham, M. L.
1975-01-01
The design of a computer suited to the class of problems typified by the general circulation of the atmosphere was investigated. A fundamental goal was that the resulting machine should have roughly 100 times the computing capability of an IBM 360/95 computer. A second requirement was that the machine should be programmable in a higher level language similar to FORTRAN. Moreover, the new machine would have to be compatible with the IBM 360/95 since the IBM machine would continue to be used for pre- and post-processing. A third constraint was that the cost of the new machine was to be significantly less than that of other extant machines of similar computing capability, such as the ILLIAC IV and CDC STAR. A final constraint was that it should be feasible to fabricate a complete system and put it in operation by early 1978. Although these objectives were generally met, considerable work remains to be done on the routing system.
NASA Astrophysics Data System (ADS)
Mebrahitom, A.; Rizuan, D.; Azmir, M.; Nassif, M.
2016-02-01
High speed milling is one of the recent technologies used to produce mould inserts due to the need for high surface finish. It is a faster machining process where it uses a small side step and a small down step combined with very high spindle speed and feed rate. In order to effectively use the HSM capabilities, optimizing the tool path strategies and machining parameters is an important issue. In this paper, six different tool path strategies have been investigated on the surface finish and machining time of a rectangular cavities of ESR Stavax material. CAD/CAM application of CATIA V5 machining module for pocket milling of the cavities was used for process planning.
Traceability of On-Machine Tool Measurement: A Review
Gomez-Acedo, Eneko; Kortaberria, Gorka; Olarra, Aitor
2017-01-01
Nowadays, errors during the manufacturing process of high value components are not acceptable in driving industries such as energy and transportation. Sectors such as aerospace, automotive, shipbuilding, nuclear power, large science facilities or wind power need complex and accurate components that demand close measurements and fast feedback into their manufacturing processes. New measuring technologies are already available in machine tools, including integrated touch probes and fast interface capabilities. They provide the possibility to measure the workpiece in-machine during or after its manufacture, maintaining the original setup of the workpiece and avoiding the manufacturing process from being interrupted to transport the workpiece to a measuring position. However, the traceability of the measurement process on a machine tool is not ensured yet and measurement data is still not fully reliable enough for process control or product validation. The scientific objective is to determine the uncertainty on a machine tool measurement and, therefore, convert it into a machine integrated traceable measuring process. For that purpose, an error budget should consider error sources such as the machine tools, components under measurement and the interactions between both of them. This paper reviews all those uncertainty sources, being mainly focused on those related to the machine tool, either on the process of geometric error assessment of the machine or on the technology employed to probe the measurand. PMID:28696358
Statistical Capability Study of a Helical Grinding Machine Producing Screw Rotors
NASA Astrophysics Data System (ADS)
Holmes, C. S.; Headley, M.; Hart, P. W.
2017-08-01
Screw compressors depend for their efficiency and reliability on the accuracy of the rotors, and therefore on the machinery used in their production. The machinery has evolved over more than half a century in response to customer demands for production accuracy, efficiency, and flexibility, and is now at a high level on all three criteria. Production equipment and processes must be capable of maintaining accuracy over a production run, and this must be assessed statistically under strictly controlled conditions. This paper gives numerical data from such a study of an innovative machine tool and shows that it is possible to meet the demanding statistical capability requirements.
Reverse engineering of wörner type drilling machine structure.
NASA Astrophysics Data System (ADS)
Wibowo, A.; Belly, I.; llhamsyah, R.; Indrawanto; Yuwana, Y.
2018-03-01
A product design needs to be modified based on the conditions of production facilities and existing resource capabilities without reducing the functional aspects of the product itself. This paper describes the reverse engineering process of the main structure of the wörner type drilling machine to obtain a machine structure design that can be made by resources with limited ability by using simple processes. Some structural, functional and the work mechanism analyzes have been performed to understand the function and role of each basic components. The process of dismantling of the drilling machine and measuring each of the basic components was performed to obtain sets of the geometry and size data of each component. The geometric model of each structure components and the machine assembly were built to facilitate the simulation process and machine performance analysis that refers to ISO standard of drilling machine. The tolerance stackup analysis also performed to determine the type and value of geometrical and dimensional tolerances, which could affect the ease of the components to be manufactured and assembled
NASA Astrophysics Data System (ADS)
Razdan, Vikram; Bateman, Richard
2015-05-01
This study investigates the use of a Smartphone and its camera vision capabilities in Engineering metrology and flaw detection, with a view to develop a low cost alternative to Machine vision systems which are out of range for small scale manufacturers. A Smartphone has to provide a similar level of accuracy as Machine Vision devices like Smart cameras. The objective set out was to develop an App on an Android Smartphone, incorporating advanced Computer vision algorithms written in java code. The App could then be used for recording measurements of Twist Drill bits and hole geometry, and analysing the results for accuracy. A detailed literature review was carried out for in-depth study of Machine vision systems and their capabilities, including a comparison between the HTC One X Android Smartphone and the Teledyne Dalsa BOA Smart camera. A review of the existing metrology Apps in the market was also undertaken. In addition, the drilling operation was evaluated to establish key measurement parameters of a twist Drill bit, especially flank wear and diameter. The methodology covers software development of the Android App, including the use of image processing algorithms like Gaussian Blur, Sobel and Canny available from OpenCV software library, as well as designing and developing the experimental set-up for carrying out the measurements. The results obtained from the experimental set-up were analysed for geometry of Twist Drill bits and holes, including diametrical measurements and flaw detection. The results show that Smartphones like the HTC One X have the processing power and the camera capability to carry out metrological tasks, although dimensional accuracy achievable from the Smartphone App is below the level provided by Machine vision devices like Smart cameras. A Smartphone with mechanical attachments, capable of image processing and having a reasonable level of accuracy in dimensional measurement, has the potential to become a handy low-cost Machine vision system for small scale manufacturers, especially in field metrology and flaw detection.
Repurposing mainstream CNC machine tools for laser-based additive manufacturing
NASA Astrophysics Data System (ADS)
Jones, Jason B.
2016-04-01
The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.
Farley Three-Dimensional-Braiding Machine
NASA Technical Reports Server (NTRS)
Farley, Gary L.
1991-01-01
Process and device known as Farley three-dimensional-braiding machine conceived to fabricate dry continuous fiber-reinforced preforms of complex three-dimensional shapes for subsequent processing into composite structures. Robotic fiber supply dispenses yarn as it traverses braiding surface. Combines many attributes of weaving and braiding processes with other attributes and capabilities. Other applications include decorative cloths, rugs, and other domestic textiles. Concept could lead to large variety of fiber layups and to entirely new products as well as new fiber-reinforcing applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mou, J.I.; King, C.
The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess themore » status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.« less
Study of the Productivity and Surface Quality of Hybrid EDM
NASA Astrophysics Data System (ADS)
Wankhade, Sandeepkumar Haribhau; Sharma, Sunil Bansilal
2016-01-01
The development of new, advanced engineering materials and the need for precise prototypes and low-volume production have made the electric discharge machining (EDM), an important manufacturing process to meet such demands. It is capable of machining geometrically complex and hard material components, that are precise and difficult-to-machine such as heat treated tool steels, composites, super alloys, ceramics, carbides etc. Conversely the low MRR limits its productivity. Abrasive water jet machine (AJM) tools are quick to setup and offer quick turn-around on the machine and could make parts out of virtually any material. They do not heat the material hence no heat affected zone and can make any intricate shape easily. The main advantages are flexibility, low heat production and ability to machine hard and brittle materials. Main disadvantages comprise the process produces a tapered cut and health hazards due to dry abrasives. To overcome the limitations and exploit the best of each of above processes; an attempt has been made to hybridize the processes of AJM and EDM. The appropriate abrasives routed with compressed air through the hollow electrode to construct the hybrid process i.e., abrasive jet electric discharge machining (AJEDM), the high speed abrasives could impinge on the machined surface to remove the recast layer caused by EDM process. The main process parameters were varied to explore their effects and experimental results show that AJEDM enhances the machining efficiency with better surface finish hence can fit the requirements of modern manufacturing applications.
Review on CNC-Rapid Prototyping
NASA Astrophysics Data System (ADS)
Z, M. Nafis O.; Y, Nafrizuan M.; A, Munira M.; J, Kartina
2012-09-01
This article reviewed developments of Computerized Numerical Control (CNC) technology in rapid prototyping process. Rapid prototyping (RP) can be classified into three major groups; subtractive, additive and virtual. CNC rapid prototyping is grouped under the subtractive category which involves material removal from the workpiece that is larger than the final part. Richard Wysk established the use of CNC machines for rapid prototyping using sets of 2½-D tool paths from various orientations about a rotary axis to machine parts without refixturing. Since then, there are few developments on this process mainly aimed to optimized the operation and increase the process capabilities to stand equal with common additive type of RP. These developments include the integration between machining and deposition process (hybrid RP), adoption of RP to the conventional machine and optimization of the CNC rapid prototyping process based on controlled parameters. The article ended by concluding that the CNC rapid prototyping research area has a vast space for improvement as in the conventional machining processes. Further developments and findings will enhance the usage of this method and minimize the limitation of current approach in building a prototype.
Accurate Micro-Tool Manufacturing by Iterative Pulsed-Laser Ablation
NASA Astrophysics Data System (ADS)
Warhanek, Maximilian; Mayr, Josef; Dörig, Christian; Wegener, Konrad
2017-12-01
Iterative processing solutions, including multiple cycles of material removal and measurement, are capable of achieving higher geometric accuracy by compensating for most deviations manifesting directly on the workpiece. Remaining error sources are the measurement uncertainty and the repeatability of the material-removal process including clamping errors. Due to the lack of processing forces, process fluids and wear, pulsed-laser ablation has proven high repeatability and can be realized directly on a measuring machine. This work takes advantage of this possibility by implementing an iterative, laser-based correction process for profile deviations registered directly on an optical measurement machine. This way efficient iterative processing is enabled, which is precise, applicable for all tool materials including diamond and eliminates clamping errors. The concept is proven by a prototypical implementation on an industrial tool measurement machine and a nanosecond fibre laser. A number of measurements are performed on both the machine and the processed workpieces. Results show production deviations within 2 μm diameter tolerance.
Flotation machine and process for removing impurities from coals
Szymocha, K.; Ignasiak, B.; Pawlak, W.; Kulik, C.; Lebowitz, H.E.
1995-12-05
The present invention is directed to a type of flotation machine that combines three separate operations in a single unit. The flotation machine is a hydraulic separator that is capable of reducing the pyrite and other mineral matter content of a coal. When the hydraulic separator is used with a flotation system, the pyrite and certain other mineral particles that may have been entrained by hydrodynamic forces associated with conventional flotation machines and/or by the attachment forces associated with the formation of microagglomerates are washed and separated from the coal. 4 figs.
Flotation machine and process for removing impurities from coals
Szymocha, Kazimierz; Ignasiak, Boleslaw; Pawlak, Wanda; Kulik, Conrad; Lebowitz, Howard E.
1995-01-01
The present invention is directed to a type of flotation machine that combines three separate operations in a single unit. The flotation machine is a hydraulic separator that is capable of reducing the pyrite and other mineral matter content of a coal. When the hydraulic separator is used with a flotation system, the pyrite and certain other minerals particles that may have been entrained by hydrodynamic forces associated with conventional flotation machines and/or by the attachment forces associated with the formation of microagglomerates are washed and separated from the coal.
Flotation machine and process for removing impurities from coals
Szymocha, K.; Ignasiak, B.; Pawlak, W.; Kulik, C.; Lebowitz, H.E.
1997-02-11
The present invention is directed to a type of flotation machine that combines three separate operations in a single unit. The flotation machine is a hydraulic separator that is capable of reducing the pyrite and other mineral matter content of a coal. When the hydraulic separator is used with a flotation system, the pyrite and certain other minerals particles that may have been entrained by hydrodynamic forces associated with conventional flotation machines and/or by the attachment forces associated with the formation of microagglomerates are washed and separated from the coal. 4 figs.
Flotation machine and process for removing impurities from coals
Szymocha, Kazimierz; Ignasiak, Boleslaw; Pawlak, Wanda; Kulik, Conrad; Lebowitz, Howard E.
1997-01-01
The present invention is directed to a type of flotation machine that combines three separate operations in a single unit. The flotation machine is a hydraulic separator that is capable of reducing the pyrite and other mineral matter content of a coal. When the hydraulic separator is used with a flotation system, the pyrite and certain other minerals particles that may have been entrained by hydrodynamic forces associated with conventional flotation machines and/or by the attachment forces associated with the formation of microagglomerates are washed and separated from the coal.
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
NASA Astrophysics Data System (ADS)
Mansor, A. F.; Zakaria, M. S.; Azmi, A. I.; Khalil, A. N. M.; Musa, N. A.
2017-10-01
Cutting fluids play very important role in machining application in order to increase tool life, surface finish and reduce energy consumption. Instead of using petrochemical and synthetic based cutting fluids, vegetable oil based lubricants is safety for operators, environmental friendly and become more popular in the industrial applications. This research paper aims to find the advantage of using vegetable oils (coconut oil) with additional of nano particles (CuO) as lubricant to the energy consumption during machining process. The energy was measured for each run from 2 level factorial experimental layout. Obtained results illustrate that lubricant with enhancement of nanoparticles has capability to improve the energy consumption during the machining process.
Heat-Assisted Machining for Material Removal Improvement
NASA Astrophysics Data System (ADS)
Mohd Hadzley, A. B.; Hafiz, S. Muhammad; Azahar, W.; Izamshah, R.; Mohd Shahir, K.; Abu, A.
2015-09-01
Heat assisted machining (HAM) is a process where an intense heat source is used to locally soften the workpiece material before machined by high speed cutting tool. In this paper, an HAM machine is developed by modification of small CNC machine with the addition of special jig to hold the heat sources in front of the machine spindle. Preliminary experiment to evaluate the capability of HAM machine to produce groove formation for slotting process was conducted. A block AISI D2 tool steel with100mm (width) × 100mm (length) × 20mm (height) size has been cut by plasma heating with different setting of arc current, feed rate and air pressure. Their effect has been analyzed based on distance of cut (DOC).Experimental results demonstrated the most significant factor that contributed to the DOC is arc current, followed by the feed rate and air pressure. HAM improves the slotting process of AISI D2 by increasing distance of cut due to initial cutting groove that formed during thermal melting and pressurized air from the heat source.
Inertia Compensation While Scanning Screw Threads on Coordinate Measuring Machines
NASA Astrophysics Data System (ADS)
Kosarevsky, Sergey; Latypov, Viktor
2010-01-01
Usage of scanning coordinate-measuring machines for inspection of screw threads has become a common practice nowadays. Compared to touch trigger probing, scanning capabilities allow to speed up the measuring process while still maintaining high accuracy. However, in some cases accuracy drastically depends on the scanning speed. In this paper a compensation method is proposed allowing to reduce the influence of inertia of the probing system while scanning screw threads on coordinate-measuring machines.
The role of soft computing in intelligent machines.
de Silva, Clarence W
2003-08-15
An intelligent machine relies on computational intelligence in generating its intelligent behaviour. This requires a knowledge system in which representation and processing of knowledge are central functions. Approximation is a 'soft' concept, and the capability to approximate for the purposes of comparison, pattern recognition, reasoning, and decision making is a manifestation of intelligence. This paper examines the use of soft computing in intelligent machines. Soft computing is an important branch of computational intelligence, where fuzzy logic, probability theory, neural networks, and genetic algorithms are synergistically used to mimic the reasoning and decision making of a human. This paper explores several important characteristics and capabilities of machines that exhibit intelligent behaviour. Approaches that are useful in the development of an intelligent machine are introduced. The paper presents a general structure for an intelligent machine, giving particular emphasis to its primary components, such as sensors, actuators, controllers, and the communication backbone, and their interaction. The role of soft computing within the overall system is discussed. Common techniques and approaches that will be useful in the development of an intelligent machine are introduced, and the main steps in the development of an intelligent machine for practical use are given. An industrial machine, which employs the concepts of soft computing in its operation, is presented, and one aspect of intelligent tuning, which is incorporated into the machine, is illustrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidt, Derek William; Cardenas, Tana; Doss, Forrest W.
In this paper, the High Energy Density Physics program at Los Alamos National Laboratory (LANL) has had a multiyear campaign to verify the predictive capability of the interface evolution of shock propagation through different profiles machined into the face of a plastic package with an iodine-doped plastic center region. These experiments varied the machined surface from a simple sine wave to a double sine wave and finally to a multitude of different profiles with power spectrum ranges and shapes to verify LANL’s simulation capability. The MultiMode-A profiles had a band-pass flat region of the power spectrum, while the MultiMode-B profilemore » had two band-pass flat regions. Another profile of interest was the 1-Peak profile, a band-pass concept with a spike to one side of the power spectrum. All these profiles were machined in flat and tilted orientations of 30 and 60 deg. Tailor-made machining profiles, supplied by experimental physicists, were compared to actual machined surfaces, and Fourier power spectra were compared to see the reproducibility of the machining process over the frequency ranges that physicists require.« less
Schmidt, Derek William; Cardenas, Tana; Doss, Forrest W.; ...
2018-01-15
In this paper, the High Energy Density Physics program at Los Alamos National Laboratory (LANL) has had a multiyear campaign to verify the predictive capability of the interface evolution of shock propagation through different profiles machined into the face of a plastic package with an iodine-doped plastic center region. These experiments varied the machined surface from a simple sine wave to a double sine wave and finally to a multitude of different profiles with power spectrum ranges and shapes to verify LANL’s simulation capability. The MultiMode-A profiles had a band-pass flat region of the power spectrum, while the MultiMode-B profilemore » had two band-pass flat regions. Another profile of interest was the 1-Peak profile, a band-pass concept with a spike to one side of the power spectrum. All these profiles were machined in flat and tilted orientations of 30 and 60 deg. Tailor-made machining profiles, supplied by experimental physicists, were compared to actual machined surfaces, and Fourier power spectra were compared to see the reproducibility of the machining process over the frequency ranges that physicists require.« less
Unorganized machines for seasonal streamflow series forecasting.
Siqueira, Hugo; Boccato, Levy; Attux, Romis; Lyra, Christiano
2014-05-01
Modern unorganized machines--extreme learning machines and echo state networks--provide an elegant balance between processing capability and mathematical simplicity, circumventing the difficulties associated with the conventional training approaches of feedforward/recurrent neural networks (FNNs/RNNs). This work performs a detailed investigation of the applicability of unorganized architectures to the problem of seasonal streamflow series forecasting, considering scenarios associated with four Brazilian hydroelectric plants and four distinct prediction horizons. Experimental results indicate the pertinence of these models to the focused task.
Modelling of human-machine interaction in equipment design of manufacturing cells
NASA Astrophysics Data System (ADS)
Cochran, David S.; Arinez, Jorge F.; Collins, Micah T.; Bi, Zhuming
2017-08-01
This paper proposes a systematic approach to model human-machine interactions (HMIs) in supervisory control of machining operations; it characterises the coexistence of machines and humans for an enterprise to balance the goals of automation/productivity and flexibility/agility. In the proposed HMI model, an operator is associated with a set of behavioural roles as a supervisor for multiple, semi-automated manufacturing processes. The model is innovative in the sense that (1) it represents an HMI based on its functions for process control but provides the flexibility for ongoing improvements in the execution of manufacturing processes; (2) it provides a computational tool to define functional requirements for an operator in HMIs. The proposed model can be used to design production systems at different levels of an enterprise architecture, particularly at the machine level in a production system where operators interact with semi-automation to accomplish the goal of 'autonomation' - automation that augments the capabilities of human beings.
Multispectral Image Processing for Plants
NASA Technical Reports Server (NTRS)
Miles, Gaines E.
1991-01-01
The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Robert; McConnell, Elizabeth
Machining methods across many industries generally require multiple operations to machine and process advanced materials, features with micron precision, and complex shapes. The resulting multiple machining platforms can significantly affect manufacturing cycle time and the precision of the final parts, with a resultant increase in cost and energy consumption. Ultrafast lasers represent a transformative and disruptive technology that removes material with micron precision and in a single step manufacturing process. Such precision results from athermal ablation without modification or damage to the remaining material which is the key differentiator between ultrafast laser technologies and traditional laser technologies or mechanical processes.more » Athermal ablation without modification or damage to the material eliminates post-processing or multiple manufacturing steps. Combined with the appropriate technology to control the motion of the work piece, ultrafast lasers are excellent candidates to provide breakthrough machining capability for difficult-to-machine materials. At the project onset in early 2012, the project team recognized that substantial effort was necessary to improve the application of ultrafast laser and precise motion control technologies (for micromachining difficult-to-machine materials) to further the aggregate throughput and yield improvements over conventional machining methods. The project described in this report advanced these leading-edge technologies thru the development and verification of two platforms: a hybrid enhanced laser chassis and a multi-application testbed.« less
Data Processing for High School Students
ERIC Educational Resources Information Center
Spiegelberg, Emma Jo
1974-01-01
Data processing should be taught at the high school level so students may develop a general understanding and appreciation for the capabilities and the limitations of these automated data processing systems. Card machines, wiring, logic, flowcharting, and Cobol programing are to be taught, with behavioral objectives for each section listed. (SC)
Performance study of a data flow architecture
NASA Technical Reports Server (NTRS)
Adams, George
1985-01-01
Teams of scientists studied data flow concepts, static data flow machine architecture, and the VAL language. Each team mapped its application onto the machine and coded it in VAL. The principal findings of the study were: (1) Five of the seven applications used the full power of the target machine. The galactic simulation and multigrid fluid flow teams found that a significantly smaller version of the machine (16 processing elements) would suffice. (2) A number of machine design parameters including processing element (PE) function unit numbers, array memory size and bandwidth, and routing network capability were found to be crucial for optimal machine performance. (3) The study participants readily acquired VAL programming skills. (4) Participants learned that application-based performance evaluation is a sound method of evaluating new computer architectures, even those that are not fully specified. During the course of the study, participants developed models for using computers to solve numerical problems and for evaluating new architectures. These models form the bases for future evaluation studies.
ERIC Educational Resources Information Center
GLOVER, J.H.
THE CHIEF OBJECTIVE OF THIS STUDY OF SPEED-SKILL ACQUISITION WAS TO FIND A MATHEMATICAL MODEL CAPABLE OF SIMPLE GRAPHIC INTERPRETATION FOR INDUSTRIAL TRAINING AND PRODUCTION SCHEDULING AT THE SHOP FLOOR LEVEL. STUDIES OF MIDDLE SKILL DEVELOPMENT IN MACHINE AND VEHICLE ASSEMBLY, AIRCRAFT PRODUCTION, SPOOLMAKING AND THE MACHINING OF PARTS CONFIRMED…
Running VisIt Software on the Peregrine System | High-Performance Computing
kilobyte range. VisIt features a robust remote visualization capability. VisIt can be started on a local machine and used to visualize data on a remote compute cluster.The remote machine must be able to send VisIt module must be loaded as part of this process. To enable remote visualization the 'module load
Oresko, Joseph J; Duschl, Heather; Cheng, Allen C
2010-05-01
Cardiovascular disease (CVD) is the single leading cause of global mortality and is projected to remain so. Cardiac arrhythmia is a very common type of CVD and may indicate an increased risk of stroke or sudden cardiac death. The ECG is the most widely adopted clinical tool to diagnose and assess the risk of arrhythmia. ECGs measure and display the electrical activity of the heart from the body surface. During patients' hospital visits, however, arrhythmias may not be detected on standard resting ECG machines, since the condition may not be present at that moment in time. While Holter-based portable monitoring solutions offer 24-48 h ECG recording, they lack the capability of providing any real-time feedback for the thousands of heart beats they record, which must be tediously analyzed offline. In this paper, we seek to unite the portability of Holter monitors and the real-time processing capability of state-of-the-art resting ECG machines to provide an assistive diagnosis solution using smartphones. Specifically, we developed two smartphone-based wearable CVD-detection platforms capable of performing real-time ECG acquisition and display, feature extraction, and beat classification. Furthermore, the same statistical summaries available on resting ECG machines are provided.
On the Conditioning of Machine-Learning-Assisted Turbulence Modeling
NASA Astrophysics Data System (ADS)
Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng
2017-11-01
Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.
Practical Framework: Implementing OEE Method in Manufacturing Process Environment
NASA Astrophysics Data System (ADS)
Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.
2016-02-01
Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.
Laser processing of ceramics for microelectronics manufacturing
NASA Astrophysics Data System (ADS)
Sposili, Robert S.; Bovatsek, James; Patel, Rajesh
2017-03-01
Ceramic materials are used extensively in the microelectronics, semiconductor, and LED lighting industries because of their electrically insulating and thermally conductive properties, as well as for their high-temperature-service capabilities. However, their brittleness presents significant challenges for conventional machining processes. In this paper we report on a series of experiments that demonstrate and characterize the efficacy of pulsed nanosecond UV and green lasers in machining ceramics commonly used in microelectronics manufacturing, such as aluminum oxide (alumina) and aluminum nitride. With a series of laser pocket milling experiments, fundamental volume ablation rate and ablation efficiency data were generated. In addition, techniques for various industrial machining processes, such as shallow scribing and deep scribing, were developed and demonstrated. We demonstrate that lasers with higher average powers offer higher processing rates with the one exception of deep scribes in aluminum nitride, where a lower average power but higher pulse energy source outperformed a higher average power laser.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian; Brightwell, Ronald B.; Grant, Ryan
This report presents a specification for the Portals 4 networ k programming interface. Portals 4 is intended to allow scalable, high-performance network communication betwee n nodes of a parallel computing system. Portals 4 is well suited to massively parallel processing and embedded syste ms. Portals 4 represents an adaption of the data movement layer developed for massively parallel processing platfor ms, such as the 4500-node Intel TeraFLOPS machine. Sandia's Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4 is tarmore » geted to the next generation of machines employing advanced network interface architectures that support enh anced offload capabilities.« less
The Portals 4.0 network programming interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian W.; Brightwell, Ronald Brian; Pedretti, Kevin
2012-11-01
This report presents a specification for the Portals 4.0 network programming interface. Portals 4.0 is intended to allow scalable, high-performance network communication between nodes of a parallel computing system. Portals 4.0 is well suited to massively parallel processing and embedded systems. Portals 4.0 represents an adaption of the data movement layer developed for massively parallel processing platforms, such as the 4500-node Intel TeraFLOPS machine. Sandias Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4.0 is targeted to the next generationmore » of machines employing advanced network interface architectures that support enhanced offload capabilities.« less
Modeling of solid-state and excimer laser processes for 3D micromachining
NASA Astrophysics Data System (ADS)
Holmes, Andrew S.; Onischenko, Alexander I.; George, David S.; Pedder, James E.
2005-04-01
An efficient simulation method has recently been developed for multi-pulse ablation processes. This is based on pulse-by-pulse propagation of the machined surface according to one of several phenomenological models for the laser-material interaction. The technique allows quantitative predictions to be made about the surface shapes of complex machined parts, given only a minimal set of input data for parameter calibration. In the case of direct-write machining of polymers or glasses with ns-duration pulses, this data set can typically be limited to the surface profiles of a small number of standard test patterns. The use of phenomenological models for the laser-material interaction, calibrated by experimental feedback, allows fast simulation, and can achieve a high degree of accuracy for certain combinations of material, laser and geometry. In this paper, the capabilities and limitations of the approach are discussed, and recent results are presented for structures machined in SU8 photoresist.
50 Years of Army Computing From ENIAC to MSRC
2000-09-01
processing capability. The scientifi c visualization program was started in 1984 to provide tools and expertise to help researchers graphically...and materials, forces modeling, nanoelectronics, electromagnetics and acoustics, signal image processing , and simulation and modeling. The ARL...mechanical and electrical calculating equipment, punch card data processing equipment, analog computers, and early digital machines. Before beginning, we
Laser Materials Processing Final Report CRADA No. TC-1526-98
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crane, J.; Lehane, C. J.
2017-09-08
This CRADA project was a joint effort between Lawrence Livermore National Laboratory (LLNL) and United Technologies Corporation (UTC)/Pratt & Whitney (P&W) to demonstrate process capability for drilling holes in turbine airfoils using LLNL-developed femtosecond laser machining technology. The basis for this development was the ability of femtosecond lasers to drill precision holes in variety of materials with little or no collateral damage. The ultimate objective was to develop a laser machine tool consisting of an extremely advanced femtosecond laser subsystem to be developed by LLNL on a best-effort basis and a drilling station for turbine blades and vanes to bemore » developed by P&W. In addition, P&W was responsible for commercializing the system. The goal of the so called Advanced Laser Drilling (ALD) system was to drill specified complex hole-shapes in turbine blades and vanes with a high degree precision and repeatability and simultaneously capable of very high speed processing.« less
NASA Astrophysics Data System (ADS)
Kumar, Harish
The present paper discusses the procedure for evaluation of best measurement capability of a force calibration machine. The best measurement capability of force calibration machine is evaluated by a comparison through the precision force transfer standards to the force standard machines. The force transfer standards are calibrated by the force standard machine and then by the force calibration machine by adopting the similar procedure. The results are reported and discussed in the paper and suitable discussion has been made for force calibration machine of 200 kN capacity. Different force transfer standards of nominal capacity 20 kN, 50 kN and 200 kN are used. It is found that there are significant variations in the .uncertainty of force realization by the force calibration machine according to the proposed method in comparison to the earlier method adopted.
Real-time face and gesture analysis for human-robot interaction
NASA Astrophysics Data System (ADS)
Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd
2010-05-01
Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.
Library Information-Processing System
NASA Technical Reports Server (NTRS)
1985-01-01
System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.
Design and development of linked data from the National Map
Usery, E. Lynn; Varanka, Dalia E.
2012-01-01
The development of linked data on the World-Wide Web provides the opportunity for the U.S. Geological Survey (USGS) to supply its extensive volumes of geospatial data, information, and knowledge in a machine interpretable form and reach users and applications that heretofore have been unavailable. To pilot a process to take advantage of this opportunity, the USGS is developing an ontology for The National Map and converting selected data from nine research test areas to a Semantic Web format to support machine processing and linked data access. In a case study, the USGS has developed initial methods for legacy vector and raster formatted geometry, attributes, and spatial relationships to be accessed in a linked data environment maintaining the capability to generate graphic or image output from semantic queries. The description of an initial USGS approach to developing ontology, linked data, and initial query capability from The National Map databases is presented.
Concurrent Image Processing Executive (CIPE). Volume 1: Design overview
NASA Technical Reports Server (NTRS)
Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.
1990-01-01
The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.
Retrofit concept for small safety related stationary machines
NASA Astrophysics Data System (ADS)
Epple, S.; Jalba, C. K.; Muminovic, A.; Jung, R.
2017-05-01
More and more old machines have the problem that their control electronics’ lifecycle comes to its intended end of life, whilst the mechanics itself and process capability is still in very good condition. This article shows an example of a reactive ion etcher originally built in 1988, which was refitted with a new control concept. The original control unit was repaired several times based on manufacturer’s obsolescence management. At start of the retrofit project the integrated circuits were no longer available for further repair of the original control unit. Safety, repeatability and stability of the process were greatly improved.
The Efficacy of Machine Learning Programs for Navy Manpower Analysis
1993-03-01
This thesis investigated the efficacy of two machine learning programs for Navy manpower analysis. Two machine learning programs, AIM and IXL, were...to generate models from the two commercial machine learning programs. Using a held out sub-set of the data the capabilities of the three models were...partial effects. The author recommended further investigation of AIM’s capabilities, and testing in an operational environment.... Machine learning , AIM, IXL.
Communication Studies of DMP and SMP Machines
NASA Technical Reports Server (NTRS)
Sohn, Andrew; Biswas, Rupak; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
Understanding the interplay between machines and problems is key to obtaining high performance on parallel machines. This paper investigates the interplay between programming paradigms and communication capabilities of parallel machines. In particular, we explicate the communication capabilities of the IBM SP-2 distributed-memory multiprocessor and the SGI PowerCHALLENGEarray symmetric multiprocessor. Two benchmark problems of bitonic sorting and Fast Fourier Transform are selected for experiments. Communication-efficient algorithms are developed to exploit the overlapping capabilities of the machines. Programs are written in Message-Passing Interface for portability and identical codes are used for both machines. Various data sizes and message sizes are used to test the machines' communication capabilities. Experimental results indicate that the communication performance of the multiprocessors are consistent with the size of messages. The SP-2 is sensitive to message size but yields a much higher communication overlapping because of the communication co-processor. The PowerCHALLENGEarray is not highly sensitive to message size and yields a low communication overlapping. Bitonic sorting yields lower performance compared to FFT due to a smaller computation-to-communication ratio.
A review of supervised machine learning applied to ageing research.
Fabris, Fabio; Magalhães, João Pedro de; Freitas, Alex A
2017-04-01
Broadly speaking, supervised machine learning is the computational task of learning correlations between variables in annotated data (the training set), and using this information to create a predictive model capable of inferring annotations for new data, whose annotations are not known. Ageing is a complex process that affects nearly all animal species. This process can be studied at several levels of abstraction, in different organisms and with different objectives in mind. Not surprisingly, the diversity of the supervised machine learning algorithms applied to answer biological questions reflects the complexities of the underlying ageing processes being studied. Many works using supervised machine learning to study the ageing process have been recently published, so it is timely to review these works, to discuss their main findings and weaknesses. In summary, the main findings of the reviewed papers are: the link between specific types of DNA repair and ageing; ageing-related proteins tend to be highly connected and seem to play a central role in molecular pathways; ageing/longevity is linked with autophagy and apoptosis, nutrient receptor genes, and copper and iron ion transport. Additionally, several biomarkers of ageing were found by machine learning. Despite some interesting machine learning results, we also identified a weakness of current works on this topic: only one of the reviewed papers has corroborated the computational results of machine learning algorithms through wet-lab experiments. In conclusion, supervised machine learning has contributed to advance our knowledge and has provided novel insights on ageing, yet future work should have a greater emphasis in validating the predictions.
The portals 4.0.1 network programming interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian W.; Brightwell, Ronald Brian; Pedretti, Kevin
2013-04-01
This report presents a specification for the Portals 4.0 network programming interface. Portals 4.0 is intended to allow scalable, high-performance network communication between nodes of a parallel computing system. Portals 4.0 is well suited to massively parallel processing and embedded systems. Portals 4.0 represents an adaption of the data movement layer developed for massively parallel processing platforms, such as the 4500-node Intel TeraFLOPS machine. Sandias Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4.0 is targeted to the next generationmore » of machines employing advanced network interface architectures that support enhanced offload capabilities. 3« less
Method of Optimizing the Construction of Machining, Assembly and Control Devices
NASA Astrophysics Data System (ADS)
Iordache, D. M.; Costea, A.; Niţu, E. L.; Rizea, A. D.; Babă, A.
2017-10-01
Industry dynamics, driven by economic and social requirements, must generate more interest in technological optimization, capable of ensuring a steady development of advanced technical means to equip machining processes. For these reasons, the development of tools, devices, work equipment and control, as well as the modernization of machine tools, is the certain solution to modernize production systems that require considerable time and effort. This type of approach is also related to our theoretical, experimental and industrial applications of recent years, presented in this paper, which have as main objectives the elaboration and use of mathematical models, new calculation methods, optimization algorithms, new processing and control methods, as well as some structures for the construction and configuration of technological equipment with a high level of performance and substantially reduced costs..
Material Processing Laser Systems In Production
NASA Astrophysics Data System (ADS)
Taeusch, David R.
1988-11-01
The laser processing system is now a respected, productive machine tool in the manufacturing industries. Systems in use today are proving their cost effectiveness and capabilities of processing quality parts. Several types of industrial lasers are described and their applications are discussed, with emphasis being placed on the production environment and methods of protection required for optical equipment against this normally hostile environment.
Perspectives on Machine Learning for Classification of Schizotypy Using fMRI Data.
Madsen, Kristoffer H; Krohne, Laerke G; Cai, Xin-Lu; Wang, Yi; Chan, Raymond C K
2018-03-15
Functional magnetic resonance imaging is capable of estimating functional activation and connectivity in the human brain, and lately there has been increased interest in the use of these functional modalities combined with machine learning for identification of psychiatric traits. While these methods bear great potential for early diagnosis and better understanding of disease processes, there are wide ranges of processing choices and pitfalls that may severely hamper interpretation and generalization performance unless carefully considered. In this perspective article, we aim to motivate the use of machine learning schizotypy research. To this end, we describe common data processing steps while commenting on best practices and procedures. First, we introduce the important role of schizotypy to motivate the importance of reliable classification, and summarize existing machine learning literature on schizotypy. Then, we describe procedures for extraction of features based on fMRI data, including statistical parametric mapping, parcellation, complex network analysis, and decomposition methods, as well as classification with a special focus on support vector classification and deep learning. We provide more detailed descriptions and software as supplementary material. Finally, we present current challenges in machine learning for classification of schizotypy and comment on future trends and perspectives.
Toward Usable Interactive Analytics: Coupling Cognition and Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; North, Chris; Chang, Remco
Interactive analytics provide users a myriad of computational means to aid in extracting meaningful information from large and complex datasets. Much prior work focuses either on advancing the capabilities of machine-centric approaches by the data mining and machine learning communities, or human-driven methods by the visualization and CHI communities. However, these methods do not yet support a true human-machine symbiotic relationship where users and machines work together collaboratively and adapt to each other to advance an interactive analytic process. In this paper we discuss some of the inherent issues, outlining what we believe are the steps toward usable interactive analyticsmore » that will ultimately increase the effectiveness for both humans and computers to produce insights.« less
Controlled English to facilitate human/machine analytical processing
NASA Astrophysics Data System (ADS)
Braines, Dave; Mott, David; Laws, Simon; de Mel, Geeth; Pham, Tien
2013-06-01
Controlled English is a human-readable information representation format that is implemented using a restricted subset of the English language, but which is unambiguous and directly accessible by simple machine processes. We have been researching the capabilities of CE in a number of contexts, and exploring the degree to which a flexible and more human-friendly information representation format could aid the intelligence analyst in a multi-agent collaborative operational environment; especially in cases where the agents are a mixture of other human users and machine processes aimed at assisting the human users. CE itself is built upon a formal logic basis, but allows users to easily specify models for a domain of interest in a human-friendly language. In our research we have been developing an experimental component known as the "CE Store" in which CE information can be quickly and flexibly processed and shared between human and machine agents. The CE Store environment contains a number of specialized machine agents for common processing tasks and also supports execution of logical inference rules that can be defined in the same CE language. This paper outlines the basic architecture of this approach, discusses some of the example machine agents that have been developed, and provides some typical examples of the CE language and the way in which it has been used to support complex analytical tasks on synthetic data sources. We highlight the fusion of human and machine processing supported through the use of the CE language and CE Store environment, and show this environment with examples of highly dynamic extensions to the model(s) and integration between different user-defined models in a collaborative setting.
Power transfer for rotating medical machine.
Sofia, A; Tavilla, A C; Gardenghi, R; Nicolis, D; Stefanini, I
2016-08-01
Very often biological tissues need to be treated inside of a biomedical centrifuge even during the centrifugation step without process interruption. In this paper an advantageous energy transfer method capable of providing sufficient electric power for the rotating and active part is presented.
NASA Astrophysics Data System (ADS)
Yu, Jianbo
2015-12-01
Prognostics is much efficient to achieve zero-downtime performance, maximum productivity and proactive maintenance of machines. Prognostics intends to assess and predict the time evolution of machine health degradation so that machine failures can be predicted and prevented. A novel prognostics system is developed based on the data-model-fusion scheme using the Bayesian inference-based self-organizing map (SOM) and an integration of logistic regression (LR) and high-order particle filtering (HOPF). In this prognostics system, a baseline SOM is constructed to model the data distribution space of healthy machine under an assumption that predictable fault patterns are not available. Bayesian inference-based probability (BIP) derived from the baseline SOM is developed as a quantification indication of machine health degradation. BIP is capable of offering failure probability for the monitored machine, which has intuitionist explanation related to health degradation state. Based on those historic BIPs, the constructed LR and its modeling noise constitute a high-order Markov process (HOMP) to describe machine health propagation. HOPF is used to solve the HOMP estimation to predict the evolution of the machine health in the form of a probability density function (PDF). An on-line model update scheme is developed to adapt the Markov process changes to machine health dynamics quickly. The experimental results on a bearing test-bed illustrate the potential applications of the proposed system as an effective and simple tool for machine health prognostics.
Design and Analysis of a Sensor System for Cutting Force Measurement in Machining Processes
Liang, Qiaokang; Zhang, Dan; Coppola, Gianmarc; Mao, Jianxu; Sun, Wei; Wang, Yaonan; Ge, Yunjian
2016-01-01
Multi-component force sensors have infiltrated a wide variety of automation products since the 1970s. However, one seldom finds full-component sensor systems available in the market for cutting force measurement in machine processes. In this paper, a new six-component sensor system with a compact monolithic elastic element (EE) is designed and developed to detect the tangential cutting forces Fx, Fy and Fz (i.e., forces along x-, y-, and z-axis) as well as the cutting moments Mx, My and Mz (i.e., moments about x-, y-, and z-axis) simultaneously. Optimal structural parameters of the EE are carefully designed via simulation-driven optimization. Moreover, a prototype sensor system is fabricated, which is applied to a 5-axis parallel kinematic machining center. Calibration experimental results demonstrate that the system is capable of measuring cutting forces and moments with good linearity while minimizing coupling error. Both the Finite Element Analysis (FEA) and calibration experimental studies validate the high performance of the proposed sensor system that is expected to be adopted into machining processes. PMID:26751451
Design and Analysis of a Sensor System for Cutting Force Measurement in Machining Processes.
Liang, Qiaokang; Zhang, Dan; Coppola, Gianmarc; Mao, Jianxu; Sun, Wei; Wang, Yaonan; Ge, Yunjian
2016-01-07
Multi-component force sensors have infiltrated a wide variety of automation products since the 1970s. However, one seldom finds full-component sensor systems available in the market for cutting force measurement in machine processes. In this paper, a new six-component sensor system with a compact monolithic elastic element (EE) is designed and developed to detect the tangential cutting forces Fx, Fy and Fz (i.e., forces along x-, y-, and z-axis) as well as the cutting moments Mx, My and Mz (i.e., moments about x-, y-, and z-axis) simultaneously. Optimal structural parameters of the EE are carefully designed via simulation-driven optimization. Moreover, a prototype sensor system is fabricated, which is applied to a 5-axis parallel kinematic machining center. Calibration experimental results demonstrate that the system is capable of measuring cutting forces and moments with good linearity while minimizing coupling error. Both the Finite Element Analysis (FEA) and calibration experimental studies validate the high performance of the proposed sensor system that is expected to be adopted into machining processes.
UPEML Version 2. 0: A machine-portable CDC Update emulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehlhorn, T.A.; Young, M.F.
1987-05-01
UPEML is a machine-portable CDC Update emulation program. UPEML is written in ANSI standard Fortran-77 and is relatively simple and compact. It is capable of emulating a significant subset of the standard CDC Update functions, including program library creation and subsequent modification. Machine-portability is an essential attribute of UPEML. UPEML was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX 11/780 and the IBM 3081. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both COS and CTSS operating systems, on APOLLO workstations, and on the HP-9000.more » Version 2.0 includes enhanced error checking, full ASCI character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the compile file. Further enhancements include checks for overlapping corrections, processing of nested calls to common decks, and reads and addfiles from alternate input files.« less
Function allocation for humans and automation in the context of team dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey C. Joe; John O'Hara; Jacques Hugo
Within Human Factors Engineering, a decision-making process called function allocation (FA) is used during the design life cycle of complex systems to distribute the system functions, often identified through a functional requirements analysis, to all human and automated machine agents (or teammates) involved in controlling the system. Most FA methods make allocation decisions primarily by comparing the capabilities of humans and automation, but then also by considering secondary factors such as cost, regulations, and the health and safety of workers. The primary analysis of the strengths and weaknesses of humans and machines, however, is almost always considered in terms ofmore » individual human or machine capabilities. Yet, FA is fundamentally about teamwork in that the goal of the FA decision-making process is to determine what are the optimal allocations of functions among agents. Given this framing of FA, and the increasing use of and sophistication of automation, there are two related social psychological issues that current FA methods need to address more thoroughly. First, many principles for effective human teamwork are not considered as central decision points or in the iterative hypothesis and testing phase in most FA methods, when it is clear that social factors have numerous positive and negative effects on individual and team capabilities. Second, social psychological factors affecting team performance and can be difficult to translate to automated agents, and most FA methods currently do not account for this effect. The implications for these issues are discussed.« less
An adaptive process-based cloud infrastructure for space situational awareness applications
NASA Astrophysics Data System (ADS)
Liu, Bingwei; Chen, Yu; Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik; Rubin, Bruce
2014-06-01
Space situational awareness (SSA) and defense space control capabilities are top priorities for groups that own or operate man-made spacecraft. Also, with the growing amount of space debris, there is an increase in demand for contextual understanding that necessitates the capability of collecting and processing a vast amount sensor data. Cloud computing, which features scalable and flexible storage and computing services, has been recognized as an ideal candidate that can meet the large data contextual challenges as needed by SSA. Cloud computing consists of physical service providers and middleware virtual machines together with infrastructure, platform, and software as service (IaaS, PaaS, SaaS) models. However, the typical Virtual Machine (VM) abstraction is on a per operating systems basis, which is at too low-level and limits the flexibility of a mission application architecture. In responding to this technical challenge, a novel adaptive process based cloud infrastructure for SSA applications is proposed in this paper. In addition, the details for the design rationale and a prototype is further examined. The SSA Cloud (SSAC) conceptual capability will potentially support space situation monitoring and tracking, object identification, and threat assessment. Lastly, the benefits of a more granular and flexible cloud computing resources allocation are illustrated for data processing and implementation considerations within a representative SSA system environment. We show that the container-based virtualization performs better than hypervisor-based virtualization technology in an SSA scenario.
Time to B. cereus about hot chocolate.
Nelms, P K; Larson, O; Barnes-Josiah, D
1997-01-01
OBJECTIVE: To determine the cause of illnesses experienced by employees of a Minneapolis manufacturing plant after drinking hot chocolate bought from a vending machine and to explore the prevalence of similar vending machine-related illnesses. METHODS: The authors inspected the vending machines at the manufacturing plant where employees reported illnesses and at other locations in the city where hot chocolate beverages were sold in machines. Tests were performed on dry mix, water, and beverage samples and on machine parts. RESULTS: Laboratory analyses confirmed the presence of B. cereus in dispensed beverages at a concentration capable of causing illness (170,000 count/gm). In citywide testing of vending machines dispensing hot chocolate, 7 of the 39 licensed machines were found to be contaminated, with two contaminated machines having B. cereus levels capable of causing illness. CONCLUSIONS: Hot chocolate sold in vending machines may contain organisms capable of producing toxins that under favorable conditions, can induce illness. Such illnesses are likely to be underreported. Even low concentrations of B. cereus may be dangerous for vulnerable populations such as the aged or immunosuppressed. Periodic testing of vending machines is thus warranted. The relationship between cleaning practices and B. cereus contamination is an issue for further study. PMID:9160059
Time to B. cereus about hot chocolate.
Nelms, P K; Larson, O; Barnes-Josiah, D
1997-01-01
To determine the cause of illnesses experienced by employees of a Minneapolis manufacturing plant after drinking hot chocolate bought from a vending machine and to explore the prevalence of similar vending machine-related illnesses. The authors inspected the vending machines at the manufacturing plant where employees reported illnesses and at other locations in the city where hot chocolate beverages were sold in machines. Tests were performed on dry mix, water, and beverage samples and on machine parts. Laboratory analyses confirmed the presence of B. cereus in dispensed beverages at a concentration capable of causing illness (170,000 count/gm). In citywide testing of vending machines dispensing hot chocolate, 7 of the 39 licensed machines were found to be contaminated, with two contaminated machines having B. cereus levels capable of causing illness. Hot chocolate sold in vending machines may contain organisms capable of producing toxins that under favorable conditions, can induce illness. Such illnesses are likely to be underreported. Even low concentrations of B. cereus may be dangerous for vulnerable populations such as the aged or immunosuppressed. Periodic testing of vending machines is thus warranted. The relationship between cleaning practices and B. cereus contamination is an issue for further study.
Ion beam figuring of highly steep mirrors with a 5-axis hybrid machine tool
NASA Astrophysics Data System (ADS)
Yin, Xiaolin; Tang, Wa; Hu, Haixiang; Zeng, Xuefeng; Wang, Dekang; Xue, Donglin; Zhang, Feng; Deng, Weijie; Zhang, Xuejun
2018-02-01
Ion beam figuring (IBF) is an advanced and deterministic method for optical mirror surface processing. The removal function of IBF varies with the different incident angles of ion beam. Therefore, for the curved surface especially the highly steep one, the Ion Beam Source (IBS) should be equipped with 5-axis machining capability to remove the material along the normal direction of the mirror surface, so as to ensure the stability of the removal function. Based on the 3-RPS parallel mechanism and two dimensional displacement platform, a new type of 5-axis hybrid machine tool for IBF is presented. With the hybrid machine tool, the figuring process of a highly steep fused silica spherical mirror is introduced. The R/# of the mirror is 0.96 and the aperture is 104mm. The figuring result shows that, PV value of the mirror surface error is converged from 121.1nm to32.3nm, and RMS value 23.6nm to 3.4nm.
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
NASA Astrophysics Data System (ADS)
Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.
2015-03-01
Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.
The Physics and Physical Chemistry of Molecular Machines.
Astumian, R Dean; Mukherjee, Shayantani; Warshel, Arieh
2016-06-17
The concept of a "power stroke"-a free-energy releasing conformational change-appears in almost every textbook that deals with the molecular details of muscle, the flagellar rotor, and many other biomolecular machines. Here, it is shown by using the constraints of microscopic reversibility that the power stroke model is incorrect as an explanation of how chemical energy is used by a molecular machine to do mechanical work. Instead, chemically driven molecular machines operating under thermodynamic constraints imposed by the reactant and product concentrations in the bulk function as information ratchets in which the directionality and stopping torque or stopping force are controlled entirely by the gating of the chemical reaction that provides the fuel for the machine. The gating of the chemical free energy occurs through chemical state dependent conformational changes of the molecular machine that, in turn, are capable of generating directional mechanical motions. In strong contrast to this general conclusion for molecular machines driven by catalysis of a chemical reaction, a power stroke may be (and often is) an essential component for a molecular machine driven by external modulation of pH or redox potential or by light. This difference between optical and chemical driving properties arises from the fundamental symmetry difference between the physics of optical processes, governed by the Bose-Einstein relations, and the constraints of microscopic reversibility for thermally activated processes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fine pitch thermosonic wire bonding: analysis of state-of-the-art manufacturing capability
NASA Astrophysics Data System (ADS)
Cavasin, Daniel
1995-09-01
A comprehensive process characterization was performed at the Motorola plastic package assembly site in Selangor, Malaysia, to document the current fine pitch wire bond process capability, using state-of-the-art equipment, in an actual manufacturing environment. Two machines, representing the latest technology from two separate manufacturers, were operated one shift per day for five days, bonding a 132 lead Plastic Quad Flat Pack. Using a test device specifically designed for fine pitch wire bonding, the bonding programs were alternated between 107 micrometers and 92 micrometers pad pitch, running each pitch for a total of 1600 units per machine. Wire, capillary type, and related materials were standardized and commercially available. A video metrology measurement system, with a demonstrated six sigma repeatability band width of 0.51 micrometers , was utilized to measure the bonded units for bond dimensions and placement. Standard Quality Assurance (QA) metrics were also performed. Results indicate that state-of-the-art thermosonic wire bonding can achieve acceptable assembly yields at these fine pad pitches.
Linear positioning laser calibration setup of CNC machine tools
NASA Astrophysics Data System (ADS)
Sui, Xiulin; Yang, Congjing
2002-10-01
The linear positioning laser calibration setup of CNC machine tools is capable of executing machine tool laser calibraiotn and backlash compensation. Using this setup, hole locations on CNC machien tools will be correct and machien tool geometry will be evaluated and adjusted. Machien tool laser calibration and backlash compensation is a simple and straightforward process. First the setup is to 'find' the stroke limits of the axis. Then the laser head is then brought into correct alignment. Second is to move the machine axis to the other extreme, the laser head is now aligned, using rotation and elevation adjustments. Finally the machine is moved to the start position and final alignment is verified. The stroke of the machine, and the machine compensation interval dictate the amount of data required for each axis. These factors determine the amount of time required for a through compensation of the linear positioning accuracy. The Laser Calibrator System monitors the material temperature and the air density; this takes into consideration machine thermal growth and laser beam frequency. This linear positioning laser calibration setup can be used on CNC machine tools, CNC lathes, horizontal centers and vertical machining centers.
Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya
2014-09-01
To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
Low Cost Process for Manufacture of Oxide Dispersion Strengthened (ODS) Turbine Nozzle Components.
1979-12-01
SWTTPROCESS FORJIANUFACTURE OF9OXIDE)ISPERSIONSTRENGTHENED (ODS) O0 TURBINE !IJOZZLE COMPONENTS, -- , General Electric Company Aircraft Engine Group...machining processes for low pressure turbine (LPT) vanes , high pressure turbine (HPT) vanes , and HPT band segments for the F101 engine . The primary intent...for aircraft turbine nozzle components. These processes were shown capable of maintaining required microstructures and properties for the vane and
NASA Astrophysics Data System (ADS)
Budi Harja, Herman; Prakosa, Tri; Raharno, Sri; Yuwana Martawirya, Yatna; Nurhadi, Indra; Setyo Nogroho, Alamsyah
2018-03-01
The production characteristic of job-shop industry at which products have wide variety but small amounts causes every machine tool will be shared to conduct production process with dynamic load. Its dynamic condition operation directly affects machine tools component reliability. Hence, determination of maintenance schedule for every component should be calculated based on actual usage of machine tools component. This paper describes study on development of monitoring system to obtaining information about each CNC machine tool component usage in real time approached by component grouping based on its operation phase. A special device has been developed for monitoring machine tool component usage by utilizing usage phase activity data taken from certain electronics components within CNC machine. The components are adaptor, servo driver and spindle driver, as well as some additional components such as microcontroller and relays. The obtained data are utilized for detecting machine utilization phases such as power on state, machine ready state or spindle running state. Experimental result have shown that the developed CNC machine tool monitoring system is capable of obtaining phase information of machine tool usage as well as its duration and displays the information at the user interface application.
Confessions of a robot lobotomist
NASA Technical Reports Server (NTRS)
Gottshall, R. Marc
1994-01-01
Since its inception, numerically controlled (NC) machining methods have been used throughout the aerospace industry to mill, drill, and turn complex shapes by sequentially stepping through motion programs. However, the recent demand for more precision, faster feeds, exotic sensors, and branching execution have existing computer numerical control (CNC) and distributed numerical control (DNC) systems running at maximum controller capacity. Typical disadvantages of current CNC's include fixed memory capacities, limited communication ports, and the use of multiple control languages. The need to tailor CNC's to meet specific applications, whether it be expanded memory, additional communications, or integrated vision, often requires replacing the original controller supplied with the commercial machine tool with a more powerful and capable system. This paper briefly describes the process and equipment requirements for new controllers and their evolutionary implementation in an aerospace environment. The process of controller retrofit with currently available machines is examined, along with several case studies and their computational and architectural implications.
Computational Foundations of Natural Intelligence
van Gerven, Marcel
2017-01-01
New developments in AI and neuroscience are revitalizing the quest to understanding natural intelligence, offering insight about how to equip machines with human-like capabilities. This paper reviews some of the computational principles relevant for understanding natural intelligence and, ultimately, achieving strong AI. After reviewing basic principles, a variety of computational modeling approaches is discussed. Subsequently, I concentrate on the use of artificial neural networks as a framework for modeling cognitive processes. This paper ends by outlining some of the challenges that remain to fulfill the promise of machines that show human-like intelligence. PMID:29375355
Temperature Measurement and Numerical Prediction in Machining Inconel 718.
Díaz-Álvarez, José; Tapetado, Alberto; Vázquez, Carmen; Miguélez, Henar
2017-06-30
Thermal issues are critical when machining Ni-based superalloy components designed for high temperature applications. The low thermal conductivity and extreme strain hardening of this family of materials results in elevated temperatures around the cutting area. This elevated temperature could lead to machining-induced damage such as phase changes and residual stresses, resulting in reduced service life of the component. Measurement of temperature during machining is crucial in order to control the cutting process, avoiding workpiece damage. On the other hand, the development of predictive tools based on numerical models helps in the definition of machining processes and the obtainment of difficult to measure parameters such as the penetration of the heated layer. However, the validation of numerical models strongly depends on the accurate measurement of physical parameters such as temperature, ensuring the calibration of the model. This paper focuses on the measurement and prediction of temperature during the machining of Ni-based superalloys. The temperature sensor was based on a fiber-optic two-color pyrometer developed for localized temperature measurements in turning of Inconel 718. The sensor is capable of measuring temperature in the range of 250 to 1200 °C. Temperature evolution is recorded in a lathe at different feed rates and cutting speeds. Measurements were used to calibrate a simplified numerical model for prediction of temperature fields during turning.
NASA Astrophysics Data System (ADS)
Paksi, A. B. N.; Ma'ruf, A.
2016-02-01
In general, both machines and human resources are needed for processing a job on production floor. However, most classical scheduling problems have ignored the possible constraint caused by availability of workers and have considered only machines as a limited resource. In addition, along with production technology development, routing flexibility appears as a consequence of high product variety and medium demand for each product. Routing flexibility is caused by capability of machines that offers more than one machining process. This paper presents a method to address scheduling problem constrained by both machines and workers, considering routing flexibility. Scheduling in a Dual-Resource Constrained shop is categorized as NP-hard problem that needs long computational time. Meta-heuristic approach, based on Genetic Algorithm, is used due to its practical implementation in industry. Developed Genetic Algorithm uses indirect chromosome representative and procedure to transform chromosome into Gantt chart. Genetic operators, namely selection, elitism, crossover, and mutation are developed to search the best fitness value until steady state condition is achieved. A case study in a manufacturing SME is used to minimize tardiness as objective function. The algorithm has shown 25.6% reduction of tardiness, equal to 43.5 hours.
Micro Slot Generation by μ-ED Milling
NASA Astrophysics Data System (ADS)
Dave, H. K.; Mayanak, M. K.; Rajpurohit, S. R.; Mathai, V. J.
2016-08-01
Micro electro discharge machining is one of the most widely used advanced micro machining technique owing to its capability to fabricate micro features on any electrically conductive materials irrespective of its material properties. Despite its wide acceptability, the process is always adversely affected by issues like wear that occurred on the tool electrode, which results into generation of inaccurate features. Micro ED milling, a process variant in which the tool electrode simultaneously rotated and scanned during machining, is reported to have high process efficiency for generation of 3D complicated shapes and features with relatively less electrode wear intensity. In the present study an attempt has been made to study the effect of two process parameters viz. capacitance and scanning speed of tool electrode on end wear that occurs on the tool electrode and overcut of micro slots generated by micro ED milling. The experiment has been conducted on Al 1100 alloy with tungsten electrode having diameter of 300 μm. Results suggest that wear on the tool electrode and overcut of the micro features generated are highly influenced by the level of the capacitance employed during machining. For the parameter usage employed for present study however, no significant effect of variation of scanning speed has been observed on both responses.
A high sensitivity wear debris sensor using ferrite cores for online oil condition monitoring
NASA Astrophysics Data System (ADS)
Zhu, Xiaoliang; Zhong, Chong; Zhe, Jiang
2017-07-01
Detecting wear debris and measuring the increasing number of wear debris in lubrication oil can indicate abnormal machine wear well ahead of machine failure, and thus are indispensable for online machine health monitoring. A portable wear debris sensor with ferrite cores for online monitoring is presented. The sensor detects wear debris by measuring the inductance change of two planar coils wound around a pair of ferrite cores that make the magnetic flux denser and more uniform in the sensing channel, thereby improving the sensitivity of the sensor. Static testing results showed this wear debris sensor is capable of detecting 11 µm and 50 µm ferrous debris in 1 mm and 7 mm diameter fluidic pipes, respectively; such a high sensitivity has not been achieved before. Furthermore, a synchronized sampling method was also applied to reduce the data size and realize real-time data processing. Dynamic testing results demonstrated that the sensor is capable of detecting wear debris in real time with a high throughput of 750 ml min-1 the measured debris concentration is in good agreement with the actual concentration.
CESAR research in intelligent machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weisbin, C.R.
1986-01-01
The Center for Engineering Systems Advanced Research (CESAR) was established in 1983 as a national center for multidisciplinary, long-range research and development in machine intelligence and advanced control theory for energy-related applications. Intelligent machines of interest here are artificially created operational systems that are capable of autonomous decision making and action. The initial emphasis for research is remote operations, with specific application to dexterous manipulation in unstructured dangerous environments where explosives, toxic chemicals, or radioactivity may be present, or in other environments with significant risk such as coal mining or oceanographic missions. Potential benefits include reduced risk to man inmore » hazardous situations, machine replication of scarce expertise, minimization of human error due to fear or fatigue, and enhanced capability using high resolution sensors and powerful computers. A CESAR goal is to explore the interface between the advanced teleoperation capability of today, and the autonomous machines of the future.« less
Liu, Xunying; Zhang, Chao; Woodland, Phil; Fonteneau, Elisabeth
2017-01-01
There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental ‘machine states’, generated as the ASR analysis progresses over time, to the incremental ‘brain states’, measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain. PMID:28945744
Application of high-performance computing to numerical simulation of human movement
NASA Technical Reports Server (NTRS)
Anderson, F. C.; Ziegler, J. M.; Pandy, M. G.; Whalen, R. T.
1995-01-01
We have examined the feasibility of using massively-parallel and vector-processing supercomputers to solve large-scale optimization problems for human movement. Specifically, we compared the computational expense of determining the optimal controls for the single support phase of gait using a conventional serial machine (SGI Iris 4D25), a MIMD parallel machine (Intel iPSC/860), and a parallel-vector-processing machine (Cray Y-MP 8/864). With the human body modeled as a 14 degree-of-freedom linkage actuated by 46 musculotendinous units, computation of the optimal controls for gait could take up to 3 months of CPU time on the Iris. Both the Cray and the Intel are able to reduce this time to practical levels. The optimal solution for gait can be found with about 77 hours of CPU on the Cray and with about 88 hours of CPU on the Intel. Although the overall speeds of the Cray and the Intel were found to be similar, the unique capabilities of each machine are better suited to different portions of the computational algorithm used. The Intel was best suited to computing the derivatives of the performance criterion and the constraints whereas the Cray was best suited to parameter optimization of the controls. These results suggest that the ideal computer architecture for solving very large-scale optimal control problems is a hybrid system in which a vector-processing machine is integrated into the communication network of a MIMD parallel machine.
Low cost fabrication development for oxide dispersion strengthened alloy vanes
NASA Technical Reports Server (NTRS)
Perkins, R. J.; Bailey, P. G.
1978-01-01
Viable processes were developed for secondary working of oxide dispersion strengthened (ODS) alloys to near-net shapes (NNS) for aircraft turbine vanes. These processes were shown capable of producing required microstructure and properties for vane applications. Material cost savings of 40 to 50% are projected for the NNS process over the current procedures which involve machining from rectangular bar. Additional machining cost savings are projected. Of three secondary working processes evaluated, directional forging and plate bending were determined to be viable NNS processes for ODS vanes. Directional forging was deemed most applicable to high pressure turbine (HPT) vanes with their large thickness variations while plate bending was determined to be most cost effective for low pressure turbine (LPT) vanes because of their limited thickness variations. Since the F101 LPT vane was selected for study in this program, development of plate bending was carried through to establishment of a preliminary process. Preparation of ODS alloy plate for bending was found to be a straight forward process using currently available bar stock, providing that the capability for reheating between roll passes is available. Advanced ODS-NiCrAl and ODS-FeCrAl alloys were utilized on this program. Workability of all alloys was adequate for directional forging and plate bending, but only the ODS-FeCrAl had adequate workability for shaped preform extrustion.
Failure detection in high-performance clusters and computers using chaotic map computations
Rao, Nageswara S.
2015-09-01
A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.
ANN-PSO Integrated Optimization Methodology for Intelligent Control of MMC Machining
NASA Astrophysics Data System (ADS)
Chandrasekaran, Muthumari; Tamang, Santosh
2017-08-01
Metal Matrix Composites (MMC) show improved properties in comparison with non-reinforced alloys and have found increased application in automotive and aerospace industries. The selection of optimum machining parameters to produce components of desired surface roughness is of great concern considering the quality and economy of manufacturing process. In this study, a surface roughness prediction model for turning Al-SiCp MMC is developed using Artificial Neural Network (ANN). Three turning parameters viz., spindle speed ( N), feed rate ( f) and depth of cut ( d) were considered as input neurons and surface roughness was an output neuron. ANN architecture having 3 -5 -1 is found to be optimum and the model predicts with an average percentage error of 7.72 %. Particle Swarm Optimization (PSO) technique is used for optimizing parameters to minimize machining time. The innovative aspect of this work is the development of an integrated ANN-PSO optimization method for intelligent control of MMC machining process applicable to manufacturing industries. The robustness of the method shows its superiority for obtaining optimum cutting parameters satisfying desired surface roughness. The method has better convergent capability with minimum number of iterations.
NASA Astrophysics Data System (ADS)
Majumder, Himadri; Maity, Kalipada
2018-03-01
Shape memory alloy has a unique capability to return to its original shape after physical deformation by applying heat or thermo-mechanical or magnetic load. In this experimental investigation, desirability function analysis (DFA), a multi-attribute decision making was utilized to find out the optimum input parameter setting during wire electrical discharge machining (WEDM) of Ni-Ti shape memory alloy. Four critical machining parameters, namely pulse on time (TON), pulse off time (TOFF), wire feed (WF) and wire tension (WT) were taken as machining inputs for the experiments to optimize three interconnected responses like cutting speed, kerf width, and surface roughness. Input parameter combination TON = 120 μs., TOFF = 55 μs., WF = 3 m/min. and WT = 8 kg-F were found to produce the optimum results. The optimum process parameters for each desired response were also attained using Taguchi’s signal-to-noise ratio. Confirmation test has been done to validate the optimum machining parameter combination which affirmed DFA was a competent approach to select optimum input parameters for the ideal response quality for WEDM of Ni-Ti shape memory alloy.
Shared direct memory access on the Explorer 2-LX
NASA Technical Reports Server (NTRS)
Musgrave, Jeffrey L.
1990-01-01
Advances in Expert System technology and Artificial Intelligence have provided a framework for applying automated Intelligence to the solution of problems which were generally perceived as intractable using more classical approaches. As a result, hybrid architectures and parallel processing capability have become more common in computing environments. The Texas Instruments Explorer II-LX is an example of a machine which combines a symbolic processing environment, and a computationally oriented environment in a single chassis for integrated problem solutions. This user's manual is an attempt to make these capabilities more accessible to a wider range of engineers and programmers with problems well suited to solution in such an environment.
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)
Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).
Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.
Systematics for checking geometric errors in CNC lathes
NASA Astrophysics Data System (ADS)
Araújo, R. P.; Rolim, T. L.
2015-10-01
Non-idealities presented in machine tools compromise directly both the geometry and the dimensions of machined parts, generating distortions in the project. Given the competitive scenario among different companies, it is necessary to have knowledge of the geometric behavior of these machines in order to be able to establish their processing capability, avoiding waste of time and materials as well as satisfying customer requirements. But despite the fact that geometric tests are important and necessary to clarify the use of the machine correctly, therefore preventing future damage, most users do not apply such tests on their machines for lack of knowledge or lack of proper motivation, basically due to two factors: long period of time and high costs of testing. This work proposes a systematics for checking straightness and perpendicularity errors in CNC lathes demanding little time and cost with high metrological reliability, to be used on factory floors of small and medium-size businesses to ensure the quality of its products and make them competitive.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho
Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less
Man/Machine Interaction Dynamics And Performance (MMIDAP) capability
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
The creation of an ability to study interaction dynamics between a machine and its human operator can be approached from a myriad of directions. The Man/Machine Interaction Dynamics and Performance (MMIDAP) project seeks to create an ability to study the consequences of machine design alternatives relative to the performance of both machine and operator. The class of machines to which this study is directed includes those that require the intelligent physical exertions of a human operator. While Goddard's Flight Telerobotic's program was expected to be a major user, basic engineering design and biomedical applications reach far beyond telerobotics. Ongoing efforts are outlined of the GSFC and its University and small business collaborators to integrate both human performance and musculoskeletal data bases with analysis capabilities necessary to enable the study of dynamic actions, reactions, and performance of coupled machine/operator systems.
NASA Technical Reports Server (NTRS)
Garin, John; Matteo, Joseph; Jennings, Von Ayre
1988-01-01
The capability for a single operator to simultaneously control complex remote multi degree of freedom robotic arms and associated dextrous end effectors is being developed. An optimal solution within the realm of current technology, can be achieved by recognizing that: (1) machines/computer systems are more effective than humans when the task is routine and specified, and (2) humans process complex data sets and deal with the unpredictable better than machines. These observations lead naturally to a philosophy in which the human's role becomes a higher level function associated with planning, teaching, initiating, monitoring, and intervening when the machine gets into trouble, while the machine performs the codifiable tasks with deliberate efficiency. This concept forms the basis for the integration of man and telerobotics, i.e., robotics with the operator in the control loop. The concept of integration of the human in the loop and maximizing the feed-forward and feed-back data flow is referred to as telepresence.
DESIGN AND EVALUATION OF INDIVIDUAL ELEMENTS OF THE INTERFACE FOR AN AGRICULTURAL MACHINE.
Rakhra, Aadesh K; Mann, Danny D
2018-01-29
If a user-centered approach is not used to design information displays, the quantity and quality of information presented to the user may not match the needs of the user, or it may exceed the capability of the human operator for processing and using that information. The result may be an excessive mental workload and reduced situation awareness of the operator, which can negatively affect the machine performance and operational outcomes. The increasing use of technology in agricultural machines may expose the human operator to excessive and undesirable information if the operator's information needs and information processing capabilities are ignored. In this study, a user-centered approach was used to design specific interface elements for an agricultural air seeder. Designs of the interface elements were evaluated in a laboratory environment by developing high-fidelity prototypes. Evaluations of the user interface elements yielded significant improvement in situation awareness (up to 11%; overall mean difference = 5.0 (4.8%), 95% CI (6.4728, 3.5939), p 0.0001). Mental workload was reduced by up to 19.7% (overall mean difference = -5.2 (-7.9%), n = 30, a = 0.05). Study participants rated the overall performance of the newly designed user-centered interface elements higher in comparison to the previous designs (overall mean difference = 27.3 (189.8%), 99% CI (35.150, 19.384), p 0.0001. Copyright© by the American Society of Agricultural Engineers.
Learning and Optimization of Cognitive Capabilities. Final Project Report.
ERIC Educational Resources Information Center
Lumsdaine, A.A.; And Others
The work of a three-year series of experimental studies of human cognition is summarized in this report. Proglem solving and learning in man-machine interaction was investigated, as well as relevant variables and processes. The work included four separate projects: (1) computer-aided problem solving, (2) computer-aided instruction techniques, (3)…
Harnessing Disordered-Ensemble Quantum Dynamics for Machine Learning
NASA Astrophysics Data System (ADS)
Fujii, Keisuke; Nakajima, Kohei
2017-08-01
The quantum computer has an amazing potential of fast information processing. However, the realization of a digital quantum computer is still a challenging problem requiring highly accurate controls and key application strategies. Here we propose a platform, quantum reservoir computing, to solve these issues successfully by exploiting the natural quantum dynamics of ensemble systems, which are ubiquitous in laboratories nowadays, for machine learning. This framework enables ensemble quantum systems to universally emulate nonlinear dynamical systems including classical chaos. A number of numerical experiments show that quantum systems consisting of 5-7 qubits possess computational capabilities comparable to conventional recurrent neural networks of 100-500 nodes. This discovery opens up a paradigm for information processing with artificial intelligence powered by quantum physics.
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics
NASA Astrophysics Data System (ADS)
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics.
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Development of a plan for automating integrated circuit processing
NASA Technical Reports Server (NTRS)
1971-01-01
The operations analysis and equipment evaluations pertinent to the design of an automated production facility capable of manufacturing beam-lead CMOS integrated circuits are reported. The overall plan shows approximate cost of major equipment, production rate and performance capability, flexibility, and special maintenance requirements. Direct computer control is compared with supervisory-mode operations. The plan is limited to wafer processing operations from the starting wafer to the finished beam-lead die after separation etching. The work already accomplished in implementing various automation schemes, and the type of equipment which can be found for instant automation are described. The plan is general, so that small shops or large production units can perhaps benefit. Examples of major types of automated processing machines are shown to illustrate the general concepts of automated wafer processing.
Using machine learning to emulate human hearing for predictive maintenance of equipment
NASA Astrophysics Data System (ADS)
Verma, Dinesh; Bent, Graham
2017-05-01
At the current time, interfaces between humans and machines use only a limited subset of senses that humans are capable of. The interaction among humans and computers can become much more intuitive and effective if we are able to use more senses, and create other modes of communicating between them. New machine learning technologies can make this type of interaction become a reality. In this paper, we present a framework for a holistic communication between humans and machines that uses all of the senses, and discuss how a subset of this capability can allow machines to talk to humans to indicate their health for various tasks such as predictive maintenance.
UltraNet Target Parameters. Chapter 1
NASA Technical Reports Server (NTRS)
Kislitzin, Katherine T.; Blaylock, Bruce T. (Technical Monitor)
1992-01-01
The UltraNet is a high speed network capable of rates up to one gigabit per second. It is a hub based network with four optical fiber links connecting each hub. Each link can carry up to 256 megabits of data, and the hub backplane is capable of one gigabit aggregate throughput. Host connections to the hub may be fiber, coax, or channel based. Bus based machines have adapter boards that connect to transceivers in the hub, while channel based machines use a personality module in the hub. One way that the UltraNet achieves its high transfer rates is by off-loading the protocol processing from the hosts to special purpose protocol engines in the UltraNet hubs. In addition, every hub has a PC connected to it by StarLAN for network management purposes. Although there is hub resident and PC resident UltraNet software, this document treats only the host resident UltraNet software.
NASA Astrophysics Data System (ADS)
Kozhina, T. D.; Kurochkin, A. V.
2016-04-01
The paper highlights results of the investigative tests of GTE compressor Ti-alloy blades obtained by the method of electrochemical machining with oscillating tool-electrodes, carried out in order to define the optimal parameters of the ECM process providing attainment of specified blade quality parameters given in the design documentation, while providing maximal performance. The new technological methods suggested based on the results of the tests; in particular application of vibrating tool-electrodes and employment of locating elements made of high-strength materials, significantly extend the capabilities of this method.
Solar prediction and intelligent machines
NASA Technical Reports Server (NTRS)
Johnson, Gordon G.
1987-01-01
The solar prediction program is aimed at reducing or eliminating the need to throughly understand the process previously developed and to still be able to produce a prediction. Substantial progress was made in identifying the procedures to be coded as well as testing some of the presently coded work. Another project involves work on developing ideas and software that should result in a machine capable of learning as well as carrying on an intelligent conversation over a wide range of topics. The underlying idea is to use primitive ideas and construct higher order ideas from these, which can then be easily related one to another.
Temperature Measurement and Numerical Prediction in Machining Inconel 718
Tapetado, Alberto; Vázquez, Carmen; Miguélez, Henar
2017-01-01
Thermal issues are critical when machining Ni-based superalloy components designed for high temperature applications. The low thermal conductivity and extreme strain hardening of this family of materials results in elevated temperatures around the cutting area. This elevated temperature could lead to machining-induced damage such as phase changes and residual stresses, resulting in reduced service life of the component. Measurement of temperature during machining is crucial in order to control the cutting process, avoiding workpiece damage. On the other hand, the development of predictive tools based on numerical models helps in the definition of machining processes and the obtainment of difficult to measure parameters such as the penetration of the heated layer. However, the validation of numerical models strongly depends on the accurate measurement of physical parameters such as temperature, ensuring the calibration of the model. This paper focuses on the measurement and prediction of temperature during the machining of Ni-based superalloys. The temperature sensor was based on a fiber-optic two-color pyrometer developed for localized temperature measurements in turning of Inconel 718. The sensor is capable of measuring temperature in the range of 250 to 1200 °C. Temperature evolution is recorded in a lathe at different feed rates and cutting speeds. Measurements were used to calibrate a simplified numerical model for prediction of temperature fields during turning. PMID:28665312
Teleoperators - Manual/automatic system requirements.
NASA Technical Reports Server (NTRS)
Janow, C.; Malone, T. B.
1973-01-01
The teleoperator is defined as a remotely controlled, cybernetic, man-machine system designed to extend and augment man's sensory, manipulative, and cognitive capabilities. The teleoperator system incorporates the decision making, adaptive intelligence without requiring its presence. The man and the machine work as a team, each contributing unique and significant capabilities, and each depending on the other to achieve a common goal. Some of the more significant requirements associated with the development of teleoperator systems technology for space, industry, and medicine are examined. Emphasis is placed on the requirement to more effectively use the man and the machine in any man-machine system.
Concurrent Image Processing Executive (CIPE)
NASA Technical Reports Server (NTRS)
Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.
1988-01-01
The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.
[Artificial intelligence to assist clinical diagnosis in medicine].
Lugo-Reyes, Saúl Oswaldo; Maldonado-Colín, Guadalupe; Murata, Chiharu
2014-01-01
Medicine is one of the fields of knowledge that would most benefit from a closer interaction with Computer studies and Mathematics by optimizing complex, imperfect processes such as differential diagnosis; this is the domain of Machine Learning, a branch of Artificial Intelligence that builds and studies systems capable of learning from a set of training data, in order to optimize classification and prediction processes. In Mexico during the last few years, progress has been made on the implementation of electronic clinical records, so that the National Institutes of Health already have accumulated a wealth of stored data. For those data to become knowledge, they need to be processed and analyzed through complex statistical methods, as it is already being done in other countries, employing: case-based reasoning, artificial neural networks, Bayesian classifiers, multivariate logistic regression, or support vector machines, among other methodologies; to assist the clinical diagnosis of acute appendicitis, breast cancer and chronic liver disease, among a wide array of maladies. In this review we shift through concepts, antecedents, current examples and methodologies of machine learning-assisted clinical diagnosis.
Fabrication in Space - What Materials are Needed?
NASA Technical Reports Server (NTRS)
Good, J
2007-01-01
In order to sustain life on the moon, and especially on Mars, the inhabitants must be self-sufficient. As on Earth, electronic and mechanical systems will break down and must be repaired. It is not realistic to "send" parts to the moon or Mars in an effort to replace failed ones or have spares for all components. It will be important to have spares on hand and even better would be to have the capability to fabricate parts in situ. The In Situ Fabrication and Repair (ISFR) team is working to develop the Arcam Electron Beam Melting (EBM) machine as the manufacturing process that will have the capability to produce repair parts, as well as new designs, and tooling on the lunar surface and eventually on Mars. What materials will be available for the inhabitants to use? What materials would be most useful? The EBM process is versatile and can handle a multitude of materials. These include titanium, stainless steels, aluminums, inconels, and copper alloys. Research has shown what parts have failed during past space missions and this data has been compiled and assessed. The EBM machine is fully capable of processing these materials of choice. Additionally, the long-term goal is to use the lunar regolith as a viable feedstock. Preliminary work has been performed to assess the feasibility of using raw lunar regolith as a material source or use a binder combined with the regolith to achieve a good melt.
The human role in space (THURIS) applications study. Final briefing
NASA Technical Reports Server (NTRS)
Maybee, George W.
1987-01-01
The THURIS (The Human Role in Space) application is an iterative process involving successive assessments of man/machine mixes in terms of performance, cost and technology to arrive at an optimum man/machine mode for the mission application. The process begins with user inputs which define the mission in terms of an event sequence and performance time requirements. The desired initial operational capability date is also an input requirement. THURIS terms and definitions (e.g., generic activities) are applied to the input data converting it into a form which can be analyzed using the THURIS cost model outputs. The cost model produces tabular and graphical outputs for determining the relative cost-effectiveness of a given man/machine mode and generic activity. A technology database is provided to enable assessment of support equipment availability for selected man/machine modes. If technology gaps exist for an application, the database contains information supportive of further investigation into the relevant technologies. The present study concentrated on testing and enhancing the THURIS cost model and subordinate data files and developing a technology database which interfaces directly with the user via technology readiness displays. This effort has resulted in a more powerful, easy-to-use applications system for optimization of man/machine roles. Volume 1 is an executive summary.
An evolutionary sensor approach for self-organizing production chains
NASA Astrophysics Data System (ADS)
Mocan, M.; Gillich, E. V.; Mituletu, I. C.; Korka, Z. I.
2018-01-01
Industry 4.0 is the actual great step in industrial progress. Convergence of industrial equipment with the power of advanced computing and analysis, low-cost sensing, and new connecting technologies are presumed to bring unexpected advancements in automation, flexibility, and efficiency. In this context, sensors ensure information regarding three essential areas: the number of processed elements, the quality of production and the condition of tools and equipment. To obtain this valuable information, the data resulted from a sensor has to be firstly processed and afterward used by the different stakeholders. If machines are linked together, this information can be employed to organize the production chain with few or without human intervention. We describe here the implementation of a sensor in a milling machine that is part of a simple production chain, capable of providing information regarding the number of manufactured pieces. It is used by the other machines in the production chain, in order to define the type and number of pieces to be manufactured by them and/or to set optimal parameters for their working regime. Secondly, the information achieved by monitoring the machine and manufactured piece dynamic behavior is used to evaluate the product quality. This information is used to warn about the need of maintenance, being transmitted to the specialized department. It is also transmitted to the central unit, in order to reorganize the production by involving other machines or by reconsidering the manufacturing regime of the existing machines. A special attention is drawn on analyzing and classifying the signals acquired via optical sensor from simulated processes.
NASA Astrophysics Data System (ADS)
Masterenko, Dmitry A.; Metel, Alexander S.
2018-03-01
The process capability indices Cp, Cpk are widely used in the modern quality management as statistical measures of the ability of a process to produce output X within specification limits. The customer's requirement to ensure Cp ≥ 1.33 is often applied in contracts. Capability indices estimates may be calculated with the estimates of the mean µ and the variability 6σ, and for it, the quality characteristic in a sample of pieces should be measured. It requires, in turn, using advanced measuring devices and well-qualified staff. From the other hand, quality inspection by attributes, fulfilled with limit gauges (go/no-go) is much simpler and has a higher performance, but it does not give the numerical values of the quality characteristic. The described method allows estimating the mean and the variability of the process on the basis of the results of limit gauge inspection with certain lower limit LCL and upper limit UCL, which separates the pieces into three groups: where X < LCL, number of pieces is n1, where LCL ≤ X < UCL, n2 pieces, and where X ≥ UCL, n3 pieces. So-called Pittman-type estimates, developed by the author, are functions of n1, n2, n3 and allow calculation of the estimated µ and σ. Thus, Cp and Cpk also may be estimated without precise measurements. The estimates can be used in quality inspection of lots of pieces as well as in monitoring and control of the manufacturing process. It is very important for improving quality of articles in machining industry through their tolerance.
30 CFR 56.14115 - Stationary grinding machines.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Stationary grinding machines. 56.14115 Section... Equipment Safety Devices and Maintenance Requirements § 56.14115 Stationary grinding machines. Stationary grinding machines, other than special bit grinders, shall be equipped with— (a) Peripheral hoods capable of...
30 CFR 56.14115 - Stationary grinding machines.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Stationary grinding machines. 56.14115 Section... Equipment Safety Devices and Maintenance Requirements § 56.14115 Stationary grinding machines. Stationary grinding machines, other than special bit grinders, shall be equipped with— (a) Peripheral hoods capable of...
NASA Astrophysics Data System (ADS)
Bui, Van-Hung; Gilles, Patrick; Cohen, Guillaume; Rubio, Walter
2018-05-01
The use of titanium alloys in the aeronautical and high technology domains is widespread. The high strength and the low mass are two outstanding characteristics of titanium alloys which permit to produce parts for these domains. As other hard materials, it is challenging to generate 3D surfaces (e.g. pockets) when using conventional cutting methods. The development of Abrasive Water Jet Machining (AWJM) technology shows the capability to cut any kind of materials and it seems to be a good solution for such titanium materials with low specific force, low deformation of parts and low thermal shocks. Applying this technology for generating 3D surfaces requires to adopt a modelling approach. However, a general methodology results in complex models due to a lot of parameters of the machining process and based on numerous experiments. This study introduces an extended geometry model of an elementary pass when changing the firing angle during machining Ti-6AL-4V titanium alloy with a given machine configuration. Several experiments are conducted to observe the influence of major kinematic operating parameters, i.e. jet inclination angle (α) (perpendicular to the feed direction) and traverse speed (Vf). The material exposure time and the erosion capability of abrasives particles are affected directly by a variation of the traverse speed (Vf) and firing angle (α). These variations lead to different erosion rates along the kerf profile characterized by the depth and width of cut. A comparison demonstrated an efficiency of the proposed model for depth and width of elementary passes. Based on knowledge of the influence of both firing angle and traverse speed on the elementary pass shape, the proposed model allows to develop the simulation of AWJM process and paves a way for milling flat bottom pockets and 3D complex shapes.
Kolb, Brian; Lentz, Levi C.; Kolpak, Alexie M.
2017-04-26
Modern ab initio methods have rapidly increased our understanding of solid state materials properties, chemical reactions, and the quantum interactions between atoms. However, poor scaling often renders direct ab initio calculations intractable for large or complex systems. There are two obvious avenues through which to remedy this problem: (i) develop new, less expensive methods to calculate system properties, or (ii) make existing methods faster. This paper describes an open source framework designed to pursue both of these avenues. PROPhet (short for PROPerty Prophet) utilizes machine learning techniques to find complex, non-linear mappings between sets of material or system properties. Themore » result is a single code capable of learning analytical potentials, non-linear density functionals, and other structure-property or property-property relationships. These capabilities enable highly accurate mesoscopic simulations, facilitate computation of expensive properties, and enable the development of predictive models for systematic materials design and optimization. Here, this work explores the coupling of machine learning to ab initio methods through means both familiar (e.g., the creation of various potentials and energy functionals) and less familiar (e.g., the creation of density functionals for arbitrary properties), serving both to demonstrate PROPhet’s ability to create exciting post-processing analysis tools and to open the door to improving ab initio methods themselves with these powerful machine learning techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolb, Brian; Lentz, Levi C.; Kolpak, Alexie M.
Modern ab initio methods have rapidly increased our understanding of solid state materials properties, chemical reactions, and the quantum interactions between atoms. However, poor scaling often renders direct ab initio calculations intractable for large or complex systems. There are two obvious avenues through which to remedy this problem: (i) develop new, less expensive methods to calculate system properties, or (ii) make existing methods faster. This paper describes an open source framework designed to pursue both of these avenues. PROPhet (short for PROPerty Prophet) utilizes machine learning techniques to find complex, non-linear mappings between sets of material or system properties. Themore » result is a single code capable of learning analytical potentials, non-linear density functionals, and other structure-property or property-property relationships. These capabilities enable highly accurate mesoscopic simulations, facilitate computation of expensive properties, and enable the development of predictive models for systematic materials design and optimization. Here, this work explores the coupling of machine learning to ab initio methods through means both familiar (e.g., the creation of various potentials and energy functionals) and less familiar (e.g., the creation of density functionals for arbitrary properties), serving both to demonstrate PROPhet’s ability to create exciting post-processing analysis tools and to open the door to improving ab initio methods themselves with these powerful machine learning techniques.« less
Next Generation Loading System for Detonators and Primers
Designed , fabricated and installed next generation tooling to provide additional manufacturing capabilities for new detonators and other small...prototype munitions on automated, semi-automated and manual machines. Lead design effort, procured and installed a primary explosive Drying Oven for a pilot...facility. Designed , fabricated and installed a Primary Explosives Waste Treatment System in a pilot environmental processing facility. Designed
Makinde, O A; Mpofu, K; Vrabic, R; Ramatsetse, B I
2017-01-01
The development of a robotic-driven maintenance solution capable of automatically maintaining reconfigurable vibrating screen (RVS) machine when utilized in dangerous and hazardous underground mining environment has called for the design of a multifunctional robotic end-effector capable of carrying out all the maintenance tasks on the RVS machine. In view of this, the paper presents a bio-inspired approach which unfolds the design of a novel multifunctional robotic end-effector embedded with mechanical and control mechanisms capable of automatically maintaining the RVS machine. To achieve this, therblig and morphological methodologies (which classifies the motions as well as the actions required by the robotic end-effector in carrying out RVS machine maintenance tasks), obtained from a detailed analogy of how human being (i.e. a machine maintenance manager) will carry out different maintenance tasks on the RVS machine, were used to obtain the maintenance objective functions or goals of the multifunctional robotic end-effector as well as the maintenance activity constraints of the RVS machine that must be adhered to by the multifunctional robotic end-effector during the machine maintenance. The results of the therblig and morphological analyses of five (5) different maintenance tasks capture and classify one hundred and thirty-four (134) repetitive motions and fifty-four (54) functions required in automating the maintenance tasks of the RVS machine. Based on these findings, a worm-gear mechanism embedded with fingers extruded with a hexagonal shaped heads capable of carrying out the "gripping and ungrasping" and "loosening and bolting" functions of the robotic end-effector and an electric cylinder actuator module capable of carrying out "unpinning and hammering" functions of the robotic end-effector were integrated together to produce the customized multifunctional robotic end-effector capable of automatically maintaining the RVS machine. The axial forces ([Formula: see text] and [Formula: see text]), normal forces ([Formula: see text]) and total load [Formula: see text] acting on the teeth of the worm-gear module of the multifunctional robotic end-effector during the gripping of worn-out or new RVS machine subsystems, which are 978.547, 1245.06 and 1016.406 N, respectively, were satisfactory. The nominal bending and torsional stresses acting on the shoulder of the socket module of the multifunctional robotic end-effector during the loosing and tightening of bolts, which are 1450.72 and 179.523 MPa, respectively, were satisfactory. The hammering and unpinning forces utilized by the electric cylinder actuator module of the multifunctional robotic end-effector during the unpinning and hammering of screen panel pins out of and into the screen panels were satisfactory.
High productivity mould robotic milling in Al-5083
NASA Astrophysics Data System (ADS)
Urresti, Iker; Arrazola, Pedro Jose; Ørskov, Klaus Bonde; Pelegay, Jose Angel
2018-05-01
Industrial serial robots were usually limited to welding, handling or spray painting operations until very recent years. However, some industries have already realized about their important capabilities in terms of flexibility, working space, adaptability and cost. Hence, currently they are seriously being considered to carry out certain metal machining tasks. Therefore, robot based machining is presented as a cost-saving and flexible manufacturing alternative compared to conventional CNC machines especially for roughing or even pre-roughing of large parts. Nevertheless, there are still some drawbacks usually referred as low rigidity, accuracy and repeatability. Thus, the process productivity is usually sacrificed getting low Material Removal Rates (MRR), and consequently not being competitive. Nevertheless, in this paper different techniques to obtain increased productivity are presented, though an appropriate selection of cutting strategies and parameters that are essential for it. During this research some rough milling tests in Al-5083 are presented where High Feed Milling (HFM) is implemented as productive cutting strategy and the experimental modal analysis named Tap-testing is used for the suitable choice of cutting conditions. Competitive productivity rates are experienced while process stability is checked through the cutting forces measurements in order to prove the effectiveness of the experimental modal analysis for robotic machining.
Man-machine interactive imaging and data processing using high-speed digital mass storage
NASA Technical Reports Server (NTRS)
Alsberg, H.; Nathan, R.
1975-01-01
The role of vision in teleoperation has been recognized as an important element in the man-machine control loop. In most applications of remote manipulation, direct vision cannot be used. To overcome this handicap, the human operator's control capabilities are augmented by a television system. This medium provides a practical and useful link between workspace and the control station from which the operator perform his tasks. Human performance deteriorates when the images are degraded as a result of instrumental and transmission limitations. Image enhancement is used to bring out selected qualities in a picture to increase the perception of the observer. A general purpose digital computer, an extensive special purpose software system is used to perform an almost unlimited repertoire of processing operations.
Analysis of acoustic emission during abrasive waterjet machining of sheet metals
NASA Astrophysics Data System (ADS)
Mokhtar, Nazrin; Gebremariam, MA; Zohari, H.; Azhari, Azmir
2018-04-01
The present paper reports on the analysis of acoustic emission (AE) produced during abrasive waterjet (AWJ) machining process. This paper focuses on the relationship of AE and surface quality of sheet metals. The changes in acoustic emission signals recorded by the mean of power spectral density (PSD) via covariance method in relation to the surface quality of the cut are discussed. The test was made using two materials for comparison namely aluminium 6061 and stainless steel 304 with five different feed rates. The acoustic emission data were captured by Labview and later processed using MATLAB software. The results show that the AE spectrums correlated with different feed rates and surface qualities. It can be concluded that the AE is capable of monitoring the changes of feed rate and surface quality.
Understanding overlay signatures using machine learning on non-lithography context information
NASA Astrophysics Data System (ADS)
Overcast, Marshall; Mellegaard, Corey; Daniel, David; Habets, Boris; Erley, Georg; Guhlemann, Steffen; Thrun, Xaver; Buhl, Stefan; Tottewitz, Steven
2018-03-01
Overlay errors between two layers can be caused by non-lithography processes. While these errors can be compensated by the run-to-run system, such process and tool signatures are not always stable. In order to monitor the impact of non-lithography context on overlay at regular intervals, a systematic approach is needed. Using various machine learning techniques, significant context parameters that relate to deviating overlay signatures are automatically identified. Once the most influential context parameters are found, a run-to-run simulation is performed to see how much improvement can be obtained. The resulting analysis shows good potential for reducing the influence of hidden context parameters on overlay performance. Non-lithographic contexts are significant contributors, and their automatic detection and classification will enable the overlay roadmap, given the corresponding control capabilities.
NASA Technical Reports Server (NTRS)
Nettles, A. T.; Tucker, D. S.; Patterson, W. J.; Franklin, S. W.; Gordon, G. H.; Hart, L.; Hodge, A. J.; Lance, D. G.; Russel, S. S.
1991-01-01
A test run was performed on IM6/3501-6 carbon-epoxy in which the material was processed, machined into specimens, and tested for damage tolerance capabilities. Nondestructive test data played a major role in this element of composite characterization. A time chart was produced showing the time the composite material spent within each Branch or Division in order to identify those areas which produce a long turnaround time. Instrumented drop weight testing was performed on the specimens with nondestructive evaluation being performed before and after the impacts. Destructive testing in the form of cross-sectional photomicrography and compression-after-impact testing were used. Results show that the processing and machining steps needed to be performed more rapidly if data on composite material is to be collected within a reasonable timeframe. The results of the damage tolerance testing showed that IM6/3501-6 is a brittle material that is very susceptible to impact damage.
Considerations on the construction of a Powder Bed Fusion platform for Additive Manufacturing
NASA Astrophysics Data System (ADS)
Andersen, Sebastian Aagaard; Nielsen, Karl-Emil; Pedersen, David Bue; Nielsen, Jakob Skov
As the demand for moulds and other tools becomes increasingly specific and complex, an additive manufacturing approach to production is making its way to the industry through laser based consolidation of metal powder particles by a method known as powder bed fusion. This paper concerns a variety of design choices facilitating the development of an experimental powder bed fusion machine tool, capable of manufacturing metal parts with strength matching that of conventional manufactured parts and a complexity surpassing that of subtractive processes. To understand the different mechanisms acting within such an experimental machine tool, a fully open and customizable rig is constructed. Emphasizing modularity in the rig, allows alternation of lasers, scanner systems, optical elements, powder deposition, layer height, temperature, atmosphere, and powder type. Through a custom-made software platform, control of the process is achieved, which extends into a graphical user interface, easing adjustment of process parameters and the job file generation.
Characterization of the Temperature Capabilities of Advanced Disk Alloy ME3
NASA Technical Reports Server (NTRS)
Gabb, Timothy P.; Telesman, Jack; Kantzos, Peter T.; OConnor, Kenneth
2002-01-01
The successful development of an advanced powder metallurgy disk alloy, ME3, was initiated in the NASA High Speed Research/Enabling Propulsion Materials (HSR/EPM) Compressor/Turbine Disk program in cooperation with General Electric Engine Company and Pratt & Whitney Aircraft Engines. This alloy was designed using statistical screening and optimization of composition and processing variables to have extended durability at 1200 F in large disks. Disks of this alloy were produced at the conclusion of the program using a realistic scaled-up disk shape and processing to enable demonstration of these properties. The objective of the Ultra-Efficient Engine Technologies disk program was to assess the mechanical properties of these ME3 disks as functions of temperature in order to estimate the maximum temperature capabilities of this advanced alloy. These disks were sectioned, machined into specimens, and extensively tested. Additional sub-scale disks and blanks were processed and selectively tested to explore the effects of several processing variations on mechanical properties. Results indicate the baseline ME3 alloy and process can produce 1300 to 1350 F temperature capabilities, dependent on detailed disk and engine design property requirements.
Nano Mechanical Machining Using AFM Probe
NASA Astrophysics Data System (ADS)
Mostofa, Md. Golam
Complex miniaturized components with high form accuracy will play key roles in the future development of many products, as they provide portability, disposability, lower material consumption in production, low power consumption during operation, lower sample requirements for testing, and higher heat transfer due to their very high surface-to-volume ratio. Given the high market demand for such micro and nano featured components, different manufacturing methods have been developed for their fabrication. Some of the common technologies in micro/nano fabrication are photolithography, electron beam lithography, X-ray lithography and other semiconductor processing techniques. Although these methods are capable of fabricating micro/nano structures with a resolution of less than a few nanometers, some of the shortcomings associated with these methods, such as high production costs for customized products, limited material choices, necessitate the development of other fabricating techniques. Micro/nano mechanical machining, such an atomic force microscope (AFM) probe based nano fabrication, has, therefore, been used to overcome some the major restrictions of the traditional processes. This technique removes material from the workpiece by engaging micro/nano size cutting tool (i.e. AFM probe) and is applicable on a wider range of materials compared to the photolithographic process. In spite of the unique benefits of nano mechanical machining, there are also some challenges with this technique, since the scale is reduced, such as size effects, burr formations, chip adhesions, fragility of tools and tool wear. Moreover, AFM based machining does not have any rotational movement, which makes fabrication of 3D features more difficult. Thus, vibration-assisted machining is introduced into AFM probe based nano mechanical machining to overcome the limitations associated with the conventional AFM probe based scratching method. Vibration-assisted machining reduced the cutting forces and burr formations through intermittent cutting. Combining the AFM probe based machining with vibration-assisted machining enhanced nano mechanical machining processes by improving the accuracy, productivity and surface finishes. In this study, several scratching tests are performed with a single crystal diamond AFM probe to investigate the cutting characteristics and model the ploughing cutting forces. Calibration of the probe for lateral force measurements, which is essential, is also extended through the force balance method. Furthermore, vibration-assisted machining system is developed and applied to fabricate different materials to overcome some of the limitations of the AFM probe based single point nano mechanical machining. The novelty of this study includes the application of vibration-assisted AFM probe based nano scale machining to fabricate micro/nano scale features, calibration of an AFM by considering different factors, and the investigation of the nano scale material removal process from a different perspective.
Technologies for developing an advanced intelligent ATM with self-defence capabilities
NASA Astrophysics Data System (ADS)
Sako, Hiroshi
2010-01-01
We have developed several technologies for protecting automated teller machines. These technologies are based mainly on pattern recognition and are used to implement various self-defence functions. They include (i) banknote recognition and information retrieval for preventing machines from accepting counterfeit and damaged banknotes and for retrieving information about detected counterfeits from a relational database, (ii) form processing and character recognition for preventing machines from accepting remittance forms without due dates and/or insufficient payment, (iii) person identification to prevent machines from transacting with non-customers, and (iv) object recognition to guard machines against foreign objects such as spy cams that might be surreptitiously attached to them and to protect users against someone attempting to peek at their user information such as their personal identification number. The person identification technology has been implemented in most ATMs in Japan, and field tests have demonstrated that the banknote recognition technology can recognise more then 200 types of banknote from 30 different countries. We are developing an "advanced intelligent ATM" that incorporates all of these technologies.
Rapid performance modeling and parameter regression of geodynamic models
NASA Astrophysics Data System (ADS)
Brown, J.; Duplyakin, D.
2016-12-01
Geodynamic models run in a parallel environment have many parameters with complicated effects on performance and scientifically-relevant functionals. Manually choosing an efficient machine configuration and mapping out the parameter space requires a great deal of expert knowledge and time-consuming experiments. We propose an active learning technique based on Gaussion Process Regression to automatically select experiments to map out the performance landscape with respect to scientific and machine parameters. The resulting performance model is then used to select optimal experiments for improving the accuracy of a reduced order model per unit of computational cost. We present the framework and evaluate its quality and capability using popular lithospheric dynamics models.
Laboratory process control using natural language commands from a personal computer
NASA Technical Reports Server (NTRS)
Will, Herbert A.; Mackin, Michael A.
1989-01-01
PC software is described which provides flexible natural language process control capability with an IBM PC or compatible machine. Hardware requirements include the PC, and suitable hardware interfaces to all controlled devices. Software required includes the Microsoft Disk Operating System (MS-DOS) operating system, a PC-based FORTRAN-77 compiler, and user-written device drivers. Instructions for use of the software are given as well as a description of an application of the system.
Machine learning for outcome prediction of acute ischemic stroke post intra-arterial therapy.
Asadi, Hamed; Dowling, Richard; Yan, Bernard; Mitchell, Peter
2014-01-01
Stroke is a major cause of death and disability. Accurately predicting stroke outcome from a set of predictive variables may identify high-risk patients and guide treatment approaches, leading to decreased morbidity. Logistic regression models allow for the identification and validation of predictive variables. However, advanced machine learning algorithms offer an alternative, in particular, for large-scale multi-institutional data, with the advantage of easily incorporating newly available data to improve prediction performance. Our aim was to design and compare different machine learning methods, capable of predicting the outcome of endovascular intervention in acute anterior circulation ischaemic stroke. We conducted a retrospective study of a prospectively collected database of acute ischaemic stroke treated by endovascular intervention. Using SPSS®, MATLAB®, and Rapidminer®, classical statistics as well as artificial neural network and support vector algorithms were applied to design a supervised machine capable of classifying these predictors into potential good and poor outcomes. These algorithms were trained, validated and tested using randomly divided data. We included 107 consecutive acute anterior circulation ischaemic stroke patients treated by endovascular technique. Sixty-six were male and the mean age of 65.3. All the available demographic, procedural and clinical factors were included into the models. The final confusion matrix of the neural network, demonstrated an overall congruency of ∼ 80% between the target and output classes, with favourable receiving operative characteristics. However, after optimisation, the support vector machine had a relatively better performance, with a root mean squared error of 2.064 (SD: ± 0.408). We showed promising accuracy of outcome prediction, using supervised machine learning algorithms, with potential for incorporation of larger multicenter datasets, likely further improving prediction. Finally, we propose that a robust machine learning system can potentially optimise the selection process for endovascular versus medical treatment in the management of acute stroke.
Controlling corrosion rate of Magnesium alloy using powder mixed electrical discharge machining
NASA Astrophysics Data System (ADS)
Razak, M. A.; Rani, A. M. A.; Saad, N. M.; Littlefair, G.; Aliyu, A. A.
2018-04-01
Biomedical implant can be divided into permanent and temporary employment. The duration of a temporary implant applied to children and adult is different due to different bone healing rate among the children and adult. Magnesium and its alloys are compatible for the biodegradable implanting application. Nevertheless, it is difficult to control the degradation rate of magnesium alloy to suit the application on both the children and adult. Powder mixed electrical discharge machining (PM-EDM) method, a modified EDM process, has high capability to improve the EDM process efficiency and machined surface quality. The objective of this paper is to establish a formula to control the degradation rate of magnesium alloy using the PM-EDM method. The different corrosion rate of machined surface is hypothesized to be obtained by having different combinations of PM-EDM operation inputs. PM-EDM experiments are conducted using an opened-loop PM-EDM system and the in-vitro corrosion tests are carried out on the machined surface of each specimen. There are four operation inputs investigated in this study which are zinc powder concentration, peak current, pulse on-time and pulse off-time. The results indicate that zinc powder concentration is significantly affecting the response with 2 g/l of zinc powder concentration obtaining the lowest corrosion rate. The high localized temperature at the cutting zone in spark erosion process causes some of the zinc particles get deposited on the machined surface, hence improving the surface characteristics. The suspended zinc particles in the dielectric fluid have also improve the sparking efficiency and the uniformity of sparks distribution. From the statistical analysis, a formula was developed to control the corrosion rate of magnesium alloy within the range from 0.000183 mm/year to 0.001528 mm/year.
Oweiss, Karim G
2006-07-01
This paper suggests a new approach for data compression during extracutaneous transmission of neural signals recorded by high-density microelectrode array in the cortex. The approach is based on exploiting the temporal and spatial characteristics of the neural recordings in order to strip the redundancy and infer the useful information early in the data stream. The proposed signal processing algorithms augment current filtering and amplification capability and may be a viable replacement to on chip spike detection and sorting currently employed to remedy the bandwidth limitations. Temporal processing is devised by exploiting the sparseness capabilities of the discrete wavelet transform, while spatial processing exploits the reduction in the number of physical channels through quasi-periodic eigendecomposition of the data covariance matrix. Our results demonstrate that substantial improvements are obtained in terms of lower transmission bandwidth, reduced latency and optimized processor utilization. We also demonstrate the improvements qualitatively in terms of superior denoising capabilities and higher fidelity of the obtained signals.
49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.
Code of Federal Regulations, 2010 CFR
2010-10-01
... windshield wipers are incompatible with the windshield material; (5) A machine braking system capable of... Roadway Maintenance Machines and Hi-Rail Vehicles § 214.507 Required safety equipment for new on-track...
Improving the reliability of inverter-based welding machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiedermayer, M.
1997-02-01
Although inverter-based welding power sources have been available since the late 1980s, many people hesitated to purchase them because of reliability issues. Unfortunately, their hesitancy had a basis, until now. Recent improvements give some inverters a reliability level that approaches that of traditional, transformer-based industrial welding machines, which have a failure rate of about 1%. Acceptance of inverter-based welding machines is important because, for many welding applications, they provide capabilities that solid-state, transformer-based machines cannot deliver. These advantages include enhanced pulsed gas metal arc welding (GMAW-P), lightweight portability, an ultrastable arc, and energy efficiency--all while producing highly aesthetic weld beadsmore » and delivering multiprocess capabilities.« less
Biomimetic machine vision system.
Harman, William M; Barrett, Steven F; Wright, Cameron H G; Wilcox, Michael
2005-01-01
Real-time application of digital imaging for use in machine vision systems has proven to be prohibitive when used within control systems that employ low-power single processors without compromising the scope of vision or resolution of captured images. Development of a real-time machine analog vision system is the focus of research taking place at the University of Wyoming. This new vision system is based upon the biological vision system of the common house fly. Development of a single sensor is accomplished, representing a single facet of the fly's eye. This new sensor is then incorporated into an array of sensors capable of detecting objects and tracking motion in 2-D space. This system "preprocesses" incoming image data resulting in minimal data processing to determine the location of a target object. Due to the nature of the sensors in the array, hyperacuity is achieved thereby eliminating resolutions issues found in digital vision systems. In this paper, we will discuss the biological traits of the fly eye and the specific traits that led to the development of this machine vision system. We will also discuss the process of developing an analog based sensor that mimics the characteristics of interest in the biological vision system. This paper will conclude with a discussion of how an array of these sensors can be applied toward solving real-world machine vision issues.
Mamdani-Fuzzy Modeling Approach for Quality Prediction of Non-Linear Laser Lathing Process
NASA Astrophysics Data System (ADS)
Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.
2018-03-01
Lathing is a process to fashioning stock materials into desired cylindrical shapes which usually performed by traditional lathe machine. But, the recent rapid advancements in engineering materials and precision demand gives a great challenge to the traditional method. The main drawback of conventional lathe is its mechanical contact which brings to the undesirable tool wear, heat affected zone, finishing, and dimensional accuracy especially taper quality in machining of stock with high length to diameter ratio. Therefore, a novel approach has been devised to investigate in transforming a 2D flatbed CO2 laser cutting machine into 3D laser lathing capability as an alternative solution. Three significant design parameters were selected for this experiment, namely cutting speed, spinning speed, and depth of cut. Total of 24 experiments were performed with eight (8) sequential runs where they were then replicated three (3) times. The experimental results were then used to establish Mamdani - Fuzzy predictive model where it yields the accuracy of more than 95%. Thus, the proposed Mamdani - Fuzzy modelling approach is found very much suitable and practical for quality prediction of non-linear laser lathing process for cylindrical stocks of 10mm diameter.
Geological applications of machine learning on hyperspectral remote sensing data
NASA Astrophysics Data System (ADS)
Tse, C. H.; Li, Yi-liang; Lam, Edmund Y.
2015-02-01
The CRISM imaging spectrometer orbiting Mars has been producing a vast amount of data in the visible to infrared wavelengths in the form of hyperspectral data cubes. These data, compared with those obtained from previous remote sensing techniques, yield an unprecedented level of detailed spectral resolution in additional to an ever increasing level of spatial information. A major challenge brought about by the data is the burden of processing and interpreting these datasets and extract the relevant information from it. This research aims at approaching the challenge by exploring machine learning methods especially unsupervised learning to achieve cluster density estimation and classification, and ultimately devising an efficient means leading to identification of minerals. A set of software tools have been constructed by Python to access and experiment with CRISM hyperspectral cubes selected from two specific Mars locations. A machine learning pipeline is proposed and unsupervised learning methods were implemented onto pre-processed datasets. The resulting data clusters are compared with the published ASTER spectral library and browse data products from the Planetary Data System (PDS). The result demonstrated that this approach is capable of processing the huge amount of hyperspectral data and potentially providing guidance to scientists for more detailed studies.
Additive Manufacturing Design Considerations for Liquid Engine Components
NASA Technical Reports Server (NTRS)
Whitten, Dave; Hissam, Andy; Baker, Kevin; Rice, Darron
2014-01-01
The Marshall Space Flight Center's Propulsion Systems Department has gained significant experience in the last year designing, building, and testing liquid engine components using additive manufacturing. The department has developed valve, duct, turbo-machinery, and combustion device components using this technology. Many valuable lessons were learned during this process. These lessons will be the focus of this presentation. We will present criteria for selecting part candidates for additive manufacturing. Some part characteristics are 'tailor made' for this process. Selecting the right parts for the process is the first step to maximizing productivity gains. We will also present specific lessons we learned about feature geometry that can and cannot be produced using additive manufacturing machines. Most liquid engine components were made using a two-step process. The base part was made using additive manufacturing and then traditional machining processes were used to produce the final part. The presentation will describe design accommodations needed to make the base part and lessons we learned about which features could be built directly and which require the final machine process. Tolerance capabilities, surface finish, and material thickness allowances will also be covered. Additive Manufacturing can produce internal passages that cannot be made using traditional approaches. It can also eliminate a significant amount of manpower by reducing part count and leveraging model-based design and analysis techniques. Information will be shared about performance enhancements and design efficiencies we experienced for certain categories of engine parts.
Maximum-Likelihood Estimation With a Contracting-Grid Search Algorithm
Hesterman, Jacob Y.; Caucci, Luca; Kupinski, Matthew A.; Barrett, Harrison H.; Furenlid, Lars R.
2010-01-01
A fast search algorithm capable of operating in multi-dimensional spaces is introduced. As a sample application, we demonstrate its utility in the 2D and 3D maximum-likelihood position-estimation problem that arises in the processing of PMT signals to derive interaction locations in compact gamma cameras. We demonstrate that the algorithm can be parallelized in pipelines, and thereby efficiently implemented in specialized hardware, such as field-programmable gate arrays (FPGAs). A 2D implementation of the algorithm is achieved in Cell/BE processors, resulting in processing speeds above one million events per second, which is a 20× increase in speed over a conventional desktop machine. Graphics processing units (GPUs) are used for a 3D application of the algorithm, resulting in processing speeds of nearly 250,000 events per second which is a 250× increase in speed over a conventional desktop machine. These implementations indicate the viability of the algorithm for use in real-time imaging applications. PMID:20824155
Performance Optimization Control of ECH using Fuzzy Inference Application
NASA Astrophysics Data System (ADS)
Dubey, Abhay Kumar
Electro-chemical honing (ECH) is a hybrid electrolytic precision micro-finishing technology that, by combining physico-chemical actions of electro-chemical machining and conventional honing processes, provides the controlled functional surfaces-generation and fast material removal capabilities in a single operation. Process multi-performance optimization has become vital for utilizing full potential of manufacturing processes to meet the challenging requirements being placed on the surface quality, size, tolerances and production rate of engineering components in this globally competitive scenario. This paper presents an strategy that integrates the Taguchi matrix experimental design, analysis of variances and fuzzy inference system (FIS) to formulate a robust practical multi-performance optimization methodology for complex manufacturing processes like ECH, which involve several control variables. Two methodologies one using a genetic algorithm tuning of FIS (GA-tuned FIS) and another using an adaptive network based fuzzy inference system (ANFIS) have been evaluated for a multi-performance optimization case study of ECH. The actual experimental results confirm their potential for a wide range of machining conditions employed in ECH.
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
Automatic welding systems for large ship hulls
NASA Astrophysics Data System (ADS)
Arregi, B.; Granados, S.; Hascoet, JY.; Hamilton, K.; Alonso, M.; Ares, E.
2012-04-01
Welding processes represents about 40% of the total production time in shipbuilding. Although most of the indoor welding work is automated, outdoor operations still require the involvement of numerous operators. To automate hull welding operations is a priority in large shipyards. The objective of the present work is to develop a comprehensive welding system capable of working with several welding layers in an automated way. There are several difficulties for the seam tracking automation of the welding process. The proposed solution is the development of a welding machine capable of moving autonomously along the welding seam, controlling both the position of the torch and the welding parameters to adjust the thickness of the weld bead to the actual gap between the hull plates.
Machine learning for the New York City power grid.
Rudin, Cynthia; Waltz, David; Anderson, Roger N; Boulanger, Albert; Salleb-Aouissi, Ansaf; Chow, Maggie; Dutta, Haimonti; Gross, Philip N; Huang, Bert; Ierome, Steve; Isaac, Delfina F; Kressner, Arthur; Passonneau, Rebecca J; Radeva, Axinia; Wu, Leon
2012-02-01
Power companies can benefit from the use of knowledge discovery methods and statistical machine learning for preventive maintenance. We introduce a general process for transforming historical electrical grid data into models that aim to predict the risk of failures for components and systems. These models can be used directly by power companies to assist with prioritization of maintenance and repair work. Specialized versions of this process are used to produce 1) feeder failure rankings, 2) cable, joint, terminator, and transformer rankings, 3) feeder Mean Time Between Failure (MTBF) estimates, and 4) manhole events vulnerability rankings. The process in its most general form can handle diverse, noisy, sources that are historical (static), semi-real-time, or realtime, incorporates state-of-the-art machine learning algorithms for prioritization (supervised ranking or MTBF), and includes an evaluation of results via cross-validation and blind test. Above and beyond the ranked lists and MTBF estimates are business management interfaces that allow the prediction capability to be integrated directly into corporate planning and decision support; such interfaces rely on several important properties of our general modeling approach: that machine learning features are meaningful to domain experts, that the processing of data is transparent, and that prediction results are accurate enough to support sound decision making. We discuss the challenges in working with historical electrical grid data that were not designed for predictive purposes. The “rawness” of these data contrasts with the accuracy of the statistical models that can be obtained from the process; these models are sufficiently accurate to assist in maintaining New York City’s electrical grid.
A VHDL Core for Intrinsic Evolution of Discrete Time Filters with Signal Feedback
NASA Technical Reports Server (NTRS)
Gwaltney, David A.; Dutton, Kenneth
2005-01-01
The design of an Evolvable Machine VHDL Core is presented, representing a discrete-time processing structure capable of supporting control system applications. This VHDL Core is implemented in an FPGA and is interfaced with an evolutionary algorithm implemented in firmware on a Digital Signal Processor (DSP) to create an evolvable system platform. The salient features of this architecture are presented. The capability to implement IIR filter structures is presented along with the results of the intrinsic evolution of a filter. The robustness of the evolved filter design is tested and its unique characteristics are described.
Simulation on turning aspheric surface method via oscillating feed
NASA Astrophysics Data System (ADS)
Kong, Fanxing; Li, Zengqiang; Sun, Tao
2014-08-01
It is quite difficult to manufacturing optical components, the combination of high gradient ellipsoid and hyperboloid, with high machining surface requirements. To solve the problem, in this paper we present a turning and forming method via oscillating feed of R-θ layout lathe, analyze machining ellipsoid segment and hyperboloid segment separately through oscillating feed. Also calculate parameters on each trajectory during processing respectively and obtain displacement, velocity, acceleration and other parameters. The simulation result shows that this rotary turning method is capable of ensuring that the cutter is on the equidistance line of meridian cross section curve of work piece during processing high gradient aspheric surface, which helps getting high quality surface. Also the method provides a new approach and a theory basis for manufacturing high quality aspheric surface and extending function of the available twin-spindle lathe as well.
Freeform Optics: current challenges for future serial production
NASA Astrophysics Data System (ADS)
Schindler, C.; Köhler, T.; Roth, E.
2017-10-01
One of the major developments in optics industry recently is the commercial manufacturing of freeform surfaces for optical mid- and high performance systems. The loss of limitation on rotational symmetry enables completely new optical design solutions - but causes completely new challenges for the manufacturer too. Adapting the serial production from radial-symmetric to freeform optics cannot be done just by the extension of machine capabilities and software for every process step. New solutions for conventional optics productions or completely new process chains are necessary.
Strain-free polished channel-cut crystal monochromators: a new approach and results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasman, Elina; Montgomery, Jonathan; Huang, XianRong
The use of channel-cut crystal monochromators has been traditionally limited to applications that can tolerate the rough surface quality from wet etching without polishing. We have previously presented and discussed the motivation for producing channel cut crystals with strain-free polished surfaces [1]. Afterwards, we have undertaken an effort to design and implement an automated machine for polishing channel-cut crystals. The initial effort led to inefficient results. Since then, we conceptualized, designed, and implemented a new version of the channel-cut polishing machine, now called C-CHiRP (Channel-Cut High Resolution Polisher), also known as CCPM V2.0. The new machine design no longer utilizesmore » Figure-8 motion that mimics manual polishing. Instead, the polishing is achieved by a combination of rotary and linear functions of two coordinated motion systems. Here we present the new design of C-CHiRP, its capabilities and features. Multiple channel-cut crystals polished using the C-CHiRP have been deployed into several beamlines at the Advanced Photon Source (APS). We present the measurements of surface finish, flatness, as well as topography results obtained at 1-BM of APS, as compared with results typically achieved when polishing flat-surface monochromator crystals using conventional polishing processes. Limitations of the current machine design, capabilities and considerations for strain-free polishing of highly complex crystals are also discussed, together with an outlook for future developments and improvements.« less
DoD Autonomy Roadmap: Autonomy Community of Interest
2015-03-24
Initiative 27 Exploiting Priming Effects Team (Navy) Develop machine perception relatable to the manner in which a human perceives the ...and trust among the team members; understanding of each member’s tasks, intentions, capabilities, and progress; and ensuring effective and timely...learning capabilities to greatly reduce the need for human interventions, while enabling effective teaming with the warfighter Machine Perception
High resolution image processing on low-cost microcomputers
NASA Technical Reports Server (NTRS)
Miller, R. L.
1993-01-01
Recent advances in microcomputer technology have resulted in systems that rival the speed, storage, and display capabilities of traditionally larger machines. Low-cost microcomputers can provide a powerful environment for image processing. A new software program which offers sophisticated image display and analysis on IBM-based systems is presented. Designed specifically for a microcomputer, this program provides a wide-range of functions normally found only on dedicated graphics systems, and therefore can provide most students, universities and research groups with an affordable computer platform for processing digital images. The processing of AVHRR images within this environment is presented as an example.
Development and fabrication of a solar cell junction processing system
NASA Technical Reports Server (NTRS)
1984-01-01
A processing system capable of producing solar cell junctions by ion implantation followed by pulsed electron beam annealing was developed and constructed. The machine was to be capable of processing 4-inch diameter single-crystal wafers at a rate of 10(7) wafers per year. A microcomputer-controlled pulsed electron beam annealer with a vacuum interlocked wafer transport system was designed, built and demonstrated to produce solar cell junctions on 4-inch wafers with an AMI efficiency of 12%. Experiments showed that a non-mass-analyzed (NMA) ion beam could implant 10 keV phosphorous dopant to form solar cell junctions which were equivalent to mass-analyzed implants. A NMA ion implanter, compatible with the pulsed electron beam annealer and wafer transport system was designed in detail but was not built because of program termination.
Thermal-mechanical modeling of laser ablation hybrid machining
NASA Astrophysics Data System (ADS)
Matin, Mohammad Kaiser
2001-08-01
Hard, brittle and wear-resistant materials like ceramics pose a problem when being machined using conventional machining processes. Machining ceramics even with a diamond cutting tool is very difficult and costly. Near net-shape processes, like laser evaporation, produce micro-cracks that require extra finishing. Thus it is anticipated that ceramic machining will have to continue to be explored with new-sprung techniques before ceramic materials become commonplace. This numerical investigation results from the numerical simulations of the thermal and mechanical modeling of simultaneous material removal from hard-to-machine materials using both laser ablation and conventional tool cutting utilizing the finite element method. The model is formulated using a two dimensional, planar, computational domain. The process simulation acronymed, LAHM (Laser Ablation Hybrid Machining), uses laser energy for two purposes. The first purpose is to remove the material by ablation. The second purpose is to heat the unremoved material that lies below the ablated material in order to ``soften'' it. The softened material is then simultaneously removed by conventional machining processes. The complete solution determines the temperature distribution and stress contours within the material and tracks the moving boundary that occurs due to material ablation. The temperature distribution is used to determine the distance below the phase change surface where sufficient ``softening'' has occurred, so that a cutting tool may be used to remove additional material. The model incorporated for tracking the ablative surface does not assume an isothermal melt phase (e.g. Stefan problem) for laser ablation. Both surface absorption and volume absorption of laser energy as function of depth have been considered in the models. LAHM, from the thermal and mechanical point of view is a complex machining process involving large deformations at high strain rates, thermal effects of the laser, removal of materials and contact between workpiece and tool. The theoretical formulation associated with LAHM for solving the thermal-mechanical problem using the finite element method is presented. The thermal formulation is incorporated in the user defined subroutines called by ABAQUS/Standard. The mechanical portion is modeled using ABAQUS/Explicit's general capabilities of modeling interactions involving contact and separation. The results obtained from the FEA simulations showed that the cutting force decrease considerably in both LAEM Surface Absorption (LARM-SA) and LAHM volume absorption (LAHM-VA) models relative to LAM model. It was observed that the HAZ can be expanded or narrowed depending on the laser speed and power. The cutting force is minimal at the last extent of the HAZ. In both the models the laser ablates material thus reducing material stiffness as well as relaxing the thermal stress. The stress values obtained showed compressive yield stress just below the ablated surface and chip. The failure occurs by conventional cutting where tensile stress exceeds the tensile strength of the material at that temperature. In this hybrid machining process the advantages of both the individual machining processes were realized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-06-01
Following a planning period during which the Lawrence Livermore Laboratory and the Department of Defense managing sponsor, the USAF Materials Laboratory, agreed on work statements, the Department of Defense Tri-Service Precision Machine-Tool Program began in February 1978. Milestones scheduled for the first quarter have been met. Tasks and manpower requirements for two basic projects, precision-machining commercialization (PMC) and a machine-tool task force (MTTF), were defined. Progress by PMC includes: (1) documentation of existing precision machine-tool technology by initiation and compilation of a bibliography containing several hundred entries: (2) identification of the problems and needs of precision turning-machine builders and ofmore » precision turning-machine users interested in developing high-precision machining capability; and (3) organization of the schedule and content of the first seminar, to be held in October 1978, which will bring together representatives from the machine-tool and optics communities to address the problems and begin the process of high-precision machining commercialization. Progress by MTTF includes: (1) planning for the organization of a team effort of approximately 60 to 80 international experts to contribute in various ways to project objectives, namely, to summarize state-of-the-art cutting-machine-tool technology and to identify areas where future R and D should prove technically and economically profitable; (2) preparation of a comprehensive plan to achieve those objectives; and (3) preliminary arrangements for a plenary session, also in October, when the task force will meet to formalize the details for implementing the plan.« less
NASA Astrophysics Data System (ADS)
Haag, Sebastian; Bernhardt, Henning; Rübenach, Olaf; Haverkamp, Tobias; Müller, Tobias; Zontar, Daniel; Brecher, Christian
2015-02-01
In many applications for high-power diode lasers, the production of beam-shaping and homogenizing optical systems experience rising volumes and dynamical market demands. The automation of assembly processes on flexible and reconfigurable machines can contribute to a more responsive and scalable production. The paper presents a flexible mounting device designed for the challenging assembly of side-tab based optical systems. It provides design elements for precisely referencing and fixating two optical elements in a well-defined geometric relation. Side tabs are presented to the machine allowing the application of glue and a rotating mechanism allows the attachment to the optical elements. The device can be adjusted to fit different form factors and it can be used in high-volume assembly machines. The paper shows the utilization of the device for a collimation module consisting of a fast-axis and a slow-axis collimation lens. Results regarding the repeatability and process capability of bonding side tab assemblies as well as estimates from 3D simulation for overall performance indicators achieved such as cycle time and throughput will be discussed.
High Temperature Thermoplastic Additive Manufacturing Using Low-Cost, Open-Source Hardware
NASA Technical Reports Server (NTRS)
Gardner, John M.; Stelter, Christopher J.; Yashin, Edward A.; Siochi, Emilie J.
2016-01-01
Additive manufacturing (or 3D printing) via Fused Filament Fabrication (FFF), also known as Fused Deposition Modeling (FDM), is a process where material is placed in specific locations layer-by-layer to create a complete part. Printers designed for FFF build parts by extruding a thermoplastic filament from a nozzle in a predetermined path. Originally developed for commercial printers, 3D printing via FFF has become accessible to a much larger community of users since the introduction of Reprap printers. These low-cost, desktop machines are typically used to print prototype parts or novelty items. As the adoption of desktop sized 3D printers broadens, there is increased demand for these machines to produce functional parts that can withstand harsher conditions such as high temperature and mechanical loads. Materials meeting these requirements tend to possess better mechanical properties and higher glass transition temperatures (Tg), thus requiring printers with high temperature printing capability. This report outlines the problems and solutions, and includes a detailed description of the machine design, printing parameters, and processes specific to high temperature thermoplastic 3D printing.
Defect detection and classification of machined surfaces under multiple illuminant directions
NASA Astrophysics Data System (ADS)
Liao, Yi; Weng, Xin; Swonger, C. W.; Ni, Jun
2010-08-01
Continuous improvement of product quality is crucial to the successful and competitive automotive manufacturing industry in the 21st century. The presence of surface porosity located on flat machined surfaces such as cylinder heads/blocks and transmission cases may allow leaks of coolant, oil, or combustion gas between critical mating surfaces, thus causing damage to the engine or transmission. Therefore 100% inline inspection plays an important role for improving product quality. Although the techniques of image processing and machine vision have been applied to machined surface inspection and well improved in the past 20 years, in today's automotive industry, surface porosity inspection is still done by skilled humans, which is costly, tedious, time consuming and not capable of reliably detecting small defects. In our study, an automated defect detection and classification system for flat machined surfaces has been designed and constructed. In this paper, the importance of the illuminant direction in a machine vision system was first emphasized and then the surface defect inspection system under multiple directional illuminations was designed and constructed. After that, image processing algorithms were developed to realize 5 types of 2D or 3D surface defects (pore, 2D blemish, residue dirt, scratch, and gouge) detection and classification. The steps of image processing include: (1) image acquisition and contrast enhancement (2) defect segmentation and feature extraction (3) defect classification. An artificial machined surface and an actual automotive part: cylinder head surface were tested and, as a result, microscopic surface defects can be accurately detected and assigned to a surface defect class. The cycle time of this system can be sufficiently fast that implementation of 100% inline inspection is feasible. The field of view of this system is 150mm×225mm and the surfaces larger than the field of view can be stitched together in software.
Aono, Masashi; Kim, Song-Ju; Hara, Masahiko; Munakata, Toshinori
2014-03-01
The true slime mold Physarum polycephalum, a single-celled amoeboid organism, is capable of efficiently allocating a constant amount of intracellular resource to its pseudopod-like branches that best fit the environment where dynamic light stimuli are applied. Inspired by the resource allocation process, the authors formulated a concurrent search algorithm, called the Tug-of-War (TOW) model, for maximizing the profit in the multi-armed Bandit Problem (BP). A player (gambler) of the BP should decide as quickly and accurately as possible which slot machine to invest in out of the N machines and faces an "exploration-exploitation dilemma." The dilemma is a trade-off between the speed and accuracy of the decision making that are conflicted objectives. The TOW model maintains a constant intracellular resource volume while collecting environmental information by concurrently expanding and shrinking its branches. The conservation law entails a nonlocal correlation among the branches, i.e., volume increment in one branch is immediately compensated by volume decrement(s) in the other branch(es). Owing to this nonlocal correlation, the TOW model can efficiently manage the dilemma. In this study, we extend the TOW model to apply it to a stretched variant of BP, the Extended Bandit Problem (EBP), which is a problem of selecting the best M-tuple of the N machines. We demonstrate that the extended TOW model exhibits better performances for 2-tuple-3-machine and 2-tuple-4-machine instances of EBP compared with the extended versions of well-known algorithms for BP, the ϵ-Greedy and SoftMax algorithms, particularly in terms of its short-term decision-making capability that is essential for the survival of the amoeba in a hostile environment. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Xia, D.; Xia, Z.
2017-12-01
The ability for the excitation system to adjust quickly plays a very important role in maintaining the normal operation of superconducting machines and power systems. However, the eddy currents in the electromagnetic shield of superconducting machines hinder the exciting magnetic field change and weaken the adjustment capability of the excitation system. To analyze this problem, a finite element calculation model for the transient electromagnetic field with moving parts is established. The effects of three different electromagnetic shields on the exciting magnetic field are analyzed using finite element method. The results show that the electromagnetic shield hinders the field changes significantly, the better its conductivity, the greater the effect on the superconducting machine excitation.
The human role in space: Technology, economics and optimization
NASA Technical Reports Server (NTRS)
Hall, S. B. (Editor)
1985-01-01
Man-machine interactions in space are explored in detail. The role and the degree of direct involvement of humans that will be required in future space missions are investigated. An attempt is made to establish valid criteria for allocating functional activities between humans and machines and to provide insight into the technological requirements, economics, and benefits of the human presence in space. Six basic categories of man-machine interactions are considered: manual, supported, augmented, teleoperated, supervised, and independent. Appendices are included which provide human capability data, project analyses, activity timeline profiles and data sheets for 37 generic activities, support equipment and human capabilities required in these activities, and cumulative costs as a function of activity for seven man-machine modes.
An analysis of a digital variant of the Trail Making Test using machine learning techniques.
Dahmen, Jessamyn; Cook, Diane; Fellows, Robert; Schmitter-Edgecombe, Maureen
2017-01-01
The goal of this work is to develop a digital version of a standard cognitive assessment, the Trail Making Test (TMT), and assess its utility. This paper introduces a novel digital version of the TMT and introduces a machine learning based approach to assess its capabilities. Using digital Trail Making Test (dTMT) data collected from (N = 54) older adult participants as feature sets, we use machine learning techniques to analyze the utility of the dTMT and evaluate the insights provided by the digital features. Predicted TMT scores correlate well with clinical digital test scores (r = 0.98) and paper time to completion scores (r = 0.65). Predicted TICS exhibited a small correlation with clinically derived TICS scores (r = 0.12 Part A, r = 0.10 Part B). Predicted FAB scores exhibited a small correlation with clinically derived FAB scores (r = 0.13 Part A, r = 0.29 for Part B). Digitally derived features were also used to predict diagnosis (AUC of 0.65). Our findings indicate that the dTMT is capable of measuring the same aspects of cognition as the paper-based TMT. Furthermore, the dTMT's additional data may be able to help monitor other cognitive processes not captured by the paper-based TMT alone.
Development of metrology for freeform optics in reflection mode
NASA Astrophysics Data System (ADS)
Burada, Dali R.; Pant, Kamal K.; Mishra, Vinod; Bichra, Mohamed; Khan, Gufran S.; Sinzinger, Stefan; Shakher, Chandra
2017-06-01
The increased range of manufacturable freeform surfaces offered by the new fabrication techniques is giving opportunities to incorporate them in the optical systems. However, the success of these fabrication techniques depends on the capabilities of metrology procedures and a feedback mechanism to CNC machines for optimizing the manufacturing process. Therefore, a precise and in-situ metrology technique for freeform optics is in demand. Though all the techniques available for aspheres have been extended for the freeform surfaces by the researchers, but none of the techniques has yet been incorporated into the manufacturing machine for in-situ measurement. The most obvious reason is the complexity involved in the optical setups to be integrated in the manufacturing platforms. The Shack-Hartmann sensor offers the potential to be incorporated into the machine environment due to its vibration insensitivity, compactness and 3D shape measurement capability from slope data. In the present work, a measurement scheme is reported in which a scanning Shack-Hartmann Sensor has been employed and used as a metrology tool for measurement of freeform surface in reflection mode. Simulation studies are conducted for analyzing the stitching accuracy in presence of various misalignment errors. The proposed scheme is experimentally verified on a freeform surface of cubic phase profile.
Vineyard, Craig M.; Verzi, Stephen J.; James, Conrad D.; ...
2015-08-10
Despite technological advances making computing devices faster, smaller, and more prevalent in today's age, data generation and collection has outpaced data processing capabilities. Simply having more compute platforms does not provide a means of addressing challenging problems in the big data era. Rather, alternative processing approaches are needed and the application of machine learning to big data is hugely important. The MapReduce programming paradigm is an alternative to conventional supercomputing approaches, and requires less stringent data passing constrained problem decompositions. Rather, MapReduce relies upon defining a means of partitioning the desired problem so that subsets may be computed independently andmore » recom- bined to yield the net desired result. However, not all machine learning algorithms are amenable to such an approach. Game-theoretic algorithms are often innately distributed, consisting of local interactions between players without requiring a central authority and are iterative by nature rather than requiring extensive retraining. Effectively, a game-theoretic approach to machine learning is well suited for the MapReduce paradigm and provides a novel, alternative new perspective to addressing the big data problem. In this paper we present a variant of our Support Vector Machine (SVM) Game classifier which may be used in a distributed manner, and show an illustrative example of applying this algorithm.« less
Comparison of fMRI data analysis by SPM99 on different operating systems.
Shinagawa, Hideo; Honda, Ei-ichi; Ono, Takashi; Kurabayashi, Tohru; Ohyama, Kimie
2004-09-01
The hardware chosen for fMRI data analysis may depend on the platform already present in the laboratory or the supporting software. In this study, we ran SPM99 software on multiple platforms to examine whether we could analyze fMRI data by SPM99, and to compare their differences and limitations in processing fMRI data, which can be attributed to hardware capabilities. Six normal right-handed volunteers participated in a study of hand-grasping to obtain fMRI data. Each subject performed a run that consisted of 98 images. The run was measured using a gradient echo-type echo planar imaging sequence on a 1.5T apparatus with a head coil. We used several personal computer (PC), Unix and Linux machines to analyze the fMRI data. There were no differences in the results obtained on several PC, Unix and Linux machines. The only limitations in processing large amounts of the fMRI data were found using PC machines. This suggests that the results obtained with different machines were not affected by differences in hardware components, such as the CPU, memory and hard drive. Rather, it is likely that the limitations in analyzing a huge amount of the fMRI data were due to differences in the operating system (OS).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vineyard, Craig M.; Verzi, Stephen J.; James, Conrad D.
Despite technological advances making computing devices faster, smaller, and more prevalent in today's age, data generation and collection has outpaced data processing capabilities. Simply having more compute platforms does not provide a means of addressing challenging problems in the big data era. Rather, alternative processing approaches are needed and the application of machine learning to big data is hugely important. The MapReduce programming paradigm is an alternative to conventional supercomputing approaches, and requires less stringent data passing constrained problem decompositions. Rather, MapReduce relies upon defining a means of partitioning the desired problem so that subsets may be computed independently andmore » recom- bined to yield the net desired result. However, not all machine learning algorithms are amenable to such an approach. Game-theoretic algorithms are often innately distributed, consisting of local interactions between players without requiring a central authority and are iterative by nature rather than requiring extensive retraining. Effectively, a game-theoretic approach to machine learning is well suited for the MapReduce paradigm and provides a novel, alternative new perspective to addressing the big data problem. In this paper we present a variant of our Support Vector Machine (SVM) Game classifier which may be used in a distributed manner, and show an illustrative example of applying this algorithm.« less
NASA Astrophysics Data System (ADS)
Timoney, Padraig; Kagalwala, Taher; Reis, Edward; Lazkani, Houssam; Hurley, Jonathan; Liu, Haibo; Kang, Charles; Isbester, Paul; Yellai, Naren; Shifrin, Michael; Etzioni, Yoav
2018-03-01
In recent years, the combination of device scaling, complex 3D device architecture and tightening process tolerances have strained the capabilities of optical metrology tools to meet process needs. Two main categories of approaches have been taken to address the evolving process needs. In the first category, new hardware configurations are developed to provide more spectral sensitivity. Most of this category of work will enable next generation optical metrology tools to try to maintain pace with next generation process needs. In the second category, new innovative algorithms have been pursued to increase the value of the existing measurement signal. These algorithms aim to boost sensitivity to the measurement parameter of interest, while reducing the impact of other factors that contribute to signal variability but are not influenced by the process of interest. This paper will evaluate the suitability of machine learning to address high volume manufacturing metrology requirements in both front end of line (FEOL) and back end of line (BEOL) sectors from advanced technology nodes. In the FEOL sector, initial feasibility has been demonstrated to predict the fin CD values from an inline measurement using machine learning. In this study, OCD spectra were acquired after an etch process that occurs earlier in the process flow than where the inline CD is measured. The fin hard mask etch process is known to impact the downstream inline CD value. Figure 1 shows the correlation of predicted CD vs downstream inline CD measurement obtained after the training of the machine learning algorithm. For BEOL, machine learning is shown to provide an additional source of information in prediction of electrical resistance from structures that are not compatible for direct copper height measurement. Figure 2 compares the trench height correlation to electrical resistance (Rs) and the correlation of predicted Rs to the e-test Rs value for a far back end of line (FBEOL) metallization level across 3 products. In the case of product C, it is found that the predicted Rs correlation to the e-test value is significantly improved utilizing spectra acquired at the e-test structure. This paper will explore the considerations required to enable use of machine learning derived metrology output to enable improved process monitoring and control. Further results from the FEOL and BEOL sectors will be presented, together with further discussion on future proliferation of machine learning based metrology solutions in high volume manufacturing.
A High Performance Micro Channel Interface for Real-Time Industrial Image Processing
Thomas H. Drayer; Joseph G. Tront; Richard W. Conners
1995-01-01
Data collection and transfer devices are critical to the performance of any machine vision system. The interface described in this paper collects image data from a color line scan camera and transfers the data obtained into the system memory of a Micro Channel-based host computer. A maximum data transfer rate of 20 Mbytes/sec can be achieved using the DMA capabilities...
Machine learning based Intelligent cognitive network using fog computing
NASA Astrophysics Data System (ADS)
Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik
2017-05-01
In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.
NASA Astrophysics Data System (ADS)
Mahapatra, Prasant Kumar; Sethi, Spardha; Kumar, Amod
2015-10-01
In conventional tool positioning technique, sensors embedded in the motion stages provide the accurate tool position information. In this paper, a machine vision based system and image processing technique for motion measurement of lathe tool from two-dimensional sequential images captured using charge coupled device camera having a resolution of 250 microns has been described. An algorithm was developed to calculate the observed distance travelled by the tool from the captured images. As expected, error was observed in the value of the distance traversed by the tool calculated from these images. Optimization of errors due to machine vision system, calibration, environmental factors, etc. in lathe tool movement was carried out using two soft computing techniques, namely, artificial immune system (AIS) and particle swarm optimization (PSO). The results show better capability of AIS over PSO.
A distributed version of the NASA Engine Performance Program
NASA Technical Reports Server (NTRS)
Cours, Jeffrey T.; Curlett, Brian P.
1993-01-01
Distributed NEPP, a version of the NASA Engine Performance Program, uses the original NEPP code but executes it in a distributed computer environment. Multiple workstations connected by a network increase the program's speed and, more importantly, the complexity of the cases it can handle in a reasonable time. Distributed NEPP uses the public domain software package, called Parallel Virtual Machine, allowing it to execute on clusters of machines containing many different architectures. It includes the capability to link with other computers, allowing them to process NEPP jobs in parallel. This paper discusses the design issues and granularity considerations that entered into programming Distributed NEPP and presents the results of timing runs.
Herrero, Héctor; Outón, Jose Luis; Puerto, Mildred; Sallé, Damien; López de Ipiña, Karmele
2017-01-01
This paper presents a state machine-based architecture, which enhances the flexibility and reusability of industrial robots, more concretely dual-arm multisensor robots. The proposed architecture, in addition to allowing absolute control of the execution, eases the programming of new applications by increasing the reusability of the developed modules. Through an easy-to-use graphical user interface, operators are able to create, modify, reuse and maintain industrial processes, increasing the flexibility of the cell. Moreover, the proposed approach is applied in a real use case in order to demonstrate its capabilities and feasibility in industrial environments. A comparative analysis is presented for evaluating the presented approach versus traditional robot programming techniques. PMID:28561750
Herrero, Héctor; Outón, Jose Luis; Puerto, Mildred; Sallé, Damien; López de Ipiña, Karmele
2017-05-31
This paper presents a state machine-based architecture, which enhances the flexibility and reusability of industrial robots, more concretely dual-arm multisensor robots. The proposed architecture, in addition to allowing absolute control of the execution, eases the programming of new applications by increasing the reusability of the developed modules. Through an easy-to-use graphical user interface, operators are able to create, modify, reuse and maintain industrial processes, increasing the flexibility of the cell. Moreover, the proposed approach is applied in a real use case in order to demonstrate its capabilities and feasibility in industrial environments. A comparative analysis is presented for evaluating the presented approach versus traditional robot programming techniques.
NASA Technical Reports Server (NTRS)
Gryphon, Coranth D.; Miller, Mark D.
1991-01-01
PCLIPS (Parallel CLIPS) is a set of extensions to the C Language Integrated Production System (CLIPS) expert system language. PCLIPS is intended to provide an environment for the development of more complex, extensive expert systems. Multiple CLIPS expert systems are now capable of running simultaneously on separate processors, or separate machines, thus dramatically increasing the scope of solvable tasks within the expert systems. As a tool for parallel processing, PCLIPS allows for an expert system to add to its fact-base information generated by other expert systems, thus allowing systems to assist each other in solving a complex problem. This allows individual expert systems to be more compact and efficient, and thus run faster or on smaller machines.
NASA Technical Reports Server (NTRS)
Coker, B. L.; Kind, T. C.; Smith, W. F., Jr.; Weber, N. V.
1981-01-01
Created for analyzing and processing digital data such as that collected by multispectral scanners or digitized from maps, ELAS is designed for ease of user operation and includes its own FORTRAN operating monitor and an expandable set of application modules which are FORTRAN overlays. On those machines that do not support FORTRAN overlaying, the modules exist as subprograms. The subsystem can be implemented on most 16-bit or 32-bit machines and is capable of, but not limited to, operating on low-cost minicomputer systems. The recommended hardware configuration for ELAS and a representative listing of some operating and application modules are presented.
Zheng, Shuai; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A
2017-01-01
Background Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Objective Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. Methods A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Results Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports—each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. Conclusions IDEAL-X adopts a unique online machine learning–based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. PMID:28487265
View north of inside machine shop 36; shop floor accommodates ...
View north of inside machine shop 36; shop floor accommodates lathes capable of machining a cylinder 60 inches in diameter and 75 feet long; other equipment includes horizontal and vertical jig borders, hydraulic tube straighteners and other equipment for precision machining of large ship components. - Naval Base Philadelphia-Philadelphia Naval Shipyard, Structure Shop, League Island, Philadelphia, Philadelphia County, PA
Archiving Software Systems: Approaches to Preserve Computational Capabilities
NASA Astrophysics Data System (ADS)
King, T. A.
2014-12-01
A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
A Senior Project-Based Multiphase Motor Drive System Development
ERIC Educational Resources Information Center
Abdel-Khalik, Ayman S.; Massoud, Ahmed M.; Ahmed, Shehab
2016-01-01
Adjustable-speed drives based on multiphase motors are of significant interest for safety-critical applications that necessitate wide fault-tolerant capabilities and high system reliability. Although multiphase machines are based on the same conceptual theory as three-phase machines, most undergraduate electrical machines and electric drives…
High Power Laser Processing Of Materials
NASA Astrophysics Data System (ADS)
Martyr, D. R.; Holt, T.
1987-09-01
The first practical demonstration of a laser device was in 1960 and in the following years, the high power carbon dioxide laser has matured as an industrial machine tool. Modern carbon dioxide gas lasers can be used for cutting, welding, heat treatment, drilling, scribing and marking. Since their invention over 25 years ago they are now becoming recognised as highly reliable devices capable of achieving huge savings in production costs in many situations. This paper introduces the basic laser processing techniques of cutting, welding and heat treatment as they apply to the most common engineering materials. Typical processing speeds achieved with a wide range of laser powers are reported. Accuracies achievable and fit-up tolerances required are presented. Methods of integrating lasers with machine tools are described and their suitability in a wide range of manufacturing industries is described by reference to recent installations. Examples from small batch manufacturing, high volume production using dedicated laser welding equipment, and high volume manufacturing using 'flexible' automated laser welding equipment are described Future applications of laser processing are suggested by reference to current process developments.
An incremental anomaly detection model for virtual machines.
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.
An incremental anomaly detection model for virtual machines
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245
Grazing Incidence Optics Technology
NASA Technical Reports Server (NTRS)
Ramsey, Brian; Smith, W. Scott; Gubarev, Mikhail; McCracken, Jeff
2015-01-01
This project is to demonstrate the capability to directly fabricate lightweight, high-resolution, grazing-incidence x-ray optics using a commercially available robotic polishing machine. Typical x-ray optics production at NASA Marshall Space Flight Center (MSFC) uses a replication process in which metal mirrors are electroformed on to figured and polished mandrels from which they are later removed. The attraction of this process is that multiple copies can be made from a single master. The drawback is that the replication process limits the angular resolution that can be attained. By directly fabricating each shell, errors inherent in the replication process are removed. The principal challenge now becomes how to support the mirror shell during all aspects of fabrication, including the necessary metrology to converge on the required mirror performance specifications. This program makes use of a Zeeko seven-axis computer-controlled polishing machine (see fig. 1) and supporting fabrication, metrology, and test equipment at MSFC. The overall development plan calls for proof-of-concept demonstration with relatively thick mirror shells (5-6 mm, fig. 2) which are straightforward to support and then a transition to much thinner shells (2-3 mm), which are an order of magnitude thinner than those used for Chandra. Both glass and metal substrates are being investigated. Currently, a thick glass shell is being figured. This has enabled experience to be gained with programming and operating the polishing machine without worrying about shell distortions or breakage. It has also allowed time for more complex support mechanisms for figuring/ polishing and metrology to be designed for the more challenging thinner shells. These are now in fabrication. Figure 1: Zeeko polishing machine.
An Extreme Learning Machine-Based Neuromorphic Tactile Sensing System for Texture Recognition.
Rasouli, Mahdi; Chen, Yi; Basu, Arindam; Kukreja, Sunil L; Thakor, Nitish V
2018-04-01
Despite significant advances in computational algorithms and development of tactile sensors, artificial tactile sensing is strikingly less efficient and capable than the human tactile perception. Inspired by efficiency of biological systems, we aim to develop a neuromorphic system for tactile pattern recognition. We particularly target texture recognition as it is one of the most necessary and challenging tasks for artificial sensory systems. Our system consists of a piezoresistive fabric material as the sensor to emulate skin, an interface that produces spike patterns to mimic neural signals from mechanoreceptors, and an extreme learning machine (ELM) chip to analyze spiking activity. Benefiting from intrinsic advantages of biologically inspired event-driven systems and massively parallel and energy-efficient processing capabilities of the ELM chip, the proposed architecture offers a fast and energy-efficient alternative for processing tactile information. Moreover, it provides the opportunity for the development of low-cost tactile modules for large-area applications by integration of sensors and processing circuits. We demonstrate the recognition capability of our system in a texture discrimination task, where it achieves a classification accuracy of 92% for categorization of ten graded textures. Our results confirm that there exists a tradeoff between response time and classification accuracy (and information transfer rate). A faster decision can be achieved at early time steps or by using a shorter time window. This, however, results in deterioration of the classification accuracy and information transfer rate. We further observe that there exists a tradeoff between the classification accuracy and the input spike rate (and thus energy consumption). Our work substantiates the importance of development of efficient sparse codes for encoding sensory data to improve the energy efficiency. These results have a significance for a wide range of wearable, robotic, prosthetic, and industrial applications.
NASA Technical Reports Server (NTRS)
Udomkesmalee, Suraphol; Padgett, Curtis; Zhu, David; Lung, Gerald; Howard, Ayanna
2000-01-01
A three-dimensional microelectronic device (3DANN-R) capable of performing general image convolution at the speed of 1012 operations/second (ops) in a volume of less than 1.5 cubic centimeter has been successfully built under the BMDO/JPL VIGILANTE program. 3DANN-R was developed in partnership with Irvine Sensors Corp., Costa Mesa, California. 3DANN-R is a sugar-cube-sized, low power image convolution engine that in its core computation circuitry is capable of performing 64 image convolutions with large (64x64) windows at video frame rates. This paper explores potential applications of 3DANN-R such as target recognition, SAR and hyperspectral data processing, and general machine vision using real data and discuss technical challenges for providing deployable systems for BMDO surveillance and interceptor programs.
Food equipment manufacturer takes a slice out of its scrap rate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, D.; Hannahs, J.; Carter, M.
1996-09-01
The PMI Food Equipment Group began manufacturing circular slicer knives for its commercial Hobart line of slicers in the early 1930s. The company manufacturers the only cast knife in the food industry. The cast knives offer superior edge retention and overall corrosion resistance. The slicer knives are cast in PMI`s foundry. The casting process sometimes produces shrinkage voids or gas bubbles in the knife blank. Surface discontinuities often do not appear until rough cutting or final machining, i.e., after several hours of value-added manufacturing. Knife blanks with these discontinuities were scrapped and sent back to the foundry for remelting. Tomore » scrap the knives at that point meant the cost for casting plus the value-added machining added up to a considerable amount. Weld repair allows the recovery of casting and machining expenses equal to a significant percentage of the total manufacturing cost of slicer knives. Repair costs include welding, grinding, shipping, surface finishing and material handling. Other good applications for this GMAW-P process include repair of jet engine components, rotating process industry equipment, and hardfacing of cutting tools and dies. In addition, dissimilar metals and any material that is heat treated to develop its properties such as precision investment castings are excellent applications. The low resultant distortion, elimination of postweld heat treatment and non-line-of-site welding capability solves thin wall, limited access and precision machined component repair challenges.« less
NASA Technical Reports Server (NTRS)
Bajis, Katie
1993-01-01
The characteristics and capabilities of existing machine translation systems were examined and procurement recommendations were developed. Four systems, SYSTRAN, GLOBALINK, PC TRANSLATOR, and STYLUS, were determined to meet the NASA requirements for a machine translation system. Initially, four language pairs were selected for implementation. These are Russian-English, French-English, German-English, and Japanese-English.
System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.
NASA Astrophysics Data System (ADS)
Gora, Wojciech S.; Tian, Yingtao; Cabo, Aldara Pan; Ardron, Marcus; Maier, Robert R. J.; Prangnell, Philip; Weston, Nicholas J.; Hand, Duncan P.
Additive manufacturing (AM) offers the possibility of creating a complex free form object as a single element, which is not possible using traditional mechanical machining. Unfortunately the typically rough surface finish of additively manufactured parts is unsuitable for many applications. As a result AM parts must be post-processed; typically mechanically machined and/or and polished using either chemical or mechanical techniques (both of which have their limitations). Laser based polishing is based on remelting of a very thin surface layer and it offers potential as a highly repeatable, higher speed process capable of selective area polishing, and without any waste problems (no abrasives or liquids). In this paper an in-depth investigation of CW laser polishing of titanium and cobalt chrome AM elements is presented. The impact of different scanning strategies, laser parameters and initial surface condition on the achieved surface finish is evaluated.
Pegorini, Vinicius; Karam, Leandro Zen; Pitta, Christiano Santos Rocha; Cardoso, Rafael; da Silva, Jean Carlos Cardozo; Kalinowski, Hypolito José; Ribeiro, Richardson; Bertotti, Fábio Luiz; Assmann, Tangriani Simioni
2015-11-11
Pattern classification of ingestive behavior in grazing animals has extreme importance in studies related to animal nutrition, growth and health. In this paper, a system to classify chewing patterns of ruminants in in vivo experiments is developed. The proposal is based on data collected by optical fiber Bragg grating sensors (FBG) that are processed by machine learning techniques. The FBG sensors measure the biomechanical strain during jaw movements, and a decision tree is responsible for the classification of the associated chewing pattern. In this study, patterns associated with food intake of dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior were monitored: rumination and idleness. Experimental results show that the proposed approach for pattern classification is capable of differentiating the five patterns involved in the chewing process with an overall accuracy of 94%.
Pegorini, Vinicius; Karam, Leandro Zen; Pitta, Christiano Santos Rocha; Cardoso, Rafael; da Silva, Jean Carlos Cardozo; Kalinowski, Hypolito José; Ribeiro, Richardson; Bertotti, Fábio Luiz; Assmann, Tangriani Simioni
2015-01-01
Pattern classification of ingestive behavior in grazing animals has extreme importance in studies related to animal nutrition, growth and health. In this paper, a system to classify chewing patterns of ruminants in in vivo experiments is developed. The proposal is based on data collected by optical fiber Bragg grating sensors (FBG) that are processed by machine learning techniques. The FBG sensors measure the biomechanical strain during jaw movements, and a decision tree is responsible for the classification of the associated chewing pattern. In this study, patterns associated with food intake of dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior were monitored: rumination and idleness. Experimental results show that the proposed approach for pattern classification is capable of differentiating the five patterns involved in the chewing process with an overall accuracy of 94%. PMID:26569250
Behind the scenes: A medical natural language processing project.
Wu, Joy T; Dernoncourt, Franck; Gehrmann, Sebastian; Tyler, Patrick D; Moseley, Edward T; Carlson, Eric T; Grant, David W; Li, Yeran; Welt, Jonathan; Celi, Leo Anthony
2018-04-01
Advancement of Artificial Intelligence (AI) capabilities in medicine can help address many pressing problems in healthcare. However, AI research endeavors in healthcare may not be clinically relevant, may have unrealistic expectations, or may not be explicit enough about their limitations. A diverse and well-functioning multidisciplinary team (MDT) can help identify appropriate and achievable AI research agendas in healthcare, and advance medical AI technologies by developing AI algorithms as well as addressing the shortage of appropriately labeled datasets for machine learning. In this paper, our team of engineers, clinicians and machine learning experts share their experience and lessons learned from their two-year-long collaboration on a natural language processing (NLP) research project. We highlight specific challenges encountered in cross-disciplinary teamwork, dataset creation for NLP research, and expectation setting for current medical AI technologies. Copyright © 2017. Published by Elsevier B.V.
A new milling machine for computer-aided, in-office restorations.
Kurbad, Andreas
Chairside computer-aided design/computer-aided manufacturing (CAD/CAM) technology requires an effective technical basis to obtain dental restorations with optimal marginal accuracy, esthetics, and longevity in as short a timeframe as possible. This article describes a compact, 5-axis milling machine based on an innovative milling technology (5XT - five-axis turn-milling technique), which is capable of achieving high-precision milling results within a very short processing time. Furthermore, the device's compact dimensioning and state-of-the-art mode of operation facilitate its use in the dental office. This model is also an option to be considered for use in smaller dental laboratories, especially as the open input format enables it to be quickly and simply integrated into digital processing systems already in use. The possibility of using ceramic and polymer materials with varying properties enables the manufacture of restorations covering all conceivable indications in the field of fixed dental prosthetics.
Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2
NASA Technical Reports Server (NTRS)
Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)
2000-01-01
A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.
Catching errors with patient-specific pretreatment machine log file analysis.
Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa
2013-01-01
A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
A parallel Jacobson-Oksman optimization algorithm. [parallel processing (computers)
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Markos, A. T.
1975-01-01
A gradient-dependent optimization technique which exploits the vector-streaming or parallel-computing capabilities of some modern computers is presented. The algorithm, derived by assuming that the function to be minimized is homogeneous, is a modification of the Jacobson-Oksman serial minimization method. In addition to describing the algorithm, conditions insuring the convergence of the iterates of the algorithm and the results of numerical experiments on a group of sample test functions are presented. The results of these experiments indicate that this algorithm will solve optimization problems in less computing time than conventional serial methods on machines having vector-streaming or parallel-computing capabilities.
Crew systems: integrating human and technical subsystems for the exploration of space.
Connors, M M; Harrison, A A; Summit, J
1994-07-01
Space exploration missions will require combining human and technical subsystems into overall "crew systems" capable of performing under the rigorous conditions of outer space. This report describes substantive and conceptual relationships among humans, intelligent machines, and communication systems, and explores how these components may be combined to complement and strengthen one another. We identify key research issues in the combination of humans and technology and examine the role of individual differences, group processes, and environmental conditions. We conclude that a crew system is, in effect, a social cyborg, a living system consisting of multiple individuals whose capabilities are extended by advanced technology.
Crew systems: integrating human and technical subsystems for the exploration of space
NASA Technical Reports Server (NTRS)
Connors, M. M.; Harrison, A. A.; Summit, J.
1994-01-01
Space exploration missions will require combining human and technical subsystems into overall "crew systems" capable of performing under the rigorous conditions of outer space. This report describes substantive and conceptual relationships among humans, intelligent machines, and communication systems, and explores how these components may be combined to complement and strengthen one another. We identify key research issues in the combination of humans and technology and examine the role of individual differences, group processes, and environmental conditions. We conclude that a crew system is, in effect, a social cyborg, a living system consisting of multiple individuals whose capabilities are extended by advanced technology.
The Development and Calculation of an Energy-saving Plant for Obtaining Water from Atmospheric Air
NASA Astrophysics Data System (ADS)
Uglanov, D. A.; Zheleznyak, K. E.; Chertykovsev, P. A.
2018-01-01
The article shows the calculation of characteristics of energy-efficient water generator from atmospheric air. This installation or the atmospheric water generator is the unique mechanism which produces safe drinking water by extraction it from air. The existing atmospheric generators allow to receive safe drinking water by means of process of condensation at air humidity at least equal to 35% and are capable to give to 25 liters of water in per day, and work from electricity. Authors offer to use instead of the condenser in the scheme of installation for increase volume of produced water by generator in per day, the following refrigerating machines: the vapor compression refrigerating machines (VCRM), the thermoelectric refrigerating machines (TRM) and the Stirling-cycle refrigerating machines (SRM). The paper describes calculation methods for each of refrigerating systems. Calculation of technical-and-economic indexes for the atmospheric water generator was carried out and the optimum system with the maximum volume of received water in per day was picked up. The atmospheric water generator which is considered in article will work from autonomous solar power station.
2014-12-01
chemical etching EDM electrical discharge machine EID enterprise identifier EOSS Engineering Operational Sequencing System F Fahrenheit...Center in Corona , California, released a DoN IUID Marking Guide, which made recommendations on how to mark legacy items. It provides technical...uploaded into the IUID registry managed by the Naval Surface Warfare Center (NSWC) in Corona , California. There is no set amount of information
Online learning control using adaptive critic designs with sparse kernel machines.
Xu, Xin; Hou, Zhongsheng; Lian, Chuanqiang; He, Haibo
2013-05-01
In the past decade, adaptive critic designs (ACDs), including heuristic dynamic programming (HDP), dual heuristic programming (DHP), and their action-dependent ones, have been widely studied to realize online learning control of dynamical systems. However, because neural networks with manually designed features are commonly used to deal with continuous state and action spaces, the generalization capability and learning efficiency of previous ACDs still need to be improved. In this paper, a novel framework of ACDs with sparse kernel machines is presented by integrating kernel methods into the critic of ACDs. To improve the generalization capability as well as the computational efficiency of kernel machines, a sparsification method based on the approximately linear dependence analysis is used. Using the sparse kernel machines, two kernel-based ACD algorithms, that is, kernel HDP (KHDP) and kernel DHP (KDHP), are proposed and their performance is analyzed both theoretically and empirically. Because of the representation learning and generalization capability of sparse kernel machines, KHDP and KDHP can obtain much better performance than previous HDP and DHP with manually designed neural networks. Simulation and experimental results of two nonlinear control problems, that is, a continuous-action inverted pendulum problem and a ball and plate control problem, demonstrate the effectiveness of the proposed kernel ACD methods.
Machine learning algorithms for mode-of-action classification in toxicity assessment.
Zhang, Yile; Wong, Yau Shu; Deng, Jian; Anton, Cristina; Gabos, Stephan; Zhang, Weiping; Huang, Dorothy Yu; Jin, Can
2016-01-01
Real Time Cell Analysis (RTCA) technology is used to monitor cellular changes continuously over the entire exposure period. Combining with different testing concentrations, the profiles have potential in probing the mode of action (MOA) of the testing substances. In this paper, we present machine learning approaches for MOA assessment. Computational tools based on artificial neural network (ANN) and support vector machine (SVM) are developed to analyze the time-concentration response curves (TCRCs) of human cell lines responding to tested chemicals. The techniques are capable of learning data from given TCRCs with known MOA information and then making MOA classification for the unknown toxicity. A novel data processing step based on wavelet transform is introduced to extract important features from the original TCRC data. From the dose response curves, time interval leading to higher classification success rate can be selected as input to enhance the performance of the machine learning algorithm. This is particularly helpful when handling cases with limited and imbalanced data. The validation of the proposed method is demonstrated by the supervised learning algorithm applied to the exposure data of HepG2 cell line to 63 chemicals with 11 concentrations in each test case. Classification success rate in the range of 85 to 95 % are obtained using SVM for MOA classification with two clusters to cases up to four clusters. Wavelet transform is capable of capturing important features of TCRCs for MOA classification. The proposed SVM scheme incorporated with wavelet transform has a great potential for large scale MOA classification and high-through output chemical screening.
Cognitive Architectures and Autonomy: A Comparative Review
NASA Astrophysics Data System (ADS)
Thórisson, Kristinn; Helgasson, Helgi
2012-05-01
One of the original goals of artificial intelligence (AI) research was to create machines with very general cognitive capabilities and a relatively high level of autonomy. It has taken the field longer than many had expected to achieve even a fraction of this goal; the community has focused on building specific, targeted cognitive processes in isolation, and as of yet no system exists that integrates a broad range of capabilities or presents a general solution to autonomous acquisition of a large set of skills. Among the reasons for this are the highly limited machine learning and adaptation techniques available, and the inherent complexity of integrating numerous cognitive and learning capabilities in a coherent architecture. In this paper we review selected systems and architectures built expressly to address integrated skills. We highlight principles and features of these systems that seem promising for creating generally intelligent systems with some level of autonomy, and discuss them in the context of the development of future cognitive architectures. Autonomy is a key property for any system to be considered generally intelligent, in our view; we use this concept as an organizing principle for comparing the reviewed systems. Features that remain largely unaddressed in present research, but seem nevertheless necessary for such efforts to succeed, are also discussed.
Transient Simulation of Speed-No Load Conditions With An Open-Source Based C++ Code
NASA Astrophysics Data System (ADS)
Casartelli, E.; Mangani, L.; Romanelli, G.; Staubli, T.
2014-03-01
Modern reversible pump-turbines can start in turbine operation very quickly, i.e. within few minutes. Unfortunately no clear design rules for runners with a stable start-up are available, so that certain machines can present unstable characteristics which lead to oscillations in the hydraulic system during synchronization. The so-called S-shape, i.e. the unstable characteristic in turbine brake operation, is defined by the change of sign of the slope of the head curve. In order to assess and understand this kind of instabilities with CFD, fast and reliable methods are needed. Using a 360 degrees model including the complete machine from spiral casing to draft tube the capabilities of a newly developed in-house tool are presented. An ad-hoc simulation is performed from no-load conditions into the S-shape in transient mode and using moving-mesh capabilities, thus being able to capture the opening process of the wicket gates, for example like during start-up. Beside the presentation of the computational methodology, various phenomena encounterd are analyzed and discussed, comparing them with measured and previously computed data, in order to show the capabilities of the developed procedure. Insight in detected phenomena is also given for global data like frequencies of vortical structures and local flow patterns.
ERIC Educational Resources Information Center
Wu, Dan; He, Daqing
2012-01-01
Purpose: This paper seeks to examine the further integration of machine translation technologies with cross language information access in providing web users the capabilities of accessing information beyond language barriers. Machine translation and cross language information access are related technologies, and yet they have their own unique…
NASA Astrophysics Data System (ADS)
Sigurdson, J.; Tagerud, J.
1986-05-01
A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.
Production of Energy Efficient Preform Structures (PEEPS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. John A. Baumann
2012-06-08
Due to its low density, good structural characteristics, excellent fabrication properties, and attractive appearance, aluminum metal and its alloys continue to be widely utilized. The transportation industry continues to be the largest consumer of aluminum products, with aerospace as the principal driver for this use. Boeing has long been the largest single company consumer of heat-treated aluminum in the U.S. The extensive use of aluminum to build aircraft and launch vehicles has been sustained, despite the growing reliance on more structurally efficient carbon fiber reinforced composite materials. The trend in the aerospace industry over the past several decades has beenmore » to rely extensively on large, complex, thin-walled, monolithic machined structural components, which are fabricated from heavy billets and thick plate using high speed machining. The use of these high buy-to-fly ratio starting product forms, while currently cost effective, is energy inefficient, with a high environmental impact. The widespread implementation of Solid State Joining (SSJ) technologies, to produce lower buy-to-fly ratio starting forms, tailored to each specific application, offers the potential for a more sustainable manufacturing strategy, which would consume less energy, require less material, and reduce material and manufacturing costs. One objective of this project was to project the energy benefits of using SSJ techniques to produce high-performance aluminum structures if implemented in the production of the world fleet of commercial aircraft. A further objective was to produce an energy consumption prediction model, capable of calculating the total energy consumption, solid waste burden, acidification potential, and CO2 burden in producing a starting product form - whether by conventional or SSJ processes - and machining that to a final part configuration. The model needed to be capable of computing and comparing, on an individual part/geometry basis, multiple possible manufacturing pathways, to identify the best balance of energy consumption and environmental impact. This model has been created and populated with energy consumption data for individual SSJ processes and process platforms. Technology feasibility cases studies were executed, to validate the model, and confirm the ability to create lower buy-to-fly ratio performs and machine these to final configuration aircraft components. This model can now be used as a tool to select manufacturing pathways that offer significant energy savings and, when coupled with a cost model, drive implementation of the SSJ processes.« less
Digitalization in roll forming manufacturing
NASA Astrophysics Data System (ADS)
Sedlmaier, A.; Dietl, T.; Ferreira, P.
2017-09-01
Roll formed profiles are used in automotive chassis production as building blocks for the body-in-white. The ability to produce profiles with discontinuous cross sections, both in width and in depth, allows weight savings in the final automotive chassis through the use of load optimized cross sections. This has been the target of the 3D Roll Forming process. A machine concept is presented where a new forming concept for roll formed parts in combination with advanced robotics allowing freely positioned roll forming tooling in 3D space enables the production of complex shapes by roll forming. This is a step forward into the digitalization of roll forming manufacturing by making the process flexible and capable of rapid prototyping and production of small series of parts. Moreover, data collection in a large scale through the control system and integrated sensors lead to an increased understanding of the process and provide the basis to develop self-optimizing roll forming machines, increasing the productivity, quality and predictability of the roll-forming process. The first parts successfully manufactured with this new forming concept are presented.
Intelligent fault-tolerant controllers
NASA Technical Reports Server (NTRS)
Huang, Chien Y.
1987-01-01
A system with fault tolerant controls is one that can detect, isolate, and estimate failures and perform necessary control reconfiguration based on this new information. Artificial intelligence (AI) is concerned with semantic processing, and it has evolved to include the topics of expert systems and machine learning. This research represents an attempt to apply AI to fault tolerant controls, hence, the name intelligent fault tolerant control (IFTC). A generic solution to the problem is sought, providing a system based on logic in addition to analytical tools, and offering machine learning capabilities. The advantages are that redundant system specific algorithms are no longer needed, that reasonableness is used to quickly choose the correct control strategy, and that the system can adapt to new situations by learning about its effects on system dynamics.
A self-learning camera for the validation of highly variable and pseudorandom patterns
NASA Astrophysics Data System (ADS)
Kelley, Michael
2004-05-01
Reliable and productive manufacturing operations have depended on people to quickly detect and solve problems whenever they appear. Over the last 20 years, more and more manufacturing operations have embraced machine vision systems to increase productivity, reliability and cost-effectiveness, including reducing the number of human operators required. Although machine vision technology has long been capable of solving simple problems, it has still not been broadly implemented. The reason is that until now, no machine vision system has been designed to meet the unique demands of complicated pattern recognition. The ZiCAM family was specifically developed to be the first practical hardware to meet these needs. To be able to address non-traditional applications, the machine vision industry must include smart camera technology that meets its users" demands for lower costs, better performance and the ability to address applications of irregular lighting, patterns and color. The next-generation smart cameras will need to evolve as a fundamentally different kind of sensor, with new technology that behaves like a human but performs like a computer. Neural network based systems, coupled with self-taught, n-space, non-linear modeling, promises to be the enabler of the next generation of machine vision equipment. Image processing technology is now available that enables a system to match an operator"s subjectivity. A Zero-Instruction-Set-Computer (ZISC) powered smart camera allows high-speed fuzzy-logic processing, without the need for computer programming. This can address applications of validating highly variable and pseudo-random patterns. A hardware-based implementation of a neural network, Zero-Instruction-Set-Computer, enables a vision system to "think" and "inspect" like a human, with the speed and reliability of a machine.
Zheng, Shuai; Lu, James J; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A; Wang, Fusheng
2017-05-09
Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports-each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. IDEAL-X adopts a unique online machine learning-based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. ©Shuai Zheng, James J Lu, Nima Ghasemzadeh, Salim S Hayek, Arshed A Quyyumi, Fusheng Wang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 09.05.2017.
An Analysis of a Digital Variant of the Trail Making Test Using Machine Learning Techniques
Dahmen, Jessamyn; Cook, Diane; Fellows, Robert; Schmitter-Edgecombe, Maureen
2017-01-01
BACKGROUND The goal of this work is to develop a digital version of a standard cognitive assessment, the Trail Making Test (TMT), and assess its utility. OBJECTIVE This paper introduces a novel digital version of the TMT and introduces a machine learning based approach to assess its capabilities. METHODS Using digital Trail Making Test (dTMT) data collected from (N=54) older adult participants as feature sets, we use machine learning techniques to analyze the utility of the dTMT and evaluate the insights provided by the digital features. RESULTS Predicted TMT scores correlate well with clinical digital test scores (r=0.98) and paper time to completion scores (r=0.65). Predicted TICS exhibited a small correlation with clinically-derived TICS scores (r=0.12 Part A, r=0.10 Part B). Predicted FAB scores exhibited a small correlation with clinically-derived FAB scores (r=0.13 Part A, r=0.29 for Part B). Digitally-derived features were also used to predict diagnosis (AUC of 0.65). CONCLUSION Our findings indicate that the dTMT is capable of measuring the same aspects of cognition as the paper-based TMT. Furthermore, the dTMT’s additional data may be able to help monitor other cognitive processes not captured by the paper-based TMT alone. PMID:27886019
Live interactive computer music performance practice
NASA Astrophysics Data System (ADS)
Wessel, David
2002-05-01
A live-performance musical instrument can be assembled around current lap-top computer technology. One adds a controller such as a keyboard or other gestural input device, a sound diffusion system, some form of connectivity processor(s) providing for audio I/O and gestural controller input, and reactive real-time native signal processing software. A system consisting of a hand gesture controller; software for gesture analysis and mapping, machine listening, composition, and sound synthesis; and a controllable radiation pattern loudspeaker are described. Interactivity begins in the set up wherein the speaker-room combination is tuned with an LMS procedure. This system was designed for improvisation. It is argued that software suitable for carrying out an improvised musical dialog with another performer poses special challenges. The processes underlying the generation of musical material must be very adaptable, capable of rapid changes in musical direction. Machine listening techniques are used to help the performer adapt to new contexts. Machine learning can play an important role in the development of such systems. In the end, as with any musical instrument, human skill is essential. Practice is required not only for the development of musically appropriate human motor programs but for the adaptation of the computer-based instrument as well.
Electrical test prediction using hybrid metrology and machine learning
NASA Astrophysics Data System (ADS)
Breton, Mary; Chao, Robin; Muthinti, Gangadhara Raja; de la Peña, Abraham A.; Simon, Jacques; Cepler, Aron J.; Sendelbach, Matthew; Gaudiello, John; Emans, Susan; Shifrin, Michael; Etzioni, Yoav; Urenski, Ronen; Lee, Wei Ti
2017-03-01
Electrical test measurement in the back-end of line (BEOL) is crucial for wafer and die sorting as well as comparing intended process splits. Any in-line, nondestructive technique in the process flow to accurately predict these measurements can significantly improve mean-time-to-detect (MTTD) of defects and improve cycle times for yield and process learning. Measuring after BEOL metallization is commonly done for process control and learning, particularly with scatterometry (also called OCD (Optical Critical Dimension)), which can solve for multiple profile parameters such as metal line height or sidewall angle and does so within patterned regions. This gives scatterometry an advantage over inline microscopy-based techniques, which provide top-down information, since such techniques can be insensitive to sidewall variations hidden under the metal fill of the trench. But when faced with correlation to electrical test measurements that are specific to the BEOL processing, both techniques face the additional challenge of sampling. Microscopy-based techniques are sampling-limited by their small probe size, while scatterometry is traditionally limited (for microprocessors) to scribe targets that mimic device ground rules but are not necessarily designed to be electrically testable. A solution to this sampling challenge lies in a fast reference-based machine learning capability that allows for OCD measurement directly of the electrically-testable structures, even when they are not OCD-compatible. By incorporating such direct OCD measurements, correlation to, and therefore prediction of, resistance of BEOL electrical test structures is significantly improved. Improvements in prediction capability for multiple types of in-die electrically-testable device structures is demonstrated. To further improve the quality of the prediction of the electrical resistance measurements, hybrid metrology using the OCD measurements as well as X-ray metrology (XRF) is used. Hybrid metrology is the practice of combining information from multiple sources in order to enable or improve the measurement of one or more critical parameters. Here, the XRF measurements are used to detect subtle changes in barrier layer composition and thickness that can have second-order effects on the electrical resistance of the test structures. By accounting for such effects with the aid of the X-ray-based measurements, further improvement in the OCD correlation to electrical test measurements is achieved. Using both types of solution incorporation of fast reference-based machine learning on nonOCD-compatible test structures, and hybrid metrology combining OCD with XRF technology improvement in BEOL cycle time learning could be accomplished through improved prediction capability.
A novel methodology for in-process monitoring of flow forming
NASA Astrophysics Data System (ADS)
Appleby, Andrew; Conway, Alastair; Ion, William
2017-10-01
Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S.
2013-12-01
The cloud is proving to be a uniquely promising platform for scientific computing. Our experience with processing satellite data using Amazon Web Services highlights several opportunities for enhanced performance, flexibility, and cost effectiveness in the cloud relative to traditional computing -- for example: - Direct readout from a polar-orbiting satellite such as the Suomi National Polar-Orbiting Partnership (S-NPP) requires bursts of processing a few times a day, separated by quiet periods when the satellite is out of receiving range. In the cloud, by starting and stopping virtual machines in minutes, we can marshal significant computing resources quickly when needed, but not pay for them when not needed. To take advantage of this capability, we are automating a data-driven approach to the management of cloud computing resources, in which new data availability triggers the creation of new virtual machines (of variable size and processing power) which last only until the processing workflow is complete. - 'Spot instances' are virtual machines that run as long as one's asking price is higher than the provider's variable spot price. Spot instances can greatly reduce the cost of computing -- for software systems that are engineered to withstand unpredictable interruptions in service (as occurs when a spot price exceeds the asking price). We are implementing an approach to workflow management that allows data processing workflows to resume with minimal delays after temporary spot price spikes. This will allow systems to take full advantage of variably-priced 'utility computing.' - Thanks to virtual machine images, we can easily launch multiple, identical machines differentiated only by 'user data' containing individualized instructions (e.g., to fetch particular datasets or to perform certain workflows or algorithms) This is particularly useful when (as is the case with S-NPP data) we need to launch many very similar machines to process an unpredictable number of data files concurrently. Our experience shows the viability and flexibility of this approach to workflow management for scientific data processing. - Finally, cloud computing is a promising platform for distributed volunteer ('interstitial') computing, via mechanisms such as the Berkeley Open Infrastructure for Network Computing (BOINC) popularized with the SETI@Home project and others such as ClimatePrediction.net and NASA's Climate@Home. Interstitial computing faces significant challenges as commodity computing shifts from (always on) desktop computers towards smartphones and tablets (untethered and running on scarce battery power); but cloud computing offers significant slack capacity. This capacity includes virtual machines with unused RAM or underused CPUs; virtual storage volumes allocated (& paid for) but not full; and virtual machines that are paid up for the current hour but whose work is complete. We are devising ways to facilitate the reuse of these resources (i.e., cloud-based interstitial computing) for satellite data processing and related analyses. We will present our findings and research directions on these and related topics.
Compact Microscope Imaging System with Intelligent Controls
NASA Technical Reports Server (NTRS)
McDowell, Mark
2004-01-01
The figure presents selected views of a compact microscope imaging system (CMIS) that includes a miniature video microscope, a Cartesian robot (a computer- controlled three-dimensional translation stage), and machine-vision and control subsystems. The CMIS was built from commercial off-the-shelf instrumentation, computer hardware and software, and custom machine-vision software. The machine-vision and control subsystems include adaptive neural networks that afford a measure of artificial intelligence. The CMIS can perform several automated tasks with accuracy and repeatability . tasks that, heretofore, have required the full attention of human technicians using relatively bulky conventional microscopes. In addition, the automation and control capabilities of the system inherently include a capability for remote control. Unlike human technicians, the CMIS is not at risk of becoming fatigued or distracted: theoretically, it can perform continuously at the level of the best human technicians. In its capabilities for remote control and for relieving human technicians of tedious routine tasks, the CMIS is expected to be especially useful in biomedical research, materials science, inspection of parts on industrial production lines, and space science. The CMIS can automatically focus on and scan a microscope sample, find areas of interest, record the resulting images, and analyze images from multiple samples simultaneously. Automatic focusing is an iterative process: The translation stage is used to move the microscope along its optical axis in a succession of coarse, medium, and fine steps. A fast Fourier transform (FFT) of the image is computed at each step, and the FFT is analyzed for its spatial-frequency content. The microscope position that results in the greatest dispersal of FFT content toward high spatial frequencies (indicating that the image shows the greatest amount of detail) is deemed to be the focal position.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... tooling, but should include ``all property, i.e., special test equipment, ground support equipment, machine tools and machines and other intangibles to maintain capability.'' Response: DoD is fully...
Xie, Hong-Bo; Huang, Hu; Wu, Jianhua; Liu, Lei
2015-02-01
We present a multiclass fuzzy relevance vector machine (FRVM) learning mechanism and evaluate its performance to classify multiple hand motions using surface electromyographic (sEMG) signals. The relevance vector machine (RVM) is a sparse Bayesian kernel method which avoids some limitations of the support vector machine (SVM). However, RVM still suffers the difficulty of possible unclassifiable regions in multiclass problems. We propose two fuzzy membership function-based FRVM algorithms to solve such problems, based on experiments conducted on seven healthy subjects and two amputees with six hand motions. Two feature sets, namely, AR model coefficients and room mean square value (AR-RMS), and wavelet transform (WT) features, are extracted from the recorded sEMG signals. Fuzzy support vector machine (FSVM) analysis was also conducted for wide comparison in terms of accuracy, sparsity, training and testing time, as well as the effect of training sample sizes. FRVM yielded comparable classification accuracy with dramatically fewer support vectors in comparison with FSVM. Furthermore, the processing delay of FRVM was much less than that of FSVM, whilst training time of FSVM much faster than FRVM. The results indicate that FRVM classifier trained using sufficient samples can achieve comparable generalization capability as FSVM with significant sparsity in multi-channel sEMG classification, which is more suitable for sEMG-based real-time control applications.
Programmable phase plate for tool modification in laser machining applications
Thompson Jr., Charles A.; Kartz, Michael W.; Brase, James M.; Pennington, Deanna; Perry, Michael D.
2004-04-06
A system for laser machining includes a laser source for propagating a laser beam toward a target location, and a spatial light modulator having individual controllable elements capable of modifying a phase profile of the laser beam to produce a corresponding irradiance pattern on the target location. The system also includes a controller operably connected to the spatial light modulator for controlling the individual controllable elements. By controlling the individual controllable elements, the phase profile of the laser beam may be modified into a desired phase profile so as to produce a corresponding desired irradiance pattern on the target location capable of performing a machining operation on the target location.
A Study of Multifunctional Document Centers that Are Accessible to People Who Are Visually Impaired
ERIC Educational Resources Information Center
Huffman, Lee A.; Uslan, Mark M.; Burton, Darren M.; Eghtesadi, Caesar
2009-01-01
The capabilities of modern photocopy machines have advanced beyond the simple duplication of documents. In addition to the standard functions of copying, collating, and stapling, such machines can be a part of telecommunication networks and provide printing, scanning, faxing, and e-mailing functions. No longer just copy machines, these devices are…
49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.
Code of Federal Regulations, 2013 CFR
2013-10-01
...) A seat for each operator, except as provided in paragraph (b) of this section; (2) A safe and secure position with handholds, handrails, or a secure seat for each roadway worker transported on the machine... windshield wipers are incompatible with the windshield material; (5) A machine braking system capable of...
49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) A seat for each operator, except as provided in paragraph (b) of this section; (2) A safe and secure position with handholds, handrails, or a secure seat for each roadway worker transported on the machine... windshield wipers are incompatible with the windshield material; (5) A machine braking system capable of...
49 CFR 214.507 - Required safety equipment for new on-track roadway maintenance machines.
Code of Federal Regulations, 2012 CFR
2012-10-01
...) A seat for each operator, except as provided in paragraph (b) of this section; (2) A safe and secure position with handholds, handrails, or a secure seat for each roadway worker transported on the machine... windshield wipers are incompatible with the windshield material; (5) A machine braking system capable of...
Stirling machine operating experience
NASA Technical Reports Server (NTRS)
Ross, Brad; Dudenhoefer, James E.
1991-01-01
Numerous Stirling machines have been built and operated, but the operating experience of these machines is not well known. It is important to examine this operating experience in detail, because it largely substantiates the claim that Stirling machines are capable of reliable and lengthy lives. The amount of data that exists is impressive, considering that many of the machines that have been built are developmental machines intended to show proof of concept, and were not expected to operate for any lengthy period of time. Some Stirling machines (typically free-piston machines) achieve long life through non-contact bearings, while other Stirling machines (typically kinematic) have achieved long operating lives through regular seal and bearing replacements. In addition to engine and system testing, life testing of critical components is also considered.
Method and apparatus for characterizing and enhancing the dynamic performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F
2013-12-17
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include dynamic one axis positional accuracy of the machine tool, dynamic cross-axis stability of the machine tool, and dynamic multi-axis positional accuracy of the machine tool.
NASA Technical Reports Server (NTRS)
Vickers, John H.; Pelham, Larry I.
1993-01-01
Automated fiber placement is a manufacturing process used for producing complex composite structures. It is a notable leap to the state-of-the-art in technology for automated composite manufacturing. The fiber placement capability was established at the Marshall Space Flight Center's (MSFC) Productivity Enhancement Complex in 1992 in collaboration with Thiokol Corporation to provide materials and processes research and development, and to fabricate components for many of the Center's Programs. The Fiber Placement System (FPX) was developed as a distinct solution to problems inherent to other automated composite manufacturing systems. This equipment provides unique capabilities to build composite parts in complex 3-D shapes with concave and other asymmetrical configurations. Components with complex geometries and localized reinforcements usually require labor intensive efforts resulting in expensive, less reproducible components; the fiber placement system has the features necessary to overcome these conditions. The mechanical systems of the equipment have the motion characteristics of a filament winder and the fiber lay-up attributes of a tape laying machine, with the additional capabilities of differential tow payout speeds, compaction and cut-restart to selectively place the correct number of fibers where the design dictates. This capability will produce a repeatable process resulting in lower cost and improved quality and reliability.
NASA Astrophysics Data System (ADS)
Walker, J. I.; Blodgett, D. L.; Suftin, I.; Kunicki, T.
2013-12-01
High-resolution data for use in environmental modeling is increasingly becoming available at broad spatial and temporal scales. Downscaled climate projections, remotely sensed landscape parameters, and land-use/land-cover projections are examples of datasets that may exceed an individual investigation's data management and analysis capacity. To allow projects on limited budgets to work with many of these data sets, the burden of working with them must be reduced. The approach being pursued at the U.S. Geological Survey Center for Integrated Data Analytics uses standard self-describing web services that allow machine to machine data access and manipulation. These techniques have been implemented and deployed in production level server-based Web Processing Services that can be accessed from a web application or scripted workflow. Data publication techniques that allow machine-interpretation of large collections of data have also been implemented for numerous datasets at U.S. Geological Survey data centers as well as partner agencies and academic institutions. Discovery of data services is accomplished using a method in which a machine-generated metadata record holds content--derived from the data's source web service--that is intended for human interpretation as well as machine interpretation. A distributed search application has been developed that demonstrates the utility of a decentralized search of data-owner metadata catalogs from multiple agencies. The integrated but decentralized system of metadata, data, and server-based processing capabilities will be presented. The design, utility, and value of these solutions will be illustrated with applied science examples and success stories. Datasets such as the EPA's Integrated Climate and Land Use Scenarios, USGS/NASA MODIS derived land cover attributes, and downscaled climate projections from several sources are examples of data this system includes. These and other datasets, have been published as standard, self-describing, web services that provide the ability to inspect and subset the data. This presentation will demonstrate this file-to-web service concept and how it can be used from script-based workflows or web applications.
NASA Technical Reports Server (NTRS)
Niccum, R. J.
1972-01-01
A series of candidate materials for use in large balloons was tested and their tensile and shear strength capabilities were compared. The tests were done in a cold box at -68 C (-90 F). Some of these materials were fabricated on a special machine called the flying thread loom. This machine laminates various patterns of polyester yarn to a thin polyester film. The results show that the shear strength of materials changes with the angle selected for the transverse yarns, and substantial increases in biaxial load carrying capabilities, compared to materials formerly used, are possible. The loom capabilities and the test methods are discussed.
Principles of control automation of soil compacting machine operating mechanism
NASA Astrophysics Data System (ADS)
Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly
2018-03-01
The relevance of the qualitative compaction of soil bases in the erection of embankment and foundations in building and structure construction is given.The quality of the compactible gravel and sandy soils provides the bearing capability and, accordingly, the strength and durability of constructed buildings.It has been established that the compaction quality depends on many external actions, such as surface roughness and soil moisture; granulometry, chemical composition and degree of elasticity of originalfilled soil for compaction.The analysis of technological processes of soil bases compaction of foreign and domestic information sources showed that the solution of such important problem as a continuous monitoring of soil compaction actual degree in the process of machine operation carry out only with the use of modern means of automation. An effective vibrodynamic method of gravel and sand material sealing for the building structure foundations for various applications was justified and suggested.The method of continuous monitoring the soil compaction by measurement of the amplitudes and frequencies of harmonic oscillations on the compactible surface was determined, which allowed to determine the basic elements of facilities of soil compacting machine monitoring system of operating, etc. mechanisms: an accelerometer, a bandpass filter, a vibro-harmonics, an on-board microcontroller. Adjustable parameters have been established to improve the soil compaction degree and the soil compacting machine performance, and the adjustable parameter dependences on the overall indexhave been experimentally determined, which is the soil compaction degree.A structural scheme of automatic control of the soil compacting machine control mechanism and theoperation algorithm has been developed.
NASA Astrophysics Data System (ADS)
Rajabifar, Bahram; Kim, Sanha; Slinker, Keith; Ehlert, Gregory J.; Hart, A. John; Maschmann, Matthew R.
2015-10-01
We demonstrate that vertically aligned carbon nanotubes (CNTs) can be precisely machined in a low pressure water vapor ambient using the electron beam of an environmental scanning electron microscope. The electron beam locally damages the irradiated regions of the CNT forest and also dissociates the water vapor molecules into reactive species including hydroxyl radicals. These species then locally oxidize the damaged region of the CNTs. The technique offers material removal capabilities ranging from selected CNTs to hundreds of cubic microns. We study how the material removal rate is influenced by the acceleration voltage, beam current, dwell time, operating pressure, and CNT orientation. Milled cuts with depths between 0-100 microns are generated, corresponding to a material removal rate of up to 20.1 μm3/min. The technique produces little carbon residue and does not disturb the native morphology of the CNT network. Finally, we demonstrate direct machining of pyramidal surfaces and re-entrant cuts to create freestanding geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajabifar, Bahram; Maschmann, Matthew R., E-mail: MaschmannM@missouri.edu; Kim, Sanha
2015-10-05
We demonstrate that vertically aligned carbon nanotubes (CNTs) can be precisely machined in a low pressure water vapor ambient using the electron beam of an environmental scanning electron microscope. The electron beam locally damages the irradiated regions of the CNT forest and also dissociates the water vapor molecules into reactive species including hydroxyl radicals. These species then locally oxidize the damaged region of the CNTs. The technique offers material removal capabilities ranging from selected CNTs to hundreds of cubic microns. We study how the material removal rate is influenced by the acceleration voltage, beam current, dwell time, operating pressure, andmore » CNT orientation. Milled cuts with depths between 0–100 microns are generated, corresponding to a material removal rate of up to 20.1 μm{sup 3}/min. The technique produces little carbon residue and does not disturb the native morphology of the CNT network. Finally, we demonstrate direct machining of pyramidal surfaces and re-entrant cuts to create freestanding geometries.« less
1988-05-01
Shearing Machines WR/MMI DG 3446 Forging Machinery and Hammers WR/MMI DG 3447 Wire and Metal Ribbon Forming Machines WR/MMI DG 3448 Riveting Machines ...R/MN1I DG 3449 Miscellaneous Secondary Metal Forming & Cutting WR/MMI DG Machinery 3450 Machine Tools, Portable WR/MMI DG 3455 Cutting Tools for...Secondary Metalworking Machinery WR/MMI DG WR 3465 Production Jigs, Fixtures and Templates WR/MMI DG WR 3470 Machine Shop Sets, Kits, and Outfits WR/MMI DG
Modeling Geomagnetic Variations using a Machine Learning Framework
NASA Astrophysics Data System (ADS)
Cheung, C. M. M.; Handmer, C.; Kosar, B.; Gerules, G.; Poduval, B.; Mackintosh, G.; Munoz-Jaramillo, A.; Bobra, M.; Hernandez, T.; McGranaghan, R. M.
2017-12-01
We present a framework for data-driven modeling of Heliophysics time series data. The Solar Terrestrial Interaction Neural net Generator (STING) is an open source python module built on top of state-of-the-art statistical learning frameworks (traditional machine learning methods as well as deep learning). To showcase the capability of STING, we deploy it for the problem of predicting the temporal variation of geomagnetic fields. The data used includes solar wind measurements from the OMNI database and geomagnetic field data taken by magnetometers at US Geological Survey observatories. We examine the predictive capability of different machine learning techniques (recurrent neural networks, support vector machines) for a range of forecasting times (minutes to 12 hours). STING is designed to be extensible to other types of data. We show how STING can be used on large sets of data from different sensors/observatories and adapted to tackle other problems in Heliophysics.
Electron Beam Freeform Fabrication for Cost Effective Near-Net Shape Manufacturing
NASA Technical Reports Server (NTRS)
Taminger, Karen M.; Hafley, Robert A.
2006-01-01
Manufacturing of structural metal parts directly from computer aided design (CAD) data has been investigated by numerous researchers over the past decade. Researchers at NASA Langley Research Center are developing a new solid freeform fabrication process, electron beam freeform fabrication (EBF3), as a rapid metal deposition process that works efficiently with a variety of weldable alloys. EBF3 deposits of 2219 aluminium and Ti-6Al-4V have exhibited a range of grain morphologies depending upon the deposition parameters. These materials have exhibited excellent tensile properties comparable to typical handbook data for wrought plate product after post-processing heat treatments. The EBF3 process is capable of bulk metal deposition at deposition rates in excess of 2500 cm3/hr (150 in3/hr) or finer detail at lower deposition rates, depending upon the desired application. This process offers the potential for rapidly adding structural details to simpler cast or forged structures rather than the conventional approach of machining large volumes of chips to produce a monolithic metallic structure. Selective addition of metal onto simpler blanks of material can have a significant effect on lead time reduction and lower material and machining costs.
Electron Beam Freeform Fabrication (EBF3) for Cost Effective Near-Net Shape Manufacturing
NASA Technical Reports Server (NTRS)
Taminger, Karen M.; Hafley, Robert A.
2006-01-01
Manufacturing of structural metal parts directly from computer aided design (CAD) data has been investigated by numerous researchers over the past decade. Researchers at NASA Langley Research Center are developing a new solid freeform fabrication process, electron beam freeform fabrication (EBF3), as a rapid metal deposition process that works efficiently with a variety of weldable alloys. EBF3 deposits of 2219 aluminium and Ti-6Al-4V have exhibited a range of grain morphologies depending upon the deposition parameters. These materials have exhibited excellent tensile properties comparable to typical handbook data for wrought plate product after post-processing heat treatments. The EBF3 process is capable of bulk metal deposition at deposition rates in excess of 2500 cubic centimeters per hour (150 in3/hr) or finer detail at lower deposition rates, depending upon the desired application. This process offers the potential for rapidly adding structural details to simpler cast or forged structures rather than the conventional approach of machining large volumes of chips to produce a monolithic metallic structure. Selective addition of metal onto simpler blanks of material can have a significant effect on lead time reduction and lower material and machining costs.
In-vivo determination of chewing patterns using FBG and artificial neural networks
NASA Astrophysics Data System (ADS)
Pegorini, Vinicius; Zen Karam, Leandro; Rocha Pitta, Christiano S.; Ribeiro, Richardson; Simioni Assmann, Tangriani; Cardozo da Silva, Jean Carlos; Bertotti, Fábio L.; Kalinowski, Hypolito J.; Cardoso, Rafael
2015-09-01
This paper reports the process of pattern classification of the chewing process of ruminants. We propose a simplified signal processing scheme for optical fiber Bragg grating (FBG) sensors based on machine learning techniques. The FBG sensors measure the biomechanical forces during jaw movements and an artificial neural network is responsible for the classification of the associated chewing pattern. In this study, three patterns associated to dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior studies were monitored, rumination and idle period. Experimental results show that the proposed approach for pattern classification has been capable of differentiating the materials involved in the chewing process with a small classification error.
Thermal and Mechanical Property Characterization of the Advanced Disk Alloy LSHR
NASA Technical Reports Server (NTRS)
Gabb, Timothy P.; Gayda, John; Telesman, Jack; Kantzos, Peter T.
2005-01-01
A low solvus, high refractory (LSHR) powder metallurgy disk alloy was recently designed using experimental screening and statistical modeling of composition and processing variables on sub-scale disks to have versatile processing-property capabilities for advanced disk applications. The objective of the present study was to produce a scaled-up disk and apply varied heat treat processes to enable full-scale demonstration of LSHR properties. Scaled-up disks were produced, heat treated, sectioned, and then machined into specimens for mechanical testing. Results indicate the LSHR alloy can be processed to produce fine and coarse grain microstructures with differing combinations of strength and time-dependent mechanical properties, for application at temperatures exceeding 1300 F.
Increasing Realism in Virtual Marksmanship Simulators
2012-12-01
M16 5.56 mm service rifle M2 .50-caliber machine gun M240 7.62 mm machine gun M9 9 mm Berretta MPI Mean Point of Impact NHQC Navy Handgun...Corps 14 Concepts in Programs, 2008, p. 214). ISMT has the capability to use a wide variety of weapons, including the .50cal. machinegun ( M2 ), 9...a time. ISMT has the unique capability to “provide immediate feedback to the instructor and trainee on weapon trigger pull, cant position, barrel
NASA Technical Reports Server (NTRS)
Riedel, Joseph E.; Grasso, Christopher A.
2012-01-01
VML (Virtual Machine Language) is an advanced computing environment that allows spacecraft to operate using mechanisms ranging from simple, time-oriented sequencing to advanced, multicomponent reactive systems. VML has developed in four evolutionary stages. VML 0 is a core execution capability providing multi-threaded command execution, integer data types, and rudimentary branching. VML 1 added named parameterized procedures, extensive polymorphism, data typing, branching, looping issuance of commands using run-time parameters, and named global variables. VML 2 added for loops, data verification, telemetry reaction, and an open flight adaptation architecture. VML 2.1 contains major advances in control flow capabilities for executable state machines. On the resource requirements front, VML 2.1 features a reduced memory footprint in order to fit more capability into modestly sized flight processors, and endian-neutral data access for compatibility with Intel little-endian processors. Sequence packaging has been improved with object-oriented programming constructs and the use of implicit (rather than explicit) time tags on statements. Sequence event detection has been significantly enhanced with multi-variable waiting, which allows a sequence to detect and react to conditions defined by complex expressions with multiple global variables. This multi-variable waiting serves as the basis for implementing parallel rule checking, which in turn, makes possible executable state machines. The new state machine feature in VML 2.1 allows the creation of sophisticated autonomous reactive systems without the need to develop expensive flight software. Users specify named states and transitions, along with the truth conditions required, before taking transitions. Transitions with the same signal name allow separate state machines to coordinate actions: the conditions distributed across all state machines necessary to arm a particular signal are evaluated, and once found true, that signal is raised. The selected signal then causes all identically named transitions in all present state machines to be taken simultaneously. VML 2.1 has relevance to all potential space missions, both manned and unmanned. It was under consideration for use on Orion.
A Computer-Controlled Laser Bore Scanner
NASA Astrophysics Data System (ADS)
Cheng, Charles C.
1980-08-01
This paper describes the design and engineering of a laser scanning system for production applications. The laser scanning techniques, the timing control, the logic design of the pattern recognition subsystem, the digital computer servo control for the loading and un-loading of parts, and the laser probe rotation and its synchronization will be discussed. The laser inspection machine is designed to automatically inspect the surface of precision-bored holes, such as those in automobile master cylinders, without contacting the machined surface. Although the controls are relatively sophisticated, operation of the laser inspection machine is simple. A laser light beam from a commercially available gas laser, directed through a probe, scans the entire surface of the bore. Reflected light, picked up through optics by photoelectric sensors, generates signals that are fed to a mini-computer for processing. A pattern recognition techniques program in the computer determines acceptance or rejection of the part being inspected. The system's acceptance specifications are adjustable and are set to the user's established tolerances. However, the computer-controlled laser system is capable of defining from 10 to 75 rms surface finish, and voids or flaws from 0.0005 to 0.020 inch. Following the successful demonstration with an engineering prototype, the described laser machine has proved its capability to consistently ensure high-quality master brake cylinders. It thus provides a safety improvement for the automotive braking system. Flawless, smooth cylinder bores eliminate premature wearing of the rubber seals, resulting in a longer-lasting master brake cylinder and a safer and more reliable automobile. The results obtained from use of this system, which has been in operation about a year for replacement of a tedious, manual operation on one of the high-volume lines at the Bendix Hydraulics Division, have been very satisfactory.
Laser Machining of Melt Infiltrated Ceramic Matrix Composite
NASA Technical Reports Server (NTRS)
Jarmon, D. C.; Ojard, G.; Brewer, D.
2012-01-01
As interest grows in considering the use of ceramic matrix composites for critical components, the effects of different machining techniques, and the resulting machined surfaces, on strength need to be understood. This work presents the characterization of a Melt Infiltrated SiC/SiC composite material system machined by different methods. While a range of machining approaches were initially considered, only diamond grinding and laser machining were investigated on a series of tensile coupons. The coupons were tested for residual tensile strength, after a stressed steam exposure cycle. The data clearly differentiated the laser machined coupons as having better capability for the samples tested. These results, along with micro-structural characterization, will be presented.
Laser cutting: industrial relevance, process optimization, and laser safety
NASA Astrophysics Data System (ADS)
Haferkamp, Heinz; Goede, Martin; von Busse, Alexander; Thuerk, Oliver
1998-09-01
Compared to other technological relevant laser machining processes, up to now laser cutting is the application most frequently used. With respect to the large amount of possible fields of application and the variety of different materials that can be machined, this technology has reached a stable position within the world market of material processing. Reachable machining quality for laser beam cutting is influenced by various laser and process parameters. Process integrated quality techniques have to be applied to ensure high-quality products and a cost effective use of the laser manufacturing plant. Therefore, rugged and versatile online process monitoring techniques at an affordable price would be desirable. Methods for the characterization of single plant components (e.g. laser source and optical path) have to be substituted by an omnivalent control system, capable of process data acquisition and analysis as well as the automatic adaptation of machining and laser parameters to changes in process and ambient conditions. At the Laser Zentrum Hannover eV, locally highly resolved thermographic measurements of the temperature distribution within the processing zone using cost effective measuring devices are performed. Characteristic values for cutting quality and plunge control as well as for the optimization of the surface roughness at the cutting edges can be deducted from the spatial distribution of the temperature field and the measured temperature gradients. Main influencing parameters on the temperature characteristic within the cutting zone are the laser beam intensity and pulse duration in pulse operation mode. For continuous operation mode, the temperature distribution is mainly determined by the laser output power related to the cutting velocity. With higher cutting velocities temperatures at the cutting front increase, reaching their maximum at the optimum cutting velocity. Here absorption of the incident laser radiation is drastically increased due to the angle between the normal of the cutting front and the laser beam axis. Beneath process optimization and control further work is focused on the characterization of particulate and gaseous laser generated air contaminants and adequate safety precautions like exhaust and filter systems.
Molecular-Sized DNA or RNA Sequencing Machine | NCI Technology Transfer Center | TTC
The National Cancer Institute's Gene Regulation and Chromosome Biology Laboratory is seeking statements of capability or interest from parties interested in collaborative research to co-develop a molecular-sized DNA or RNA sequencing machine.
Implementing Molecular Dynamics for Hybrid High Performance Computers - 1. Short Range Forces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, W Michael; Wang, Peng; Plimpton, Steven J
The use of accelerators such as general-purpose graphics processing units (GPGPUs) have become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high performance computers, machines with more than one type of floating-point processor, are now becoming more prevalent due to these advantages. In this work, we discuss several important issues in porting a large molecular dynamics code for use on parallel hybrid machines - 1) choosing a hybrid parallel decomposition that works on central processing units (CPUs) with distributed memory and accelerator cores with shared memory,more » 2) minimizing the amount of code that must be ported for efficient acceleration, 3) utilizing the available processing power from both many-core CPUs and accelerators, and 4) choosing a programming model for acceleration. We present our solution to each of these issues for short-range force calculation in the molecular dynamics package LAMMPS. We describe algorithms for efficient short range force calculation on hybrid high performance machines. We describe a new approach for dynamic load balancing of work between CPU and accelerator cores. We describe the Geryon library that allows a single code to compile with both CUDA and OpenCL for use on a variety of accelerators. Finally, we present results on a parallel test cluster containing 32 Fermi GPGPUs and 180 CPU cores.« less
Quantum-enhanced feature selection with forward selection and backward elimination
NASA Astrophysics Data System (ADS)
He, Zhimin; Li, Lvzhou; Huang, Zhiming; Situ, Haozhen
2018-07-01
Feature selection is a well-known preprocessing technique in machine learning, which can remove irrelevant features to improve the generalization capability of a classifier and reduce training and inference time. However, feature selection is time-consuming, particularly for the applications those have thousands of features, such as image retrieval, text mining and microarray data analysis. It is crucial to accelerate the feature selection process. We propose a quantum version of wrapper-based feature selection, which converts a classical feature selection to its quantum counterpart. It is valuable for machine learning on quantum computer. In this paper, we focus on two popular kinds of feature selection methods, i.e., wrapper-based forward selection and backward elimination. The proposed feature selection algorithm can quadratically accelerate the classical one.
Identification Of Cells With A Compact Microscope Imaging System With Intelligent Controls
NASA Technical Reports Server (NTRS)
McDowell, Mark (Inventor)
2006-01-01
A Microscope Imaging System (CMIS) with intelligent controls is disclosed that provides techniques for scanning, identifying, detecting and tracking mic?oscopic changes in selected characteristics or features of various surfaces including, but not limited to, cells, spheres, and manufactured products subject to difficult-to-see imperfections. The practice of the present invention provides applications that include colloidal hard spheres experiments, biological cell detection for patch clamping, cell movement and tracking, as well as defect identification in products, such as semiconductor devices, where surface damage can be significant, but difficult to detect. The CMIS system is a machine vision system, which combines intelligent image processing with remote control capabilities and provides the ability to autofocus on a microscope sample, automatically scan an image, and perform machine vision analysis on multiple samples simultaneously.
Tracking of Cells with a Compact Microscope Imaging System with Intelligent Controls
NASA Technical Reports Server (NTRS)
McDowell, Mark (Inventor)
2007-01-01
A Microscope Imaging System (CMIS) with intelligent controls is disclosed that provides techniques for scanning, identifying, detecting and tracking microscopic changes in selected characteristics or features of various surfaces including, but not limited to, cells, spheres, and manufactured products subject to difficult-to-see imperfections. The practice of the present invention provides applications that include colloidal hard spheres experiments, biological cell detection for patch clamping, cell movement and tracking, as well as defect identification in products, such as semiconductor devices, where surface damage can be significant, but difficult to detect. The CMIS system is a machine vision system, which combines intelligent image processing with remote control capabilities and provides the ability to autofocus on a microscope sample, automatically scan an image, and perform machine vision analysis on multiple samples simultaneously
Tracking of cells with a compact microscope imaging system with intelligent controls
NASA Technical Reports Server (NTRS)
McDowell, Mark (Inventor)
2007-01-01
A Microscope Imaging System (CMIS) with intelligent controls is disclosed that provides techniques for scanning, identifying, detecting and tracking microscopic changes in selected characteristics or features of various surfaces including, but not limited to, cells, spheres, and manufactured products subject to difficult-to-see imperfections. The practice of the present invention provides applications that include colloidal hard spheres experiments, biological cell detection for patch clamping, cell movement and tracking, as well as defect identification in products, such as semiconductor devices, where surface damage can be significant, but difficult to detect. The CMIS system is a machine vision system, which combines intelligent image processing with remote control capabilities and provides the ability to auto-focus on a microscope sample, automatically scan an image, and perform machine vision analysis on multiple samples simultaneously.
Operation of a Cartesian Robotic System in a Compact Microscope with Intelligent Controls
NASA Technical Reports Server (NTRS)
McDowell, Mark (Inventor)
2006-01-01
A Microscope Imaging System (CMIS) with intelligent controls is disclosed that provides techniques for scanning, identifying, detecting and tracking microscopic changes in selected characteristics or features of various surfaces including, but not limited to, cells, spheres, and manufactured products subject to difficult-to-see imperfections. The practice of the present invention provides applications that include colloidal hard spheres experiments, biological cell detection for patch clamping, cell movement and tracking, as well as defect identification in products, such as semiconductor devices, where surface damage can be significant, but difficult to detect. The CMIS system is a machine vision system, which combines intelligent image processing with remote control capabilities and provides the ability to autofocus on a microscope sample, automatically scan an image, and perform machine vision analysis on multiple samples simultaneously.
Wearable Technology in Medicine: Machine-to-Machine (M2M) Communication in Distributed Systems.
Schmucker, Michael; Yildirim, Kemal; Igel, Christoph; Haag, Martin
2016-01-01
Smart wearables are capable of supporting physicians during various processes in medical emergencies. Nevertheless, it is almost impossible to operate several computers without neglecting a patient's treatment. Thus, it is necessary to set up a distributed network consisting of two or more computers to exchange data or initiate remote procedure calls (RPC). If it is not possible to create flawless connections between those devices, it is not possible to transfer medically relevant data to the most suitable device, as well as to control a device with another one. This paper shows how wearables can be paired and what problems occur when trying to pair several wearables. Furthermore, it is described as to what interesting scenarios are possible in the context of emergency medicine/paramedicine.
Method and apparatus for characterizing and enhancing the functional performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F; Smith, Kevin Scott; Assaid, Thomas S; McFarland, Justin T; Tursky, David A; Woody, Bethany; Adams, David
2013-04-30
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include workpiece surface finish, and the ability to generate chips of the desired length.
Self-assembled software and method of overriding software execution
Bouchard, Ann M.; Osbourn, Gordon C.
2013-01-08
A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.
Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †
Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio
2017-01-01
Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448
Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.
Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio
2017-03-10
Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.
NASA Technical Reports Server (NTRS)
1975-01-01
NASA structural analysis (NASTRAN) computer program is operational on three series of third generation computers. The problem and difficulties involved in adapting NASTRAN to a fourth generation computer, namely, the Control Data STAR-100, are discussed. The salient features which distinguish Control Data STAR-100 from third generation computers are hardware vector processing capability and virtual memory. A feasible method is presented for transferring NASTRAN to Control Data STAR-100 system while retaining much of the machine-independent code. Basic matrix operations are noted for optimization for vector processing.
Cell and module formation research area
NASA Technical Reports Server (NTRS)
Bickler, D. B.
1982-01-01
Metallization is discussed. The influence of hydrogen on the firing of base-metal pastes in reducing atmospheres is reported. A method for optimization of metallization patterns is presented. A process sequence involving an AR coating and thick-film metallization system capable of penetrating the AR coating during firing is reported. Design and construction of the NMA implantation machine is reported. Implanted back-surface fields and NMA primary (front) junctions are discussed. The use of glass beads, a wave-soldering device, and ion milling is reported. Processing through the module fabrication and environmental testing of its design are reported. Metallization patterns by mathematical optimization are assessed.
Experimental Investigation and Optimization of Response Variables in WEDM of Inconel - 718
NASA Astrophysics Data System (ADS)
Karidkar, S. S.; Dabade, U. A.
2016-02-01
Effective utilisation of Wire Electrical Discharge Machining (WEDM) technology is challenge for modern manufacturing industries. Day by day new materials with high strengths and capabilities are being developed to fulfil the customers need. Inconel - 718 is similar kind of material which is extensively used in aerospace applications, such as gas turbine, rocket motors, and spacecraft as well as in nuclear reactors and pumps etc. This paper deals with the experimental investigation of optimal machining parameters in WEDM for Surface Roughness, Kerf Width and Dimensional Deviation using DoE such as Taguchi methodology, L9 orthogonal array. By keeping peak current constant at 70 A, the effect of other process parameters on above response variables were analysed. Obtained experimental results were statistically analysed using Minitab-16 software. Analysis of Variance (ANOVA) shows pulse on time as the most influential parameter followed by wire tension whereas spark gap set voltage is observed to be non-influencing parameter. Multi-objective optimization technique, Grey Relational Analysis (GRA), shows optimal machining parameters such as pulse on time 108 Machine unit, spark gap set voltage 50 V and wire tension 12 gm for optimal response variables considered for the experimental analysis.
Vann, Charles S.
1999-01-01
This small, non-contact optical sensor increases the capability and flexibility of computer controlled machines by detecting its relative position to a workpiece in all six degrees of freedom (DOF). At a fraction of the cost, it is over 200 times faster and up to 25 times more accurate than competing 3-DOF sensors. Applications range from flexible manufacturing to a 6-DOF mouse for computers. Until now, highly agile and accurate machines have been limited by their inability to adjust to changes in their tasks. By enabling them to sense all six degrees of position, these machines can now adapt to new and complicated tasks without human intervention or delay--simplifying production, reducing costs, and enhancing the value and capability of flexible manufacturing.
Vann, C.S.
1999-03-16
This small, non-contact optical sensor increases the capability and flexibility of computer controlled machines by detecting its relative position to a workpiece in all six degrees of freedom (DOF). At a fraction of the cost, it is over 200 times faster and up to 25 times more accurate than competing 3-DOF sensors. Applications range from flexible manufacturing to a 6-DOF mouse for computers. Until now, highly agile and accurate machines have been limited by their inability to adjust to changes in their tasks. By enabling them to sense all six degrees of position, these machines can now adapt to new and complicated tasks without human intervention or delay--simplifying production, reducing costs, and enhancing the value and capability of flexible manufacturing. 3 figs.
NASA Astrophysics Data System (ADS)
Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.
2017-12-01
Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The tool is available at https://github.com/USGS-Astrogeology/PySAT_Point_Spectra_GUI. [1] Clegg, S.M., et al. (2017) Spectrochim Acta B. 129, 64-85. [2] Gaddis, L. et al. (2017) 3rd Planetary Data Workshop, #1986. [3] http://scikit-learn.org/ [4] Anderson, R.B., et al. (2017) Spectrochim. Acta B. 129, 49-57.
Intelligence-Augmented Rat Cyborgs in Maze Solving.
Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui
2016-01-01
Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains.
Intelligence-Augmented Rat Cyborgs in Maze Solving
Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui
2016-01-01
Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains. PMID:26859299
Ponce, Hiram; Martínez-Villaseñor, María de Lourdes; Miralles-Pechuán, Luis
2016-07-05
Human activity recognition has gained more interest in several research communities given that understanding user activities and behavior helps to deliver proactive and personalized services. There are many examples of health systems improved by human activity recognition. Nevertheless, the human activity recognition classification process is not an easy task. Different types of noise in wearable sensors data frequently hamper the human activity recognition classification process. In order to develop a successful activity recognition system, it is necessary to use stable and robust machine learning techniques capable of dealing with noisy data. In this paper, we presented the artificial hydrocarbon networks (AHN) technique to the human activity recognition community. Our artificial hydrocarbon networks novel approach is suitable for physical activity recognition, noise tolerance of corrupted data sensors and robust in terms of different issues on data sensors. We proved that the AHN classifier is very competitive for physical activity recognition and is very robust in comparison with other well-known machine learning methods.
Integrated flexible manufacturing program for manufacturing automation and rapid prototyping
NASA Technical Reports Server (NTRS)
Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.
1993-01-01
The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.
Pyramidal neurovision architecture for vision machines
NASA Astrophysics Data System (ADS)
Gupta, Madan M.; Knopf, George K.
1993-08-01
The vision system employed by an intelligent robot must be active; active in the sense that it must be capable of selectively acquiring the minimal amount of relevant information for a given task. An efficient active vision system architecture that is based loosely upon the parallel-hierarchical (pyramidal) structure of the biological visual pathway is presented in this paper. Although the computational architecture of the proposed pyramidal neuro-vision system is far less sophisticated than the architecture of the biological visual pathway, it does retain some essential features such as the converging multilayered structure of its biological counterpart. In terms of visual information processing, the neuro-vision system is constructed from a hierarchy of several interactive computational levels, whereupon each level contains one or more nonlinear parallel processors. Computationally efficient vision machines can be developed by utilizing both the parallel and serial information processing techniques within the pyramidal computing architecture. A computer simulation of a pyramidal vision system for active scene surveillance is presented.
NASA Technical Reports Server (NTRS)
Landgrebe, D.
1974-01-01
A broad study is described to evaluate a set of machine analysis and processing techniques applied to ERTS-1 data. Based on the analysis results in urban land use analysis and soil association mapping together with previously reported results in general earth surface feature identification and crop species classification, a profile of general applicability of this procedure is beginning to emerge. Put in the hands of a user who knows well the information needed from the data and also is familiar with the region to be analyzed it appears that significantly useful information can be generated by these methods. When supported by preprocessing techniques such as the geometric correction and temporal registration capabilities, final products readily useable by user agencies appear possible. In parallel with application, through further research, there is much potential for further development of these techniques both with regard to providing higher performance and in new situations not yet studied.
Acuña, Gonzalo; Ramirez, Cristian; Curilem, Millaray
2014-01-01
The lack of sensors for some relevant state variables in fermentation processes can be coped by developing appropriate software sensors. In this work, NARX-ANN, NARMAX-ANN, NARX-SVM and NARMAX-SVM models are compared when acting as software sensors of biomass concentration for a solid substrate cultivation (SSC) process. Results show that NARMAX-SVM outperforms the other models with an SMAPE index under 9 for a 20 % amplitude noise. In addition, NARMAX models perform better than NARX models under the same noise conditions because of their better predictive capabilities as they include prediction errors as inputs. In the case of perturbation of initial conditions of the autoregressive variable, NARX models exhibited better convergence capabilities. This work also confirms that a difficult to measure variable, like biomass concentration, can be estimated on-line from easy to measure variables like CO₂ and O₂ using an adequate software sensor based on computational intelligence techniques.
Redman, Joseph S; Natarajan, Yamini; Hou, Jason K; Wang, Jingqi; Hanif, Muzammil; Feng, Hua; Kramer, Jennifer R; Desiderio, Roxanne; Xu, Hua; El-Serag, Hashem B; Kanwal, Fasiha
2017-10-01
Natural language processing is a powerful technique of machine learning capable of maximizing data extraction from complex electronic medical records. We utilized this technique to develop algorithms capable of "reading" full-text radiology reports to accurately identify the presence of fatty liver disease. Abdominal ultrasound, computerized tomography, and magnetic resonance imaging reports were retrieved from the Veterans Affairs Corporate Data Warehouse from a random national sample of 652 patients. Radiographic fatty liver disease was determined by manual review by two physicians and verified with an expert radiologist. A split validation method was utilized for algorithm development. For all three imaging modalities, the algorithms could identify fatty liver disease with >90% recall and precision, with F-measures >90%. These algorithms could be used to rapidly screen patient records to establish a large cohort to facilitate epidemiological and clinical studies and examine the clinic course and outcomes of patients with radiographic hepatic steatosis.
Spin dynamics modeling in the AGS based on a stepwise ray-tracing method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dutheil, Yann
The AGS provides a polarized proton beam to RHIC. The beam is accelerated in the AGS from Gγ= 4.5 to Gγ = 45.5 and the polarization transmission is critical to the RHIC spin program. In the recent years, various systems were implemented to improve the AGS polarization transmission. These upgrades include the double partial snakes configuration and the tune jumps system. However, 100% polarization transmission through the AGS acceleration cycle is not yet reached. The current efficiency of the polarization transmission is estimated to be around 85% in typical running conditions. Understanding the sources of depolarization in the AGS ismore » critical to improve the AGS polarized proton performances. The complexity of beam and spin dynamics, which is in part due to the specialized Siberian snake magnets, drove a strong interest for original methods of simulations. For that, the Zgoubi code, capable of direct particle and spin tracking through field maps, was here used to model the AGS. A model of the AGS using the Zgoubi code was developed and interfaced with the current system through a simple command: the AgsFromSnapRampCmd. Interfacing with the machine control system allows for fast modelization using actual machine parameters. Those developments allowed the model to realistically reproduce the optics of the AGS along the acceleration ramp. Additional developments on the Zgoubi code, as well as on post-processing and pre-processing tools, granted long term multiturn beam tracking capabilities: the tracking of realistic beams along the complete AGS acceleration cycle. Beam multiturn tracking simulations in the AGS, using realistic beam and machine parameters, provided a unique insight into the mechanisms behind the evolution of the beam emittance and polarization during the acceleration cycle. Post-processing softwares were developed to allow the representation of the relevant quantities from the Zgoubi simulations data. The Zgoubi simulations proved particularly useful to better understand the polarization losses through horizontal intrinsic spin resonances The Zgoubi model as well as the tools developed were also used for some direct applications. For instance, some beam experiment simulations allowed an accurate estimation of the expected polarization gains from machine changes. In particular, the simulations that involved involved the tune jumps system provided an accurate estimation of polarization gains and the optimum settings that would improve the performance of the AGS.« less
A semi-automated process for the production of custom-made shoes
NASA Technical Reports Server (NTRS)
Farmer, Franklin H.
1991-01-01
A more efficient, cost-effective and timely way of designing and manufacturing custom footware is needed. A potential solution to this problem lies in the use of computer-aided design and manufacturing (CAD/CAM) techniques in the production of custom shoes. A prototype computer-based system was developed, and the system is primarily a software entity which directs and controls a 3-D scanner, a lathe or milling machine, and a pattern-cutting machine to produce the shoe last and the components to be assembled into a shoe. The steps in this process are: (1) scan the surface of the foot to obtain a 3-D image; (2) thin the foot surface data and create a tiled wire model of the foot; (3) interactively modify the wire model of the foot to produce a model of the shoe last; (4) machine the last; (5) scan the surface of the last and verify that it correctly represents the last model; (6) design cutting patterns for shoe uppers; (7) cut uppers; (8) machine an inverse mold for the shoe innersole/sole combination; (9) mold the innersole/sole; and (10) assemble the shoe. For all its capabilities, this system still requires the direction and assistance of skilled operators, and shoemakers to assemble the shoes. Currently, the system is running on a SUN3/260 workstation with TAAC application accelerator. The software elements of the system are written in either Fortran or C and run under a UNIX operator system.
Method and system for rendering and interacting with an adaptable computing environment
Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM
2012-06-12
An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.
Blood Irradiator Interactive Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howington, John; Potter, Charles; DeGroff, Tavias
The “Blood Irradiator Interactive Tool” compares a typical Cs-137 Blood Irradiator with that of the capabilities of an average X-ray Irradiator. It is designed to inform the user about the potential capabilities that an average X-ray Irradiator could offer them. Specifically the tool compares the amount of blood bags that can be irradiated by the users’ machine with that of the average X-ray capability. It also forcasts the amount of blood that can be irradiated on yearly basis for both the users’ machine and an average X-ray Device. The Average X-ray capabilities are taken from the three X-ray devices currentlymore » on the market: The RS 3400 Rad Source X-ray Blood Irradiator and both the 2.0L and 3.5 L versions of the Best Theratronis Raycell MK2« less
The microstructure of aluminum A5083 butt joint by friction stir welding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jasri, M. A. H. M.; Afendi, M.; Ismail, A.
This study presents the microstructure of the aluminum A5083 butt joint surface after it has been joined by friction stir welding (FSW) process. The FSW process is a unique welding method because it will not change the chemical properties of the welded metals. In this study, MILKO 37 milling machine was modified to run FSW process on 4 mm plate of aluminum A5083 butt joint. For the experiment, variables of travel speed and tool rotational speed based on capability of machine were used to run FSW process. The concentrated heat from the tool to the aluminum plate changes the platemore » form from solid to plastic state. Two aluminum plates is merged to become one plate during plastic state and return to solid when concentrated heat is gradually further away. After that, the surface and cross section of the welded aluminum were investigated with a microscope by 400 x multiplication zoom. The welding defect in the FSW aluminum was identified. Then, the result was compared to the American Welding Society (AWS) FSW standard to decide whether the plate can be accepted or rejected.« less
NASA Astrophysics Data System (ADS)
Paradis, Daniel; Lefebvre, René; Gloaguen, Erwan; Rivera, Alfonso
2015-01-01
The spatial heterogeneity of hydraulic conductivity (K) exerts a major control on groundwater flow and solute transport. The heterogeneous spatial distribution of K can be imaged using indirect geophysical data as long as reliable relations exist to link geophysical data to K. This paper presents a nonparametric learning machine approach to predict aquifer K from cone penetrometer tests (CPT) coupled with a soil moisture and resistivity probe (SMR) using relevance vector machines (RVMs). The learning machine approach is demonstrated with an application to a heterogeneous unconsolidated littoral aquifer in a 12 km2 subwatershed, where relations between K and multiparameters CPT/SMR soundings appear complex. Our approach involved fuzzy clustering to define hydrofacies (HF) on the basis of CPT/SMR and K data prior to the training of RVMs for HFs recognition and K prediction on the basis of CPT/SMR data alone. The learning machine was built from a colocated training data set representative of the study area that includes K data from slug tests and CPT/SMR data up-scaled at a common vertical resolution of 15 cm with K data. After training, the predictive capabilities of the learning machine were assessed through cross validation with data withheld from the training data set and with K data from flowmeter tests not used during the training process. Results show that HF and K predictions from the learning machine are consistent with hydraulic tests. The combined use of CPT/SMR data and RVM-based learning machine proved to be powerful and efficient for the characterization of high-resolution K heterogeneity for unconsolidated aquifers.
Operations management tools to be applied for textile
NASA Astrophysics Data System (ADS)
Maralcan, A.; Ilhan, I.
2017-10-01
In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.
Machining of AISI D2 Tool Steel with Multiple Hole Electrodes by EDM Process
NASA Astrophysics Data System (ADS)
Prasad Prathipati, R.; Devuri, Venkateswarlu; Cheepu, Muralimohan; Gudimetla, Kondaiah; Uzwal Kiran, R.
2018-03-01
In recent years, with the increasing of technology the demand for machining processes is increasing for the newly developed materials. The conventional machining processes are not adequate to meet the accuracy of the machining of these materials. The non-conventional machining processes of electrical discharge machining is one of the most efficient machining processes is being widely used to machining of high accuracy products of various industries. The optimum selection of process parameters is very important in machining processes as that of an electrical discharge machining as they determine surface quality and dimensional precision of the obtained parts, even though time consumption rate is higher for machining of large dimension features. In this work, D2 high carbon and chromium tool steel has been machined using electrical discharge machining with the multiple hole electrode technique. The D2 steel has several applications such as forming dies, extrusion dies and thread rolling. But the machining of this tool steel is very hard because of it shard alloyed elements of V, Cr and Mo which enhance its strength and wear properties. However, the machining is possible by using electrical discharge machining process and the present study implemented a new technique to reduce the machining time using a multiple hole copper electrode. In this technique, while machining with multiple holes electrode, fin like projections are obtained, which can be removed easily by chipping. Then the finishing is done by using solid electrode. The machining time is reduced to around 50% while using multiple hole electrode technique for electrical discharge machining.
Interaction with Machine Improvisation
NASA Astrophysics Data System (ADS)
Assayag, Gerard; Bloch, George; Cont, Arshia; Dubnov, Shlomo
We describe two multi-agent architectures for an improvisation oriented musician-machine interaction systems that learn in real time from human performers. The improvisation kernel is based on sequence modeling and statistical learning. We present two frameworks of interaction with this kernel. In the first, the stylistic interaction is guided by a human operator in front of an interactive computer environment. In the second framework, the stylistic interaction is delegated to machine intelligence and therefore, knowledge propagation and decision are taken care of by the computer alone. The first framework involves a hybrid architecture using two popular composition/performance environments, Max and OpenMusic, that are put to work and communicate together, each one handling the process at a different time/memory scale. The second framework shares the same representational schemes with the first but uses an Active Learning architecture based on collaborative, competitive and memory-based learning to handle stylistic interactions. Both systems are capable of processing real-time audio/video as well as MIDI. After discussing the general cognitive background of improvisation practices, the statistical modelling tools and the concurrent agent architecture are presented. Then, an Active Learning scheme is described and considered in terms of using different improvisation regimes for improvisation planning. Finally, we provide more details about the different system implementations and describe several performances with the system.
AHaH computing-from metastable switches to attractors to machine learning.
Nugent, Michael Alexander; Molter, Timothy Wesley
2014-01-01
Modern computing architecture based on the separation of memory and processing leads to a well known problem called the von Neumann bottleneck, a restrictive limit on the data bandwidth between CPU and RAM. This paper introduces a new approach to computing we call AHaH computing where memory and processing are combined. The idea is based on the attractor dynamics of volatile dissipative electronics inspired by biological systems, presenting an attractive alternative architecture that is able to adapt, self-repair, and learn from interactions with the environment. We envision that both von Neumann and AHaH computing architectures will operate together on the same machine, but that the AHaH computing processor may reduce the power consumption and processing time for certain adaptive learning tasks by orders of magnitude. The paper begins by drawing a connection between the properties of volatility, thermodynamics, and Anti-Hebbian and Hebbian (AHaH) plasticity. We show how AHaH synaptic plasticity leads to attractor states that extract the independent components of applied data streams and how they form a computationally complete set of logic functions. After introducing a general memristive device model based on collections of metastable switches, we show how adaptive synaptic weights can be formed from differential pairs of incremental memristors. We also disclose how arrays of synaptic weights can be used to build a neural node circuit operating AHaH plasticity. By configuring the attractor states of the AHaH node in different ways, high level machine learning functions are demonstrated. This includes unsupervised clustering, supervised and unsupervised classification, complex signal prediction, unsupervised robotic actuation and combinatorial optimization of procedures-all key capabilities of biological nervous systems and modern machine learning algorithms with real world application.
AHaH Computing–From Metastable Switches to Attractors to Machine Learning
Nugent, Michael Alexander; Molter, Timothy Wesley
2014-01-01
Modern computing architecture based on the separation of memory and processing leads to a well known problem called the von Neumann bottleneck, a restrictive limit on the data bandwidth between CPU and RAM. This paper introduces a new approach to computing we call AHaH computing where memory and processing are combined. The idea is based on the attractor dynamics of volatile dissipative electronics inspired by biological systems, presenting an attractive alternative architecture that is able to adapt, self-repair, and learn from interactions with the environment. We envision that both von Neumann and AHaH computing architectures will operate together on the same machine, but that the AHaH computing processor may reduce the power consumption and processing time for certain adaptive learning tasks by orders of magnitude. The paper begins by drawing a connection between the properties of volatility, thermodynamics, and Anti-Hebbian and Hebbian (AHaH) plasticity. We show how AHaH synaptic plasticity leads to attractor states that extract the independent components of applied data streams and how they form a computationally complete set of logic functions. After introducing a general memristive device model based on collections of metastable switches, we show how adaptive synaptic weights can be formed from differential pairs of incremental memristors. We also disclose how arrays of synaptic weights can be used to build a neural node circuit operating AHaH plasticity. By configuring the attractor states of the AHaH node in different ways, high level machine learning functions are demonstrated. This includes unsupervised clustering, supervised and unsupervised classification, complex signal prediction, unsupervised robotic actuation and combinatorial optimization of procedures–all key capabilities of biological nervous systems and modern machine learning algorithms with real world application. PMID:24520315
LTCC Thick Film Process Characterization
Girardi, M. A.; Peterson, K. A.; Vianco, P. T.
2016-05-01
Low temperature cofired ceramic (LTCC) technology has proven itself in military/space electronics, wireless communication, microsystems, medical and automotive electronics, and sensors. The use of LTCC for high frequency applications is appealing due to its low losses, design flexibility and packaging and integration capability. Moreover, we summarize the LTCC thick film process including some unconventional process steps such as feature machining in the unfired state and thin film definition of outer layer conductors. The LTCC thick film process was characterized to optimize process yields by focusing on these factors: 1) Print location, 2) Print thickness, 3) Drying of tapes and panels,more » 4) Shrinkage upon firing, and 5) Via topography. Statistical methods were used to analyze critical process and product characteristics in the determination towards that optimization goal.« less
Data Mining at NASA: From Theory to Applications
NASA Technical Reports Server (NTRS)
Srivastava, Ashok N.
2009-01-01
This slide presentation demonstrates the data mining/machine learning capabilities of NASA Ames and Intelligent Data Understanding (IDU) group. This will encompass the work done recently in the group by various group members. The IDU group develops novel algorithms to detect, classify, and predict events in large data streams for scientific and engineering systems. This presentation for Knowledge Discovery and Data Mining 2009 is to demonstrate the data mining/machine learning capabilities of NASA Ames and IDU group. This will encompass the work done re cently in the group by various group members.
Big Data: Next-Generation Machines for Big Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hack, James J.; Papka, Michael E.
Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less
The 4-D approach to visual control of autonomous systems
NASA Technical Reports Server (NTRS)
Dickmanns, Ernst D.
1994-01-01
Development of a 4-D approach to dynamic machine vision is described. Core elements of this method are spatio-temporal models oriented towards objects and laws of perspective projection in a foward mode. Integration of multi-sensory measurement data was achieved through spatio-temporal models as invariants for object recognition. Situation assessment and long term predictions were allowed through maintenance of a symbolic 4-D image of processes involving objects. Behavioral capabilities were easily realized by state feedback and feed-foward control.
Travelogue--a newcomer encounters statistics and the computer.
Bruce, Peter
2011-11-01
Computer-intensive methods have revolutionized statistics, giving rise to new areas of analysis and expertise in predictive analytics, image processing, pattern recognition, machine learning, genomic analysis, and more. Interest naturally centers on the new capabilities the computer allows the analyst to bring to the table. This article, instead, focuses on the account of how computer-based resampling methods, with their relative simplicity and transparency, enticed one individual, untutored in statistics or mathematics, on a long journey into learning statistics, then teaching it, then starting an education institution.
Li, Ji; Hu, Guoqing; Zhou, Yonghong; Zou, Chong; Peng, Wei; Alam SM, Jahangir
2017-01-01
As a high performance-cost ratio solution for differential pressure measurement, piezo-resistive differential pressure sensors are widely used in engineering processes. However, their performance is severely affected by the environmental temperature and the static pressure applied to them. In order to modify the non-linear measuring characteristics of the piezo-resistive differential pressure sensor, compensation actions should synthetically consider these two aspects. Advantages such as nonlinear approximation capability, highly desirable generalization ability and computational efficiency make the kernel extreme learning machine (KELM) a practical approach for this critical task. Since the KELM model is intrinsically sensitive to the regularization parameter and the kernel parameter, a searching scheme combining the coupled simulated annealing (CSA) algorithm and the Nelder-Mead simplex algorithm is adopted to find an optimal KLEM parameter set. A calibration experiment at different working pressure levels was conducted within the temperature range to assess the proposed method. In comparison with other compensation models such as the back-propagation neural network (BP), radius basis neural network (RBF), particle swarm optimization optimized support vector machine (PSO-SVM), particle swarm optimization optimized least squares support vector machine (PSO-LSSVM) and extreme learning machine (ELM), the compensation results show that the presented compensation algorithm exhibits a more satisfactory performance with respect to temperature compensation and synthetic compensation problems. PMID:28422080
Li, Ji; Hu, Guoqing; Zhou, Yonghong; Zou, Chong; Peng, Wei; Alam Sm, Jahangir
2017-04-19
As a high performance-cost ratio solution for differential pressure measurement, piezo-resistive differential pressure sensors are widely used in engineering processes. However, their performance is severely affected by the environmental temperature and the static pressure applied to them. In order to modify the non-linear measuring characteristics of the piezo-resistive differential pressure sensor, compensation actions should synthetically consider these two aspects. Advantages such as nonlinear approximation capability, highly desirable generalization ability and computational efficiency make the kernel extreme learning machine (KELM) a practical approach for this critical task. Since the KELM model is intrinsically sensitive to the regularization parameter and the kernel parameter, a searching scheme combining the coupled simulated annealing (CSA) algorithm and the Nelder-Mead simplex algorithm is adopted to find an optimal KLEM parameter set. A calibration experiment at different working pressure levels was conducted within the temperature range to assess the proposed method. In comparison with other compensation models such as the back-propagation neural network (BP), radius basis neural network (RBF), particle swarm optimization optimized support vector machine (PSO-SVM), particle swarm optimization optimized least squares support vector machine (PSO-LSSVM) and extreme learning machine (ELM), the compensation results show that the presented compensation algorithm exhibits a more satisfactory performance with respect to temperature compensation and synthetic compensation problems.
Quantitative assessment of the enamel machinability in tooth preparation with dental diamond burs.
Song, Xiao-Fei; Jin, Chen-Xin; Yin, Ling
2015-01-01
Enamel cutting using dental handpieces is a critical process in tooth preparation for dental restorations and treatment but the machinability of enamel is poorly understood. This paper reports on the first quantitative assessment of the enamel machinability using computer-assisted numerical control, high-speed data acquisition, and force sensing systems. The enamel machinability in terms of cutting forces, force ratio, cutting torque, cutting speed and specific cutting energy were characterized in relation to enamel surface orientation, specific material removal rate and diamond bur grit size. The results show that enamel surface orientation, specific material removal rate and diamond bur grit size critically affected the enamel cutting capability. Cutting buccal/lingual surfaces resulted in significantly higher tangential and normal forces, torques and specific energy (p<0.05) but lower cutting speeds than occlusal surfaces (p<0.05). Increasing material removal rate for high cutting efficiencies using coarse burs yielded remarkable rises in cutting forces and torque (p<0.05) but significant reductions in cutting speed and specific cutting energy (p<0.05). In particular, great variations in cutting forces, torques and specific energy were observed at the specific material removal rate of 3mm(3)/min/mm using coarse burs, indicating the cutting limit. This work provides fundamental data and the scientific understanding of the enamel machinability for clinical dental practice. Copyright © 2014 Elsevier Ltd. All rights reserved.
The JPL Library Information Retrieval System
ERIC Educational Resources Information Center
Walsh, Josephine
1975-01-01
The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. (Author)
Trajectories of the ribosome as a Brownian nanomachine
Dashti, Ali; Schwander, Peter; Langlois, Robert; ...
2014-11-24
In a Brownian machine, there is a tiny device buffeted by the random motions of molecules in the environment, is capable of exploiting these thermal motions for many of the conformational changes in its work cycle. Such machines are now thought to be ubiquitous, with the ribosome, a molecular machine responsible for protein synthesis, increasingly regarded as prototypical. We present a new analytical approach capable of determining the free-energy landscape and the continuous trajectories of molecular machines from a large number of snapshots obtained by cryogenic electron microscopy. We demonstrate this approach in the context of experimental cryogenic electron microscopemore » images of a large ensemble of nontranslating ribosomes purified from yeast cells. The free-energy landscape is seen to contain a closed path of low energy, along which the ribosome exhibits conformational changes known to be associated with the elongation cycle. This approach allows model-free quantitative analysis of the degrees of freedom and the energy landscape underlying continuous conformational changes in nanomachines, including those important for biological function.« less
Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider
NASA Astrophysics Data System (ADS)
Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.
2010-03-01
In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.
A light-stimulated synaptic device based on graphene hybrid phototransistor
NASA Astrophysics Data System (ADS)
Qin, Shuchao; Wang, Fengqiu; Liu, Yujie; Wan, Qing; Wang, Xinran; Xu, Yongbing; Shi, Yi; Wang, Xiaomu; Zhang, Rong
2017-09-01
Neuromorphic chips refer to an unconventional computing architecture that is modelled on biological brains. They are increasingly employed for processing sensory data for machine vision, context cognition, and decision making. Despite rapid advances, neuromorphic computing has remained largely an electronic technology, making it a challenge to access the superior computing features provided by photons, or to directly process vision data that has increasing importance to artificial intelligence. Here we report a novel light-stimulated synaptic device based on a graphene-carbon nanotube hybrid phototransistor. Significantly, the device can respond to optical stimuli in a highly neuron-like fashion and exhibits flexible tuning of both short- and long-term plasticity. These features combined with the spatiotemporal processability make our device a capable counterpart to today’s electrically-driven artificial synapses, with superior reconfigurable capabilities. In addition, our device allows for generic optical spike processing, which provides a foundation for more sophisticated computing. The silicon-compatible, multifunctional photosensitive synapse opens up a new opportunity for neural networks enabled by photonics and extends current neuromorphic systems in terms of system complexities and functionalities.
Techniques and potential capabilities of multi-resolutional information (knowledge) processing
NASA Technical Reports Server (NTRS)
Meystel, A.
1989-01-01
A concept of nested hierarchical (multi-resolutional, pyramidal) information (knowledge) processing is introduced for a variety of systems including data and/or knowledge bases, vision, control, and manufacturing systems, industrial automated robots, and (self-programmed) autonomous intelligent machines. A set of practical recommendations is presented using a case study of a multiresolutional object representation. It is demonstrated here that any intelligent module transforms (sometimes, irreversibly) the knowledge it deals with, and this tranformation affects the subsequent computation processes, e.g., those of decision and control. Several types of knowledge transformation are reviewed. Definite conditions are analyzed, satisfaction of which is required for organization and processing of redundant information (knowledge) in the multi-resolutional systems. Providing a definite degree of redundancy is one of these conditions.
Human Classification Based on Gestural Motions by Using Components of PCA
NASA Astrophysics Data System (ADS)
Aziz, Azri A.; Wan, Khairunizam; Za'aba, S. K.; B, Shahriman A.; Adnan, Nazrul H.; H, Asyekin; R, Zuradzman M.
2013-12-01
Lately, a study of human capabilities with the aim to be integrated into machine is the famous topic to be discussed. Moreover, human are bless with special abilities that they can hear, see, sense, speak, think and understand each other. Giving such abilities to machine for improvement of human life is researcher's aim for better quality of life in the future. This research was concentrating on human gesture, specifically arm motions for differencing the individuality which lead to the development of the hand gesture database. We try to differentiate the human physical characteristic based on hand gesture represented by arm trajectories. Subjects are selected from different type of the body sizes, and then acquired data undergo resampling process. The results discuss the classification of human based on arm trajectories by using Principle Component Analysis (PCA).
Space Spider - A concept for fabrication of large structures
NASA Technical Reports Server (NTRS)
Britton, W. R.; Johnston, J. D.
1978-01-01
The Space Spider concept for the automated fabrication of large space structures involves a specialized machine which roll-forms thin gauge material such as aluminum and develops continuous spiral structures with radial struts to sizes of 600-1,000 feet in diameter by 15 feet deep. This concept allows the machine and raw material to be integrated using the Orbiter capabilities, then boosting the rigid system to geosynchronous equatorial orbit (GEO) without high sensitivity to acceleration forces. As a teleoperator controlled device having repetitive operations, the fabrication process can be monitored and verified from a ground-based station without astronaut involvement in GEO. The resultant structure will be useful as an intermediate size platform or as a structural element to be used with other elements such as the space-fabricated beams or composite nested tubes.
NASA Astrophysics Data System (ADS)
Sun, Wenjie; Liu, Fan; Ma, Ziqi; Li, Chenghai; Zhou, Jinxiong
2017-01-01
Combining synergistically the muscle-like actuation of soft materials and load-carrying and locomotive capability of hard mechanical components results in hybrid soft machines that can exhibit specific functions. Here, we describe the design, fabrication, modeling and experiment of a hybrid soft machine enabled by marrying unidirectionally actuated dielectric elastomer (DE) membrane-spring system and ratchet wheels. Subjected to an applied voltage 8.2 kV at ramping velocity 820 V/s, the hybrid machine prototype exhibits monotonic uniaxial locomotion with an averaged velocity 0.5mm/s. The underlying physics and working mechanisms of the soft machine are verified and elucidated by finite element simulation.
Mechanization for Optimal Landscape Reclamation
NASA Astrophysics Data System (ADS)
Vondráčková, Terezie; Voštová, Věra; Kraus, Michal
2017-12-01
Reclamation is a method of ultimate utilization of land adversely affected by mining or other industrial activity. The paper explains the types of reclamation and the term “optimal reclamation”. Technological options of the long-lasting process of mine dumps reclamation starting with the removal of overlying rocks, transport and backfilling up to the follow-up remodelling of the mine dumps terrain. Technological units and equipment for stripping flow division. Stripping flow solution with respect to optimal reclamation. We recommend that the application of logistic chains and mining simulation with follow-up reclamation to open-pit mines be used for the implementation of optimal reclamation. In addition to a database of local heterogeneities of the stripped soil and reclaimed land, the flow of earths should be resolved in a manner allowing the most suitable soil substrate to be created for the restoration of agricultural and forest land on mine dumps. The methodology under development for the solution of a number of problems, including the geological survey of overlying rocks, extraction of stripping, their transport and backfilling in specified locations with the follow-up deployment of goal-directed reclamation. It will make possible to reduce the financial resources needed for the complex process chain by utilizing GIS, GPS and DGPS technologies, logistic tools and synergistic effects. When selecting machines for transport, moving and spreading of earths, various points of view and aspects must be taken into account. Among such aspects are e.g. the kind of earth to be operated by the respective construction machine, the kind of work activities to be performed, the machine’s capacity, the option to control the machine’s implement and economic aspects and clients’ requirements. All these points of view must be considered in the decision-making process so that the selected machine is capable of executing the required activity and that the use of an unsuitable machine is eliminated as it would result in a delay and increase in the project costs. Therefore, reclamation always includes extensive earth-moving work activities restoring the required relief of the land being reclaimed. Using the earth-moving machine capacity, the kind of soil in mine dumps, the kind of the work activity performed and the machine design, a SW application has been developed that allows the most suitable machine for the respective work technology to be selected with a view to preparing the land intended for reclamation.
NASA Astrophysics Data System (ADS)
Rose, K.; Bauer, J.; Baker, D.; Barkhurst, A.; Bean, A.; DiGiulio, J.; Jones, K.; Jones, T.; Justman, D.; Miller, R., III; Romeo, L.; Sabbatino, M.; Tong, A.
2017-12-01
As spatial datasets are increasingly accessible through open, online systems, the opportunity to use these resources to address a range of Earth system questions grows. Simultaneously, there is a need for better infrastructure and tools to find and utilize these resources. We will present examples of advanced online computing capabilities, hosted in the U.S. DOE's Energy Data eXchange (EDX), that address these needs for earth-energy research and development. In one study the computing team developed a custom, machine learning, big data computing tool designed to parse the web and return priority datasets to appropriate servers to develop an open-source global oil and gas infrastructure database. The results of this spatial smart search approach were validated against expert-driven, manual search results which required a team of seven spatial scientists three months to produce. The custom machine learning tool parsed online, open systems, including zip files, ftp sites and other web-hosted resources, in a matter of days. The resulting resources were integrated into a geodatabase now hosted for open access via EDX. Beyond identifying and accessing authoritative, open spatial data resources, there is also a need for more efficient tools to ingest, perform, and visualize multi-variate, spatial data analyses. Within the EDX framework, there is a growing suite of processing, analytical and visualization capabilities that allow multi-user teams to work more efficiently in private, virtual workspaces. An example of these capabilities are a set of 5 custom spatio-temporal models and data tools that form NETL's Offshore Risk Modeling suite that can be used to quantify oil spill risks and impacts. Coupling the data and advanced functions from EDX with these advanced spatio-temporal models has culminated with an integrated web-based decision-support tool. This platform has capabilities to identify and combine data across scales and disciplines, evaluate potential environmental, social, and economic impacts, highlight knowledge or technology gaps, and reduce uncertainty for a range of `what if' scenarios relevant to oil spill prevention efforts. These examples illustrate EDX's growing capabilities for advanced spatial data search and analysis to support geo-data science needs.
Paz, Concepción; Conde, Marcos; Porteiro, Jacobo; Concheiro, Miguel
2017-01-01
This work introduces the use of machine vision in the massive bubble recognition process, which supports the validation of boiling models involving bubble dynamics, as well as nucleation frequency, active site density and size of the bubbles. The two algorithms presented are meant to be run employing quite standard images of the bubbling process, recorded in general-purpose boiling facilities. The recognition routines are easily adaptable to other facilities if a minimum number of precautions are taken in the setup and in the treatment of the information. Both the side and front projections of subcooled flow-boiling phenomenon over a plain plate are covered. Once all of the intended bubbles have been located in space and time, the proper post-process of the recorded data become capable of tracking each of the recognized bubbles, sketching their trajectories and size evolution, locating the nucleation sites, computing their diameters, and so on. After validating the algorithm’s output against the human eye and data from other researchers, machine vision systems have been demonstrated to be a very valuable option to successfully perform the recognition process, even though the optical analysis of bubbles has not been set as the main goal of the experimental facility. PMID:28632158
Robot Lies in Health Care: When Is Deception Morally Permissible?
Matthias, Andreas
2015-06-01
Autonomous robots are increasingly interacting with users who have limited knowledge of robotics and are likely to have an erroneous mental model of the robot's workings, capabilities, and internal structure. The robot's real capabilities may diverge from this mental model to the extent that one might accuse the robot's manufacturer of deceiving the user, especially in cases where the user naturally tends to ascribe exaggerated capabilities to the machine (e.g. conversational systems in elder-care contexts, or toy robots in child care). This poses the question, whether misleading or even actively deceiving the user of an autonomous artifact about the capabilities of the machine is morally bad and why. By analyzing trust, autonomy, and the erosion of trust in communicative acts as consequences of deceptive robot behavior, we formulate four criteria that must be fulfilled in order for robot deception to be morally permissible, and in some cases even morally indicated.
Human-machine interface hardware: The next decade
NASA Technical Reports Server (NTRS)
Marcus, Elizabeth A.
1991-01-01
In order to understand where human-machine interface hardware is headed, it is important to understand where we are today, how we got there, and what our goals for the future are. As computers become more capable, faster, and programs become more sophisticated, it becomes apparent that the interface hardware is the key to an exciting future in computing. How can a user interact and control a seemingly limitless array of parameters effectively? Today, the answer is most often a limitless array of controls. The link between these controls and human sensory motor capabilities does not utilize existing human capabilities to their full extent. Interface hardware for teleoperation and virtual environments is now facing a crossroad in design. Therefore, we as developers need to explore how the combination of interface hardware, human capabilities, and user experience can be blended to get the best performance today and in the future.
Information processing via physical soft body
Nakajima, Kohei; Hauser, Helmut; Li, Tao; Pfeifer, Rolf
2015-01-01
Soft machines have recently gained prominence due to their inherent softness and the resulting safety and resilience in applications. However, these machines also have disadvantages, as they respond with complex body dynamics when stimulated. These dynamics exhibit a variety of properties, including nonlinearity, memory, and potentially infinitely many degrees of freedom, which are often difficult to control. Here, we demonstrate that these seemingly undesirable properties can in fact be assets that can be exploited for real-time computation. Using body dynamics generated from a soft silicone arm, we show that they can be employed to emulate desired nonlinear dynamical systems. First, by using benchmark tasks, we demonstrate that the nonlinearity and memory within the body dynamics can increase the computational performance. Second, we characterize our system’s computational capability by comparing its task performance with a standard machine learning technique and identify its range of validity and limitation. Our results suggest that soft bodies are not only impressive in their deformability and flexibility but can also be potentially used as computational resources on top and for free. PMID:26014748
Integrating Machine Learning into Space Operations
NASA Astrophysics Data System (ADS)
Kelly, K. G.
There are significant challenges with managing activities in space, which for the scope of this paper are primarily the identification of objects in orbit, maintaining accurate estimates of the orbits of those objects, detecting changes to those orbits, warning of possible collisions between objects and detection of anomalous behavior. The challenges come from the large amounts of data to be processed, which is often incomplete and noisy, limitations on the ability to influence objects in space and the overall strategic importance of space to national interests. The focus of this paper is on defining an approach to leverage the improved capabilities that are possible using state of the art machine learning in a way that empowers operations personnel without sacrificing the security and mission assurance associated with manual operations performed by trained personnel. There has been significant research in the development of algorithms and techniques for applying machine learning in this domain, but deploying new techniques into such a mission critical domain is difficult and time consuming. Establishing a common framework could improve the efficiency with which new techniques are integrated into operations and the overall effectiveness at providing improvements.
The scope of additive manufacturing in cryogenics, component design, and applications
NASA Astrophysics Data System (ADS)
Stautner, W.; Vanapalli, S.; Weiss, K.-P.; Chen, R.; Amm, K.; Budesheim, E.; Ricci, J.
2017-12-01
Additive manufacturing techniques using composites or metals are rapidly gaining momentum in cryogenic applications. Small or large, complex structural components are now no longer limited to mere design studies but can now move into the production stream thanks to new machines on the market that allow for light-weight, cost optimized designs with short turnaround times. The potential for cost reductions from bulk materials machined to tight tolerances has become obvious. Furthermore, additive manufacturing opens doors and design space for cryogenic components that to date did not exist or were not possible in the past, using bulk materials along with elaborate and expensive machining processes, e.g. micromachining. The cryogenic engineer now faces the challenge to design toward those new additive manufacturing capabilities. Additionally, re-thinking designs toward cost optimization and fast implementation also requires detailed knowledge of mechanical and thermal properties at cryogenic temperatures. In the following we compile the information available to date and show a possible roadmap for additive manufacturing applications of parts and components typically used in cryogenic engineering designs.
NASA Astrophysics Data System (ADS)
Nugraheni, L.; Budayasa, I. K.; Suwarsono, S. T.
2018-01-01
The study was designed to discover examine the profile of metacognition of vocational high school student of the Machine Technology program that had high ability and field independent cognitive style in mathematical problem solving. The design of this study was exploratory research with a qualitative approach. This research was conducted at the Machine Technology program of the vocational senior high school. The result revealed that the high-ability student with field independent cognitive style conducted metacognition practices well. That involved the three types of metacognition activities, consisting of planning, monitoring, and evaluating at metacognition level 2 or aware use, 3 or strategic use, 4 or reflective use in mathematical problem solving. The applicability of the metacognition practices conducted by the subject was never at metacognition level 1 or tacit use. This indicated that the participant were already aware, capable of choosing strategies, and able to reflect on their own thinking before, after, or during the process at the time of solving mathematical problems.That was very necessary for the vocational high school student of Machine Technology program.
Seelbach, C
1995-01-01
The Colloquium on Human-Machine Communication by Voice highlighted the global technical community's focus on the problems and promise of voice-processing technology, particularly, speech recognition and speech synthesis. Clearly, there are many areas in both the research and development of these technologies that can be advanced significantly. However, it is also true that there are many applications of these technologies that are capable of commercialization now. Early successful commercialization of new technology is vital to ensure continuing interest in its development. This paper addresses efforts to commercialize speech technologies in two markets: telecommunications and aids for the handicapped. PMID:7479814
Reconfigurable Mobile System - Ground, sea and air applications
NASA Astrophysics Data System (ADS)
Lamonica, Gary L.; Sturges, James W.
1990-11-01
The Reconfigurable Mobile System (RMS) is a highly mobile data-processing unit for military users requiring real-time access to data gathered by airborne (and other) reconnaissance data. RMS combines high-performance computation and image processing workstations with resources for command/control/communications in a single, lightweight shelter. RMS is composed of off-the-shelf components, and is easily reconfigurable to land-vehicle or shipboard versions. Mission planning, which involves an airborne sensor platform's sensor coverage, considered aircraft/sensor capabilities in conjunction with weather, terrain, and threat scenarios. RMS's man-machine interface concept facilitates user familiarization and features iron-based function selection and windowing.
Metric Use in the Tool Industry. A Status Report and a Test of Assessment Methodology.
1982-04-20
Weights and Measures) CIM - Computer-Integrated Manufacturing CNC - Computer Numerical Control DOD - Department of Defense DODISS - DOD Index of...numerically-controlled ( CNC ) machines that have an inch-millimeter selection switch and a corresponding dual readout scale. S -4- The use of both metric...satisfactorily met the demands of both domestic and foreign customers for metric machine tools by providing either metric- capable machines or NC and CNC
A tubular flux-switching permanent magnet machine
NASA Astrophysics Data System (ADS)
Wang, J.; Wang, W.; Clark, R.; Atallah, K.; Howe, D.
2008-04-01
The paper describes a novel tubular, three-phase permanent magnet brushless machine, which combines salient features from both switched reluctance and permanent magnet machine technologies. It has no end windings and zero net radial force and offers a high power density and peak force capability, as well as the potential for low manufacturing cost. It is, therefore, eminently suitable for a variety of applications, ranging from free-piston energy converters to active vehicle suspensions.
Human factors - Man-machine symbiosis in space
NASA Technical Reports Server (NTRS)
Brown, Jeri W.
1987-01-01
The relation between man and machine in space is studied. Early spaceflight and the goal of establishing a permanent space presence are described. The need to consider the physiological, psychological, and social integration of humans for each space mission is examined. Human factors must also be considered in the design of spacecraft. The effective utilization of man and machine capabilities, and research in anthropometry and biomechanics aimed at determining the limitations of spacecrews are discussed.
3-D laser patterning process utilizing horizontal and vertical patterning
Malba, Vincent; Bernhardt, Anthony F.
2000-01-01
A process which vastly improves the 3-D patterning capability of laser pantography (computer controlled laser direct-write patterning). The process uses commercially available electrodeposited photoresist (EDPR) to pattern 3-D surfaces. The EDPR covers the surface of a metal layer conformally, coating the vertical as well as horizontal surfaces. A laser pantograph then patterns the EDPR, which is subsequently developed in a standard, commercially available developer, leaving patterned trench areas in the EDPR. The metal layer thereunder is now exposed in the trench areas and masked in others, and thereafter can be etched to form the desired pattern (subtractive process), or can be plated with metal (additive process), followed by a resist stripping, and removal of the remaining field metal (additive process). This improved laser pantograph process is simpler, faster, move manufacturable, and requires no micro-machining.
Fantastic Journey through Minds and Machines.
ERIC Educational Resources Information Center
Muir, Michael
Intended for learners with a basic familiarity with the Logo programming language, this manual is designed to introduce them to artificial intelligence and enhance their programming capabilities. Nine chapters discuss the following features of Logo: (1) MAZE.MASTER, a look at robots and how sensors make machines aware of their environment; (2)…
Control system software, simulation, and robotic applications
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.
NASA Astrophysics Data System (ADS)
Eilbert, Richard F.; Krug, Kristoph D.
1993-04-01
The Vivid Rapid Explosives Detection Systems is a true dual energy x-ray machine employing precision x-ray data acquisition in combination with unique algorithms and massive computation capability. Data from the system's 960 detectors is digitally stored and processed by powerful supermicro-computers organized as an expandable array of parallel processors. The algorithms operate on the dual energy attenuation image data to recognize and define objects in the milieu of the baggage contents. Each object is then systematically examined for a match to a specific effective atomic number, density, and mass threshold. Material properties are determined by comparing the relative attenuations of the 75 kVp and 150 kVp beams and electronically separating the object from its local background. Other heuristic algorithms search for specific configurations and provide additional information. The machine automatically detects explosive materials and identifies bomb components in luggage with high specificity and throughput, X-ray dose is comparable to that of current airport x-ray machines. The machine is also configured to find heroin, cocaine, and US currency by selecting appropriate settings on-site. Since January 1992, production units have been operationally deployed at U.S. and European airports for improved screening of checked baggage.
Deterministic ion beam material adding technology for high-precision optical surfaces.
Liao, Wenlin; Dai, Yifan; Xie, Xuhui; Zhou, Lin
2013-02-20
Although ion beam figuring (IBF) provides a highly deterministic method for the precision figuring of optical components, several problems still need to be addressed, such as the limited correcting capability for mid-to-high spatial frequency surface errors and low machining efficiency for pit defects on surfaces. We propose a figuring method named deterministic ion beam material adding (IBA) technology to solve those problems in IBF. The current deterministic optical figuring mechanism, which is dedicated to removing local protuberances on optical surfaces, is enriched and developed by the IBA technology. Compared with IBF, this method can realize the uniform convergence of surface errors, where the particle transferring effect generated in the IBA process can effectively correct the mid-to-high spatial frequency errors. In addition, IBA can rapidly correct the pit defects on the surface and greatly improve the machining efficiency of the figuring process. The verification experiments are accomplished on our experimental installation to validate the feasibility of the IBA method. First, a fused silica sample with a rectangular pit defect is figured by using IBA. Through two iterations within only 47.5 min, this highly steep pit is effectively corrected, and the surface error is improved from the original 24.69 nm root mean square (RMS) to the final 3.68 nm RMS. Then another experiment is carried out to demonstrate the correcting capability of IBA for mid-to-high spatial frequency surface errors, and the final results indicate that the surface accuracy and surface quality can be simultaneously improved.
NASA Technical Reports Server (NTRS)
Gabb, Tim; Gayda, John; Telesman, Jack
2001-01-01
The advanced powder metallurgy disk alloy ME3 was designed using statistical screening and optimization of composition and processing variables in the NASA HSR/EPM disk program to have extended durability at 1150 to 1250 "Fin large disks. Scaled-up disks of this alloy were produced at the conclusion of this program to demonstrate these properties in realistic disk shapes. The objective of the UEET disk program was to assess the mechanical properties of these ME3 disks as functions of temperature, in order to estimate the maximum temperature capabilities of this advanced alloy. Scaled-up disks processed in the HSR/EPM Compressor / Turbine Disk program were sectioned, machined into specimens, and tested in tensile, creep, fatigue, and fatigue crack growth tests by NASA Glenn Research Center, in cooperation with General Electric Engine Company and Pratt & Whitney Aircraft Engines. Additional sub-scale disks and blanks were processed and tested to explore the effects of several processing variations on mechanical properties. Scaled-up disks of an advanced regional disk alloy, Alloy 10, were used to evaluate dual microstructure heat treatments. This allowed demonstration of an improved balance of properties in disks with higher strength and fatigue resistance in the bores and higher creep and dwell fatigue crack growth resistance in the rims. Results indicate the baseline ME3 alloy and process has 1300 to 1350 O F temperature capabilities, dependent on detailed disk and engine design property requirements. Chemistry and process enhancements show promise for further increasing temperature capabilities.
Design and Analysis of Linear Fault-Tolerant Permanent-Magnet Vernier Machines
Xu, Liang; Liu, Guohai; Du, Yi; Liu, Hu
2014-01-01
This paper proposes a new linear fault-tolerant permanent-magnet (PM) vernier (LFTPMV) machine, which can offer high thrust by using the magnetic gear effect. Both PMs and windings of the proposed machine are on short mover, while the long stator is only manufactured from iron. Hence, the proposed machine is very suitable for long stroke system applications. The key of this machine is that the magnetizer splits the two movers with modular and complementary structures. Hence, the proposed machine offers improved symmetrical and sinusoidal back electromotive force waveform and reduced detent force. Furthermore, owing to the complementary structure, the proposed machine possesses favorable fault-tolerant capability, namely, independent phases. In particular, differing from the existing fault-tolerant machines, the proposed machine offers fault tolerance without sacrificing thrust density. This is because neither fault-tolerant teeth nor the flux-barriers are adopted. The electromagnetic characteristics of the proposed machine are analyzed using the time-stepping finite-element method, which verifies the effectiveness of the theoretical analysis. PMID:24982959
Design and analysis of linear fault-tolerant permanent-magnet vernier machines.
Xu, Liang; Ji, Jinghua; Liu, Guohai; Du, Yi; Liu, Hu
2014-01-01
This paper proposes a new linear fault-tolerant permanent-magnet (PM) vernier (LFTPMV) machine, which can offer high thrust by using the magnetic gear effect. Both PMs and windings of the proposed machine are on short mover, while the long stator is only manufactured from iron. Hence, the proposed machine is very suitable for long stroke system applications. The key of this machine is that the magnetizer splits the two movers with modular and complementary structures. Hence, the proposed machine offers improved symmetrical and sinusoidal back electromotive force waveform and reduced detent force. Furthermore, owing to the complementary structure, the proposed machine possesses favorable fault-tolerant capability, namely, independent phases. In particular, differing from the existing fault-tolerant machines, the proposed machine offers fault tolerance without sacrificing thrust density. This is because neither fault-tolerant teeth nor the flux-barriers are adopted. The electromagnetic characteristics of the proposed machine are analyzed using the time-stepping finite-element method, which verifies the effectiveness of the theoretical analysis.
NASA Technical Reports Server (NTRS)
Carnahan, Richard S., Jr.; Corey, Stephen M.; Snow, John B.
1989-01-01
Applications of rapid prototyping and Artificial Intelligence techniques to problems associated with Space Station-era information management systems are described. In particular, the work is centered on issues related to: (1) intelligent man-machine interfaces applied to scientific data user support, and (2) the requirement that intelligent information management systems (IIMS) be able to efficiently process metadata updates concerning types of data handled. The advanced IIMS represents functional capabilities driven almost entirely by the needs of potential users. Space Station-era scientific data projected to be generated is likely to be significantly greater than data currently processed and analyzed. Information about scientific data must be presented clearly, concisely, and with support features to allow users at all levels of expertise efficient and cost-effective data access. Additionally, mechanisms for allowing more efficient IIMS metadata update processes must be addressed. The work reported covers the following IIMS design aspects: IIMS data and metadata modeling, including the automatic updating of IIMS-contained metadata, IIMS user-system interface considerations, including significant problems associated with remote access, user profiles, and on-line tutorial capabilities, and development of an IIMS query and browse facility, including the capability to deal with spatial information. A working prototype has been developed and is being enhanced.
NASA Astrophysics Data System (ADS)
Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.
2017-12-01
With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also present some of our findings from applying machine learning and data analytics on the processed SAR data streams. We will also present lessons learned on how to ease the SAR community onto interfacing with these cloud-based SAR science data systems.
UPEML: a machine-portable CDC Update emulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehlhorn, T.A.; Young, M.F.
1984-12-01
UPEML is a machine-portable CDC Update emulation program. UPEML is written in ANSI standard Fortran-77 and is relatively simple and compact. It is capable of emulating a significant subset of the standard CDC Update functions including program library creation and subsequent modification. Machine-portability is an essential attribute of UPEML. It was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX 11/780 and the IBM 3081.
Survivability Design of Ground Systems for Area Defense Operation in an Urban Scenario
2014-09-01
similar to the M2 Bradley AFV. The 50 caliber heavy machine gun technical platoon and mounted anti-tank guided missiles (ATGM) technical platoon...7.62 machine gun . The infantry-carrying BMP-2 has applique steel armor, but its lack of inherent ERA like the T-90 makes it vulnerable against most...vehicle modified to provide an offensive capability. Characterized by its high mobility, technical vehicles are often equipped with heavy machine guns
Laser and Surface Processes of NiTi Shape Memory Elements for Micro-actuation
NASA Astrophysics Data System (ADS)
Nespoli, Adelaide; Biffi, Carlo Alberto; Previtali, Barbara; Villa, Elena; Tuissi, Ausonio
2014-04-01
In the current microtechnology for actuation field, shape memory alloys (SMA) are considered one of the best candidates for the production of mini/micro devices thanks to their high power-to-weight ratio as function of the actuator weight and hence for their capability of generating high mechanical performance in very limited spaces. In the microscale the most suitable conformation of a SMA actuator is given by a planar wavy formed arrangement, i.e., the snake-like shape, which allows high strokes, considerable forces, and devices with very low sizes. This uncommon and complex geometry becomes more difficult to be realized when the actuator dimensions are scaled down to micrometric values. In this work, micro-snake-like actuators are laser machined using a nanosecond pulsed fiber laser, starting from a 120- μm-thick NiTi sheet. Chemical and electrochemical surface polishes are also investigated for the removal of the thermal damages of the laser process. Calorimetric and thermo-mechanical tests are accomplished to assess the NiTi microdevice performance after each step of the working process. It is shown that laser machining has to be followed by some post-processes in order to obtain a micro-actuator with good thermo-mechanical properties.
Automated Solar Module Assembly Line
NASA Technical Reports Server (NTRS)
Bycer, M.
1979-01-01
The gathering of information that led to the design approach of the machine, and a summary of the findings in the areas of study along with a description of each station of the machine are discussed. The machine is a cell stringing and string applique machine which is flexible in design, capable of handling a variety of cells and assembling strings of cells which can then be placed in a matrix up to 4 ft x 2 ft. in series or parallel arrangement. The target machine cycle is to be 5 seconds per cell. This machine is primarily adapted to 100 MM round cells with one or two tabs between cells. It places finished strings of up to twelve cells in a matrix of up to six such strings arranged in series or in parallel.
A T-Type Capacitive Sensor Capable of Measuring 5-DOF Error Motions of Precision Spindles
Xiang, Kui; Qiu, Rongbo; Mei, Deqing; Chen, Zichen
2017-01-01
The precision spindle is a core component of high-precision machine tools, and the accurate measurement of its error motions is important for improving its rotation accuracy as well as the work performance of the machine. This paper presents a T-type capacitive sensor (T-type CS) with an integrated structure. The proposed sensor can measure the 5-degree-of-freedom (5-DOF) error motions of a spindle in-situ and simultaneously by integrating electrode groups in the cylindrical bore of the stator and the outer end face of its flange, respectively. Simulation analysis and experimental results show that the sensing electrode groups with differential measurement configuration have near-linear output for the different types of rotor displacements. What’s more, the additional capacitance generated by fringe effects has been reduced about 90% with the sensing electrode groups fabricated based on flexible printed circuit board (FPCB) and related processing technologies. The improved signal processing circuit has also been increased one times in the measuring performance and makes the measured differential output capacitance up to 93% of the theoretical values. PMID:28846631
Distributed decision support for the 21st century mission space
NASA Astrophysics Data System (ADS)
McQuay, William K.
2002-07-01
The past decade has produced significant changes in the conduct of military operations: increased humanitarian missions, asymmetric warfare, the reliance on coalitions and allies, stringent rules of engagement, concern about casualties, and the need for sustained air operations. Future mission commanders will need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Integral to this process is creating situational assessment-understanding the mission space, simulation to analyze alternative futures, current capabilities, planning assessments, course-of-action assessments, and a common operational picture-keeping everyone on the same sheet of paper. Decision support tools in a distributed collaborative environment offer the capability of decomposing these complex multitask processes and distributing them over a dynamic set of execution assets. Decision support technologies can semi-automate activities, such as planning an operation, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that is not currently fused. The marriage of information and simulation technologies provides the mission commander with a collaborative virtual environment for planning and decision support.
New tool holder design for cryogenic machining of Ti6Al4V
NASA Astrophysics Data System (ADS)
Bellin, Marco; Sartori, Stefano; Ghiotti, Andrea; Bruschi, Stefania
2017-10-01
The renewed demand of increasing the machinability of the Ti6Al4V titanium alloy to produce biomedical and aerospace parts working at high temperature has recently led to the application of low-temperature coolants instead of conventional cutting fluids to increase both the tool life and the machined surface integrity. In particular, the liquid nitrogen directed to the tool rake face has shown a great capability of reducing the temperature at the chip-tool interface, as well as the chemical interaction between the tool coating and the titanium to be machined, therefore limiting the tool crater wear, and improving, at the same time, the chip breakability. Furthermore, the nitrogen is a safe, non-harmful, non-corrosive, odorless, recyclable, non-polluting and abundant gas, characteristics that further qualify it as an environmental friendly coolant to be applied to machining processes. However, the behavior of the system composed by the tool and the tool holder, exposed to the cryogenics temperatures may represent a critical issue in order to obtain components within the required geometrical tolerances. On this basis, the paper aims at presenting the design of an innovative tool holder installed on a CNC lathe, which includes the cryogenic coolant provision system, and which is able to hinder the part possible distortions due to the liquid nitrogen adduction by stabilizing its dimensions through the use of heating cartridges and appropriate sensors to monitor the temperature evolution of the tool holder.
Basic design principles of colorimetric vision systems
NASA Astrophysics Data System (ADS)
Mumzhiu, Alex M.
1998-10-01
Color measurement is an important part of overall production quality control in textile, coating, plastics, food, paper and other industries. The color measurement instruments such as colorimeters and spectrophotometers, used for production quality control have many limitations. In many applications they cannot be used for a variety of reasons and have to be replaced with human operators. Machine vision has great potential for color measurement. The components for color machine vision systems, such as broadcast quality 3-CCD cameras, fast and inexpensive PCI frame grabbers, and sophisticated image processing software packages are available. However the machine vision industry has only started to approach the color domain. The few color machine vision systems on the market, produced by the largest machine vision manufacturers have very limited capabilities. A lack of understanding that a vision based color measurement system could fail if it ignores the basic principles of colorimetry is the main reason for the slow progress of color vision systems. the purpose of this paper is to clarify how color measurement principles have to be applied to vision systems and how the electro-optical design features of colorimeters have to be modified in order to implement them for vision systems. The subject of this presentation far exceeds the limitations of a journal paper so only the most important aspects will be discussed. An overview of the major areas of applications for colorimetric vision system will be discussed. Finally, the reasons why some customers are happy with their vision systems and some are not will be analyzed.
Requirements for fault-tolerant factoring on an atom-optics quantum computer.
Devitt, Simon J; Stephens, Ashley M; Munro, William J; Nemoto, Kae
2013-01-01
Quantum information processing and its associated technologies have reached a pivotal stage in their development, with many experiments having established the basic building blocks. Moving forward, the challenge is to scale up to larger machines capable of performing computational tasks not possible today. This raises questions that need to be urgently addressed, such as what resources these machines will consume and how large will they be. Here we estimate the resources required to execute Shor's factoring algorithm on an atom-optics quantum computer architecture. We determine the runtime and size of the computer as a function of the problem size and physical error rate. Our results suggest that once the physical error rate is low enough to allow quantum error correction, optimization to reduce resources and increase performance will come mostly from integrating algorithms and circuits within the error correction environment, rather than from improving the physical hardware.
LeMoyne, Robert; Mastroianni, Timothy
2016-08-01
Natural gait consists of synchronous and rhythmic patterns for both the lower and upper limb. People with hemiplegia can experience reduced arm swing, which can negatively impact the quality of gait. Wearable and wireless sensors, such as through a smartphone, have demonstrated the ability to quantify various features of gait. With a software application the smartphone (iPhone) can function as a wireless gyroscope platform capable of conveying a gyroscope signal recording as an email attachment by wireless connectivity to the Internet. The gyroscope signal recordings of the affected hemiplegic arm with reduced arm swing arm and the unaffected arm are post-processed into a feature set for machine learning. Using a multilayer perceptron neural network a considerable degree of classification accuracy is attained to distinguish between the affected hemiplegic arm with reduced arm swing arm and the unaffected arm.
In situ surface roughness measurement using a laser scattering method
NASA Astrophysics Data System (ADS)
Tay, C. J.; Wang, S. H.; Quan, C.; Shang, H. M.
2003-03-01
In this paper, the design and development of an optical probe for in situ measurement of surface roughness are discussed. Based on this light scattering principle, the probe which consists of a laser diode, measuring lens and a linear photodiode array, is designed to capture the scattered light from a test surface with a relatively large scattering angle ϕ (=28°). This capability increases the measuring range and enhances repeatability of the results. The coaxial arrangement that incorporates a dual-laser beam and a constant compressed air stream renders the proposed system insensitive to movement or vibration of the test surface as well as surface conditions. Tests were conducted on workpieces which were mounted on a turning machine that operates with different cutting speeds. Test specimens which underwent different machining processes and of different surface finish were also studied. The results obtained demonstrate the feasibility of surface roughness measurement using the proposed method.
Accelerometry-based classification of human activities using Markov modeling.
Mannini, Andrea; Sabatini, Angelo Maria
2011-01-01
Accelerometers are a popular choice as body-motion sensors: the reason is partly in their capability of extracting information that is useful for automatically inferring the physical activity in which the human subject is involved, beside their role in feeding biomechanical parameters estimators. Automatic classification of human physical activities is highly attractive for pervasive computing systems, whereas contextual awareness may ease the human-machine interaction, and in biomedicine, whereas wearable sensor systems are proposed for long-term monitoring. This paper is concerned with the machine learning algorithms needed to perform the classification task. Hidden Markov Model (HMM) classifiers are studied by contrasting them with Gaussian Mixture Model (GMM) classifiers. HMMs incorporate the statistical information available on movement dynamics into the classification process, without discarding the time history of previous outcomes as GMMs do. An example of the benefits of the obtained statistical leverage is illustrated and discussed by analyzing two datasets of accelerometer time series.
Development of assembly and joint concepts for erectable space structures
NASA Technical Reports Server (NTRS)
Jacquemin, G. G.; Bluck, R. M.; Grotbeck, G. H.; Johnson, R. R.
1980-01-01
The technology associated with the on-orbit assembly of tetrahedral truss platforms erected of graphite epoxy tapered columns is examined. Associated with the assembly process is the design and fabrication of nine member node joints. Two such joints demonstrating somewhat different technology were designed and fabricated. Two methods of automatic assembly using the node designs were investigated, and the time of assembly of tetrahedral truss structures up to 1 square km in size was estimated. The effect of column and node joint packaging on the Space Shuttle cargo bay is examined. A brief discussion is included of operating cost considerations and the selection of energy sources. Consideration was given to the design assembly machines from 5 m to 20 m. The smaller machines, mounted on the Space Shuttle, are deployable and restowable. They provide a means of demonstrating the capabilities of the concept and of erecting small specialized platforms on relatively short notice.
NASA Technical Reports Server (NTRS)
Simpson, M. L.; Sayler, G. S.; Fleming, J. T.; Applegate, B.
2001-01-01
The ability to manipulate systems on the molecular scale naturally leads to speculation about the rational design of molecular-scale machines. Cells might be the ultimate molecular-scale machines and our ability to engineer them is relatively advanced when compared with our ability to control the synthesis and direct the assembly of man-made materials. Indeed, engineered whole cells deployed in biosensors can be considered one of the practical successes of molecular-scale devices. However, these devices explore only a small portion of cellular functionality. Individual cells or self-organized groups of cells perform extremely complex functions that include sensing, communication, navigation, cooperation and even fabrication of synthetic nanoscopic materials. In natural systems, these capabilities are controlled by complex genetic regulatory circuits, which are only partially understood and not readily accessible for use in engineered systems. Here, we focus on efforts to mimic the functionality of man-made information-processing systems within whole cells.
NASA Astrophysics Data System (ADS)
Sui, Yi; Zheng, Ping; Cheng, Luming; Wang, Weinan; Liu, Jiaqi
2017-05-01
A single-phase axially-magnetized permanent-magnet (PM) oscillating machine which can be integrated with a free-piston Stirling engine to generate electric power, is investigated for miniature aerospace power sources. Machine structure, operating principle and detent force characteristic are elaborately studied. With the sinusoidal speed characteristic of the mover considered, the proposed machine is designed by 2D finite-element analysis (FEA), and some main structural parameters such as air gap diameter, dimensions of PMs, pole pitches of both stator and mover, and the pole-pitch combinations, etc., are optimized to improve both the power density and force capability. Compared with the three-phase PM linear machines, the proposed single-phase machine features less PM use, simple control and low controller cost. The power density of the proposed machine is higher than that of the three-phase radially-magnetized PM linear machine, but lower than the three-phase axially-magnetized PM linear machine.
Towards a generalized energy prediction model for machine tools
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H.; Dornfeld, David A.; Helu, Moneer; Rachuri, Sudarsan
2017-01-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process. PMID:28652687
Towards a generalized energy prediction model for machine tools.
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan
2017-04-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.
3D Printing In Zero-G ISS Technology Demonstration
NASA Technical Reports Server (NTRS)
Werkheiser, Niki; Cooper, Kenneth; Edmunson, Jennifer; Dunn, Jason; Snyder, Michael
2014-01-01
The National Aeronautics and Space Administration (NASA) has a long term strategy to fabricate components and equipment on-demand for manned missions to the Moon, Mars, and beyond. To support this strategy, NASA and Made in Space, Inc. are developing the 3D Printing In Zero-G payload as a Technology Demonstration for the International Space Station (ISS). The 3D Printing In Zero-G experiment ('3D Print') will be the first machine to perform 3D printing in space. The greater the distance from Earth and the longer the mission duration, the more difficult resupply becomes; this requires a change from the current spares, maintenance, repair, and hardware design model that has been used on the International Space Station (ISS) up until now. Given the extension of the ISS Program, which will inevitably result in replacement parts being required, the ISS is an ideal platform to begin changing the current model for resupply and repair to one that is more suitable for all exploration missions. 3D Printing, more formally known as Additive Manufacturing, is the method of building parts/objects/tools layer-by-layer. The 3D Print experiment will use extrusion-based additive manufacturing, which involves building an object out of plastic deposited by a wire-feed via an extruder head. Parts can be printed from data files loaded on the device at launch, as well as additional files uplinked to the device while on-orbit. The plastic extrusion additive manufacturing process is a low-energy, low-mass solution to many common needs on board the ISS. The 3D Print payload will serve as the ideal first step to proving that process in space. It is unreasonable to expect NASA to launch large blocks of material from which parts or tools can be traditionally machined, and even more unreasonable to fly up multiple drill bits that would be required to machine parts from aerospace-grade materials such as titanium 6-4 alloy and Inconel. The technology to produce parts on demand, in space, offers unique design options that are not possible through traditional manufacturing methods while offering cost-effective, high-precision, low-unit on-demand manufacturing. Thus, Additive Manufacturing capabilities are the foundation of an advanced manufacturing in space roadmap. The 3D Printing In Zero-G experiment will demonstrate the capability of utilizing Additive Manufacturing technology in space. This will serve as the enabling first step to realizing an additive manufacturing, print-on-demand "machine shop" for long-duration missions and sustaining human exploration of other planets, where there is extremely limited ability and availability of Earth-based logistics support. Simply put, Additive Manufacturing in space is a critical enabling technology for NASA. It will provide the capability to produce hardware on-demand, directly lowering cost and decreasing risk by having the exact part or tool needed in the time it takes to print. This capability will also provide the much-needed solution to the cost, volume, and up-mass constraints that prohibit launching everything needed for long-duration or long-distance missions from Earth, including spare parts and replacement systems. A successful mission for the 3D Printing In Zero-G payload is the first step to demonstrate the capability of printing on orbit. The data gathered and lessons learned from this demonstration will be applied to the next generation of additive manufacturing technology on orbit. It is expected that Additive Manufacturing technology will quickly become a critical part of any mission's infrastructure.
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward
2016-01-01
In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.
NASA Technical Reports Server (NTRS)
Nosenchuck, D. M.; Littman, M. G.
1986-01-01
The Navier-Stokes computer (NSC) has been developed for solving problems in fluid mechanics involving complex flow simulations that require more speed and capacity than provided by current and proposed Class VI supercomputers. The machine is a parallel processing supercomputer with several new architectural elements which can be programmed to address a wide range of problems meeting the following criteria: (1) the problem is numerically intensive, and (2) the code makes use of long vectors. A simulation of two-dimensional nonsteady viscous flows is presented to illustrate the architecture, programming, and some of the capabilities of the NSC.
NASA Astrophysics Data System (ADS)
Bonfanti, C. E.; Stewart, J.; Lee, Y. J.; Govett, M.; Trailovic, L.; Etherton, B.
2017-12-01
One of the National Oceanic and Atmospheric Administration (NOAA) goals is to provide timely and reliable weather forecasts to support important decisions when and where people need it for safety, emergencies, planning for day-to-day activities. Satellite data is essential for areas lacking in-situ observations for use as initial conditions in Numerical Weather Prediction (NWP) Models, such as spans of the ocean or remote areas of land. Currently only about 7% of total received satellite data is selected for use and from that, an even smaller percentage ever are assimilated into NWP models. With machine learning, the computational and time costs needed for satellite data selection can be greatly reduced. We study various machine learning approaches to process orders of magnitude more satellite data in significantly less time allowing for a greater quantity and more intelligent selection of data to be used for assimilation purposes. Given the future launches of satellites in the upcoming years, machine learning is capable of being applied for better selection of Regions of Interest (ROI) in the magnitudes more of satellite data that will be received. This paper discusses the background of machine learning methods as applied to weather forecasting and the challenges of creating a "labeled dataset" for training and testing purposes. In the training stage of supervised machine learning, labeled data are important to identify a ROI as either true or false so that the model knows what signatures in satellite data to identify. Authors have selected cyclones, including tropical cyclones and mid-latitude lows, as ROI for their machine learning purposes and created a labeled dataset of true or false for ROI from Global Forecast System (GFS) reanalysis data. A dataset like this does not yet exist and given the need for a high quantity of samples, is was decided this was best done with automation. This process was done by developing a program similar to the National Center for Environmental Prediction (NCEP) tropical cyclone tracker by Marchok that was used to identify cyclones based off its physical characteristics. We will discuss the methods and challenges to creating this dataset and the dataset's use for our current supervised machine learning model as well as use for future work on events such as convection initiation.
Metrology for the manufacturing of freeform optics
NASA Astrophysics Data System (ADS)
Blalock, Todd; Myer, Brian; Ferralli, Ian; Brunelle, Matt; Lynch, Tim
2017-10-01
Recently the use of freeform surfaces have become a realization for optical designers. These non-symmetrical optical surfaces have allowed unique solutions to optical design problems. The implementation of freeform optical surfaces has been limited by manufacturing capabilities and quality. However over the past several years freeform fabrication processes have improved in capability and precision. But as with any manufacturing, proper metrology is required to monitor and verify the process. Typical optics metrology such as interferometry has its challenges and limitations with the unique shapes of freeform optics. Two contact metrology methods for freeform metrology are presented; a Leitz coordinate measurement machine (CMM) with an uncertainty of +/- 0.5 μm and a high resolution profilometer (Panasonic UA3P) with a measurement uncertainty of +/- 0.05 μm. We are also developing a non-contact high resolution technique based on the fringe reflection technique known as deflectometry. This fast non-contact metrology has the potential to compete with accuracies of the contact methods but also can acquire data in seconds rather than minutes or hours.
Advances in Machine Technology.
Clark, William R; Villa, Gianluca; Neri, Mauro; Ronco, Claudio
2018-01-01
Continuous renal replacement therapy (CRRT) machines have evolved into devices specifically designed for critically ill over the past 40 years. In this chapter, a brief history of this evolution is first provided, with emphasis on the manner in which changes have been made to address the specific needs of the critically ill patient with acute kidney injury. Subsequently, specific examples of technology developments for CRRT machines are discussed, including the user interface, pumps, pressure monitoring, safety features, and anticoagulation capabilities. © 2018 S. Karger AG, Basel.
2013-11-01
machine learning techniques used in BBAC to make predictions about the intent of actors establishing TCP connections and issuing HTTP requests. We discuss pragmatic challenges and solutions we encountered in implementing and evaluating BBAC, discussing (a) the general concepts underlying BBAC, (b) challenges we have encountered in identifying suitable datasets, (c) mitigation strategies to cope...and describe current plans for transitioning BBAC capabilities into the Department of Defense together with lessons learned for the machine learning
Critical Technology Assessment of Five Axis Simultaneous Control Machine Tools
2009-07-01
assessment, BIS specifically examined: • The application of Export Control Classification Numbers ( ECCN ) 2B001.b.2 and 2B001.c.2 controls and related...availability of certain five axis simultaneous control mills, mill/turns, and machining centers controlled by ECCN 2B001.b.2 (but not grinders controlled by... ECCN 2B001.c.2) exists to China and Taiwan, which both have an indigenous capability to produce five axis simultaneous control machine tools with
2015-07-01
annex. iii Self-defense testing was limited to structural test firing from each machine gun mount and an ammunition resupply drill. Robust self...provided in the classified annex. Self- 8 defense testing was limited to structural test firing from each machine gun mount and a single...Caliber Machine Gun Mount Structural Test Fire November 2014 San Diego, Offshore Ship Weapons Range Operating Independently 9 Section Three
Machine translation project alternatives analysis
NASA Technical Reports Server (NTRS)
Bajis, Catherine J.; Bedford, Denise A. D.
1993-01-01
The Machine Translation Project consists of several components, two of which, the Project Plan and the Requirements Analysis, have already been delivered. The Project Plan details the overall rationale, objectives and time-table for the project as a whole. The Requirements Analysis compares a number of available machine translation systems, their capabilities, possible configurations, and costs. The Alternatives Analysis has resulted in a number of conclusions and recommendations to the NASA STI program concerning the acquisition of specific MT systems and related hardware and software.
NASA Astrophysics Data System (ADS)
Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad
2017-11-01
Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.
Means and method of balancing multi-cylinder reciprocating machines
Corey, John A.; Walsh, Michael M.
1985-01-01
A virtual balancing axis arrangement is described for multi-cylinder reciprocating piston machines for effectively balancing out imbalanced forces and minimizing residual imbalance moments acting on the crankshaft of such machines without requiring the use of additional parallel-arrayed balancing shafts or complex and expensive gear arrangements. The novel virtual balancing axis arrangement is capable of being designed into multi-cylinder reciprocating piston and crankshaft machines for substantially reducing vibrations induced during operation of such machines with only minimal number of additional component parts. Some of the required component parts may be available from parts already required for operation of auxiliary equipment, such as oil and water pumps used in certain types of reciprocating piston and crankshaft machine so that by appropriate location and dimensioning in accordance with the teachings of the invention, the virtual balancing axis arrangement can be built into the machine at little or no additional cost.
Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro
2018-05-09
Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.
Creating Situational Awareness in Spacecraft Operations with the Machine Learning Approach
NASA Astrophysics Data System (ADS)
Li, Z.
2016-09-01
This paper presents a machine learning approach for the situational awareness capability in spacecraft operations. There are two types of time dependent data patterns for spacecraft datasets: the absolute time pattern (ATP) and the relative time pattern (RTP). The machine learning captures the data patterns of the satellite datasets through the data training during the normal operations, which is represented by its time dependent trend. The data monitoring compares the values of the incoming data with the predictions of machine learning algorithm, which can detect any meaningful changes to a dataset above the noise level. If the difference between the value of incoming telemetry and the machine learning prediction are larger than the threshold defined by the standard deviation of datasets, it could indicate the potential anomaly that may need special attention. The application of the machine-learning approach to the Advanced Himawari Imager (AHI) on Japanese Himawari spacecraft series is presented, which has the same configuration as the Advanced Baseline Imager (ABI) on Geostationary Environment Operational Satellite (GOES) R series. The time dependent trends generated by the data-training algorithm are in excellent agreement with the datasets. The standard deviation in the time dependent trend provides a metric for measuring the data quality, which is particularly useful in evaluating the detector quality for both AHI and ABI with multiple detectors in each channel. The machine-learning approach creates the situational awareness capability, and enables engineers to handle the huge data volume that would have been impossible with the existing approach, and it leads to significant advances to more dynamic, proactive, and autonomous spacecraft operations.
Does human cognition allow Human Factors (HF) certification of advanced aircrew systems?
NASA Technical Reports Server (NTRS)
Macleod, Iain S.; Taylor, Robert M.
1994-01-01
This paper has examined the requirements of HF specification and certification within advanced or complex aircrew systems. It suggests reasons for current inadequacies in the use of HF in the design process, giving some examples in support, and suggesting an avenue towards the improvement of the HF certification process. The importance of human cognition to the operation and performance of advanced aircrew systems has been stressed. Many of the shortfalls of advanced aircrew systems must be attributed to over automated designs that show little consideration on either the mental limits or the cognitive capabilities of the human system component. Traditional approaches to system design and HF certification are set within an over physicalistic foundation. Also, traditionally it was assumed that physicalistic system functions could be attributed to either the human or the machine on a one to one basis. Moreover, any problems associated with the parallel needs, or promoting human understanding alongside system operation and direction, were generally equated in reality by the natural flexibility and adaptability of human skills. The consideration of the human component of a complex system is seen as being primarily based on manifestations of human behavior to the almost total exclusion of any appreciation of unobservable human mental and cognitive processes. The argument of this paper is that the considered functionality of any complex human-machine system must contain functions that are purely human and purely cognitive. Human-machine system reliability ultimately depends on human reliability and dependability and, therefore, on the form and frequency of cognitive processes that have to be conducted to support system performance. The greater the demand placed by an advanced aircraft system on the human component's basic knowledge processes or cognition, rather than on skill, the more insiduous the effects the human may have on that system. This paper discusses one example of an attempt to devise an improved method of specificaiton and certification with relation to the advanced aircrew system, that of the RN Merlin helicopter. The method is realized to have limitations in practice, these mainly associated with the late production of the system specification in relation to the system development process. The need for a careful appreciation of the capabilities and support needs of human cognition within the design process of a complex man machine system has been argued, especially with relation to the concept of system functionality. Unlike the physicalistic Fitts list, a new classification of system functionality is proposed, namely: (1) equipment - system equipment related; (2) cognitive - human cognition related; and (3) associated - necessary combinatin of equipment and cognitive. This paper has not proposed a method for a fuller consideration of cognition within systems design, but has suggested the need for such a method and indicated an avenue towards its development. Finally, the HF certification of advanced aircrew systems is seen as only being possible in a qualified sense until the important functions of human cognition are considered within the system design process. (This paper contains the opinions of its authors and does not necessarily refledt the standpoint of their respective organizations).
Knowledge-based machine vision systems for space station automation
NASA Technical Reports Server (NTRS)
Ranganath, Heggere S.; Chipman, Laure J.
1989-01-01
Computer vision techniques which have the potential for use on the space station and related applications are assessed. A knowledge-based vision system (expert vision system) and the development of a demonstration system for it are described. This system implements some of the capabilities that would be necessary in a machine vision system for the robot arm of the laboratory module in the space station. A Perceptics 9200e image processor, on a host VAXstation, was used to develop the demonstration system. In order to use realistic test images, photographs of actual space shuttle simulator panels were used. The system's capabilities of scene identification and scene matching are discussed.
INFORM: An interactive data collection and display program with debugging capability
NASA Technical Reports Server (NTRS)
Cwynar, D. S.
1980-01-01
A computer program was developed to aid ASSEMBLY language programmers of mini and micro computers in solving the man machine communications problems that exist when scaled integers are involved. In addition to producing displays of quasi-steady state values, INFORM provides an interactive mode for debugging programs, making program patches, and modifying the displays. Auxiliary routines SAMPLE and DATAO add dynamic data acquisition and high speed dynamic display capability to the program. Programming information and flow charts to aid in implementing INFORM on various machines together with descriptions of all supportive software are provided. Program modifications to satisfy the individual user's needs are considered.
TheHiveDB image data management and analysis framework.
Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew
2014-01-06
The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative.
TheHiveDB image data management and analysis framework
Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew
2014-01-01
The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative. PMID:24432000
Advancements in Binder Systems for Solid Freeform Fabrication
NASA Technical Reports Server (NTRS)
Cooper, Ken; Munafo, Paul (Technical Monitor)
2002-01-01
Paper will present recent developments in advanced material binder systems for solid freeform fabrication (SFF) technologies. The advantage of SFF is the capability to custom fabricate complex geometries directly from computer aided design data in layer- by-layer fashion, eliminated the need for traditional fixturing and tooling. Binders allow for the low temperature processing of 'green' structural materials, either metal, ceramic or composite, in traditional rapid prototyping machines. The greatest obstacle comes when green parts must then go through a sintering or burnout process to remove the binders and fully densify the parent material, without damaging or distorting the original part geometry. Critical issues and up-to-date assessments will be delivered on various material systems.
Advances in deep-UV processing using cluster tools
NASA Astrophysics Data System (ADS)
Escher, Gary C.; Tepolt, Gary; Mohondro, Robert D.
1993-09-01
Deep-UV laser lithography has shown the capability of supporting the manufacture of multiple generations of integrated circuits (ICs) due to its wide process latitude and depth of focus (DOF) for 0.2 micrometers to 0.5 micrometers feature sizes. This capability has been attained through improvements in deep-UV wide field lens technology, excimer lasers, steppers and chemically amplified, positive deep-UV resists. Chemically amplified deep-UV resists are required for 248 nm lithography due to the poor absorption and sensitivity of conventional novolac resists. The acid catalyzation processes of the new resists requires control of the thermal history and environmental conditions of the lithographic process. Work is currently underway at several resist vendors to reduce the need for these controls, but practical manufacturing solutions exist today. One of these solutions is the integration of steppers and resist tracks into a `cluster tool' or `Lithocell' to insure a consistent thermal profile for the resist process and reduce the time the resist is exposed to atmospheric contamination. The work here reports processing and system integration results with a Machine Technology, Inc (MTI) post-exposure bake (PEB) track interfaced with an advanced GCA XLS 7800 deep-UV stepper [31 mm diameter, variable NA (0.35 - 0.53) and variable sigma (0.3 - 0.74)].
Hybrid micromachining using a nanosecond pulsed laser and micro EDM
NASA Astrophysics Data System (ADS)
Kim, Sanha; Kim, Bo Hyun; Chung, Do Kwan; Shin, Hong Shik; Chu, Chong Nam
2010-01-01
Micro electrical discharge machining (micro EDM) is a well-known precise machining process that achieves micro structures of excellent quality for any conductive material. However, the slow machining speed and high tool wear are main drawbacks of this process. Though the use of deionized water instead of kerosene as a dielectric fluid can reduce the tool wear and increase the machine speed, the material removal rate (MRR) is still low. In contrast, laser ablation using a nanosecond pulsed laser is a fast and non-wear machining process but achieves micro figures of rather low quality. Therefore, the integration of these two processes can overcome the respective disadvantages. This paper reports a hybrid process of a nanosecond pulsed laser and micro EDM for micromachining. A novel hybrid micromachining system that combines the two discrete machining processes is introduced. Then, the feasibility and characteristics of the hybrid machining process are investigated compared to conventional EDM and laser ablation. It is verified experimentally that the machining time can be effectively reduced in both EDM drilling and milling by rapid laser pre-machining prior to micro EDM. Finally, some examples of complicated 3D micro structures fabricated by the hybrid process are shown.
Effect of the Machining Processes on Low Cycle Fatigue Behavior of a Powder Metallurgy Disk
NASA Technical Reports Server (NTRS)
Telesman, J.; Kantzos, P.; Gabb, T. P.; Ghosn, L. J.
2010-01-01
A study has been performed to investigate the effect of various machining processes on fatigue life of configured low cycle fatigue specimens machined out of a NASA developed LSHR P/M nickel based disk alloy. Two types of configured specimen geometries were employed in the study. To evaluate a broach machining processes a double notch geometry was used with both notches machined using broach tooling. EDM machined notched specimens of the same configuration were tested for comparison purposes. Honing finishing process was evaluated by using a center hole specimen geometry. Comparison testing was again done using EDM machined specimens of the same geometry. The effect of these machining processes on the resulting surface roughness, residual stress distribution and microstructural damage were characterized and used in attempt to explain the low cycle fatigue results.
Symbolic Computation Using Cellular Automata-Based Hyperdimensional Computing.
Yilmaz, Ozgur
2015-12-01
This letter introduces a novel framework of reservoir computing that is capable of both connectionist machine intelligence and symbolic computation. A cellular automaton is used as the reservoir of dynamical systems. Input is randomly projected onto the initial conditions of automaton cells, and nonlinear computation is performed on the input via application of a rule in the automaton for a period of time. The evolution of the automaton creates a space-time volume of the automaton state space, and it is used as the reservoir. The proposed framework is shown to be capable of long-term memory, and it requires orders of magnitude less computation compared to echo state networks. As the focus of the letter, we suggest that binary reservoir feature vectors can be combined using Boolean operations as in hyperdimensional computing, paving a direct way for concept building and symbolic processing. To demonstrate the capability of the proposed system, we make analogies directly on image data by asking, What is the automobile of air?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Mitchell T.; Johnson, Seth R.; Prokopenko, Andrey V.
With the development of a Fortran Interface to Trilinos, ForTrilinos, modelers using modern Fortran will beable to provide their codes the capability to use solvers and other capabilities on exascale machines via astraightforward infrastructure that accesses Trilinos. This document outlines what Fortrilinos does andexplains briefly how it works. We show it provides a general access to packages via an entry point and usesan xml file from fortran code. With the first release, ForTrilinos will enable Teuchos to take xml parameterlists from Fortran code and set up data structures. It will provide access to linear solvers and eigensolvers.Several examples are providedmore » to illustrate the capabilities in practice. We explain what the user shouldhave already with their code and what Trilinos provides and returns to the Fortran code. We provideinformation about the build process for ForTrilinos, with a practical example. In future releases, nonlinearsolvers, time iteration, advanced preconditioning techniques, and inversion of control (IoC), to enablecallbacks to Fortran routines, will be available.« less
Kernel Temporal Differences for Neural Decoding
Bae, Jihye; Sanchez Giraldo, Luis G.; Pohlmeyer, Eric A.; Francis, Joseph T.; Sanchez, Justin C.; Príncipe, José C.
2015-01-01
We study the feasibility and capability of the kernel temporal difference (KTD)(λ) algorithm for neural decoding. KTD(λ) is an online, kernel-based learning algorithm, which has been introduced to estimate value functions in reinforcement learning. This algorithm combines kernel-based representations with the temporal difference approach to learning. One of our key observations is that by using strictly positive definite kernels, algorithm's convergence can be guaranteed for policy evaluation. The algorithm's nonlinear functional approximation capabilities are shown in both simulations of policy evaluation and neural decoding problems (policy improvement). KTD can handle high-dimensional neural states containing spatial-temporal information at a reasonable computational complexity allowing real-time applications. When the algorithm seeks a proper mapping between a monkey's neural states and desired positions of a computer cursor or a robot arm, in both open-loop and closed-loop experiments, it can effectively learn the neural state to action mapping. Finally, a visualization of the coadaptation process between the decoder and the subject shows the algorithm's capabilities in reinforcement learning brain machine interfaces. PMID:25866504
Investigations on high speed machining of EN-353 steel alloy under different machining environments
NASA Astrophysics Data System (ADS)
Venkata Vishnu, A.; Jamaleswara Kumar, P.
2018-03-01
The addition of Nano Particles into conventional cutting fluids enhances its cooling capabilities; in the present paper an attempt is made by adding nano sized particles into conventional cutting fluids. Taguchi Robust Design Methodology is employed in order to study the performance characteristics of different turning parameters i.e. cutting speed, feed rate, depth of cut and type of tool under different machining environments i.e. dry machining, machining with lubricant - SAE 40 and machining with mixture of nano sized particles of Boric acid and base fluid SAE 40. A series of turning operations were performed using L27 (3)13 orthogonal array, considering high cutting speeds and the other machining parameters to measure hardness. The results are compared among the different machining environments, and it is concluded that there is considerable improvement in the machining performance using lubricant SAE 40 and mixture of SAE 40 + boric acid compared with dry machining. The ANOVA suggests that the selected parameters and the interactions are significant and cutting speed has most significant effect on hardness.
3D hierarchical spatial representation and memory of multimodal sensory data
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Dow, Paul A.; Huber, David J.
2009-04-01
This paper describes an efficient method and system for representing, processing and understanding multi-modal sensory data. More specifically, it describes a computational method and system for how to process and remember multiple locations in multimodal sensory space (e.g., visual, auditory, somatosensory, etc.). The multimodal representation and memory is based on a biologically-inspired hierarchy of spatial representations implemented with novel analogues of real representations used in the human brain. The novelty of the work is in the computationally efficient and robust spatial representation of 3D locations in multimodal sensory space as well as an associated working memory for storage and recall of these representations at the desired level for goal-oriented action. We describe (1) A simple and efficient method for human-like hierarchical spatial representations of sensory data and how to associate, integrate and convert between these representations (head-centered coordinate system, body-centered coordinate, etc.); (2) a robust method for training and learning a mapping of points in multimodal sensory space (e.g., camera-visible object positions, location of auditory sources, etc.) to the above hierarchical spatial representations; and (3) a specification and implementation of a hierarchical spatial working memory based on the above for storage and recall at the desired level for goal-oriented action(s). This work is most useful for any machine or human-machine application that requires processing of multimodal sensory inputs, making sense of it from a spatial perspective (e.g., where is the sensory information coming from with respect to the machine and its parts) and then taking some goal-oriented action based on this spatial understanding. A multi-level spatial representation hierarchy means that heterogeneous sensory inputs (e.g., visual, auditory, somatosensory, etc.) can map onto the hierarchy at different levels. When controlling various machine/robot degrees of freedom, the desired movements and action can be computed from these different levels in the hierarchy. The most basic embodiment of this machine could be a pan-tilt camera system, an array of microphones, a machine with arm/hand like structure or/and a robot with some or all of the above capabilities. We describe the approach, system and present preliminary results on a real-robotic platform.
2011-07-29
squad Armament: M60 7 .62mm machine gun , MK19 40mm, M2 .50 caL machine gun 61 "Spartan Scout Unmanned Surface Vehicle (USV)," Defense Industry...1) RQ-8B Fire Scout helicopter (VTUAV) a) EO/IR/LD sensor and datalink relay 2) MH-60R/S helicopters a) GAU 16/19 machine gun b) AGM-114 Hellfire...60Rhelicopter car1ies the a .50 caliber OAU 16/A machine gun , a crew-served, recoil operated, belt-fed, air cooled, percussion fired weapon, with a rate of fire
Clock Agreement Among Parallel Supercomputer Nodes
Jones, Terry R.; Koenig, Gregory A.
2014-04-30
This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.
32 CFR 655.10 - Use of radiation sources by non-Army entities on Army land (AR 385-11).
Code of Federal Regulations, 2010 CFR
2010-07-01
... radioisotope; or (5) A machine-produced ionizing-radiation source capable of producing an area, accessible to... NARM and machine-produced ionizing radiation sources, the applicant has an appropriate State... 32 National Defense 4 2010-07-01 2010-07-01 true Use of radiation sources by non-Army entities on...
Conformal Predictions in Multimedia Pattern Recognition
ERIC Educational Resources Information Center
Nallure Balasubramanian, Vineeth
2010-01-01
The fields of pattern recognition and machine learning are on a fundamental quest to design systems that can learn the way humans do. One important aspect of human intelligence that has so far not been given sufficient attention is the capability of humans to express when they are certain about a decision, or when they are not. Machine learning…
49 CFR 214.525 - Towing with on-track roadway maintenance machines or hi-rail vehicles.
Code of Federal Regulations, 2010 CFR
2010-10-01
... towing would cause the machine or hi-rail vehicle to exceed the capabilities of its braking system. In determining the limit of the braking system, the employer must consider the track grade (slope), as well as... or hi-rail vehicles. 214.525 Section 214.525 Transportation Other Regulations Relating to...
49 CFR 214.525 - Towing with on-track roadway maintenance machines or hi-rail vehicles.
Code of Federal Regulations, 2014 CFR
2014-10-01
... other coupling device that provides a safe and secure attachment. (b) An on-track roadway maintenance... towing would cause the machine or hi-rail vehicle to exceed the capabilities of its braking system. In determining the limit of the braking system, the employer must consider the track grade (slope), as well as...
49 CFR 214.525 - Towing with on-track roadway maintenance machines or hi-rail vehicles.
Code of Federal Regulations, 2013 CFR
2013-10-01
... other coupling device that provides a safe and secure attachment. (b) An on-track roadway maintenance... towing would cause the machine or hi-rail vehicle to exceed the capabilities of its braking system. In determining the limit of the braking system, the employer must consider the track grade (slope), as well as...
DOT National Transportation Integrated Search
1973-01-01
Experiments were performed with the British Klarcrete machine-on the Emporia bypass (I-95) in Greensville County to determine if it was capable of removing the top layer of roadway to a depth of between 1/8 to 1/4 inch, and in so doing expose a fresh...
ERIC Educational Resources Information Center
CRAWFORD, ROBERT C.; KEISLAR, EVAN R.
THE MAJOR PROBLEM OF THIS INVESTIGATION WAS TO DETERMINE TO WHAT EXTENT FIRST-GRADE PUPILS ARE CAPABLE OF LEARNING ALGEBRAIC STRUCTURES THROUGH PROGRAMED INSTRUCTION. IN THE EXPERIMENT APPROXIMATELY 130 FIRST-GRADERS WERE INSTRUCTED THROUGH AUDIOVISUAL TEACHING MACHINES FOR APPROXIMATELY 15 WEEKS. AT THE END OF THE PROGRAM, THE CHILDREN WERE…
NASA Astrophysics Data System (ADS)
Sui, Yi; Zheng, Ping; Tong, Chengde; Yu, Bin; Zhu, Shaohong; Zhu, Jianguo
2015-05-01
This paper describes a tubular dual-stator flux-switching permanent-magnet (PM) linear generator for free-piston energy converter. The operating principle, topology, and design considerations of the machine are investigated. Combining the motion characteristic of free-piston Stirling engine, a tubular dual-stator PM linear generator is designed by finite element method. Some major structural parameters, such as the outer and inner radii of the mover, PM thickness, mover tooth width, tooth width of the outer and inner stators, etc., are optimized to improve the machine performances like thrust capability and power density. In comparison with conventional single-stator PM machines like moving-magnet linear machine and flux-switching linear machine, the proposed dual-stator flux-switching PM machine shows advantages in higher mass power density, higher volume power density, and lighter mover.
A tubular hybrid Halbach/axially-magnetized permanent-magnet linear machine
NASA Astrophysics Data System (ADS)
Sui, Yi; Liu, Yong; Cheng, Luming; Liu, Jiaqi; Zheng, Ping
2017-05-01
A single-phase tubular permanent-magnet linear machine (PMLM) with hybrid Halbach/axially-magnetized PM arrays is proposed for free-piston Stirling power generation system. Machine topology and operating principle are elaborately illustrated. With the sinusoidal speed characteristic of the free-piston Stirling engine considered, the proposed machine is designed and calculated by finite-element analysis (FEA). The main structural parameters, such as outer radius of the mover, radial length of both the axially-magnetized PMs and ferromagnetic poles, axial length of both the middle and end radially-magnetized PMs, etc., are optimized to improve both the force capability and power density. Compared with the conventional PMLMs, the proposed machine features high mass and volume power density, and has the advantages of simple control and low converter cost. The proposed machine topology is applicable to tubular PMLMs with any phases.
Research on the EDM Technology for Micro-holes at Complex Spatial Locations
NASA Astrophysics Data System (ADS)
Y Liu, J.; Guo, J. M.; Sun, D. J.; Cai, Y. H.; Ding, L. T.; Jiang, H.
2017-12-01
For the demands on machining micro-holes at complex spatial location, several key technical problems are conquered such as micro-Electron Discharge Machining (micro-EDM) power supply system’s development, the host structure’s design and machining process technical. Through developing low-voltage power supply circuit, high-voltage circuit, micro and precision machining circuit and clearance detection system, the narrow pulse and high frequency six-axis EDM machining power supply system is developed to meet the demands on micro-hole discharging machining. With the method of combining the CAD structure design, CAE simulation analysis, modal test, ODS (Operational Deflection Shapes) test and theoretical analysis, the host construction and key axes of the machine tool are optimized to meet the position demands of the micro-holes. Through developing the special deionized water filtration system to make sure that the machining process is stable enough. To verify the machining equipment and processing technical developed in this paper through developing the micro-hole’s processing flow and test on the real machine tool. As shown in the final test results: the efficient micro-EDM machining pulse power supply system, machine tool host system, deionized filtration system and processing method developed in this paper meet the demands on machining micro-holes at complex spatial locations.
Process Control and Development for Ultrasonic Additive Manufacturing with Embedded Fibers
NASA Astrophysics Data System (ADS)
Hehr, Adam J.
Ultrasonic additive manufacturing (UAM) is a recent additive manufacturing technology which combines ultrasonic metal welding, CNC machining, and mechanized foil layering to create large gapless near net-shape metallic parts. The process has been attracting much attention lately due to its low formation temperature, the capability to join dissimilar metals, and the ability to create complex design features not possible with traditional subtractive processes alone. These process attributes enable light-weighting of structures and components in an unprecedented way. However, UAM is currently limited to niche areas due to the lack of quality tracking and inadequate scientific understanding of the process. As a result, this thesis work is focused on improving both component quality tracking and process understanding through the use of average electrical power input to the welder. Additionally, the understanding and application space of embedding fibers into metals using UAM is investigated, with particular focus on NiTi shape memory alloy fibers.
A strategy to apply machine learning to small datasets in materials science
NASA Astrophysics Data System (ADS)
Zhang, Ying; Ling, Chen
2018-12-01
There is growing interest in applying machine learning techniques in the research of materials science. However, although it is recognized that materials datasets are typically smaller and sometimes more diverse compared to other fields, the influence of availability of materials data on training machine learning models has not yet been studied, which prevents the possibility to establish accurate predictive rules using small materials datasets. Here we analyzed the fundamental interplay between the availability of materials data and the predictive capability of machine learning models. Instead of affecting the model precision directly, the effect of data size is mediated by the degree of freedom (DoF) of model, resulting in the phenomenon of association between precision and DoF. The appearance of precision-DoF association signals the issue of underfitting and is characterized by large bias of prediction, which consequently restricts the accurate prediction in unknown domains. We proposed to incorporate the crude estimation of property in the feature space to establish ML models using small sized materials data, which increases the accuracy of prediction without the cost of higher DoF. In three case studies of predicting the band gap of binary semiconductors, lattice thermal conductivity, and elastic properties of zeolites, the integration of crude estimation effectively boosted the predictive capability of machine learning models to state-of-art levels, demonstrating the generality of the proposed strategy to construct accurate machine learning models using small materials dataset.
The JPL Library information retrieval system
NASA Technical Reports Server (NTRS)
Walsh, J.
1975-01-01
The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. Programs and documentation are available to the public through NASA's computer software dissemination program.
Electron Beam Freeform Fabrication: A Rapid Metal Deposition Process
NASA Technical Reports Server (NTRS)
Taminger, Karen M. B.; Hafley, Robert A.
2003-01-01
Manufacturing of structural metal parts directly from computer aided design (CAD) data has been investigated by numerous researchers over the past decade. Researchers at NASA Langley REsearch Center are developing a new solid freeform fabrication process, electron beam freeform fabrication (EBF), as a rapid metal deposition process that works efficiently with a variety of weldable alloys. The EBF process introduces metal wire feedstock into a molten pool that is created and sustained using a focused electron beam in a vacuum environment. Thus far, this technique has been demonstrated on aluminum and titanium alloys of interest for aerospace structural applications nickel and ferrous based alloys are also planned. Deposits resulting from 2219 aluminum demonstrations have exhibited a range of grain morphologies depending upon the deposition parameters. These materials ave exhibited excellent tensile properties comparable to typical handbook data for wrought plate product after post-processing heat treatments. The EBF process is capable of bulk metal deposition at deposition rated in excess of 2500 cubic centimeters per hour (150 cubic inches per our) or finer detail at lower deposition rates, depending upon the desired application. This process offers the potential for rapidly adding structural details to simpler cast or forged structures rather than the conventional approach of machining large volumes of chips to produce a monolithic metallic structure. Selective addition of metal onto simpler blanks of material can have a significant effect on lead time reduction and lower material and machining costs.
The HARNESS Workbench: Unified and Adaptive Access to Diverse HPC Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sunderam, Vaidy S.
2012-03-20
The primary goal of the Harness WorkBench (HWB) project is to investigate innovative software environments that will help enhance the overall productivity of applications science on diverse HPC platforms. Two complementary frameworks were designed: one, a virtualized command toolkit for application building, deployment, and execution, that provides a common view across diverse HPC systems, in particular the DOE leadership computing platforms (Cray, IBM, SGI, and clusters); and two, a unified runtime environment that consolidates access to runtime services via an adaptive framework for execution-time and post processing activities. A prototype of the first was developed based on the concept ofmore » a 'system-call virtual machine' (SCVM), to enhance portability of the HPC application deployment process across heterogeneous high-end machines. The SCVM approach to portable builds is based on the insertion of toolkit-interpretable directives into original application build scripts. Modifications resulting from these directives preserve the semantics of the original build instruction flow. The execution of the build script is controlled by our toolkit that intercepts build script commands in a manner transparent to the end-user. We have applied this approach to a scientific production code (Gamess-US) on the Cray-XT5 machine. The second facet, termed Unibus, aims to facilitate provisioning and aggregation of multifaceted resources from resource providers and end-users perspectives. To achieve that, Unibus proposes a Capability Model and mediators (resource drivers) to virtualize access to diverse resources, and soft and successive conditioning to enable automatic and user-transparent resource provisioning. A proof of concept implementation has demonstrated the viability of this approach on high end machines, grid systems and computing clouds.« less
Chang, Ni-Bin; Bai, Kaixu; Chen, Chi-Farn
2017-10-01
Monitoring water quality changes in lakes, reservoirs, estuaries, and coastal waters is critical in response to the needs for sustainable development. This study develops a remote sensing-based multiscale modeling system by integrating multi-sensor satellite data merging and image reconstruction algorithms in support of feature extraction with machine learning leading to automate continuous water quality monitoring in environmentally sensitive regions. This new Earth observation platform, termed "cross-mission data merging and image reconstruction with machine learning" (CDMIM), is capable of merging multiple satellite imageries to provide daily water quality monitoring through a series of image processing, enhancement, reconstruction, and data mining/machine learning techniques. Two existing key algorithms, including Spectral Information Adaptation and Synthesis Scheme (SIASS) and SMart Information Reconstruction (SMIR), are highlighted to support feature extraction and content-based mapping. Whereas SIASS can support various data merging efforts to merge images collected from cross-mission satellite sensors, SMIR can overcome data gaps by reconstructing the information of value-missing pixels due to impacts such as cloud obstruction. Practical implementation of CDMIM was assessed by predicting the water quality over seasons in terms of the concentrations of nutrients and chlorophyll-a, as well as water clarity in Lake Nicaragua, providing synergistic efforts to better monitor the aquatic environment and offer insightful lake watershed management strategies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cadena, Natalia L; Cue-Sampedro, Rodrigo; Siller, Héctor R; Arizmendi-Morquecho, Ana M; Rivera-Solorio, Carlos I; Di-Nardo, Santiago
2013-05-24
The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum-chromium-nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating.
Cadena, Natalia L.; Cue-Sampedro, Rodrigo; Siller, Héctor R.; Arizmendi-Morquecho, Ana M.; Rivera-Solorio, Carlos I.; Di-Nardo, Santiago
2013-01-01
The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum–chromium–nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating. PMID:28809266
The research on construction and application of machining process knowledge base
NASA Astrophysics Data System (ADS)
Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai
2018-03-01
In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.
High-speed ultrafast laser machining with tertiary beam positioning (Conference Presentation)
NASA Astrophysics Data System (ADS)
Yang, Chuan; Zhang, Haibin
2017-03-01
For an industrial laser application, high process throughput and low average cost of ownership are critical to commercial success. Benefiting from high peak power, nonlinear absorption and small-achievable spot size, ultrafast lasers offer advantages of minimal heat affected zone, great taper and sidewall quality, and small via capability that exceeds the limits of their predecessors in via drilling for electronic packaging. In the past decade, ultrafast lasers have both grown in power and reduced in cost. For example, recently, disk and fiber technology have both shown stable operation in the 50W to 200W range, mostly at high repetition rate (beyond 500 kHz) that helps avoid detrimental nonlinear effects. However, to effectively and efficiently scale the throughput with the fast-growing power capability of the ultrafast lasers while keeping the beneficial laser-material interactions is very challenging, mainly because of the bottleneck imposed by the inertia-related acceleration limit and servo gain bandwidth when only stages and galvanometers are being used. On the other side, inertia-free scanning solutions like acoustic optics and electronic optical deflectors have small scan field, and therefore not suitable for large-panel processing. Our recent system developments combine stages, galvanometers, and AODs into a coordinated tertiary architecture for high bandwidth and meanwhile large field beam positioning. Synchronized three-level movements allow extremely fast local speed and continuous motion over the whole stage travel range. We present the via drilling results from such ultrafast system with up to 3MHz pulse to pulse random access, enabling high quality low cost ultrafast machining with emerging high average power laser sources.
Hidden physics models: Machine learning of nonlinear partial differential equations
NASA Astrophysics Data System (ADS)
Raissi, Maziar; Karniadakis, George Em
2018-03-01
While there is currently a lot of enthusiasm about "big data", useful data is usually "small" and expensive to acquire. In this paper, we present a new paradigm of learning partial differential equations from small data. In particular, we introduce hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equations, to extract patterns from high-dimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or data-driven discovery of partial differential equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier-Stokes, Schrödinger, Kuramoto-Sivashinsky, and time dependent linear fractional equations. The methodology provides a promising new direction for harnessing the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data.
Gasparyan, Diana
2016-12-01
There is a problem associated with contemporary studies of philosophy of mind, which focuses on the identification and convergence of human and machine intelligence. This is the problem of machine emulation of sense. In the present study, analysis of this problem is carried out based on concepts from structural and post-structural approaches that have been almost entirely overlooked by contemporary philosophy of mind. If we refer to the basic definitions of "sign" and "meaning" found in structuralism and post-structuralism, we see a fundamental difference between the capabilities of a machine and the human brain engaged in the processing of a sign. This research will exemplify and provide additional evidence to support distinctions between syntactic and semantic aspects of intelligence, an issue widely discussed by adepts of contemporary philosophy of mind. The research will demonstrate that some aspect of a number of ideas proposed in relation to semantics and semiosis in structuralism and post-structuralism are similar to those we find in contemporary analytical studies related to the theory and philosophy of artificial intelligence. The concluding part of the paper offers an interpretation of the problem of formalization of sense, connected to its metaphysical (transcendental) properties.
Real-time model-based vision system for object acquisition and tracking
NASA Technical Reports Server (NTRS)
Wilcox, Brian; Gennery, Donald B.; Bon, Bruce; Litwin, Todd
1987-01-01
A machine vision system is described which is designed to acquire and track polyhedral objects moving and rotating in space by means of two or more cameras, programmable image-processing hardware, and a general-purpose computer for high-level functions. The image-processing hardware is capable of performing a large variety of operations on images and on image-like arrays of data. Acquisition utilizes image locations and velocities of the features extracted by the image-processing hardware to determine the three-dimensional position, orientation, velocity, and angular velocity of the object. Tracking correlates edges detected in the current image with edge locations predicted from an internal model of the object and its motion, continually updating velocity information to predict where edges should appear in future frames. With some 10 frames processed per second, real-time tracking is possible.
Analyzing microtomography data with Python and the scikit-image library.
Gouillart, Emmanuelle; Nunez-Iglesias, Juan; van der Walt, Stéfan
2017-01-01
The exploration and processing of images is a vital aspect of the scientific workflows of many X-ray imaging modalities. Users require tools that combine interactivity, versatility, and performance. scikit-image is an open-source image processing toolkit for the Python language that supports a large variety of file formats and is compatible with 2D and 3D images. The toolkit exposes a simple programming interface, with thematic modules grouping functions according to their purpose, such as image restoration, segmentation, and measurements. scikit-image users benefit from a rich scientific Python ecosystem that contains many powerful libraries for tasks such as visualization or machine learning. scikit-image combines a gentle learning curve, versatile image processing capabilities, and the scalable performance required for the high-throughput analysis of X-ray imaging data.
NASA Astrophysics Data System (ADS)
Liu, Shuang; Liu, Fei; Hu, Shaohua; Yin, Zhenbiao
The major power information of the main transmission system in machine tools (MTSMT) during machining process includes effective output power (i.e. cutting power), input power and power loss from the mechanical transmission system, and the main motor power loss. These information are easy to obtain in the lab but difficult to evaluate in a manufacturing process. To solve this problem, a separation method is proposed here to extract the MTSMT power information during machining process. In this method, the energy flow and the mathematical models of major power information of MTSMT during the machining process are set up first. Based on the mathematical models and the basic data tables obtained from experiments, the above mentioned power information during machining process can be separated just by measuring the real time total input power of the spindle motor. The operation program of this method is also given.
Fast in-situ tool inspection based on inverse fringe projection and compact sensor heads
NASA Astrophysics Data System (ADS)
Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard
2016-11-01
Inspection of machine elements is an important task in production processes in order to ensure the quality of produced parts and to gather feedback for the continuous improvement process. A new measuring system is presented, which is capable of performing the inspection of critical tool geometries, such as gearing elements, inside the forming machine. To meet the constraints on sensor head size and inspection time imposed by the limited space inside the machine and the cycle time of the process, the measuring device employs a combination of endoscopy techniques with the fringe projection principle. Compact gradient index lenses enable a compact design of the sensor head, which is connected to a CMOS camera and a flexible micro-mirror based projector via flexible fiber bundles. Using common fringe projection patterns, the system achieves measuring times of less than five seconds. To further reduce the time required for inspection, the generation of inverse fringe projection patterns has been implemented for the system. Inverse fringe projection speeds up the inspection process by employing object-adapted patterns, which enable the detection of geometry deviations in a single image. Two different approaches to generate object adapted patterns are presented. The first approach uses a reference measurement of a manufactured tool master to generate the inverse pattern. The second approach is based on a virtual master geometry in the form of a CAD file and a ray-tracing model of the measuring system. Virtual modeling of the measuring device and inspection setup allows for geometric tolerancing for free-form surfaces by the tool designer in the CAD-file. A new approach is presented, which uses virtual tolerance specifications and additional simulation steps to enable fast checking of metric tolerances. Following the description of the pattern generation process, the image processing steps required for inspection are demonstrated on captures of gearing geometries.
A Man-Machine System for Contemporary Counseling Practice: Diagnosis and Prediction.
ERIC Educational Resources Information Center
Roach, Arthur J.
This paper looks at present and future capabilities for diagnosis and prediction in computer-based guidance efforts and reviews the problems and potentials which will accompany the implementation of such capabilities. In addition to necessary procedural refinement in prediction, future developments in computer-based educational and career…
Distributed communications and control network for robotic mining
NASA Technical Reports Server (NTRS)
Schiffbauer, William H.
1989-01-01
The application of robotics to coal mining machines is one approach pursued to increase productivity while providing enhanced safety for the coal miner. Toward that end, a network composed of microcontrollers, computers, expert systems, real time operating systems, and a variety of program languages are being integrated that will act as the backbone for intelligent machine operation. Actual mining machines, including a few customized ones, have been given telerobotic semiautonomous capabilities by applying the described network. Control devices, intelligent sensors and computers onboard these machines are showing promise of achieving improved mining productivity and safety benefits. Current research using these machines involves navigation, multiple machine interaction, machine diagnostics, mineral detection, and graphical machine representation. Guidance sensors and systems employed include: sonar, laser rangers, gyroscopes, magnetometers, clinometers, and accelerometers. Information on the network of hardware/software and its implementation on mining machines are presented. Anticipated coal production operations using the network are discussed. A parallelism is also drawn between the direction of present day underground coal mining research to how the lunar soil (regolith) may be mined. A conceptual lunar mining operation that employs a distributed communication and control network is detailed.
Damage Precursor Identification via Microstructure-Sensitive Nondestructive Evaluation
NASA Astrophysics Data System (ADS)
Wisner, Brian John
Damage in materials is a complex and stochastic process bridging several time and length scales. This dissertation focuses on investigating the damage process in a particular class of precipitate-hardened aluminum alloys which is widely used in automotive and aerospace applications. Most emphasis in the literature has been given either on their ductility for manufacturing purposes or fracture for performance considerations. In this dissertation, emphasis is placed on using nondestructive evaluation (NDE) combined with mechanical testing and characterization methods applied at a scale where damage incubation and initiation is occurring. Specifically, a novel setup built inside a Scanning Electron Microscope (SEM) and retrofitted to be combined with characterization and NDE capabilities was developed with the goal to track the early stages of the damage process in this type of material. The characterization capabilities include Electron Backscatter Diffraction (EBSD) and Energy Dispersive Spectroscopy (EDS) in addition to X-ray micro-computed tomography (μ-CT) and nanoindentation, in addition to microscopy achieved by the Secondary Electron (SE) and Back Scatter Electron (BSE) detectors. The mechanical testing inside the SEM was achieved with the use of an appropriate stage that fitted within its chamber and is capable of applying both axial and bending monotonic and cyclic loads. The NDE capabilities, beyond the microscopy and μ-CT, include the methods of Acoustic Emission and Digital Image Correlation (DIC). This setup was used to identify damage precursors in this material system and their evolution over time and space. The experimental results were analyzed by a custom signal processing scheme that involves both feature-based analyses as well as a machine learning method to relate recorded microstructural data to damage in this material. Extensions of the presented approach to include information from computational methods as well as its applicability to other material systems are discussed.
NASA Astrophysics Data System (ADS)
Gates, W. G.
1982-05-01
Bendix product applications require the capability of fabricating heavy gage, high strength materials. Five commercial sources have been identified that have the capability of spin forming metal thicknesses greater than 9.5 mm and four equipment manufacturers produce machines with this capability. Twelve assemblies selected as candidates for spin forming applications require spin forming of titanium, 250 maraging steel, 17-4 pH stainless steel, Nitronic 40 steel, 304 L stainless steel, and 6061 aluminum. Twelve parts have been cold spin formed from a 250 maraging steel 8.1 mm wall thickness machine preform, and six have been hot spin formed directly from 31.8-mm-thick flat plate. Thirty-three Ti-6Al-4V titanium alloy parts and 26 17-4 pH stainless steel parts have been hot spin formed directly from 31.8-mm-thick plate. Hot spin forming directly from plate has demonstrated the feasibility and favorable economics of this fabrication technique for Bendix applications.
Westinghouse programs in pulsed homopolar power supplies
NASA Technical Reports Server (NTRS)
Litz, D. C.; Mullan, E.
1984-01-01
This document details Westinghouse's ongoing study of homopolar machines since 1929 with the major effort occurring in the early 1970's to the present. The effort has enabled Westinghouse to develop expertise in the technology required for the design, fabrication and testing of such machines. This includes electrical design, electromagnetic analysis, current collection, mechanical design, advanced cooling, stress analysis, transient rotor performance, bearing analysis and seal technology. Westinghouse is using this capability to explore the use of homopolar machines as pulsed power supplies for future systems in both military and commercial applications.
Cellular Neural Network for Real Time Image Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vagliasindi, G.; Arena, P.; Fortuna, L.
2008-03-12
Since their introduction in 1988, Cellular Nonlinear Networks (CNNs) have found a key role as image processing instruments. Thanks to their structure they are able of processing individual pixels in a parallel way providing fast image processing capabilities that has been applied to a wide range of field among which nuclear fusion. In the last years, indeed, visible and infrared video cameras have become more and more important in tokamak fusion experiments for the twofold aim of understanding the physics and monitoring the safety of the operation. Examining the output of these cameras in real-time can provide significant information formore » plasma control and safety of the machines. The potentiality of CNNs can be exploited to this aim. To demonstrate the feasibility of the approach, CNN image processing has been applied to several tasks both at the Frascati Tokamak Upgrade (FTU) and the Joint European Torus (JET)« less
Wave scheduling - Decentralized scheduling of task forces in multicomputers
NASA Technical Reports Server (NTRS)
Van Tilborg, A. M.; Wittie, L. D.
1984-01-01
Decentralized operating systems that control large multicomputers need techniques to schedule competing parallel programs called task forces. Wave scheduling is a probabilistic technique that uses a hierarchical distributed virtual machine to schedule task forces by recursively subdividing and issuing wavefront-like commands to processing elements capable of executing individual tasks. Wave scheduling is highly resistant to processing element failures because it uses many distributed schedulers that dynamically assign scheduling responsibilities among themselves. The scheduling technique is trivially extensible as more processing elements join the host multicomputer. A simple model of scheduling cost is used by every scheduler node to distribute scheduling activity and minimize wasted processing capacity by using perceived workload to vary decentralized scheduling rules. At low to moderate levels of network activity, wave scheduling is only slightly less efficient than a central scheduler in its ability to direct processing elements to accomplish useful work.
A Parameter Communication Optimization Strategy for Distributed Machine Learning in Sensors.
Zhang, Jilin; Tu, Hangdi; Ren, Yongjian; Wan, Jian; Zhou, Li; Li, Mingwei; Wang, Jue; Yu, Lifeng; Zhao, Chang; Zhang, Lei
2017-09-21
In order to utilize the distributed characteristic of sensors, distributed machine learning has become the mainstream approach, but the different computing capability of sensors and network delays greatly influence the accuracy and the convergence rate of the machine learning model. Our paper describes a reasonable parameter communication optimization strategy to balance the training overhead and the communication overhead. We extend the fault tolerance of iterative-convergent machine learning algorithms and propose the Dynamic Finite Fault Tolerance (DFFT). Based on the DFFT, we implement a parameter communication optimization strategy for distributed machine learning, named Dynamic Synchronous Parallel Strategy (DSP), which uses the performance monitoring model to dynamically adjust the parameter synchronization strategy between worker nodes and the Parameter Server (PS). This strategy makes full use of the computing power of each sensor, ensures the accuracy of the machine learning model, and avoids the situation that the model training is disturbed by any tasks unrelated to the sensors.
Knowledge-based load leveling and task allocation in human-machine systems
NASA Technical Reports Server (NTRS)
Chignell, M. H.; Hancock, P. A.
1986-01-01
Conventional human-machine systems use task allocation policies which are based on the premise of a flexible human operator. This individual is most often required to compensate for and augment the capabilities of the machine. The development of artificial intelligence and improved technologies have allowed for a wider range of task allocation strategies. In response to these issues a Knowledge Based Adaptive Mechanism (KBAM) is proposed for assigning tasks to human and machine in real time, using a load leveling policy. This mechanism employs an online workload assessment and compensation system which is responsive to variations in load through an intelligent interface. This interface consists of a loading strategy reasoner which has access to information about the current status of the human-machine system as well as a database of admissible human/machine loading strategies. Difficulties standing in the way of successful implementation of the load leveling strategy are examined.
Intelligent earthquake data processing for global adjoint tomography
NASA Astrophysics Data System (ADS)
Chen, Y.; Hill, J.; Li, T.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Tromp, J.
2016-12-01
Due to the increased computational capability afforded by modern and future computing architectures, the seismology community is demanding a more comprehensive understanding of the full waveform information from the recorded earthquake seismograms. Global waveform tomography is a complex workflow that matches observed seismic data with synthesized seismograms by iteratively updating the earth model parameters based on the adjoint state method. This methodology allows us to compute a very accurate model of the earth's interior. The synthetic data is simulated by solving the wave equation in the entire globe using a spectral-element method. In order to ensure the inversion accuracy and stability, both the synthesized and observed seismograms must be carefully pre-processed. Because the scale of the inversion problem is extremely large and there is a very large volume of data to both be read and written, an efficient and reliable pre-processing workflow must be developed. We are investigating intelligent algorithms based on a machine-learning (ML) framework that will automatically tune parameters for the data processing chain. One straightforward application of ML in data processing is to classify all possible misfit calculation windows into usable and unusable ones, based on some intelligent ML models such as neural network, support vector machine or principle component analysis. The intelligent earthquake data processing framework will enable the seismology community to compute the global waveform tomography using seismic data from an arbitrarily large number of earthquake events in the fastest, most efficient way.
Optimization of turning process through the analytic flank wear modelling
NASA Astrophysics Data System (ADS)
Del Prete, A.; Franchi, R.; De Lorenzis, D.
2018-05-01
In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.
A new approach to build VPLS with auto-discovery mechanism
NASA Astrophysics Data System (ADS)
Dong, Ximing; Yu, Shaohua
2005-11-01
VPLS is the key technology implemented to provide Layer 2 bridge-like services, connecting dispersed locations to work in a switched LAN over an MPLS backbone. However, implementing VPLS requires creating a complex matrix of services and locations that quickly becomes difficult to configure and maintain. To address this complexity, this paper proposes a new approach to automate the configuration and maintenance of VPLS networks, a node-discovery process in which each router advertises its VPLS-enabled status and capabilities to all other routers. Our approach can be summarized into four steps. (1) Discover other VPLS PE nodes with VPLS capabilities and create the VPLS capable PE routers list. We introduce a finite state machine which includes four states to illustrate the process how a VPLS peer can be discovered and the peer relations be kept alive. (2) Build MPLS LSP tunnels to all the PE routers in the list, according to the advertised VPLS protocol capabilities. (3) Use the lists to create targeted-LDP sessions for VPLS services discovery. (4) VC label assignment. The PE edge routers exchanges messages to define VC labels and bind them with each built PWE. The suggested auto-discovery mechanism is sensitive to any service provider's topology change and customer's service modification. The dynamic process for the FIB building, MAC address learning and withdrawal, is also covered as the result of VPLS auto-discovery. The suggested mechanism can be implemented as a software module and could be seamlessly integrated with currently deployed Metro Ethernet routing and switching platform.
Specimen coordinate automated measuring machine/fiducial automated measuring machine
Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.
1991-01-01
The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.
On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process
NASA Astrophysics Data System (ADS)
Hongzhi, Zhao; Jian, Zhang
2018-03-01
The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.
Migrating EO/IR sensors to cloud-based infrastructure as service architectures
NASA Astrophysics Data System (ADS)
Berglie, Stephen T.; Webster, Steven; May, Christopher M.
2014-06-01
The Night Vision Image Generator (NVIG), a product of US Army RDECOM CERDEC NVESD, is a visualization tool used widely throughout Army simulation environments to provide fully attributed synthesized, full motion video using physics-based sensor and environmental effects. The NVIG relies heavily on contemporary hardware-based acceleration and GPU processing techniques, which push the envelope of both enterprise and commodity-level hypervisor support for providing virtual machines with direct access to hardware resources. The NVIG has successfully been integrated into fully virtual environments where system architectures leverage cloudbased technologies to various extents in order to streamline infrastructure and service management. This paper details the challenges presented to engineers seeking to migrate GPU-bound processes, such as the NVIG, to virtual machines and, ultimately, Cloud-Based IAS architectures. In addition, it presents the path that led to success for the NVIG. A brief overview of Cloud-Based infrastructure management tool sets is provided, and several virtual desktop solutions are outlined. A discrimination is made between general purpose virtual desktop technologies compared to technologies that expose GPU-specific capabilities, including direct rendering and hard ware-based video encoding. Candidate hypervisor/virtual machine configurations that nominally satisfy the virtualized hardware-level GPU requirements of the NVIG are presented , and each is subsequently reviewed in light of its implications on higher-level Cloud management techniques. Implementation details are included from the hardware level, through the operating system, to the 3D graphics APls required by the NVIG and similar GPU-bound tools.
Research in speech communication.
Flanagan, J
1995-10-24
Advances in digital speech processing are now supporting application and deployment of a variety of speech technologies for human/machine communication. In fact, new businesses are rapidly forming about these technologies. But these capabilities are of little use unless society can afford them. Happily, explosive advances in microelectronics over the past two decades have assured affordable access to this sophistication as well as to the underlying computing technology. The research challenges in speech processing remain in the traditionally identified areas of recognition, synthesis, and coding. These three areas have typically been addressed individually, often with significant isolation among the efforts. But they are all facets of the same fundamental issue--how to represent and quantify the information in the speech signal. This implies deeper understanding of the physics of speech production, the constraints that the conventions of language impose, and the mechanism for information processing in the auditory system. In ongoing research, therefore, we seek more accurate models of speech generation, better computational formulations of language, and realistic perceptual guides for speech processing--along with ways to coalesce the fundamental issues of recognition, synthesis, and coding. Successful solution will yield the long-sought dictation machine, high-quality synthesis from text, and the ultimate in low bit-rate transmission of speech. It will also open the door to language-translating telephony, where the synthetic foreign translation can be in the voice of the originating talker.
Design, fabrication, and operation of a test rig for high-speed tapered-roller bearings
NASA Technical Reports Server (NTRS)
Signer, H. R.
1974-01-01
A tapered-roller bearing test machine was designed, fabricated and successfully operated at speeds to 20,000 rpm. Infinitely variable radial loads to 26,690 N (6,000 lbs.) and thrust loads to 53,380 N (12,000 lbs.) can be applied to test bearings. The machine instrumentation proved to have the accuracy and reliability required for parametric bearing performance testing and has the capability of monitoring all programmed test parameters at continuous operation during life testing. This system automatically shuts down a test if any important test parameter deviates from the programmed conditions, or if a bearing failure occurs. A lubrication system was developed as an integral part of the machine, capable of lubricating test bearings by external jets and by means of passages feeding through the spindle and bearing rings into the critical internal bearing surfaces. In addition, provisions were made for controlled oil cooling of inner and outer rings to effect the type of bearing thermal management that is required when testing at high speeds.
Tool path strategy and cutting process monitoring in intelligent machining
NASA Astrophysics Data System (ADS)
Chen, Ming; Wang, Chengdong; An, Qinglong; Ming, Weiwei
2018-06-01
Intelligent machining is a current focus in advanced manufacturing technology, and is characterized by high accuracy and efficiency. A central technology of intelligent machining—the cutting process online monitoring and optimization—is urgently needed for mass production. In this research, the cutting process online monitoring and optimization in jet engine impeller machining, cranio-maxillofacial surgery, and hydraulic servo valve deburring are introduced as examples of intelligent machining. Results show that intelligent tool path optimization and cutting process online monitoring are efficient techniques for improving the efficiency, quality, and reliability of machining.
Parameter optimization of electrochemical machining process using black hole algorithm
NASA Astrophysics Data System (ADS)
Singh, Dinesh; Shukla, Rajkamal
2017-12-01
Advanced machining processes are significant as higher accuracy in machined component is required in the manufacturing industries. Parameter optimization of machining processes gives optimum control to achieve the desired goals. In this paper, electrochemical machining (ECM) process is considered to evaluate the performance of the considered process using black hole algorithm (BHA). BHA considers the fundamental idea of a black hole theory and it has less operating parameters to tune. The two performance parameters, material removal rate (MRR) and overcut (OC) are considered separately to get optimum machining parameter settings using BHA. The variations of process parameters with respect to the performance parameters are reported for better and effective understanding of the considered process using single objective at a time. The results obtained using BHA are found better while compared with results of other metaheuristic algorithms, such as, genetic algorithm (GA), artificial bee colony (ABC) and bio-geography based optimization (BBO) attempted by previous researchers.
Automatic inference of multicellular regulatory networks using informative priors.
Sun, Xiaoyun; Hong, Pengyu
2009-01-01
To fully understand the mechanisms governing animal development, computational models and algorithms are needed to enable quantitative studies of the underlying regulatory networks. We developed a mathematical model based on dynamic Bayesian networks to model multicellular regulatory networks that govern cell differentiation processes. A machine-learning method was developed to automatically infer such a model from heterogeneous data. We show that the model inference procedure can be greatly improved by incorporating interaction data across species. The proposed approach was applied to C. elegans vulval induction to reconstruct a model capable of simulating C. elegans vulval induction under 73 different genetic conditions.
ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox
NASA Technical Reports Server (NTRS)
Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan
2015-01-01
The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.
Leung, S C; Fung, W K; Wong, K H
1999-01-01
The relative bit density variation graphs of 207 specimen credit cards processed by 12 encoding machines were examined first visually, and then classified by means of hierarchical cluster analysis. Twenty-nine credit cards being treated as 'questioned' samples were tested by way of cluster analysis against 'controls' derived from known encoders. It was found that hierarchical cluster analysis provided a high accuracy of identification with all 29 'questioned' samples classified correctly. On the other hand, although visual comparison of jitter graphs was less discriminating, it was nevertheless capable of giving a reasonably accurate result.
Hypercluster - Parallel processing for computational mechanics
NASA Technical Reports Server (NTRS)
Blech, Richard A.
1988-01-01
An account is given of the development status, performance capabilities and implications for further development of NASA-Lewis' testbed 'hypercluster' parallel computer network, in which multiple processors communicate through a shared memory. Processors have local as well as shared memory; the hypercluster is expanded in the same manner as the hypercube, with processor clusters replacing the normal single processor node. The NASA-Lewis machine has three nodes with a vector personality and one node with a scalar personality. Each of the vector nodes uses four board-level vector processors, while the scalar node uses four general-purpose microcomputer boards.
Robotics control using isolated word recognition of voice input
NASA Technical Reports Server (NTRS)
Weiner, J. M.
1977-01-01
A speech input/output system is presented that can be used to communicate with a task oriented system. Human speech commands and synthesized voice output extend conventional information exchange capabilities between man and machine by utilizing audio input and output channels. The speech input facility is comprised of a hardware feature extractor and a microprocessor implemented isolated word or phrase recognition system. The recognizer offers a medium sized (100 commands), syntactically constrained vocabulary, and exhibits close to real time performance. The major portion of the recognition processing required is accomplished through software, minimizing the complexity of the hardware feature extractor.
NASA Astrophysics Data System (ADS)
Zan, Tao; Wang, Min; Hu, Jianzhong
2010-12-01
Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.
Sandino, Juan; Wooler, Adam; Gonzalez, Felipe
2017-09-24
The increased technological developments in Unmanned Aerial Vehicles (UAVs) combined with artificial intelligence and Machine Learning (ML) approaches have opened the possibility of remote sensing of extensive areas of arid lands. In this paper, a novel approach towards the detection of termite mounds with the use of a UAV, hyperspectral imagery, ML and digital image processing is intended. A new pipeline process is proposed to detect termite mounds automatically and to reduce, consequently, detection times. For the classification stage, several ML classification algorithms' outcomes were studied, selecting support vector machines as the best approach for their role in image classification of pre-existing termite mounds. Various test conditions were applied to the proposed algorithm, obtaining an overall accuracy of 68%. Images with satisfactory mound detection proved that the method is "resolution-dependent". These mounds were detected regardless of their rotation and position in the aerial image. However, image distortion reduced the number of detected mounds due to the inclusion of a shape analysis method in the object detection phase, and image resolution is still determinant to obtain accurate results. Hyperspectral imagery demonstrated better capabilities to classify a huge set of materials than implementing traditional segmentation methods on RGB images only.
A New Mathematical Framework for Design Under Uncertainty
2016-05-05
blending multiple information sources via auto-regressive stochastic modeling. A computationally efficient machine learning framework is developed based on...sion and machine learning approaches; see Fig. 1. This will lead to a comprehensive description of system performance with less uncertainty than in the...Bayesian optimization of super-cavitating hy- drofoils The goal of this study is to demonstrate the capabilities of statistical learning and
University NanoSat Program: AggieSat3
2009-06-01
commercially available product for stereo machine vision developed by Point Grey Research. The current binocular BumbleBee2® system incorporates two...and Fellow of the American Society of Mechanical Engineers (ASME) in 1997. She was awarded the 2007 J. Leland "Lee" Atwood Award from the ASEE...AggieSat2 satellite programs. Additional experience gained in the area of drawing standards, machining capabilities, solid modeling, safety
Machinability of nickel based alloys using electrical discharge machining process
NASA Astrophysics Data System (ADS)
Khan, M. Adam; Gokul, A. K.; Bharani Dharan, M. P.; Jeevakarthikeyan, R. V. S.; Uthayakumar, M.; Thirumalai Kumaran, S.; Duraiselvam, M.
2018-04-01
The high temperature materials such as nickel based alloys and austenitic steel are frequently used for manufacturing critical aero engine turbine components. Literature on conventional and unconventional machining of steel materials is abundant over the past three decades. However the machining studies on superalloy is still a challenging task due to its inherent property and quality. Thus this material is difficult to be cut in conventional processes. Study on unconventional machining process for nickel alloys is focused in this proposed research. Inconel718 and Monel 400 are the two different candidate materials used for electrical discharge machining (EDM) process. Investigation is to prepare a blind hole using copper electrode of 6mm diameter. Electrical parameters are varied to produce plasma spark for diffusion process and machining time is made constant to calculate the experimental results of both the material. Influence of process parameters on tool wear mechanism and material removal are considered from the proposed experimental design. While machining the tool has prone to discharge more materials due to production of high energy plasma spark and eddy current effect. The surface morphology of the machined surface were observed with high resolution FE SEM. Fused electrode found to be a spherical structure over the machined surface as clumps. Surface roughness were also measured with surface profile using profilometer. It is confirmed that there is no deviation and precise roundness of drilling is maintained.
NASA Astrophysics Data System (ADS)
Sharif, Safian; Sadiq, Ibrahim Ogu; Suhaimi, Mohd Azlan; Rahim, Shayfull Zamree Abd
2017-09-01
Pollution related activities in addition to handling cost of conventional cutting fluid application in metal cutting industry has generated a lot of concern over time. The desire for a green machining environment which will preserve the environment through reduction or elimination of machining related pollution, reduction in oil consumption and safety of the machine operators without compromising an efficient machining process led to search for alternatives to conventional cutting fluid. Amongst the alternatives of dry machining, cryogenic cooling, high pressure cooling, near dry or minimum quantity lubrication (MQL), MQL have shown remarkable performance in terms of cost, machining output, safety of environment and machine operators. However, the MQL under aggressive machining or very high speed machining pose certain restriction as the lubrication media cannot perform efficiently at elevated temperature. In compensating for the shortcomings of MQL technique, high thermal conductivity nanoparticles are introduced in cutting fluids for use in the MQL lubrication process. They have indicated enhanced performance of machining process and significant reduction of loads on the environment. The present work is aimed at evaluating the application and performance of nanofluid in metal cutting process through MQL lubrication technique highlighting their impacts and prospects as lubrication strategy in metal cutting process for sustainable green manufacturing. Enhanced performance of vegetable oil based nanofluids over mineral oil-based nanofluids have been reported and thus highlighted.
Process Monitoring Evaluation and Implementation for the Wood Abrasive Machining Process
Saloni, Daniel E.; Lemaster, Richard L.; Jackson, Steven D.
2010-01-01
Wood processing industries have continuously developed and improved technologies and processes to transform wood to obtain better final product quality and thus increase profits. Abrasive machining is one of the most important of these processes and therefore merits special attention and study. The objective of this work was to evaluate and demonstrate a process monitoring system for use in the abrasive machining of wood and wood based products. The system developed increases the life of the belt by detecting (using process monitoring sensors) and removing (by cleaning) the abrasive loading during the machining process. This study focused on abrasive belt machining processes and included substantial background work, which provided a solid base for understanding the behavior of the abrasive, and the different ways that the abrasive machining process can be monitored. In addition, the background research showed that abrasive belts can effectively be cleaned by the appropriate cleaning technique. The process monitoring system developed included acoustic emission sensors which tended to be sensitive to belt wear, as well as platen vibration, but not loading, and optical sensors which were sensitive to abrasive loading. PMID:22163477
New Single Piece Blast Hardware design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ulrich, Andri; Steinzig, Michael Louis; Aragon, Daniel Adrian
W, Q and PF engineers and machinists designed and fabricated, on the new Mazak i300, the first Single Piece Blast Hardware (unclassified design shown) reducing fabrication and inspection time by over 50%. The first DU Single Piece is completed and will be used for Hydro Test 3680. Past hydro tests used a twopiece assembly due to a lack of equipment capable of machining the complex saddle shape in a single piece. The i300 provides turning and milling 5-axis machining on one machine. The milling head on the i300 can machine past 90 relative to the spindle axis. This makes itmore » possible to machine the complex saddle surface on a single piece. Going to a single piece eliminates tolerance problems, such as tilting and eccentricity, that typically occurred when assembling the two pieces together« less
Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z
2009-05-01
Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.
Compact Microscope Imaging System Developed
NASA Technical Reports Server (NTRS)
McDowell, Mark
2001-01-01
The Compact Microscope Imaging System (CMIS) is a diagnostic tool with intelligent controls for use in space, industrial, medical, and security applications. The CMIS can be used in situ with a minimum amount of user intervention. This system, which was developed at the NASA Glenn Research Center, can scan, find areas of interest, focus, and acquire images automatically. Large numbers of multiple cell experiments require microscopy for in situ observations; this is only feasible with compact microscope systems. CMIS is a miniature machine vision system that combines intelligent image processing with remote control capabilities. The software also has a user-friendly interface that can be used independently of the hardware for post-experiment analysis. CMIS has potential commercial uses in the automated online inspection of precision parts, medical imaging, security industry (examination of currency in automated teller machines and fingerprint identification in secure entry locks), environmental industry (automated examination of soil/water samples), biomedical field (automated blood/cell analysis), and microscopy community. CMIS will improve research in several ways: It will expand the capabilities of MSD experiments utilizing microscope technology. It may be used in lunar and Martian experiments (Rover Robot). Because of its reduced size, it will enable experiments that were not feasible previously. It may be incorporated into existing shuttle orbiter and space station experiments, including glove-box-sized experiments as well as ground-based experiments.
NASA Astrophysics Data System (ADS)
Wilson, S. A.; Jourdain, R. P.; Owens, S.
2010-09-01
The projected force-displacement capability of piezoelectric ceramic films in the 20-50 µm thickness range suggests that they are well suited to many micro-fluidic and micro-pneumatic applications. Furthermore when they are configured as bending actuators and operated at ~ 1 V µm - 1 they do not necessarily conform to the high-voltage, very low-displacement piezoelectric stereotype. Even so they are rarely found today in commercial micro-electromechanical devices, such as micro-pumps and micro-valves, and the main barriers to making them much more widely available would appear to be processing incompatibilities rather than commercial desirability. In particular, the issues associated with integration of these devices into MEMS at the production level are highly significant and they have perhaps received less attention in the mainstream than they deserve. This paper describes a fabrication route based on ultra-precision ceramic machining and full-wafer bonding for cost-effective batch scale production of thick film PZT bimorph micro-actuators and their integration with MEMS. The resulting actuators are pre-stressed (ceramic in compression) which gives them added performance, they are true bimorphs with bi-directional capability and they exhibit full bulk piezoelectric ceramic properties. The devices are designed to integrate with ancillary systems components using transfer-bonding techniques. The work forms part of the European Framework 6 Project 'Q2M—Quality to Micro'.
A distributed pipeline for DIDSON data processing
Li, Liling; Danner, Tyler; Eickholt, Jesse; McCann, Erin L.; Pangle, Kevin; Johnson, Nicholas
2018-01-01
Technological advances in the field of ecology allow data on ecological systems to be collected at high resolution, both temporally and spatially. Devices such as Dual-frequency Identification Sonar (DIDSON) can be deployed in aquatic environments for extended periods and easily generate several terabytes of underwater surveillance data which may need to be processed multiple times. Due to the large amount of data generated and need for flexibility in processing, a distributed pipeline was constructed for DIDSON data making use of the Hadoop ecosystem. The pipeline is capable of ingesting raw DIDSON data, transforming the acoustic data to images, filtering the images, detecting and extracting motion, and generating feature data for machine learning and classification. All of the tasks in the pipeline can be run in parallel and the framework allows for custom processing. Applications of the pipeline include monitoring migration times, determining the presence of a particular species, estimating population size and other fishery management tasks.
Challenges in Special Steel Making
NASA Astrophysics Data System (ADS)
Balachandran, G.
2018-02-01
Special bar quality [SBQ] is a long steel product where an assured quality is delivered by the steel mill to its customer. The bars have enhanced tolerance to higher stress application and it is demanded for specialised component making. The SBQ bars are sought for component making processing units such as closed die hot forging, hot extrusion, cold forging, machining, heat treatment, welding operations. The final component quality of the secondary processing units depends on the quality maintained at the steel maker end along with quality maintained at the fabricator end. Thus, quality control is ensured at every unit process stages. The various market segments catered to by SBQ steel segment is ever growing and is reviewed. Steel mills need adequate infrastructure and technological capability to make these higher quality steels. Some of the critical stages of processing SBQ and the critical quality maintenance parameters at the steel mill in the manufacture has been brought out.
Optimal nonlinear information processing capacity in delay-based reservoir computers
NASA Astrophysics Data System (ADS)
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2015-09-01
Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.
Optimal nonlinear information processing capacity in delay-based reservoir computers.
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2015-09-11
Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.
Optimal nonlinear information processing capacity in delay-based reservoir computers
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2015-01-01
Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature. PMID:26358528
CNC Machining Of The Complex Copper Electrodes
NASA Astrophysics Data System (ADS)
Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina
2015-07-01
This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.
CNN universal machine as classificaton platform: an art-like clustering algorithm.
Bálya, David
2003-12-01
Fast and robust classification of feature vectors is a crucial task in a number of real-time systems. A cellular neural/nonlinear network universal machine (CNN-UM) can be very efficient as a feature detector. The next step is to post-process the results for object recognition. This paper shows how a robust classification scheme based on adaptive resonance theory (ART) can be mapped to the CNN-UM. Moreover, this mapping is general enough to include different types of feed-forward neural networks. The designed analogic CNN algorithm is capable of classifying the extracted feature vectors keeping the advantages of the ART networks, such as robust, plastic and fault-tolerant behaviors. An analogic algorithm is presented for unsupervised classification with tunable sensitivity and automatic new class creation. The algorithm is extended for supervised classification. The presented binary feature vector classification is implemented on the existing standard CNN-UM chips for fast classification. The experimental evaluation shows promising performance after 100% accuracy on the training set.
Machine learning for real time remote detection
NASA Astrophysics Data System (ADS)
Labbé, Benjamin; Fournier, Jérôme; Henaff, Gilles; Bascle, Bénédicte; Canu, Stéphane
2010-10-01
Infrared systems are key to providing enhanced capability to military forces such as automatic control of threats and prevention from air, naval and ground attacks. Key requirements for such a system to produce operational benefits are real-time processing as well as high efficiency in terms of detection and false alarm rate. These are serious issues since the system must deal with a large number of objects and categories to be recognized (small vehicles, armored vehicles, planes, buildings, etc.). Statistical learning based algorithms are promising candidates to meet these requirements when using selected discriminant features and real-time implementation. This paper proposes a new decision architecture benefiting from recent advances in machine learning by using an effective method for level set estimation. While building decision function, the proposed approach performs variable selection based on a discriminative criterion. Moreover, the use of level set makes it possible to manage rejection of unknown or ambiguous objects thus preserving the false alarm rate. Experimental evidences reported on real world infrared images demonstrate the validity of our approach.
An experimental study of the putative mechanism of a synthetic autonomous rotary DNA nanomotor
NASA Astrophysics Data System (ADS)
Dunn, K. E.; Leake, M. C.; Wollman, A. J. M.; Trefzer, M. A.; Johnson, S.; Tyrrell, A. M.
2017-03-01
DNA has been used to construct a wide variety of nanoscale molecular devices. Inspiration for such synthetic molecular machines is frequently drawn from protein motors, which are naturally occurring and ubiquitous. However, despite the fact that rotary motors such as ATP synthase and the bacterial flagellar motor play extremely important roles in nature, very few rotary devices have been constructed using DNA. This paper describes an experimental study of the putative mechanism of a rotary DNA nanomotor, which is based on strand displacement, the phenomenon that powers many synthetic linear DNA motors. Unlike other examples of rotary DNA machines, the device described here is designed to be capable of autonomous operation after it is triggered. The experimental results are consistent with operation of the motor as expected, and future work on an enhanced motor design may allow rotation to be observed at the single-molecule level. The rotary motor concept presented here has potential applications in molecular processing, DNA computing, biosensing and photonics.
Kusne, Aaron Gilad; Gao, Tieren; Mehta, Apurva; Ke, Liqin; Nguyen, Manh Cuong; Ho, Kai-Ming; Antropov, Vladimir; Wang, Cai-Zhuang; Kramer, Matthew J.; Long, Christian; Takeuchi, Ichiro
2014-01-01
Advanced materials characterization techniques with ever-growing data acquisition speed and storage capabilities represent a challenge in modern materials science, and new procedures to quickly assess and analyze the data are needed. Machine learning approaches are effective in reducing the complexity of data and rapidly homing in on the underlying trend in multi-dimensional data. Here, we show that by employing an algorithm called the mean shift theory to a large amount of diffraction data in high-throughput experimentation, one can streamline the process of delineating the structural evolution across compositional variations mapped on combinatorial libraries with minimal computational cost. Data collected at a synchrotron beamline are analyzed on the fly, and by integrating experimental data with the inorganic crystal structure database (ICSD), we can substantially enhance the accuracy in classifying the structural phases across ternary phase spaces. We have used this approach to identify a novel magnetic phase with enhanced magnetic anisotropy which is a candidate for rare-earth free permanent magnet. PMID:25220062
Apparatus to collect, classify, concentrate, and characterize gas-borne particles
Rader, Daniel J.; Torczynski, John R.; Wally, Karl; Brockmann, John E.
2002-01-01
An aerosol lab-on-a-chip (ALOC) integrates one or more of a variety of aerosol collection, classification, concentration (enrichment), and characterization processes onto a single substrate or layered stack of such substrates. By taking advantage of modern micro-machining capabilities, an entire suite of discrete laboratory aerosol handling and characterization techniques can be combined in a single portable device that can provide a wealth of data on the aerosol being sampled. The ALOC offers parallel characterization techniques and close proximity of the various characterization modules helps ensure that the same aerosol is available to all devices (dramatically reducing sampling and transport errors). Micro-machine fabrication of the ALOC significantly reduces unit costs relative to existing technology, and enables the fabrication of small, portable ALOC devices, as well as the potential for rugged design to allow operation in harsh environments. Miniaturization also offers the potential of working with smaller particle sizes and lower pressure drops (leading to reduction of power consumption).
Making medicine a business in Japan: Shimadzu Co. and the diffusion of radiology (1900-1960).
Donzé, Pierre-Yves
2010-01-01
This contribution focuses on the role of the firm Shimadzu in the marketing of X-ray machines in Japan during the first part of the 20th century, viewed from a business history perspective. It attempts to further understanding of the process of technology diffusion in medicine. In a global market controlled by American and German multinational enterprises, Japan appears to have been a particular country, where a domestic independent firm, Shimadzu, succeeded in establishing itself as a competitive company. This success is the result of a strategy based on both the internalisation of technological capabilities (recruitment of university graduate engineers, subcontracting of research and development activities) and an original communication policy towards the medical world. Finally, the specific structure of the Japanese medical market, composed of numerous and largely privatised small healthcare centres, facilitated the rapid diffusion of X-ray machines, a new technology which conferred a comparative advantage on its holders.
Dissolvable films of silk fibroin for ultrathin conformal bio-integrated electronics.
Kim, Dae-Hyeong; Viventi, Jonathan; Amsden, Jason J; Xiao, Jianliang; Vigeland, Leif; Kim, Yun-Soung; Blanco, Justin A; Panilaitis, Bruce; Frechette, Eric S; Contreras, Diego; Kaplan, David L; Omenetto, Fiorenzo G; Huang, Yonggang; Hwang, Keh-Chih; Zakin, Mitchell R; Litt, Brian; Rogers, John A
2010-06-01
Electronics that are capable of intimate, non-invasive integration with the soft, curvilinear surfaces of biological tissues offer important opportunities for diagnosing and treating disease and for improving brain/machine interfaces. This article describes a material strategy for a type of bio-interfaced system that relies on ultrathin electronics supported by bioresorbable substrates of silk fibroin. Mounting such devices on tissue and then allowing the silk to dissolve and resorb initiates a spontaneous, conformal wrapping process driven by capillary forces at the biotic/abiotic interface. Specialized mesh designs and ultrathin forms for the electronics ensure minimal stresses on the tissue and highly conformal coverage, even for complex curvilinear surfaces, as confirmed by experimental and theoretical studies. In vivo, neural mapping experiments on feline animal models illustrate one mode of use for this class of technology. These concepts provide new capabilities for implantable and surgical devices.
Adaptive control of nonlinear system using online error minimum neural networks.
Jia, Chao; Li, Xiaoli; Wang, Kang; Ding, Dawei
2016-11-01
In this paper, a new learning algorithm named OEM-ELM (Online Error Minimized-ELM) is proposed based on ELM (Extreme Learning Machine) neural network algorithm and the spreading of its main structure. The core idea of this OEM-ELM algorithm is: online learning, evaluation of network performance, and increasing of the number of hidden nodes. It combines the advantages of OS-ELM and EM-ELM, which can improve the capability of identification and avoid the redundancy of networks. The adaptive control based on the proposed algorithm OEM-ELM is set up which has stronger adaptive capability to the change of environment. The adaptive control of chemical process Continuous Stirred Tank Reactor (CSTR) is also given for application. The simulation results show that the proposed algorithm with respect to the traditional ELM algorithm can avoid network redundancy and improve the control performance greatly. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Dissolvable Films of Silk Fibroin for Ultrathin, Conformal Bio-Integrated Electronics
Kim, Dae-Hyeong; Viventi, Jonathan; Amsden, Jason J.; Xiao, Jianliang; Vigeland, Leif; Kim, Yun-Soung; Blanco, Justin A.; Panilaitis, Bruce; Frechette, Eric S.; Contreras, Diego; Kaplan, David L.; Omenetto, Fiorenzo G.; Huang, Yonggang; Hwang, Keh-Chih; Zakin, Mitchell R.; Litt, Brian; Rogers, John A.
2011-01-01
Electronics that are capable of intimate, non-invasive integration with the soft, curvilinear surfaces of biological tissues offer important opportunities for diagnosing and treating disease and for improving brain-machine interfaces. This paper describes a material strategy for a type of bio-interfaced system that relies on ultrathin electronics supported by bioresorbable substrates of silk fibroin. Mounting such devices on tissue and then allowing the silk to dissolve and resorb initiates a spontaneous, conformal wrapping process driven by capillary forces at the biotic/abiotic interface. Specialized mesh designs and ultrathin forms for the electronics ensure minimal stresses on the tissue and highly conformal coverage, even for complex curvilinear surfaces, as confirmed by experimental and theoretical studies. In vivo, neural mapping experiments on feline animal models illustrate one mode of use for this class of technology. These concepts provide new capabilities for implantable or surgical devices. PMID:20400953
Transitioning NWChem to the Next Generation of Manycore Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bylaska, Eric J.; Apra, Edoardo; Kowalski, Karol
The NorthWest Chemistry (NWChem) modeling software is a popular molecular chemistry simulation software that was designed from the start to work on massively parallel processing supercomputers[6, 28, 49]. It contains an umbrella of modules that today includes Self Consistent Field (SCF), second order Mller-Plesset perturbation theory (MP2), Coupled Cluster, multi-conguration selfconsistent eld (MCSCF), selected conguration interaction (CI), tensor contraction engine (TCE) many body methods, density functional theory (DFT), time-dependent density functional theory (TDDFT), real time time-dependent density functional theory, pseudopotential plane-wave density functional theory (PSPW), band structure (BAND), ab initio molecular dynamics, Car-Parrinello molecular dynamics, classical molecular dynamics (MD), QM/MM,more » AIMD/MM, GIAO NMR, COSMO, COSMO-SMD, and RISM solvation models, free energy simulations, reaction path optimization, parallel in time, among other capabilities[ 22]. Moreover new capabilities continue to be added with each new release.« less
Man-machine interface and control of the shuttle digital flight system
NASA Technical Reports Server (NTRS)
Burghduff, R. D.; Lewis, J. L., Jr.
1985-01-01
The space shuttle main engine (SSME) presented new requirements in the design of controls for large pump fed liquid rocket engine systems. These requirements were the need for built in full mission support capability, and complexity and flexibility of function not previously needed in this type of application. An engine mounted programmable digital control system was developed to meet these requirements. The engine system and controller and their function are described. Design challenges encountered during the course of development included accommodation for a very severe engine environment, the implementation of redundancy and redundancy management to provide fail operational/fail safe capability, removal of heat from the package, and significant constraints on computer memory size and processing time. The flexibility offered by programmable control reshaped the approach to engine design and development and set the pattern for future controls development in these types of applications.
Introducing PLIA: Planetary Laboratory for Image Analysis
NASA Astrophysics Data System (ADS)
Peralta, J.; Hueso, R.; Barrado, N.; Sánchez-Lavega, A.
2005-08-01
We present a graphical software tool developed under IDL software to navigate, process and analyze planetary images. The software has a complete Graphical User Interface and is cross-platform. It can also run under the IDL Virtual Machine without the need to own an IDL license. The set of tools included allow image navigation (orientation, centring and automatic limb determination), dynamical and photometric atmospheric measurements (winds and cloud albedos), cylindrical and polar projections, as well as image treatment under several procedures. Being written in IDL, it is modular and easy to modify and grow for adding new capabilities. We show several examples of the software capabilities with Galileo-Venus observations: Image navigation, photometrical corrections, wind profiles obtained by cloud tracking, cylindrical projections and cloud photometric measurements. Acknowledgements: This work has been funded by Spanish MCYT PNAYA2003-03216, fondos FEDER and Grupos UPV 15946/2004. R. Hueso acknowledges a post-doc fellowship from Gobierno Vasco.
Nondestructive Evaluation Methodologies Developed for Certifying Composite Flywheels
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Konno, Kevin E.; Martin, Richard E.; Thompson, Richard
2001-01-01
Manufacturing readiness of composite rotors and certification of flywheels depend in part on the maturity of nondestructive evaluation (NDE) technology for process optimization and quality assurance, respectively. At the NASA Glenn Research Center, the capabilities and limitations of x-ray-computed tomography and radiography, as well as advanced ultrasonics were established on NDE ring and rotor standards with electrical discharge machining (EDM) notches and drilled holes. Also, intentionally seeded delamination, tow break, and insert of bagging material were introduced in hydroburst-rings to study the NDE detection capabilities of such anomalies and their effect on the damage tolerance and safe life margins of subscale rings and rotors. Examples of possible occurring flaws or anomalies in composite rings as detected by NDE and validated by destructive metallography are shown. The general NDE approach to ensure the quality of composite rotors and to help in the certification of flywheels is briefly outlined.
Dissolvable films of silk fibroin for ultrathin conformal bio-integrated electronics
NASA Astrophysics Data System (ADS)
Kim, Dae-Hyeong; Viventi, Jonathan; Amsden, Jason J.; Xiao, Jianliang; Vigeland, Leif; Kim, Yun-Soung; Blanco, Justin A.; Panilaitis, Bruce; Frechette, Eric S.; Contreras, Diego; Kaplan, David L.; Omenetto, Fiorenzo G.; Huang, Yonggang; Hwang, Keh-Chih; Zakin, Mitchell R.; Litt, Brian; Rogers, John A.
2010-06-01
Electronics that are capable of intimate, non-invasive integration with the soft, curvilinear surfaces of biological tissues offer important opportunities for diagnosing and treating disease and for improving brain/machine interfaces. This article describes a material strategy for a type of bio-interfaced system that relies on ultrathin electronics supported by bioresorbable substrates of silk fibroin. Mounting such devices on tissue and then allowing the silk to dissolve and resorb initiates a spontaneous, conformal wrapping process driven by capillary forces at the biotic/abiotic interface. Specialized mesh designs and ultrathin forms for the electronics ensure minimal stresses on the tissue and highly conformal coverage, even for complex curvilinear surfaces, as confirmed by experimental and theoretical studies. In vivo, neural mapping experiments on feline animal models illustrate one mode of use for this class of technology. These concepts provide new capabilities for implantable and surgical devices.
NASA Technical Reports Server (NTRS)
Birisan, Mihnea; Beling, Peter
2011-01-01
New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.
NASA Technical Reports Server (NTRS)
Rubbert, P. E.
1978-01-01
The commercial airplane builder's viewpoint on the important issues involved in the development of improved computational aerodynamics tools such as powerful computers optimized for fluid flow problems is presented. The primary user of computational aerodynamics in a commercial aircraft company is the design engineer who is concerned with solving practical engineering problems. From his viewpoint, the development of program interfaces and pre-and post-processing capability for new computational methods is just as important as the algorithms and machine architecture. As more and more details of the entire flow field are computed, the visibility of the output data becomes a major problem which is then doubled when a design capability is added. The user must be able to see, understand, and interpret the results calculated. Enormous costs are expanded because of the need to work with programs having only primitive user interfaces.
NASA Technical Reports Server (NTRS)
1981-01-01
The goals in this program for advanced czochralski growth process to produce low cost 150 kg silicon ingots from a single crucible for technology readiness are outlined. To provide a modified CG2000 crystal power capable of pulling a minimum of five crystals, each of approximately 30 kg in weight, 150 mm diameter from a single crucible with periodic melt replenishment. Crystals to have: resistivity of 1 to 3 ohm cm, p-type; dislocation density below 1- to the 6th power per cm; orientation (100); after growth yield of greater than 90%. Growth throughput of greater than 2.5 kg per hour of machine operation using a radiation shield. Prototype equipment suitable for use as a production facility. The overall cost goal is $.70 per peak watt by 1986. To accomplish these goals, the modified CG2000 grower and development program includes: (1) increased automation with a microprocessor based control system; (2) sensors development which will increase the capability of the automatic controls system, and provide technology transfer of the developed systems.
Conversion of LARSYS III.1 to an IBM 370 computer
NASA Technical Reports Server (NTRS)
Williams, G. N.; Leggett, J.; Hascall, G. A.
1975-01-01
A software system for processing multispectral aircraft or satellite data (LARSYS) was designed and written at the Laboratory for Applications of Remote Sensing at Purdue University. This system, being implemented on an IBM 360/67 computer utilizing the Cambridge Monitor System, is of an interactive nature. TAMU LARSYS maintains the essential capabilities of Purdue's LARSYS. The machine configuration for which it has been converted is an IBM-compatible Amdahl 470V/6 computer utilizing the time sharing option of the currently implemented OS/VS2 Operating System. Due to TSO limitations, the NASA-JSC deliverable TAMU LARSYS is comprised of two parts. Part one is a TSO Control Card Checker for LARSYS control cards, and part two is a batch version of LARSYS. Used together, they afford most of the capabilities of the original LARSYS III.1. Additionally, two programs have been written by TAMU to support LARSYS processing. The first is an ERTS-to-MIST conversion program used to convert ERTS data to the LARSYS input form, the MIST tape. The second is a system runtable code which maintains tape/file location information for the MIST data sets.
Two-dimensional nonsteady viscous flow simulation on the Navier-Stokes computer miniNode
NASA Technical Reports Server (NTRS)
Nosenchuck, Daniel M.; Littman, Michael G.; Flannery, William
1986-01-01
The needs of large-scale scientific computation are outpacing the growth in performance of mainframe supercomputers. In particular, problems in fluid mechanics involving complex flow simulations require far more speed and capacity than that provided by current and proposed Class VI supercomputers. To address this concern, the Navier-Stokes Computer (NSC) was developed. The NSC is a parallel-processing machine, comprised of individual Nodes, each comparable in performance to current supercomputers. The global architecture is that of a hypercube, and a 128-Node NSC has been designed. New architectural features, such as a reconfigurable many-function ALU pipeline and a multifunction memory-ALU switch, have provided the capability to efficiently implement a wide range of algorithms. Efficient algorithms typically involve numerically intensive tasks, which often include conditional operations. These operations may be efficiently implemented on the NSC without, in general, sacrificing vector-processing speed. To illustrate the architecture, programming, and several of the capabilities of the NSC, the simulation of two-dimensional, nonsteady viscous flows on a prototype Node, called the miniNode, is presented.
Machine Learning, deep learning and optimization in computer vision
NASA Astrophysics Data System (ADS)
Canu, Stéphane
2017-03-01
As quoted in the Large Scale Computer Vision Systems NIPS workshop, computer vision is a mature field with a long tradition of research, but recent advances in machine learning, deep learning, representation learning and optimization have provided models with new capabilities to better understand visual content. The presentation will go through these new developments in machine learning covering basic motivations, ideas, models and optimization in deep learning for computer vision, identifying challenges and opportunities. It will focus on issues related with large scale learning that is: high dimensional features, large variety of visual classes, and large number of examples.
Computer-aided design studies of the homopolar linear synchronous motor
NASA Astrophysics Data System (ADS)
Dawson, G. E.; Eastham, A. R.; Ong, R.
1984-09-01
The linear induction motor (LIM), as an urban transit drive, can provide good grade-climbing capabilities and propulsion/braking performance that is independent of steel wheel-rail adhesion. In view of its 10-12 mm airgap, the LIM is characterized by a low power factor-efficiency product of order 0.4. A synchronous machine offers high efficiency and controllable power factor. An assessment of the linear homopolar configuration of this machine is presented as an alternative to the LIM. Computer-aided design studies using the finite element technique have been conducted to identify a suitable machine design for urban transit propulsion.
Continuous performance measurement in flight systems. [sequential control model
NASA Technical Reports Server (NTRS)
Connelly, E. M.; Sloan, N. A.; Zeskind, R. M.
1975-01-01
The desired response of many man machine control systems can be formulated as a solution to an optimal control synthesis problem where the cost index is given and the resulting optimal trajectories correspond to the desired trajectories of the man machine system. Optimal control synthesis provides the reference criteria and the significance of error information required for performance measurement. The synthesis procedure described provides a continuous performance measure (CPM) which is independent of the mechanism generating the control action. Therefore, the technique provides a meaningful method for online evaluation of man's control capability in terms of total man machine performance.
Proceedings of the 8th Annual Conference on Manual Control
NASA Technical Reports Server (NTRS)
Pew, R. W.
1972-01-01
The volume presents recent developments in the field of manual control theory and applications. The papers give analytical methods as well as examples of the important interplay between man and machine, such as how man controls and stabilizes machine dynamics, and how machines extend man's capability. Included in the broad range of subjects are procedures to evaluate and identify display systems, controllers, manipulators, human operators, aircraft, and non-flying vehicles. Of particular interest is the continuing trend of applying control theory to problems in medicine and psychology, as well as to problems in vehicle control.
Programmable Pulse-Position-Modulation Encoder
NASA Technical Reports Server (NTRS)
Zhu, David; Farr, William
2006-01-01
A programmable pulse-position-modulation (PPM) encoder has been designed for use in testing an optical communication link. The encoder includes a programmable state machine and an electronic code book that can be updated to accommodate different PPM coding schemes. The encoder includes a field-programmable gate array (FPGA) that is programmed to step through the stored state machine and code book and that drives a custom high-speed serializer circuit board that is capable of generating subnanosecond pulses. The stored state machine and code book can be updated by means of a simple text interface through the serial port of a personal computer.
2013-01-01
the ACSW was meant to be the future replacement for both the M2 .50-Caliber machine gun and the MK19 grenade launcher. As with the OICW, the...of the legacy M2 machine gun , so that the ‘requirement creep’ – defined as the introduction of requirements after initial phases of development have...replace the M2 and M2A1 heavy machine guns , circled at the bottom of Figure 6.2. However, the OICW and ACSW systems were supposed to replace the M4
Pressure variation of developed lapping tool on surface roughness
NASA Astrophysics Data System (ADS)
Hussain, A. K.; Lee, K. Q.; Aung, L. M.; Abu, A.; Tan, L. K.; Kang, H. S.
2018-01-01
Improving the surface roughness is always one of the major concerns in the development of lapping process as high precision machining caters a great demand in manufacturing process. This paper aims to investigate the performance of a newly designed lapping tool in term of surface roughness. Polypropylene is used as the lapping tool head. The lapping tool is tested for different pressure to identify the optimum working pressure for lapping process. The theoretical surface roughness is also calculated using Vickers Hardness. The present study shows that polypropylene is able to produce good quality and smooth surface roughness. The optimum lapping pressure in the present study is found to be 45 MPa. By comparing the theoretical and experimental values, the present study shows that the newly designed lapping tool is capable to produce finer surface roughness.
Solving the Software Legacy Problem with RISA
NASA Astrophysics Data System (ADS)
Ibarra, A.; Gabriel, C.
2012-09-01
Nowadays hardware and system infrastructure evolve on time scales much shorter than the typical duration of space astronomy missions. Data processing software capabilities have to evolve to preserve the scientific return during the entire experiment life time. Software preservation is a key issue that has to be tackled before the end of the project to keep the data usable over many years. We present RISA (Remote Interface to Science Analysis) as a solution to decouple data processing software and infrastructure life-cycles, using JAVA applications and web-services wrappers to existing software. This architecture employs embedded SAS in virtual machines assuring a homogeneous job execution environment. We will also present the first studies to reactivate the data processing software of the EXOSAT mission, the first ESA X-ray astronomy mission launched in 1983, using the generic RISA approach.
Grau, Cai; Defourny, Noémie; Malicki, Julian; Dunscombe, Peter; Borras, Josep M; Coffey, Mary; Slotman, Ben; Bogusz, Marta; Gasparotto, Chiara; Lievens, Yolande; Kokobobo, Arianit; Sedlmayer, Felix; Slobina, Elena; Feyen, Karen; Hadjieva, Tatiana; Odrazka, Karel; Grau Eriksen, Jesper; Jaal, Jana; Bly, Ritva; Chauvet, Bruno; Willich, Normann; Polgar, Csaba; Johannsson, Jakob; Cunningham, Moya; Magrini, Stefano; Atkocius, Vydmantas; Untereiner, Michel; Pirotta, Martin; Karadjinovic, Vanja; Levernes, Sverre; Sladowski, Krystol; Lurdes Trigo, Maria; Šegedin, Barbara; Rodriguez, Aurora; Lagerlund, Magnus; Pastoors, Bert; Hoskin, Peter; Vaarkamp, Jaap; Cleries Soler, Ramon
2014-08-01
Documenting the distribution of radiotherapy departments and the availability of radiotherapy equipment in the European countries is an important part of HERO - the ESTRO Health Economics in Radiation Oncology project. HERO has the overall aim to develop a knowledge base of the provision of radiotherapy in Europe and build a model for health economic evaluation of radiation treatments at the European level. The aim of the current report is to describe the distribution of radiotherapy equipment in European countries. An 84-item questionnaire was sent out to European countries, principally through their national societies. The current report includes a detailed analysis of radiotherapy departments and equipment (questionnaire items 26-29), analyzed in relation to the annual number of treatment courses and the socio-economic status of the countries. The analysis is based on validated responses from 28 of the 40 European countries defined by the European Cancer Observatory (ECO). A large variation between countries was found for most parameters studied. There were 2192 linear accelerators, 96 dedicated stereotactic machines, and 77 cobalt machines reported in the 27 countries where this information was available. A total of 12 countries had at least one cobalt machine in use. There was a median of 0.5 simulator per MV unit (range 0.3-1.5) and 1.4 (range 0.4-4.4) simulators per department. Of the 874 simulators, a total of 654 (75%) were capable of 3D imaging (CT-scanner or CBCT-option). The number of MV machines (cobalt, linear accelerators, and dedicated stereotactic machines) per million inhabitants ranged from 1.4 to 9.5 (median 5.3) and the average number of MV machines per department from 0.9 to 8.2 (median 2.6). The average number of treatment courses per year per MV machine varied from 262 to 1061 (median 419). While 69% of MV units were capable of IMRT only 49% were equipped for image guidance (IGRT). There was a clear relation between socio-economic status, as measured by GNI per capita, and availability of radiotherapy equipment in the countries. In many low income countries in Southern and Central-Eastern Europe there was very limited access to radiotherapy and especially to equipment for IMRT or IGRT. The European average number of MV machines per million inhabitants and per department is now better in line with QUARTS recommendations from 2005, but the survey also showed a significant heterogeneity in the access to modern radiotherapy equipment in Europe. High income countries especially in Northern-Western Europe are well-served with radiotherapy resources, other countries are facing important shortages of both equipment in general and especially machines capable of delivering high precision conformal treatments (IMRT, IGRT). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
State machine analysis of sensor data from dynamic processes
Cook, William R.; Brabson, John M.; Deland, Sharon M.
2003-12-23
A state machine model analyzes sensor data from dynamic processes at a facility to identify the actual processes that were performed at the facility during a period of interest for the purpose of remote facility inspection. An inspector can further input the expected operations into the state machine model and compare the expected, or declared, processes to the actual processes to identify undeclared processes at the facility. The state machine analysis enables the generation of knowledge about the state of the facility at all levels, from location of physical objects to complex operational concepts. Therefore, the state machine method and apparatus may benefit any agency or business with sensored facilities that stores or manipulates expensive, dangerous, or controlled materials or information.
NASA Technical Reports Server (NTRS)
Miller, R. H.; Minsky, M. L.; Smith, D. B. S.
1982-01-01
Applications of automation, robotics, and machine intelligence systems (ARAMIS) to space activities and their related ground support functions are studied, so that informed decisions can be made on which aspects of ARAMIS to develop. The specific tasks which will be required by future space project tasks are identified and the relative merits of these options are evaluated. The ARAMIS options defined and researched span the range from fully human to fully machine, including a number of intermediate options (e.g., humans assisted by computers, and various levels of teleoperation). By including this spectrum, the study searches for the optimum mix of humans and machines for space project tasks.
Machining of bone: Analysis of cutting force and surface roughness by turning process.
Noordin, M Y; Jiawkok, N; Ndaruhadi, P Y M W; Kurniawan, D
2015-11-01
There are millions of orthopedic surgeries and dental implantation procedures performed every year globally. Most of them involve machining of bones and cartilage. However, theoretical and analytical study on bone machining is lagging behind its practice and implementation. This study views bone machining as a machining process with bovine bone as the workpiece material. Turning process which makes the basis of the actually used drilling process was experimented. The focus is on evaluating the effects of three machining parameters, that is, cutting speed, feed, and depth of cut, to machining responses, that is, cutting forces and surface roughness resulted by the turning process. Response surface methodology was used to quantify the relation between the machining parameters and the machining responses. The turning process was done at various cutting speeds (29-156 m/min), depths of cut (0.03 -0.37 mm), and feeds (0.023-0.11 mm/rev). Empirical models of the resulted cutting force and surface roughness as the functions of cutting speed, depth of cut, and feed were developed. Observation using the developed empirical models found that within the range of machining parameters evaluated, the most influential machining parameter to the cutting force is depth of cut, followed by feed and cutting speed. The lowest cutting force was obtained at the lowest cutting speed, lowest depth of cut, and highest feed setting. For surface roughness, feed is the most significant machining condition, followed by cutting speed, and with depth of cut showed no effect. The finest surface finish was obtained at the lowest cutting speed and feed setting. © IMechE 2015.
Computer vision for automatic inspection of agricultural produce
NASA Astrophysics Data System (ADS)
Molto, Enrique; Blasco, Jose; Benlloch, Jose V.
1999-01-01
Fruit and vegetables suffer different manipulations from the field to the final consumer. These are basically oriented towards the cleaning and selection of the product in homogeneous categories. For this reason, several research projects, aimed at fast, adequate produce sorting and quality control are currently under development around the world. Moreover, it is possible to find manual and semi- automatic commercial system capable of reasonably performing these tasks.However, in many cases, their accuracy is incompatible with current European market demands, which are constantly increasing. IVIA, the Valencian Research Institute of Agriculture, located in Spain, has been involved in several European projects related with machine vision for real-time inspection of various agricultural produces. This paper will focus on the work related with two products that have different requirements: fruit and olives. In the case of fruit, the Institute has developed a vision system capable of providing assessment of the external quality of single fruit to a robot that also receives information from other senors. The system use four different views of each fruit and has been tested on peaches, apples and citrus. Processing time of each image is under 500 ms using a conventional PC. The system provides information about primary and secondary color, blemishes and their extension, and stem presence and position, which allows further automatic orientation of the fruit in the final box using a robotic manipulator. Work carried out in olives was devoted to fast sorting of olives for consumption at table. A prototype has been developed to demonstrate the feasibility of a machine vision system capable of automatically sorting 2500 kg/h olives using low-cost conventional hardware.
Machining of Fibre Reinforced Plastic Composite Materials.
Caggiano, Alessandra
2018-03-18
Fibre reinforced plastic composite materials are difficult to machine because of the anisotropy and inhomogeneity characterizing their microstructure and the abrasiveness of their reinforcement components. During machining, very rapid cutting tool wear development is experienced, and surface integrity damage is often produced in the machined parts. An accurate selection of the proper tool and machining conditions is therefore required, taking into account that the phenomena responsible for material removal in cutting of fibre reinforced plastic composite materials are fundamentally different from those of conventional metals and their alloys. To date, composite materials are increasingly used in several manufacturing sectors, such as the aerospace and automotive industry, and several research efforts have been spent to improve their machining processes. In the present review, the key issues that are concerning the machining of fibre reinforced plastic composite materials are discussed with reference to the main recent research works in the field, while considering both conventional and unconventional machining processes and reporting the more recent research achievements. For the different machining processes, the main results characterizing the recent research works and the trends for process developments are presented.
Machining of Fibre Reinforced Plastic Composite Materials
2018-01-01
Fibre reinforced plastic composite materials are difficult to machine because of the anisotropy and inhomogeneity characterizing their microstructure and the abrasiveness of their reinforcement components. During machining, very rapid cutting tool wear development is experienced, and surface integrity damage is often produced in the machined parts. An accurate selection of the proper tool and machining conditions is therefore required, taking into account that the phenomena responsible for material removal in cutting of fibre reinforced plastic composite materials are fundamentally different from those of conventional metals and their alloys. To date, composite materials are increasingly used in several manufacturing sectors, such as the aerospace and automotive industry, and several research efforts have been spent to improve their machining processes. In the present review, the key issues that are concerning the machining of fibre reinforced plastic composite materials are discussed with reference to the main recent research works in the field, while considering both conventional and unconventional machining processes and reporting the more recent research achievements. For the different machining processes, the main results characterizing the recent research works and the trends for process developments are presented. PMID:29562635
NASA Astrophysics Data System (ADS)
Mazlan, Mohamed Mubin Aizat; Sulaiman, Erwan; Husin, Zhafir Aizat; Othman, Syed Muhammad Naufal Syed; Khan, Faisal
2015-05-01
In hybrid excitation machines (HEMs), there are two main flux sources which are permanent magnet (PM) and field excitation coil (FEC). These HEMs have better features when compared with the interior permanent magnet synchronous machines (IPMSM) used in conventional hybrid electric vehicles (HEVs). Since all flux sources including PM, FEC and armature coils are located on the stator core, the rotor becomes a single piece structure similar with switch reluctance machine (SRM). The combined flux generated by PM and FEC established more excitation fluxes that are required to produce much higher torque of the motor. In addition, variable DC FEC can control the flux capabilities of the motor, thus the machine can be applied for high-speed motor drive system. In this paper, the comparisons of single-phase 8S-4P outer and inner rotor hybrid excitation flux switching machine (HEFSM) are presented. Initially, design procedures of the HEFSM including parts drawing, materials and conditions setting, and properties setting are explained. Flux comparisons analysis is performed to investigate the flux capabilities at various current densities. Then the flux linkages of PM with DC FEC of various DC FEC current densities are examined. Finally torque performances are analyzed at various armature and FEC current densities for both designs. As a result, the outer-rotor HEFSM has higher flux linkage of PM with DC FEC and higher average torque of approximately 10% when compared with inner-rotor HEFSM.
2009-01-01
components or systems to prevent the unauthorised opening of the system, access to the internal workings or Intellectual Property . > Armoured vehicles. This...This is the ability to repair specialist alloys and composite materials, to develop new repair techniques and to undertake precision machining of...Selected ballistic munitions and explosives. This capability relates to the manufacture of some high usage munitions, ammunition components
Fluid machines: Expanding the limits, past and future
NASA Technical Reports Server (NTRS)
Hartmann, M. J.; Sandercock, D. M.
1985-01-01
During the 40 yr period from 1940 to 1980, the capabilities and operating limits of fluid machines were greatly extended. This was due to a research program, carried out to meet the needs of aerospace programs. Some of the events are reviewed. Overall advancements of all machinery components are discussed followed by a detailed examination of technology advancements in axial compressors and pumps. Future technology needs are suggested.
Implementation of a parallel unstructured Euler solver on the CM-5
NASA Technical Reports Server (NTRS)
Morano, Eric; Mavriplis, D. J.
1995-01-01
An efficient unstructured 3D Euler solver is parallelized on a Thinking Machine Corporation Connection Machine 5, distributed memory computer with vectoring capability. In this paper, the single instruction multiple data (SIMD) strategy is employed through the use of the CM Fortran language and the CMSSL scientific library. The performance of the CMSSL mesh partitioner is evaluated and the overall efficiency of the parallel flow solver is discussed.
Topics in programmable automation. [for materials handling, inspection, and assembly
NASA Technical Reports Server (NTRS)
Rosen, C. A.
1975-01-01
Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.
1988-04-30
side it necessary and Identify’ by’ block n~nmbot) haptic hand, touch , vision, robot, object recognition, categorization 20. AGSTRPACT (Continue an...established that the haptic system has remarkable capabilities for object recognition. We define haptics as purposive touch . The basic tactual system...gathered ratings of the importance of dimensions for categorizing common objects by touch . Texture and hardness ratings strongly co-vary, which is
NASA Astrophysics Data System (ADS)
Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.
1989-03-01
The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.
Classifying black and white spruce pollen using layered machine learning.
Punyasena, Surangi W; Tcheng, David K; Wesseln, Cassandra; Mueller, Pietra G
2012-11-01
Pollen is among the most ubiquitous of terrestrial fossils, preserving an extended record of vegetation change. However, this temporal continuity comes with a taxonomic tradeoff. Analytical methods that improve the taxonomic precision of pollen identifications would expand the research questions that could be addressed by pollen, in fields such as paleoecology, paleoclimatology, biostratigraphy, melissopalynology, and forensics. We developed a supervised, layered, instance-based machine-learning classification system that uses leave-one-out bias optimization and discriminates among small variations in pollen shape, size, and texture. We tested our system on black and white spruce, two paleoclimatically significant taxa in the North American Quaternary. We achieved > 93% grain-to-grain classification accuracies in a series of experiments with both fossil and reference material. More significantly, when applied to Quaternary samples, the learning system was able to replicate the count proportions of a human expert (R(2) = 0.78, P = 0.007), with one key difference - the machine achieved these ratios by including larger numbers of grains with low-confidence identifications. Our results demonstrate the capability of machine-learning systems to solve the most challenging palynological classification problem, the discrimination of congeneric species, extending the capabilities of the pollen analyst and improving the taxonomic resolution of the palynological record. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.
CNC water-jet machining and cutting center
NASA Astrophysics Data System (ADS)
Bartlett, D. C.
1991-09-01
Computer Numerical Control (CNC) water-jet machining was investigated to determine the potential applications and cost-effectiveness that would result by establishing this capability in the engineering shops of Allied-Signal Inc., Kansas City Division (KCD). Both conductive and nonconductive samples were machined at KCD on conventional machining equipment (a three-axis conversational programmed mill and a wire electrical discharge machine) and on two current-technology water-jet machines at outside vendors. These samples were then inspected, photographed, and evaluated. The current-technology water-jet machines were not as accurate as the conventional equipment. The resolution of the water-jet equipment was only +/- 0.005 inch, as compared to +/- 0.0002 inch for the conventional equipment. The principal use for CNC water-jet machining would be as follows: Contouring to near finished shape those items made from 300 and 400 series stainless steels, titanium, Inconel, aluminum, glass, or any material whose fabrication tolerance is less than the machine resolution of +/- 0.005 inch; and contouring to finished shape those items made from Kevlar, rubber, fiberglass, foam, aluminum, or any material whose fabrication specifications allow the use of a machine with +/- 0.005 inch tolerance. Additional applications are possible because there is minimal force generated on the material being cut and because the water-jet cuts without generating dust.
An intelligent CNC machine control system architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.J.; Loucks, C.S.
1996-10-01
Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less
Application of Machine Learning to Rotorcraft Health Monitoring
NASA Technical Reports Server (NTRS)
Cody, Tyler; Dempsey, Paula J.
2017-01-01
Machine learning is a powerful tool for data exploration and model building with large data sets. This project aimed to use machine learning techniques to explore the inherent structure of data from rotorcraft gear tests, relationships between features and damage states, and to build a system for predicting gear health for future rotorcraft transmission applications. Classical machine learning techniques are difficult, if not irresponsible to apply to time series data because many make the assumption of independence between samples. To overcome this, Hidden Markov Models were used to create a binary classifier for identifying scuffing transitions and Recurrent Neural Networks were used to leverage long distance relationships in predicting discrete damage states. When combined in a workflow, where the binary classifier acted as a filter for the fatigue monitor, the system was able to demonstrate accuracy in damage state prediction and scuffing identification. The time dependent nature of the data restricted data exploration to collecting and analyzing data from the model selection process. The limited amount of available data was unable to give useful information, and the division of training and testing sets tended to heavily influence the scores of the models across combinations of features and hyper-parameters. This work built a framework for tracking scuffing and fatigue on streaming data and demonstrates that machine learning has much to offer rotorcraft health monitoring by using Bayesian learning and deep learning methods to capture the time dependent nature of the data. Suggested future work is to implement the framework developed in this project using a larger variety of data sets to test the generalization capabilities of the models and allow for data exploration.
Diamond, James; Anderson, Neil H; Bartels, Peter H; Montironi, Rodolfo; Hamilton, Peter W
2004-09-01
Quantitative examination of prostate histology offers clues in the diagnostic classification of lesions and in the prediction of response to treatment and prognosis. To facilitate the collection of quantitative data, the development of machine vision systems is necessary. This study explored the use of imaging for identifying tissue abnormalities in prostate histology. Medium-power histological scenes were recorded from whole-mount radical prostatectomy sections at x 40 objective magnification and assessed by a pathologist as exhibiting stroma, normal tissue (nonneoplastic epithelial component), or prostatic carcinoma (PCa). A machine vision system was developed that divided the scenes into subregions of 100 x 100 pixels and subjected each to image-processing techniques. Analysis of morphological characteristics allowed the identification of normal tissue. Analysis of image texture demonstrated that Haralick feature 4 was the most suitable for discriminating stroma from PCa. Using these morphological and texture measurements, it was possible to define a classification scheme for each subregion. The machine vision system is designed to integrate these classification rules and generate digital maps of tissue composition from the classification of subregions; 79.3% of subregions were correctly classified. Established classification rates have demonstrated the validity of the methodology on small scenes; a logical extension was to apply the methodology to whole slide images via scanning technology. The machine vision system is capable of classifying these images. The machine vision system developed in this project facilitates the exploration of morphological and texture characteristics in quantifying tissue composition. It also illustrates the potential of quantitative methods to provide highly discriminatory information in the automated identification of prostatic lesions using computer vision.
MSWT-01, flood disaster water treatment solution from common ideas
NASA Astrophysics Data System (ADS)
Ananto, Gamawan; Setiawan, Albertus B.; Z, Darman M.
2013-06-01
Indonesia has a lot of potential flood disaster places with clean water problems faced. Various solution programs always initiated by Government, companies CSR, and people sporadical actions to provide clean water; with their advantages and disadvantages respectively. One solution is easy to operate for instance, but didn't provide adequate capacity, whereas the other had ideal performance but more costly. This situation inspired to develop a water treatment machine that could be an alternative favor. There are many methods could be choosed; whether in simple, middle or high technology, depends on water source input and output result quality. MSWT, Mobile Surface Water Treatment, is an idea for raw water in flood area, basically made for 1m3 per hour. This water treatment design adopted from combined existing technologies and related literatures. Using common ideas, the highlight is how to make such modular process put in compact design elegantly, and would be equipped with mobile feature due to make easier in operational. Through prototype level experiment trials, the machine is capable for producing clean water that suitable for sanitation and cooking/drinking purposes although using contaminated water input source. From the investment point of view, such machine could be also treated as an asset that will be used from time to time when needed, instead of made for project approach only.
Collaborative human-machine analysis using a controlled natural language
NASA Astrophysics Data System (ADS)
Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave
2015-05-01
A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".