The Use of Computer Graphics in the Design Process.
ERIC Educational Resources Information Center
Palazzi, Maria
This master's thesis examines applications of computer technology to the field of industrial design and ways in which technology can transform the traditional process. Following a statement of the problem, the history and applications of the fields of computer graphics and industrial design are reviewed. The traditional industrial design process…
Computer and control applications in a vegetable processing plant
USDA-ARS?s Scientific Manuscript database
There are many advantages to the use of computers and control in food industry. Software in the food industry takes 2 forms - general purpose commercial computer software and software for specialized applications, such as drying and thermal processing of foods. Many applied simulation models for d...
The application of computer-aided technologies in automotive styling design
NASA Astrophysics Data System (ADS)
Zheng, Ze-feng; Zhang, Ji; Zheng, Ying
2012-04-01
In automotive industry, outline design is its life and creative design is its soul indeed. Computer-aided technology has been widely used in the automotive industry and more and more attention has been paid. This paper chiefly introduce the application of computer-aided technologies including CAD, CAM and CAE, analyses the process of automotive structural design and describe the development tendency of computer-aided design.
Environmentalists and the Computer.
ERIC Educational Resources Information Center
Baron, Robert C.
1982-01-01
Review characteristics, applications, and limitations of computers, including word processing, data/record keeping, scientific and industrial, and educational applications. Discusses misuse of computers and role of computers in environmental management. (JN)
Cloud Computing Boosts Business Intelligence of Telecommunication Industry
NASA Astrophysics Data System (ADS)
Xu, Meng; Gao, Dan; Deng, Chao; Luo, Zhiguo; Sun, Shaoling
Business Intelligence becomes an attracting topic in today's data intensive applications, especially in telecommunication industry. Meanwhile, Cloud Computing providing IT supporting Infrastructure with excellent scalability, large scale storage, and high performance becomes an effective way to implement parallel data processing and data mining algorithms. BC-PDM (Big Cloud based Parallel Data Miner) is a new MapReduce based parallel data mining platform developed by CMRI (China Mobile Research Institute) to fit the urgent requirements of business intelligence in telecommunication industry. In this paper, the architecture, functionality and performance of BC-PDM are presented, together with the experimental evaluation and case studies of its applications. The evaluation result demonstrates both the usability and the cost-effectiveness of Cloud Computing based Business Intelligence system in applications of telecommunication industry.
Parallel programming of industrial applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, M; Koniges, A; Simon, H
1998-07-21
In the introductory material, we overview the typical MPP environment for real application computing and the special tools available such as parallel debuggers and performance analyzers. Next, we draw from a series of real applications codes and discuss the specific challenges and problems that are encountered in parallelizing these individual applications. The application areas drawn from include biomedical sciences, materials processing and design, plasma and fluid dynamics, and others. We show how it was possible to get a particular application to run efficiently and what steps were necessary. Finally we end with a summary of the lessons learned from thesemore » applications and predictions for the future of industrial parallel computing. This tutorial is based on material from a forthcoming book entitled: "Industrial Strength Parallel Computing" to be published by Morgan Kaufmann Publishers (ISBN l-55860-54).« less
Electronic Computer Aided Design. Its Application in FE.
ERIC Educational Resources Information Center
Further Education Unit, London (England).
A study was conducted at the Electronics Industrial Unit at the Dorset Institute of Higher Education to investigate the feasibility of incorporating computer-aided design (CAD) in electrical and electronic courses. The aim was to investigate the application of CAD to electrical and electronic systems; the extent to which industrial developments…
The computer-communication link for the innovative use of Space Station
NASA Technical Reports Server (NTRS)
Carroll, C. C.
1984-01-01
The potential capability of the computer-communications system link of space station is related to innovative utilization for industrial applications. Conceptual computer network architectures are presented and their respective accommodation of innovative industrial projects are discussed. To achieve maximum system availability for industrialization is a possible design goal, which would place the industrial community in an interactive mode with facilities in space. A worthy design goal would be to minimize the computer-communication management function and thereby optimize the system availability for industrial users. Quasi-autonomous modes and subnetworks are key design issues, since they would be the system elements directly effecting the system performance for industrial use.
WinHPC System Software | High-Performance Computing | NREL
Software WinHPC System Software Learn about the software applications, tools, toolchains, and for industrial applications. Intel Compilers Development Tool, Toolchain Suite featuring an industry
ERIC Educational Resources Information Center
1972
Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…
Koyama, Michihisa; Tsuboi, Hideyuki; Endou, Akira; Takaba, Hiromitsu; Kubo, Momoji; Del Carpio, Carlos A; Miyamoto, Akira
2007-02-01
Computational chemistry can provide fundamental knowledge regarding various aspects of materials. While its impact in scientific research is greatly increasing, its contributions to industrially important issues are far from satisfactory. In order to realize industrial innovation by computational chemistry, a new concept "combinatorial computational chemistry" has been proposed by introducing the concept of combinatorial chemistry to computational chemistry. This combinatorial computational chemistry approach enables theoretical high-throughput screening for materials design. In this manuscript, we review the successful applications of combinatorial computational chemistry to deNO(x) catalysts, Fischer-Tropsch catalysts, lanthanoid complex catalysts, and cathodes of the lithium ion secondary battery.
ERIC Educational Resources Information Center
Toong, Hoo-min D.; Gupta, Amar
1982-01-01
Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)
Parametric instabilities of rotor-support systems with application to industrial ventilators
NASA Technical Reports Server (NTRS)
Parszewski, Z.; Krodkiemski, T.; Marynowski, K.
1980-01-01
Rotor support systems interaction with parametric excitation is considered for both unequal principal shaft stiffness (generators) and offset disc rotors (ventilators). Instability regions and types of instability are computed in the first case, and parametric resonances in the second case. Computed and experimental results are compared for laboratory machine models. A field case study of parametric vibrations in industrial ventilators is reported. Computed parametric resonances are confirmed in field measurements, and some industrial failures are explained. Also the dynamic influence and gyroscopic effect of supporting structures are shown and computed.
Vision 2010: The Future of Higher Education Business and Learning Applications
ERIC Educational Resources Information Center
Carey, Patrick; Gleason, Bernard
2006-01-01
The global software industry is in the midst of a major evolutionary shift--one based on open computing--and this trend, like many transformative trends in technology, is being led by the IT staffs and academic computing faculty of the higher education industry. The elements of this open computing approach are open source, open standards, open…
Overview 1993: Computational applications
NASA Technical Reports Server (NTRS)
Benek, John A.
1993-01-01
Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.
NASA Astrophysics Data System (ADS)
Adnan, F. A.; Romlay, F. R. M.; Shafiq, M.
2018-04-01
Owing to the advent of the industrial revolution 4.0, the need for further evaluating processes applied in the additive manufacturing application particularly the computational process for slicing is non-trivial. This paper evaluates a real-time slicing algorithm for slicing an STL formatted computer-aided design (CAD). A line-plane intersection equation was applied to perform the slicing procedure at any given height. The application of this algorithm has found to provide a better computational time regardless the number of facet in the STL model. The performance of this algorithm is evaluated by comparing the results of the computational time for different geometry.
JPRS Report, Science & Technology, China.
1992-12-08
impor- tance of the computer information industry to the develop- ment of the national economy and the people’s standard of living. Forecasts call...past several years, and the application of computers has permeated every trade and industry , providing powerful SCIENCE & TECHNOLOGY POLICY JPRS...system and ample human talent; market potential is large; and it has potential for low cost develop- ment. However, the scale of its industrial
Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds
NASA Astrophysics Data System (ADS)
Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano
Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.
Influence of technology on magnetic tape storage device characteristics
NASA Technical Reports Server (NTRS)
Gniewek, John J.; Vogel, Stephen M.
1994-01-01
There are available today many data storage devices that serve the diverse application requirements of the consumer, professional entertainment, and computer data processing industries. Storage technologies include semiconductors, several varieties of optical disk, optical tape, magnetic disk, and many varieties of magnetic tape. In some cases, devices are developed with specific characteristics to meet specification requirements. In other cases, an existing storage device is modified and adapted to a different application. For magnetic tape storage devices, examples of the former case are 3480/3490 and QIC device types developed for the high end and low end segments of the data processing industry respectively, VHS, Beta, and 8 mm formats developed for consumer video applications, and D-1, D-2, D-3 formats developed for professional video applications. Examples of modified and adapted devices include 4 mm, 8 mm, 12.7 mm and 19 mm computer data storage devices derived from consumer and professional audio and video applications. With the conversion of the consumer and professional entertainment industries from analog to digital storage and signal processing, there have been increasing references to the 'convergence' of the computer data processing and entertainment industry technologies. There has yet to be seen, however, any evidence of convergence of data storage device types. There are several reasons for this. The diversity of application requirements results in varying degrees of importance for each of the tape storage characteristics.
Open Systems Architecture for Command, Control and Communications
1991-07-01
CONTENTS SECTION TITLE PAGE I. EXECUTIVE SUMMARY 5 II. TERMS OF REFERENCE 7 III. PANEL MEMBERSHIP 9 IV. INTRODUCTION 11 V. INDUSTRIAL REVOLUTION 19 VI...INTRODUCTION 18 19 V. INDUSTRIAL REVOLUTION 20 21 Initial manifestations of computer and communications standards emerged in the early seventies, largely...SYSTEMS INDUSTRIAL REVOLUTION Application Presentation Session Transport Internet Data Link Physical Application Presentation Session Transport
Applications of aerospace technology in the electric power industry
NASA Technical Reports Server (NTRS)
Johnson, F. D.; Heins, C. F.
1974-01-01
Existing applications of NASA contributions to disciplines such as combustion engineering, mechanical engineering, materials science, quality assurance and computer control are outlined to illustrate how space technology is used in the electric power industry. Corporate strategies to acquire relevant space technology are described.
ATCA for Machines-- Advanced Telecommunications Computing Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, R.S.; /SLAC
2008-04-22
The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.
Close-range photogrammetry for aircraft quality control
NASA Astrophysics Data System (ADS)
Schwartz, D. S.
Close range photogrammetry is applicable to quality assurance inspections, design data acquisition, and test management support tasks, yielding significant cost avoidance and increased productivity. An understanding of mensuration parameters and their related accuracies is fundamental to the successful application of industrial close range photogrammetry. Attention is presently given to these parameters and to the use of computer modelling as an aid to the photogrammetric entrepreneur in industry. Suggested improvements to cameras and film readers for industrial applications are discussed.
The Application of Artificial Intelligence Principles to Teaching and Training
ERIC Educational Resources Information Center
Shaw, Keith
2008-01-01
This paper compares and contrasts the use of AI principles in industrial training with more normal computer-based training (CBT) approaches. A number of applications of CBT are illustrated (for example simulations, tutorial presentations, fault diagnosis, management games, industrial relations exercises) and compared with an alternative approach…
Computers and Data Processing. Subject Bibliography.
ERIC Educational Resources Information Center
United States Government Printing Office, Washington, DC.
This annotated bibliography of U.S. Government publications contains over 90 entries on topics including telecommunications standards, U.S. competitiveness in high technology industries, computer-related crimes, capacity management of information technology systems, the application of computer technology in the Soviet Union, computers and…
Rationale and Application of Tangential Scanning to Industrial Inspection of Hardwood Logs
Nand K. Gupta; Daniel L. Schmoldt; Bruce Isaacson
1998-01-01
Industrial computed tomography (CT) inspection of hardwood logs has some unique requirements not found in other CT applications. Sawmill operations demand that large volumes of wood be scanned quickly at high spatial resolution for extended duty cycles. Current CT scanning geometries and commercial systems have both technical and economic [imitations. Tangential...
The development of the ICME supply-chain: Route to ICME implementation and sustainment
NASA Astrophysics Data System (ADS)
Furrer, David; Schirra, John
2011-04-01
Over the past twenty years, integrated computational materials engineering (ICME) has emerged as a key engineering field with great promise. Models simulating materials-related phenomena have been developed and are being validated for industrial application. The integration of computational methods into material, process and component design has been a challenge, however, in part due to the complexities in the development of an ICME "supply-chain" that supports, sustains and delivers this emerging technology. ICME touches many disciplines, which results in a requirement for many types of computational-based technology organizations to be involved to provide tools that can be rapidly developed, validated, deployed and maintained for industrial applications. The need for, and the current state of an ICME supply-chain along with development and future requirements for the continued pace of introduction of ICME into industrial design practices will be reviewed within this article.
A Fog Computing and Cloudlet Based Augmented Reality System for the Industry 4.0 Shipyard.
Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Vilar-Montesinos, Miguel
2018-06-02
Augmented Reality (AR) is one of the key technologies pointed out by Industry 4.0 as a tool for enhancing the next generation of automated and computerized factories. AR can also help shipbuilding operators, since they usually need to interact with information (e.g., product datasheets, instructions, maintenance procedures, quality control forms) that could be handled easily and more efficiently through AR devices. This is the reason why Navantia, one of the 10 largest shipbuilders in the world, is studying the application of AR (among other technologies) in different shipyard environments in a project called "Shipyard 4.0". This article presents Navantia's industrial AR (IAR) architecture, which is based on cloudlets and on the fog computing paradigm. Both technologies are ideal for supporting physically-distributed, low-latency and QoS-aware applications that decrease the network traffic and the computational load of traditional cloud computing systems. The proposed IAR communications architecture is evaluated in real-world scenarios with payload sizes according to demanding Microsoft HoloLens applications and when using a cloud, a cloudlet and a fog computing system. The results show that, in terms of response delay, the fog computing system is the fastest when transferring small payloads (less than 128 KB), while for larger file sizes, the cloudlet solution is faster than the others. Moreover, under high loads (with many concurrent IAR clients), the cloudlet in some cases is more than four times faster than the fog computing system in terms of response delay.
Technology advances and market forces: Their impact on high performance architectures
NASA Technical Reports Server (NTRS)
Best, D. R.
1978-01-01
Reasonable projections into future supercomputer architectures and technology require an analysis of the computer industry market environment, the current capabilities and trends within the component industry, and the research activities on computer architecture in the industrial and academic communities. Management, programmer, architect, and user must cooperate to increase the efficiency of supercomputer development efforts. Care must be taken to match the funding, compiler, architecture and application with greater attention to testability, maintainability, reliability, and usability than supercomputer development programs of the past.
Application of narrow-band television to industrial and commercial communications
NASA Technical Reports Server (NTRS)
Embrey, B. C., Jr.; Southworth, G. R.
1974-01-01
The development of narrow-band systems for use in space systems is presented. Applications of the technology to future spacecraft requirements are discussed along with narrow-band television's influence in stimulating development within the industry. The transferral of the technology into industrial and commercial communications is described. Major areas included are: (1) medicine; (2) education; (3) remote sensing for traffic control; and (5) weather observation. Applications in data processing, image enhancement, and information retrieval are provided by the combination of the TV camera and the computer.
Qu, Hai-bin; Cheng, Yi-yu; Wang, Yue-sheng
2003-10-01
Based on the review of some engineering problems on developing modern production industry of Traditional Chinese Medicine (TCM), the differences of TCM production industry between China and abroad were pointed out. Accelerating the application and extension of high-tech and computer integrated manufacturing system (CIMS) were suggested to promote the technology advancement of TCM industry.
Your Career in Computer Programming.
ERIC Educational Resources Information Center
Seligsohn, I. J.
This book offers the career-minded young reader insight into computers and computer-programming, by describing the nature of the work, the actual workings of the machines, the language of computers, their history, and their far-reading and increasing applications in business, industry, science, education, defense, and government. At the same time,…
Microprocessors: Laboratory Simulation of Industrial Control Applications.
ERIC Educational Resources Information Center
Gedeon, David V.
1981-01-01
Describes a course to make technical managers more aware of computer technology and how data loggers, programmable controllers, and larger computer systems interact in a hierarchical configuration of manufacturing process control. (SK)
A Compilation of Information on Computer Applications in Nutrition and Food Service.
ERIC Educational Resources Information Center
Casbergue, John P.
Compiled is information on the application of computer technology to nutrition food service. It is designed to assist dieticians and nutritionists interested in applying electronic data processing to food service and related industries. The compilation is indexed by subject area. Included for each subject area are: (1) bibliographic references,…
Computer technology applications in industrial and organizational psychology.
Crespin, Timothy R; Austin, James T
2002-08-01
This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.
The application of CFD to the modelling of fires in complex geometries
NASA Astrophysics Data System (ADS)
Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.
The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.
Industrial applications of automated X-ray inspection
NASA Astrophysics Data System (ADS)
Shashishekhar, N.
2015-03-01
Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.
NASA Astrophysics Data System (ADS)
Pizette, Patrick; Govender, Nicolin; Wilke, Daniel N.; Abriak, Nor-Edine
2017-06-01
The use of the Discrete Element Method (DEM) for industrial civil engineering industrial applications is currently limited due to the computational demands when large numbers of particles are considered. The graphics processing unit (GPU) with its highly parallelized hardware architecture shows potential to enable solution of civil engineering problems using discrete granular approaches. We demonstrate in this study the pratical utility of a validated GPU-enabled DEM modeling environment to simulate industrial scale granular problems. As illustration, the flow discharge of storage silos using 8 and 17 million particles is considered. DEM simulations have been performed to investigate the influence of particle size (equivalent size for the 20/40-mesh gravel) and induced shear stress for two hopper shapes. The preliminary results indicate that the shape of the hopper significantly influences the discharge rates for the same material. Specifically, this work shows that GPU-enabled DEM modeling environments can model industrial scale problems on a single portable computer within a day for 30 seconds of process time.
Computational sciences in the upstream oil and gas industry
Halsey, Thomas C.
2016-01-01
The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785
NASA Technical Reports Server (NTRS)
Befrui, Bizhan A.
1995-01-01
This viewgraph presentation discusses the following: STAR-CD computational features; STAR-CD turbulence models; common features of industrial complex flows; industry-specific CFD development requirements; applications and experiences of industrial complex flows, including flow in rotating disc cavities, diffusion hole film cooling, internal blade cooling, and external car aerodynamics; and conclusions on turbulence modeling needs.
A network-based distributed, media-rich computing and information environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, R.L.
1995-12-31
Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less
Some Experience with Interactive Computing in Teaching Introductory Statistics.
ERIC Educational Resources Information Center
Diegert, Carl
Students in two biostatistics courses at the Cornell Medical College and in a course in applications of computer science given in Cornell's School of Industrial Engineering were given access to an interactive package of computer programs enabling them to perform statistical analysis without the burden of hand computation. After a general…
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
ERIC Educational Resources Information Center
Lancaster, F. W.
1989-01-01
Describes various stages involved in the applications of electronic media to the publishing industry. Highlights include computer typesetting, or photocomposition; machine-readable databases; the distribution of publications in electronic form; computer conferencing and electronic mail; collaborative authorship; hypertext; hypermedia publications;…
Cogeneration technology alternatives study. Volume 6: Computer data
NASA Technical Reports Server (NTRS)
1980-01-01
The potential technical capabilities of energy conversion systems in the 1985 - 2000 time period were defined with emphasis on systems using coal, coal-derived fuels or alternate fuels. Industrial process data developed for the large energy consuming industries serve as a framework for the cogeneration applications. Ground rules for the study were established and other necessary equipment (balance-of-plant) was defined. This combination of technical information, energy conversion system data ground rules, industrial process information and balance-of-plant characteristics was analyzed to evaluate energy consumption, capital and operating costs and emissions. Data in the form of computer printouts developed for 3000 energy conversion system-industrial process combinations are presented.
Three-dimensional surface reconstruction for industrial computed tomography
NASA Technical Reports Server (NTRS)
Vannier, M. W.; Knapp, R. H.; Gayou, D. E.; Sammon, N. P.; Butterfield, R. L.; Larson, J. W.
1985-01-01
Modern high resolution medical computed tomography (CT) scanners can produce geometrically accurate sectional images of many types of industrial objects. Computer software has been developed to convert serial CT scans into a three-dimensional surface form, suitable for display on the scanner itself. This software, originally developed for imaging the skull, has been adapted for application to industrial CT scanning, where serial CT scans thrrough an object of interest may be reconstructed to demonstrate spatial relationships in three dimensions that cannot be easily understood using the original slices. The methods of three-dimensional reconstruction and solid modeling are reviewed, and reconstruction in three dimensions from CT scans through familiar objects is demonstrated.
NASA Astrophysics Data System (ADS)
Xu, Jun; Cudel, Christophe; Kohler, Sophie; Fontaine, Stéphane; Haeberlé, Olivier; Klotz, Marie-Louise
2012-04-01
Fabric's smoothness is a key factor in determining the quality of finished textile products and has great influence on the functionality of industrial textiles and high-end textile products. With popularization of the zero defect industrial concept, identifying and measuring defective material in the early stage of production is of great interest to the industry. In the current market, many systems are able to achieve automatic monitoring and control of fabric, paper, and nonwoven material during the entire production process, however online measurement of hairiness is still an open topic and highly desirable for industrial applications. We propose a computer vision approach to compute epipole by using variable homography, which can be used to measure emergent fiber length on textile fabrics. The main challenges addressed in this paper are the application of variable homography on textile monitoring and measurement, as well as the accuracy of the estimated calculation. We propose that a fibrous structure can be considered as a two-layer structure, and then we show how variable homography combined with epipolar geometry can estimate the length of the fiber defects. Simulations are carried out to show the effectiveness of this method. The true length of selected fibers is measured precisely using a digital optical microscope, and then the same fibers are tested by our method. Our experimental results suggest that smoothness monitored by variable homography is an accurate and robust method of quality control for important industrial fabrics.
Status of emerging standards for data definitions and transfer in the petroleum industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winczewski, L.M.
1991-03-01
Leading-edge hardware and software to store, retrieve, process, analyze, visualize, and interpret geoscience and petroleum data are improving continuously. A babel of definitions and formats for common industry data items limits the overall effectiveness of these computer-aided exploration and production tools. Custom data conversion required to load applications causes delays and exposes data content to error and degradation. Emerging industry-wide standards for management of geoscience and petroleum-related data are poised to overcome long-standing internal barriers to the full exploitation of these high-tech hardware/software systems. Industry technical organizations, such as AAPG, SEG, and API, have been actively pursuing industry-wide standards formore » data transfer, data definitions, and data models. These standard-defining groups are non-fee and solicit active participation from the entire petroleum community. The status of the most active of these groups is presented here. Data transfer standards are being pursued within AAPG (AAPG-B Data Transfer Standard), API (DLIS, for log data) and SEG (SEG-DEF, for seismic data). Converging data definitions, models, and glossaries are coming from the Petroleum Industry Data Dictionary Group (PIDD) and from subcommittees of the AAPG Computer Applications Committee. The National Computer Graphics Association is promoting development of standards for transfer of geographically oriented data. The API Well-Number standard is undergoing revision.« less
National research and education network
NASA Technical Reports Server (NTRS)
Villasenor, Tony
1991-01-01
Some goals of this network are as follows: Extend U.S. technological leadership in high performance computing and computer communications; Provide wide dissemination and application of the technologies both to the speed and the pace of innovation and to serve the national economy, national security, education, and the global environment; and Spur gains in the U.S. productivity and industrial competitiveness by making high performance computing and networking technologies an integral part of the design and production process. Strategies for achieving these goals are as follows: Support solutions to important scientific and technical challenges through a vigorous R and D effort; Reduce the uncertainties to industry for R and D and use of this technology through increased cooperation between government, industry, and universities and by the continued use of government and government funded facilities as a prototype user for early commercial HPCC products; and Support underlying research, network, and computational infrastructures on which U.S. high performance computing technology is based.
Overview of Computer Simulation Modeling Approaches and Methods
Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett
2005-01-01
The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...
Tenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion, part 1
NASA Technical Reports Server (NTRS)
Williams, R. W. (Compiler)
1992-01-01
Experimental and computational fluid dynamic activities in rocket propulsion were discussed. The workshop was an open meeting of government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.
BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs
NASA Astrophysics Data System (ADS)
Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes
2017-06-01
Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.
A comprehensive overview of the applications of artificial life.
Kim, Kyung-Joong; Cho, Sung-Bae
2006-01-01
We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.
CT Image Sequence Processing For Wood Defect Recognition
Dongping Zhu; R.W. Conners; Philip A. Araman
1991-01-01
The research reported in this paper explores a non-destructive testing application of x-ray computed tomography (CT) in the forest products industry. This application involves a computer vision system that uses CT to locate and identify internal defects in hardwood logs. The knowledge of log defects is critical in deciding whether to veneer or to saw up a log, and how...
Computational Electronics and Electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeFord, J.F.
The Computational Electronics and Electromagnetics thrust area is a focal point for computer modeling activities in electronics and electromagnetics in the Electronics Engineering Department of Lawrence Livermore National Laboratory (LLNL). Traditionally, they have focused their efforts in technical areas of importance to existing and developing LLNL programs, and this continues to form the basis for much of their research. A relatively new and increasingly important emphasis for the thrust area is the formation of partnerships with industry and the application of their simulation technology and expertise to the solution of problems faced by industry. The activities of the thrust areamore » fall into three broad categories: (1) the development of theoretical and computational models of electronic and electromagnetic phenomena, (2) the development of useful and robust software tools based on these models, and (3) the application of these tools to programmatic and industrial problems. In FY-92, they worked on projects in all of the areas outlined above. The object of their work on numerical electromagnetic algorithms continues to be the improvement of time-domain algorithms for electromagnetic simulation on unstructured conforming grids. The thrust area is also investigating various technologies for conforming-grid mesh generation to simplify the application of their advanced field solvers to design problems involving complicated geometries. They are developing a major code suite based on the three-dimensional (3-D), conforming-grid, time-domain code DSI3D. They continue to maintain and distribute the 3-D, finite-difference time-domain (FDTD) code TSAR, which is installed at several dozen university, government, and industry sites.« less
2011 Computation Directorate Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L
2012-04-11
From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less
Tenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion, part 2
NASA Technical Reports Server (NTRS)
Williams, R. W. (Compiler)
1992-01-01
Presented here are 59 abstracts and presentations and three invited presentations given at the Tenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion held at the George C. Marshall Space Flight Center, April 28-30, 1992. The purpose of the workshop is to discuss experimental and computational fluid dynamic activities in rocket propulsion. The workshop is an open meeting for government, industry, and academia. A broad number of topics are discussed, including a computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.
Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion
NASA Technical Reports Server (NTRS)
Williams, R. W. (Compiler)
1993-01-01
Conference publication includes 79 abstracts and presentations and 3 invited presentations given at the Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion held at George C. Marshall Space Flight Center, April 20-22, 1993. The purpose of the workshop is to discuss experimental and computational fluid dynamic activities in rocket propulsion. The workshop is an open meeting for government, industry, and academia. A broad number of topics are discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.
Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion, Part 1
NASA Technical Reports Server (NTRS)
Williams, Robert W. (Compiler)
1993-01-01
Conference publication includes 79 abstracts and presentations given at the Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion held at the George C. Marshall Space Flight Center, April 20-22, 1993. The purpose of this workshop is to discuss experimental and computational fluid dynamic activities in rocket propulsion. The workshop is an open meeting for government, industry, and academia. A broad number of topics are discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.
Industrial machinery noise impact modeling, volume 1
NASA Astrophysics Data System (ADS)
Hansen, C. H.; Kugler, B. A.
1981-07-01
The development of a machinery noise computer model which may be used to assess the effect of occupational noise on the health and welfare of industrial workers is discussed. The purpose of the model is to provide EPA with the methodology to evaluate the personnel noise problem, to identify the equipment types responsible for the exposure and to assess the potential benefits of a given noise control action. Due to its flexibility in design and application, the model and supportive computer program can be used by other federal agencies, state governments, labor and industry as an aid in the development of noise abatement programs.
Two examples of industrial applications of shock physics research
NASA Astrophysics Data System (ADS)
Sanai, Mohsen
1996-05-01
An in-depth understanding of shock physics phenomena has led to many industrial applications. Two recent applications discussed in this paper are a method for assessing explosion safety in industrial plants and a bomb-resistant luggage container for widebody aircraft. Our explosion safety assessment is based on frequent use of computer simulation of postulated accidents to model in detail the detonation of energetic materials, the formation and propagation of the resulting airblast, and the projection of fragments of known material and mass. Using a general load-damage analysis technique referred to as the pressure-impulse (PI) method, we have developed a PC-based computer algorithm that includes a continually expanding library of PI load and damage curves, which can predict and graphically display common structural damage modes and the response of humans to postulated explosion accidents. A second commercial application of shock physics discussed here is a bomb-resistant luggage container for widebody aircraft that can protect the aircraft from a terrorist bomb hidden inside the luggage. This hardened luggage container (HLC) relies on blast management and debris containment provided by a flexible flow-through blanket woven from threads made with a strong lightweight material, such as Spectra or Kevlar. This mitigation blanket forms a continuous and seamless shell around the sides of the luggage container that are parallel to the aircraft axis, leaving the two ends of the container unprotected. When an explosion occurs, the mitigation blanket expands into a nearly circular shell that contains the flying debris while directing the flow into the adjacent containers. The HLC concept has been demonstrated through full-scale experiments conducted at SRI. We believe that these two examples represent a broad class of potential industrial hazard applications of the experimental, analytical, and computational tools possessed by the shock physics community.
10 CFR 431.173 - Requirements applicable to all manufacturers.
Code of Federal Regulations, 2011 CFR
2011-01-01
... COMMERCIAL AND INDUSTRIAL EQUIPMENT Provisions for Commercial Heating, Ventilating, Air-Conditioning and... is based on engineering or statistical analysis, computer simulation or modeling, or other analytic... method or methods used; (B) The mathematical model, the engineering or statistical analysis, computer...
Automated computer grading of hardwood lumber
P. Klinkhachorn; J.P. Franklin; Charles W. McMillin; R.W. Conners; H.A. Huber
1988-01-01
This paper describes an improved computer program to grade hardwood lumber. The program was created as part of a system to automate various aspects of the hardwood manufacturing industry. It enhances previous efforts by considering both faces of the board and provides easy application of species dependent rules. The program can be readily interfaced with a computer...
ERIC Educational Resources Information Center
Branstad, Dennis K., Ed.
The 15 papers and summaries of presentations in this collection provide technical information and guidance offered by representatives from federal agencies and private industry. Topics discussed include physical security, risk assessment, software security, computer network security, and applications and implementation of the Data Encryption…
NASA Technical Reports Server (NTRS)
Huang, C. J.; Motard, R. L.
1978-01-01
The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.
Industry involvement in IPAD through the Industry Technical Advisory Board
NASA Technical Reports Server (NTRS)
Swanson, W. E.
1980-01-01
In 1976 NASA awarded The Boeing Company a contract to develop IPAD (Integrated Programs for Aerospace-Vehicle Design). This contract included a requirement for Boeing to form an Industrial Technical Advisory Board (ITAB), with members representing major aerospace and computer companies. The purpose of this board was to guide the development of IPAD. The specific goal of IPAD is to increase United States aerospace industry productivity through the application of computers to manage engineering data. This goal clearly is attainable; in fact, IPAD's influence can reach beyond the aerospace industry to many businesses where product development is based on the design-building process. An enhanced IPAD, therefore, is a national asset of significance. The role of ITAB in guiding the development of this system is described.
Wireless Communications in Reverberant Environments
2015-01-01
Secure Wireless Agent Testbed (SWAT), the Protocol Engineering Advanced Networking (PROTEAN) Research Group, the Data Fusion Laboratory (DFL), and the...constraints of their application. 81 Bibliography [1] V. Gungor and G. Hancke, “Industrial wireless sensor networks : Challenges, design principles, and...Bhattacharya, “Path loss estimation for a wireless sensor network for application in ship,” Int. J. of Comput. Sci. and Mobile Computing, vol. 2, no. 6, pp
Lattice Boltzmann computation of creeping fluid flow in roll-coating applications
NASA Astrophysics Data System (ADS)
Rajan, Isac; Kesana, Balashanker; Perumal, D. Arumuga
2018-04-01
Lattice Boltzmann Method (LBM) has advanced as a class of Computational Fluid Dynamics (CFD) methods used to solve complex fluid systems and heat transfer problems. It has ever-increasingly attracted the interest of researchers in computational physics to solve challenging problems of industrial and academic importance. In this current study, LBM is applied to simulate the creeping fluid flow phenomena commonly encountered in manufacturing technologies. In particular, we apply this novel method to simulate the fluid flow phenomena associated with the "meniscus roll coating" application. This prevalent industrial problem encountered in polymer processing and thin film coating applications is modelled as standard lid-driven cavity problem to which creeping flow analysis is applied. This incompressible viscous flow problem is studied in various speed ratios, the ratio of upper to lower lid speed in two different configurations of lid movement - parallel and anti-parallel wall motion. The flow exhibits interesting patterns which will help in design of roll coaters.
VRML Industry: Microcosms in the Making.
ERIC Educational Resources Information Center
Brown, Eric
1998-01-01
Discusses VRML (Virtual Reality Modeling Language) technology and some of its possible applications, including creating three-dimensional images on the Web, advertising, and data visualization in computer-assisted design and computer-assisted manufacturing (CAD/CAM). Future improvements are discussed, including streaming, database support, and…
PRACE - The European HPC Infrastructure
NASA Astrophysics Data System (ADS)
Stadelmeyer, Peter
2014-05-01
The mission of PRACE (Partnership for Advanced Computing in Europe) is to enable high impact scientific discovery and engineering research and development across all disciplines to enhance European competitiveness for the benefit of society. PRACE seeks to realize this mission by offering world class computing and data management resources and services through a peer review process. This talk gives a general overview about PRACE and the PRACE research infrastructure (RI). PRACE is established as an international not-for-profit association and the PRACE RI is a pan-European supercomputing infrastructure which offers access to computing and data management resources at partner sites distributed throughout Europe. Besides a short summary about the organization, history, and activities of PRACE, it is explained how scientists and researchers from academia and industry from around the world can access PRACE systems and which education and training activities are offered by PRACE. The overview also contains a selection of PRACE contributions to societal challenges and ongoing activities. Examples of the latter are beside others petascaling, application benchmark suite, best practice guides for efficient use of key architectures, application enabling / scaling, new programming models, and industrial applications. The Partnership for Advanced Computing in Europe (PRACE) is an international non-profit association with its seat in Brussels. The PRACE Research Infrastructure provides a persistent world-class high performance computing service for scientists and researchers from academia and industry in Europe. The computer systems and their operations accessible through PRACE are provided by 4 PRACE members (BSC representing Spain, CINECA representing Italy, GCS representing Germany and GENCI representing France). The Implementation Phase of PRACE receives funding from the EU's Seventh Framework Programme (FP7/2007-2013) under grant agreements RI-261557, RI-283493 and RI-312763. For more information, see www.prace-ri.eu
1990-06-01
reader is cautioned that computer programs developed in this research may not have been exercised for all cases of interest. While every effort has been...Source of Funding Numbers _. Program Element No Project No I Task No I Work Unit Accession No 11 Title (Include security classflcation) APPLICATION OF...formats. Previous applications of these encoding formats were on industry standard computers (PC) over a 16-20 klIz channel. This report discusses the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zheng; Ukida, H.; Ramuhalli, Pradeep
2010-06-05
Imaging- and vision-based techniques play an important role in industrial inspection. The sophistication of the techniques assures high- quality performance of the manufacturing process through precise positioning, online monitoring, and real-time classification. Advanced systems incorporating multiple imaging and/or vision modalities provide robust solutions to complex situations and problems in industrial applications. A diverse range of industries, including aerospace, automotive, electronics, pharmaceutical, biomedical, semiconductor, and food/beverage, etc., have benefited from recent advances in multi-modal imaging, data fusion, and computer vision technologies. Many of the open problems in this context are in the general area of image analysis methodologies (preferably in anmore » automated fashion). This editorial article introduces a special issue of this journal highlighting recent advances and demonstrating the successful applications of integrated imaging and vision technologies in industrial inspection.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, R.E.
1983-11-01
Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.
NASA Technical Reports Server (NTRS)
Williams, R. W. (Compiler)
1996-01-01
This conference publication includes various abstracts and presentations given at the 13th Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion and Launch Vehicle Technology held at the George C. Marshall Space Flight Center April 25-27 1995. The purpose of the workshop was to discuss experimental and computational fluid dynamic activities in rocket propulsion and launch vehicles. The workshop was an open meeting for government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.
A Database Approach to Computer Integrated Manufacturing
1988-06-01
advanced application areas such as tactical weapons systems, industrial manufacturing systems, and -D, ........... . . .m - I I [ l~ ii i l I4...manufacturing industry . We will provide definitions for the functions which are most prevalent in our research. Figure 3 shows the basic processes partitioned...IGES) [Ref. 9] and the Product Definition Data Interface (PDDI) [Ref. 101. 11 The IGES specification is considered an industry standard for the
Molecular electronics: The technology of sixth generation computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarvis, M.T.; Miller, R.K.
1987-01-01
In February 1986, Japan began the 6th Generation project. At the 1987 Economic Summit in Venice, Prime Minister Yashuhiro Makasone opened the project to world collaboration. A project director suggests that the 6th Generation ''may just be a turning point for human society.'' The major rationale for building molecular electronic devices is to achieve advances in computational densities and speeds. Proposed chromophore chains for molecular-scale chips, for example, could be spaced closer than today's silicone elements by a factor of almost 100. This book describes the research and proposed designs for molecular electronic devices and computers. It examines specific potentialmore » applications and the relationship to molecular electronics to silicon technology and presents the first published survey of experts on research issues, applications, and forecast of future developments and also includes market forecast. An interesting suggestion of the survey is that the chemical industry may become a significant factor in the computer industry as the sixth generation unfolds.« less
Electron tubes for industrial applications
NASA Astrophysics Data System (ADS)
Gellert, Bernd
1994-05-01
This report reviews research and development efforts within the last years for vacuum electron tubes, in particular power grid tubes for industrial applications. Physical and chemical effects are discussed that determine the performance of todays devices. Due to the progress made in the fundamental understanding of materials and newly developed processes the reliability and reproducibility of power grid tubes could be improved considerably. Modern computer controlled manufacturing methods ensure a high reproducibility of production and continuous quality certification according to ISO 9001 guarantees future high quality standards. Some typical applications of these tubes are given as an example.
Discovery of the Kalman filter as a practical tool for aerospace and industry
NASA Technical Reports Server (NTRS)
Mcgee, L. A.; Schmidt, S. F.
1985-01-01
The sequence of events which led the researchers at Ames Research Center to the early discovery of the Kalman filter shortly after its introduction into the literature is recounted. The scientific breakthroughs and reformulations that were necessary to transform Kalman's work into a useful tool for a specific aerospace application are described. The resulting extended Kalman filter, as it is now known, is often still referred to simply as the Kalman filter. As the filter's use gained in popularity in the scientific community, the problems of implementation on small spaceborne and airborne computers led to a square-root formulation of the filter to overcome numerical difficulties associated with computer word length. The work that led to this new formulation is also discussed, including the first airborne computer implementation and flight test. Since then the applications of the extended and square-root formulations of the Kalman filter have grown rapidly throughout the aerospace industry.
Second Computational Aeroacoustics (CAA) Workshop on Benchmark Problems
NASA Technical Reports Server (NTRS)
Tam, C. K. W. (Editor); Hardin, J. C. (Editor)
1997-01-01
The proceedings of the Second Computational Aeroacoustics (CAA) Workshop on Benchmark Problems held at Florida State University are the subject of this report. For this workshop, problems arising in typical industrial applications of CAA were chosen. Comparisons between numerical solutions and exact solutions are presented where possible.
The NASA NASTRAN structural analysis computer program - New content
NASA Technical Reports Server (NTRS)
Weidman, D. J.
1978-01-01
Capabilities of a NASA-developed structural analysis computer program, NASTRAN, are evaluated with reference to finite-element modelling. Applications include the automotive industry as well as aerospace. It is noted that the range of sub-programs within NASTRAN has expanded, while keeping user cost low.
Evaluation of general non-reflecting boundary conditions for industrial CFD applications
NASA Astrophysics Data System (ADS)
Basara, Branislav; Frolov, Sergei; Lidskii, Boris; Posvyanskii, Vladimir
2007-11-01
The importance of having proper boundary conditions for the calculation domain is a known issue in Computational Fluid Dynamics (CFD). In many situations, it is very difficult to define a correct boundary condition. The flow may enter and leave the computational domain at the same time and at the same boundary. In such circumstances, it is important that numerical implementation of boundary conditions enforces certain physical constraints leading to correct results which then ensures a better convergence rate. The aim of this paper is to evaluate recently proposed non-reflecting boundary conditions (Frolov et al., 2001, Advances in Chemical Propulsion) on industrial CFD applications. Derivation of the local non-reflecting boundary conditions at the open boundary is based on finding the solution of linearized Euler equations vanishing at infinity for both incompressible and compressible formulations. This is implemented into the in-house CFD package AVL FIRE and some numerical details will be presented as well. The key applications in this paper are from automotive industry, e.g. an external car aerodynamics, an intake port, etc. The results will show benefits of using effective non-reflecting boundary conditions.
CSM Testbed Development and Large-Scale Structural Applications
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.
1989-01-01
A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Computational structural mechanics methods research using an evolving framework
NASA Technical Reports Server (NTRS)
Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.
1990-01-01
Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.
Surface Modeling, Grid Generation, and Related Issues in Computational Fluid Dynamic (CFD) Solutions
NASA Technical Reports Server (NTRS)
Choo, Yung K. (Compiler)
1995-01-01
The NASA Steering Committee for Surface Modeling and Grid Generation (SMAGG) sponsored a workshop on surface modeling, grid generation, and related issues in Computational Fluid Dynamics (CFD) solutions at Lewis Research Center, Cleveland, Ohio, May 9-11, 1995. The workshop provided a forum to identify industry needs, strengths, and weaknesses of the five grid technologies (patched structured, overset structured, Cartesian, unstructured, and hybrid), and to exchange thoughts about where each technology will be in 2 to 5 years. The workshop also provided opportunities for engineers and scientists to present new methods, approaches, and applications in SMAGG for CFD. This Conference Publication (CP) consists of papers on industry overview, NASA overview, five grid technologies, new methods/ approaches/applications, and software systems.
NASA Technical Reports Server (NTRS)
Williams, R. W. (Compiler)
1996-01-01
The purpose of the workshop was to discuss experimental and computational fluid dynamic activities in rocket propulsion and launch vehicles. The workshop was an open meeting for government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
Transfer of space technology to industry
NASA Technical Reports Server (NTRS)
Hamilton, J. T.
1974-01-01
Some of the most significant applications of the NASA aerospace technology transfer to industry and other government agencies are briefly outlined. The technology utilization program encompasses computer programs for structural problems, life support systems, fuel cell development, and rechargeable cardiac pacemakers as well as reliability and quality research for oil recovery operations and pollution control.
Application of SAE ARP4754A to Flight Critical Systems
NASA Technical Reports Server (NTRS)
Peterson, Eric M.
2015-01-01
This report documents applications of ARP4754A to the development of modern computer-based (i.e., digital electronics, software and network-based) aircraft systems. This study is to offer insight and provide educational value relative to the guidelines in ARP4754A and provide an assessment of the current state-of-the- practice within industry and regulatory bodies relative to development assurance for complex and safety-critical computer-based aircraft systems.
Evaluating the Efficacy of the Cloud for Cluster Computation
NASA Technical Reports Server (NTRS)
Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom
2012-01-01
Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.
The Computer's Debt to Science.
ERIC Educational Resources Information Center
Branscomb, Lewis M.
1984-01-01
Discusses discoveries and applications of science that have enabled the computer industry to introduce new technology each year and produce 25 percent more for the customer at constant cost. Potential limits to progress, disc storage technology, programming and end-user interface, and designing for ease of use are considered. Glossary is included.…
NASA CST aids U.S. industry. [computational structures technology
NASA Technical Reports Server (NTRS)
Housner, Jerry M.; Pinson, Larry D.
1993-01-01
The effect of NASA's computational structures Technology (CST) research on aerospace vehicle design and operation is discussed. The application of this research to proposed version of a high-speed civil transport, to composite structures in aerospace, to the study of crack growth, and to resolving field problems is addressed.
SIAM Conference on Geometric Design and Computing. Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2002-03-11
The SIAM Conference on Geometric Design and Computing attracted 164 domestic and international researchers, from academia, industry, and government. It provided a stimulating forum in which to learn about the latest developments, to discuss exciting new research directions, and to forge stronger ties between theory and applications. Final Report
Brilliant gamma beams for industrial applications: new opportunities, new challenges
NASA Astrophysics Data System (ADS)
Iancu, V.; Suliman, G.; Turturica, G. V.; Iovea, M.; Daito, I.; Ohgaki, H.; Matei, C.; Ur, C. A.; Balabanski, D. L.
2016-10-01
The Nuclear Physics oriented pillar of the pan-European Extreme Light Infrastructure (ELI-NP) will host an ultra-bright, energy tunable, and quasi-monochromatic gamma-ray beam system in the range of 0.2-19.5 MeV produced by laser-Compton backscattering technique. The applied research program envisioned at ELI-NP targets to use nuclear resonance fluorescence (NRF) and computed tomography to provide new opportunities for industry and society. High sensitivity NRF-based investigations can be successfully applied to safeguard applications and management of radioactive wastes as well as to uncharted fields like cultural heritage and medical imaging. Gamma-ray radioscopy and computed tomography performed at ELI-NP has the potential to achieve high resolution in industrial-sized objects provided the detection challenges introduced by the unique characteristics of the gamma beam are overcome. Here we discuss the foreseen industrial applications that will benefit from the high quality and unique characteristics of ELI-NP gamma beam and the challenges they present. We present the experimental setups proposed to be implemented for this goal, discuss their performance based on analytical calculations and numerical Monte-Carlo simulations, and comment about constrains imposed by the limitation of current scintillator detectors. Several gamma-beam monitoring devices based on scintillator detectors will also be discussed.
Real-Time Pattern Recognition - An Industrial Example
NASA Astrophysics Data System (ADS)
Fitton, Gary M.
1981-11-01
Rapid advancements in cost effective sensors and micro computers are now making practical the on-line implementation of pattern recognition based systems for a variety of industrial applications requiring high processing speeds. One major application area for real time pattern recognition is in the sorting of packaged/cartoned goods at high speed for automated warehousing and return goods cataloging. While there are many OCR and bar code readers available to perform these functions, it is often impractical to use such codes (package too small, adverse esthetics, poor print quality) and an approach which recognizes an item by its graphic content alone is desirable. This paper describes a specific application within the tobacco industry, that of sorting returned cigarette goods by brand and size.
Application research of Ganglia in Hadoop monitoring and management
NASA Astrophysics Data System (ADS)
Li, Gang; Ding, Jing; Zhou, Lixia; Yang, Yi; Liu, Lei; Wang, Xiaolei
2017-03-01
There are many applications of Hadoop System in the field of large data, cloud computing. The test bench of storage and application in seismic network at Earthquake Administration of Tianjin use with Hadoop system, which is used the open source software of Ganglia to operate and monitor. This paper reviews the function, installation and configuration process, application effect of operating and monitoring in Hadoop system of the Ganglia system. It briefly introduces the idea and effect of Nagios software monitoring Hadoop system. It is valuable for the industry in the monitoring system of cloud computing platform.
Evaluation of Service Level Agreement Approaches for Portfolio Management in the Financial Industry
NASA Astrophysics Data System (ADS)
Pontz, Tobias; Grauer, Manfred; Kuebert, Roland; Tenschert, Axel; Koller, Bastian
The idea of service-oriented Grid computing seems to have the potential for fundamental paradigm change and a new architectural alignment concerning the design of IT infrastructures. There is a wide range of technical approaches from scientific communities which describe basic infrastructures and middlewares for integrating Grid resources in order that by now Grid applications are technically realizable. Hence, Grid computing needs viable business models and enhanced infrastructures to move from academic application right up to commercial application. For a commercial usage of these evolutions service level agreements are needed. The developed approaches are primary of academic interest and mostly have not been put into practice. Based on a business use case of the financial industry, five service level agreement approaches have been evaluated in this paper. Based on the evaluation, a management architecture has been designed and implemented as a prototype.
The design of an irradiator for the continuous processing of liquid latex
NASA Astrophysics Data System (ADS)
Reuter, O.; Langley, R.; Zn, Wan Manshol Bin W.
1998-06-01
This paper presents anew design concept for a gamma irradiation plant for the continuous processing of pumpable liquids. Typical applications of such a plant include ∗ the irradiation vulcanisation of natural latex rubber ∗ disinfection of municipal sewage sludge for agricultural use ∗ sterilisation of liquids in the pharmaceutical and cosmetics industries ∗ industrial processing of bulk liquids The authors describe the design and operation of the latex irradiator now operating on a small production scale in Malaysia and proposed developments. The design allows irradiation processing to be carried out under an inert or other gaseous environment. State-of-the-art computer control system ensures the fully automatic processing operation needed by industrial computers.
Norton, Tomás; Sun, Da-Wen; Grant, Jim; Fallon, Richard; Dodd, Vincent
2007-09-01
The application of computational fluid dynamics (CFD) in the agricultural industry is becoming ever more important. Over the years, the versatility, accuracy and user-friendliness offered by CFD has led to its increased take-up by the agricultural engineering community. Now CFD is regularly employed to solve environmental problems of greenhouses and animal production facilities. However, due to a combination of increased computer efficacy and advanced numerical techniques, the realism of these simulations has only been enhanced in recent years. This study provides a state-of-the-art review of CFD, its current applications in the design of ventilation systems for agricultural production systems, and the outstanding challenging issues that confront CFD modellers. The current status of greenhouse CFD modelling was found to be at a higher standard than that of animal housing, owing to the incorporation of user-defined routines that simulate crop biological responses as a function of local environmental conditions. Nevertheless, the most recent animal housing simulations have addressed this issue and in turn have become more physically realistic.
Survey shows continued strong interest in UNIX applications for healthcare.
Dunbar, C
1993-03-01
As part of the general computer industry movement toward open systems, many are predicting UNIX will become the dominant host operating system of the late 1990s. To better understand this prediction within the healthcare setting, Computers in Healthcare surveyed our readership about their opinions of UNIX, its current use and its relative importance as an information services strategy. The upshot? CIH readers definitely want more systems on UNIX, more healthcare applications written for UNIX and more trained resource people to help them with faster installation and more useful applications.
Dynamics of gas-thrust bearings
NASA Technical Reports Server (NTRS)
Stiffler, A. K.; Tapia, R. R.
1978-01-01
Computer program calculates load coefficients, up to third harmonic, for hydrostatic gas thrust bearings. Program is useful in identification of industrial situations where gas-thrust bearings have potential applications.
Computer Technology for Industry
NASA Technical Reports Server (NTRS)
1979-01-01
In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.
The Sunrise project: An R&D project for a national information infrastructure prototype
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Juhnyoung
1995-02-01
Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to a prototype National Information Infrastructure (NII) development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multimedia technologies, and data mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; and (3) To define a new way of collaboration between computer science and industrially relevant research.« less
NASA Astrophysics Data System (ADS)
Bian, Jun; Fu, Huijian; Shang, Qian; Zhou, Xiangyang; Ma, Qingguo
This paper analyzes the outstanding problems in current industrial production by reviewing the three stages of the Industrial Engineering Development. Based on investigations and interviews in enterprises, we propose the new idea of applying "computer video analysis technology" to new industrial engineering management software, and add "loose-coefficient" of the working station to this software in order to arrange scientific and humanistic production. Meanwhile, we suggest utilizing Biofeedback Technology to promote further research on "the rules of workers' physiological, psychological and emotional changes in production". This new kind of combination will push forward industrial engineering theories and benefit enterprises in progressing towards flexible social production, thus it will be of great theory innovation value, social significance and application value.
Cloud Computing Security Issue: Survey
NASA Astrophysics Data System (ADS)
Kamal, Shailza; Kaur, Rajpreet
2011-12-01
Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.
Multiple-User, Multitasking, Virtual-Memory Computer System
NASA Technical Reports Server (NTRS)
Generazio, Edward R.; Roth, Don J.; Stang, David B.
1993-01-01
Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.
Computer programs: Operational and mathematical, a compilation
NASA Technical Reports Server (NTRS)
1973-01-01
Several computer programs which are available through the NASA Technology Utilization Program are outlined. Presented are: (1) Computer operational programs which can be applied to resolve procedural problems swiftly and accurately. (2) Mathematical applications for the resolution of problems encountered in numerous industries. Although the functions which these programs perform are not new and similar programs are available in many large computer center libraries, this collection may be of use to centers with limited systems libraries and for instructional purposes for new computer operators.
NASA Astrophysics Data System (ADS)
Zheng, Pai; wang, Honghui; Sang, Zhiqian; Zhong, Ray Y.; Liu, Yongkui; Liu, Chao; Mubarok, Khamdi; Yu, Shiqiang; Xu, Xun
2018-06-01
Information and communication technology is undergoing rapid development, and many disruptive technologies, such as cloud computing, Internet of Things, big data, and artificial intelligence, have emerged. These technologies are permeating the manufacturing industry and enable the fusion of physical and virtual worlds through cyber-physical systems (CPS), which mark the advent of the fourth stage of industrial production (i.e., Industry 4.0). The widespread application of CPS in manufacturing environments renders manufacturing systems increasingly smart. To advance research on the implementation of Industry 4.0, this study examines smart manufacturing systems for Industry 4.0. First, a conceptual framework of smart manufacturing systems for Industry 4.0 is presented. Second, demonstrative scenarios that pertain to smart design, smart machining, smart control, smart monitoring, and smart scheduling, are presented. Key technologies and their possible applications to Industry 4.0 smart manufacturing systems are reviewed based on these demonstrative scenarios. Finally, challenges and future perspectives are identified and discussed.
Spring 2006. Industry Study. Manufacturing Industry
2006-01-01
ANALYSIS OF TRENDS Today the U.S. is the global leader in manufacturing innovation and technology . Continued advancements in both computing power and...than ninety percent of all annual U.S. patents as reported by the Department of Commerce. Through innovation and the application of new technology ...mobilization, innovation and technology , the manufacturing transformation, environmental balance, and international travel impressions
NASA Astrophysics Data System (ADS)
Jimenez, Edward S.; Thompson, Kyle R.; Stohn, Adriana; Goodner, Ryan N.
2017-09-01
Sandia National Laboratories has recently developed the capability to acquire multi-channel radio- graphs for multiple research and development applications in industry and security. This capability allows for the acquisition of x-ray radiographs or sinogram data to be acquired at up to 300 keV with up to 128 channels per pixel. This work will investigate whether multiple quality metrics for computed tomography can actually benefit from binned projection data compared to traditionally acquired grayscale sinogram data. Features and metrics to be evaluated include the ability to dis- tinguish between two different materials with similar absorption properties, artifact reduction, and signal-to-noise for both raw data and reconstructed volumetric data. The impact of this technology to non-destructive evaluation, national security, and industry is wide-ranging and has to potential to improve upon many inspection methods such as dual-energy methods, material identification, object segmentation, and computer vision on radiographs.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Job-shop scheduling applied to computer vision
NASA Astrophysics Data System (ADS)
Sebastian y Zuniga, Jose M.; Torres-Medina, Fernando; Aracil, Rafael; Reinoso, Oscar; Jimenez, Luis M.; Garcia, David
1997-09-01
This paper presents a method for minimizing the total elapsed time spent by n tasks running on m differents processors working in parallel. The developed algorithm not only minimizes the total elapsed time but also reduces the idle time and waiting time of in-process tasks. This condition is very important in some applications of computer vision in which the time to finish the total process is particularly critical -- quality control in industrial inspection, real- time computer vision, guided robots. The scheduling algorithm is based on the use of two matrices, obtained from the precedence relationships between tasks, and the data obtained from the two matrices. The developed scheduling algorithm has been tested in one application of quality control using computer vision. The results obtained have been satisfactory in the application of different image processing algorithms.
NASA Astrophysics Data System (ADS)
Komen, E. M. J.; Camilo, L. H.; Shams, A.; Geurts, B. J.; Koren, B.
2017-09-01
LES for industrial applications with complex geometries is mostly characterised by: a) a finite volume CFD method using a non-staggered arrangement of the flow variables and second order accurate spatial and temporal discretisation schemes, b) an implicit top-hat filter, where the filter length is equal to the local computational cell size, and c) eddy-viscosity type LES models. LES based on these three main characteristics is indicated as industrial LES in this paper. It becomes increasingly clear that the numerical dissipation in CFD codes typically used in industrial applications with complex geometries may inhibit the predictive capabilities of explicit LES. Therefore, there is a need to quantify the numerical dissipation rate in such CFD codes. In this paper, we quantify the numerical dissipation rate in physical space based on an analysis of the transport equation for the mean turbulent kinetic energy. Using this method, we quantify the numerical dissipation rate in a quasi-Direct Numerical Simulation (DNS) and in under-resolved DNS of, as a basic demonstration case, fully-developed turbulent channel flow. With quasi-DNS, we indicate a DNS performed using a second order accurate finite volume method typically used in industrial applications. Furthermore, we determine and explain the trends in the performance of industrial LES for fully-developed turbulent channel flow for four different Reynolds numbers for three different LES mesh resolutions. The presented explanation of the mechanisms behind the observed trends is based on an analysis of the turbulent kinetic energy budgets. The presented quantitative analyses demonstrate that the numerical errors in the industrial LES computations of the considered turbulent channel flows result in a net numerical dissipation rate which is larger than the subgrid-scale dissipation rate. No new computational methods are presented in this paper. Instead, the main new elements in this paper are our detailed quantification method for the numerical dissipation rate, the application of this method to a quasi-DNS and under-resolved DNS of fully-developed turbulent channel flow, and the explanation of the effects of the numerical dissipation on the observed trends in the performance of industrial LES for fully-developed turbulent channel flows.
Electronic Data Interchange: Selected Issues and Trends.
ERIC Educational Resources Information Center
Wigand, Rolf T.; And Others
1993-01-01
Describes electronic data interchange (EDI) as the application-to-application exchange of business documents in a computer-readable format. Topics discussed include EDI in various industries, EDI in finance and banking, organizational impacts of EDI, future EDI markets and organizations, and implications for information resources management.…
ERIC Educational Resources Information Center
San Jose State Coll., CA.
The papers from a conference on computer communication networks are divided into five groups--trends, applications, problems and impairments, solutions and tools, impact on society and education. The impact of such developing technologies as cable television, the "wired nation," the telephone industry, and analog data storage is…
Ma, Ji; Sun, Da-Wen; Qu, Jia-Huan; Liu, Dan; Pu, Hongbin; Gao, Wen-Hong; Zeng, Xin-An
2016-01-01
With consumer concerns increasing over food quality and safety, the food industry has begun to pay much more attention to the development of rapid and reliable food-evaluation systems over the years. As a result, there is a great need for manufacturers and retailers to operate effective real-time assessments for food quality and safety during food production and processing. Computer vision, comprising a nondestructive assessment approach, has the aptitude to estimate the characteristics of food products with its advantages of fast speed, ease of use, and minimal sample preparation. Specifically, computer vision systems are feasible for classifying food products into specific grades, detecting defects, and estimating properties such as color, shape, size, surface defects, and contamination. Therefore, in order to track the latest research developments of this technology in the agri-food industry, this review aims to present the fundamentals and instrumentation of computer vision systems with details of applications in quality assessment of agri-food products from 2007 to 2013 and also discuss its future trends in combination with spectroscopy.
John, Temitope M; Badejo, Joke A; Popoola, Segun I; Omole, David O; Odukoya, Jonathan A; Ajayi, Priscilla O; Aboyade, Mary; Atayero, Aderemi A
2018-06-01
This data article presents data of academic performances of undergraduate students in Science, Technology, Engineering and Mathematics (STEM) disciplines in Covenant University, Nigeria. The data shows academic performances of Male and Female students who graduated from 2010 to 2014. The total population of samples in the observation is 3046 undergraduates mined from Biochemistry (BCH), Building technology (BLD), Computer Engineering (CEN), Chemical Engineering (CHE), Industrial Chemistry (CHM), Computer Science (CIS), Civil Engineering (CVE), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mathematics (MAT), Microbiology (MCB), Mechanical Engineering (MCE), Management and Information System (MIS), Petroleum Engineering (PET), Industrial Physics-Electronics and IT Applications (PHYE), Industrial Physics-Applied Geophysics (PHYG) and Industrial Physics-Renewable Energy (PHYR). The detailed dataset is made available in form of a Microsoft Excel spreadsheet in the supplementary material of this article.
Final Report Extreme Computing and U.S. Competitiveness DOE Award. DE-FG02-11ER26087/DE-SC0008764
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mustain, Christopher J.
The Council has acted on each of the grant deliverables during the funding period. The deliverables are: (1) convening the Council’s High Performance Computing Advisory Committee (HPCAC) on a bi-annual basis; (2) broadening public awareness of high performance computing (HPC) and exascale developments; (3) assessing the industrial applications of extreme computing; and (4) establishing a policy and business case for an exascale economy.
Project JOVE. [microgravity experiments and applications
NASA Technical Reports Server (NTRS)
Lyell, M. J.
1994-01-01
The goal of this project is to investigate new areas of research pertaining to free surface-interface fluids mechanics and/or microgravity which have potential commercial applications. This paper presents an introduction to ferrohydrodynamics (FHD), and discusses some applications. Also, computational methods for solving free surface flow problems are presented in detail. Both have diverse applications in industry and in microgravity fluids applications. Three different modeling schemes for FHD flows are addressed and the governing equations, including Maxwell's equations, are introduced. In the area of computational modeling of free surface flows, both Eulerian and Lagrangian schemes are discussed. The state of the art in computational methods applied to free surface flows is elucidated. In particular, adaptive grids and re-zoning methods are discussed. Additional research results are addressed and copies of the publications produced under the JOVE Project are included.
There is a need to properly develop the application of Computational Fluid Dynamics (CFD) methods in support of air quality studies involving pollution sources near buildings at industrial sites. CFD models are emerging as a promising technology for such assessments, in part due ...
MapReduce Based Parallel Neural Networks in Enabling Large Scale Machine Learning
Yang, Jie; Huang, Yuan; Xu, Lixiong; Li, Siguang; Qi, Man
2015-01-01
Artificial neural networks (ANNs) have been widely used in pattern recognition and classification applications. However, ANNs are notably slow in computation especially when the size of data is large. Nowadays, big data has received a momentum from both industry and academia. To fulfill the potentials of ANNs for big data applications, the computation process must be speeded up. For this purpose, this paper parallelizes neural networks based on MapReduce, which has become a major computing model to facilitate data intensive applications. Three data intensive scenarios are considered in the parallelization process in terms of the volume of classification data, the size of the training data, and the number of neurons in the neural network. The performance of the parallelized neural networks is evaluated in an experimental MapReduce computer cluster from the aspects of accuracy in classification and efficiency in computation. PMID:26681933
High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations
NASA Technical Reports Server (NTRS)
Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.
2003-01-01
Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.
MapReduce Based Parallel Neural Networks in Enabling Large Scale Machine Learning.
Liu, Yang; Yang, Jie; Huang, Yuan; Xu, Lixiong; Li, Siguang; Qi, Man
2015-01-01
Artificial neural networks (ANNs) have been widely used in pattern recognition and classification applications. However, ANNs are notably slow in computation especially when the size of data is large. Nowadays, big data has received a momentum from both industry and academia. To fulfill the potentials of ANNs for big data applications, the computation process must be speeded up. For this purpose, this paper parallelizes neural networks based on MapReduce, which has become a major computing model to facilitate data intensive applications. Three data intensive scenarios are considered in the parallelization process in terms of the volume of classification data, the size of the training data, and the number of neurons in the neural network. The performance of the parallelized neural networks is evaluated in an experimental MapReduce computer cluster from the aspects of accuracy in classification and efficiency in computation.
ERIC Educational Resources Information Center
Technology & Learning, 2005
2005-01-01
In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…
Analysis on the University’s Network Security Level System in the Big Data Era
NASA Astrophysics Data System (ADS)
Li, Tianli
2017-12-01
The rapid development of science and technology, the continuous expansion of the scope of computer network applications, has gradually improved the social productive forces, has had a positive impact on the increase production efficiency and industrial scale of China's different industries. Combined with the actual application of computer network in the era of large data, we can see the existence of influencing factors such as network virus, hacker and other attack modes, threatening network security and posing a potential threat to the safe use of computer network in colleges and universities. In view of this unfavorable development situation, universities need to pay attention to the analysis of the situation of large data age, combined with the requirements of network security use, to build a reliable network space security system from the equipment, systems, data and other different levels. To avoid the security risks exist in the network. Based on this, this paper will analyze the hierarchical security system of cyberspace security in the era of large data.
Application of computer virtual simulation technology in 3D animation production
NASA Astrophysics Data System (ADS)
Mo, Can
2017-11-01
In the continuous development of computer technology, the application system of virtual simulation technology has been further optimized and improved. It also has been widely used in various fields of social development, such as city construction, interior design, industrial simulation and tourism teaching etc. This paper mainly introduces the virtual simulation technology used in 3D animation. Based on analyzing the characteristics of virtual simulation technology, the application ways and means of this technology in 3D animation are researched. The purpose is to provide certain reference for the 3D effect promotion days after.
Input Scanners: A Growing Impact In A Diverse Marketplace
NASA Astrophysics Data System (ADS)
Marks, Kevin E.
1989-08-01
Just as newly invented photographic processes revolutionized the printing industry at the turn of the century, electronic imaging has affected almost every computer application today. To completely emulate traditionally mechanical means of information handling, computer based systems must be able to capture graphic images. Thus, there is a widespread need for the electronic camera, the digitizer, the input scanner. This paper will review how various types of input scanners are being used in many diverse applications. The following topics will be covered: - Historical overview of input scanners - New applications for scanners - Impact of scanning technology on select markets - Scanning systems issues
A Resource Service Model in the Industrial IoT System Based on Transparent Computing.
Li, Weimin; Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang
2018-03-26
The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system.
A Resource Service Model in the Industrial IoT System Based on Transparent Computing
Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang
2018-01-01
The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system. PMID:29587450
Modular chassis simplifies packaging and interconnecting of circuit boards
NASA Technical Reports Server (NTRS)
Arens, W. E.; Boline, K. G.
1964-01-01
A system of modular chassis structures has simplified the design for mounting a number of printed circuit boards. This design is structurally adaptable to computer and industrial control system applications.
Shaded-Color Picture Generation of Computer-Defined Arbitrary Shapes
NASA Technical Reports Server (NTRS)
Cozzolongo, J. V.; Hermstad, D. L.; Mccoy, D. S.; Clark, J.
1986-01-01
SHADE computer program generates realistic color-shaded pictures from computer-defined arbitrary shapes. Objects defined for computer representation displayed as smooth, color-shaded surfaces, including varying degrees of transparency. Results also used for presentation of computational results. By performing color mapping, SHADE colors model surface to display analysis results as pressures, stresses, and temperatures. NASA has used SHADE extensively in sign and analysis of high-performance aircraft. Industry should find applications for SHADE in computer-aided design and computer-aided manufacturing. SHADE written in VAX FORTRAN and MACRO Assembler for either interactive or batch execution.
Evaluation of Digital Technology and Software Use among Business Education Teachers
ERIC Educational Resources Information Center
Ellis, Richard S.; Okpala, Comfort O.
2004-01-01
Digital video cameras are part of the evolution of multimedia digital products that have positive applications for educators, students, and industry. Multimedia digital video can be utilized by any personal computer and it allows the user to control, combine, and manipulate different types of media, such as text, sound, video, computer graphics,…
ERIC Educational Resources Information Center
Klein, David C.
2014-01-01
As advancements in automation continue to alter the systemic behavior of computer systems in a wide variety of industrial applications, human-machine interactions are increasingly becoming supervisory in nature, with less hands-on human involvement. This maturing of the human role within the human-computer relationship is relegating operations…
Architectures for single-chip image computing
NASA Astrophysics Data System (ADS)
Gove, Robert J.
1992-04-01
This paper will focus on the architectures of VLSI programmable processing components for image computing applications. TI, the maker of industry-leading RISC, DSP, and graphics components, has developed an architecture for a new-generation of image processors capable of implementing a plurality of image, graphics, video, and audio computing functions. We will show that the use of a single-chip heterogeneous MIMD parallel architecture best suits this class of processors--those which will dominate the desktop multimedia, document imaging, computer graphics, and visualization systems of this decade.
The application of computer image analysis in life sciences and environmental engineering
NASA Astrophysics Data System (ADS)
Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.
2014-04-01
The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.
Recent development of radiation measurement instrument for industrial and medical applications
NASA Astrophysics Data System (ADS)
Baba, Sueki; Ohmori, Koichi; Mito, Yoshio; Tanoue, Toshiya; Yano, Shigeki; Tokumori, Kenji; Toyofuku, Fukai; Kanda, Shigenobu
2001-02-01
Recently, computer imaging technology has developed very high-quality image and fast processing time. X-rays have been used for many purposes such as medical diagnosis and analyzing the structure of industrial materials. However, as X-rays are hazardous to the human body, it is desirable to reduce its exposed dose to a minimum. For this purpose, it is necessary to use a semiconductor radiation detector with a high efficiency for X-rays. We have developed photon-counting CdTe array detector system for medical and industrial use. The bone densitometer for Dual Energy X-ray Absorptometry (DEXA) has been developed to make diagnosis of osteoporosis, and it is developed to analyze a material element for industrial use. Recently, we have developed a monochromatic X-ray CT using a 256 ch CdTe array detector. We found that the array detector systems are very useful for medical and industrial applications.
Application of desktop computers in nuclear engineering education
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graves, H.W. Jr.
1990-01-01
Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solutionmore » to the problem being solved, and yet be flexible enough to accommodate most problem solution options.« less
NASA Technical Reports Server (NTRS)
Anyiwo, Joshua C.
2000-01-01
Vixen is a collection of enabling technologies for uninhibited distributed object computing. In the Spring of 1995 when Vixen was proposed, it was an innovative idea very much ahead of its time. But today the technologies proposed in Vixen have become standard technologies for Enterprise Computing. Sun Microsystems J2EE/EJB specifications, among others, are independently proposed technologies of the Vixen type. I have brought Vixen completely under the J2EE standard in order to maximize interoperability and compatibility with other computing industry efforts. Vixen and the Enterprise JavaBean (EJB) Server technologies are now practically identical; OIL, another Vixen technology, and the Java Messaging System (JMS) are practically identical; and so on. There is no longer anything novel or patentable in the Vixen work performed under this grant. The above discussion, notwithstanding, my independent development of Vixen has significantly helped me, my university, my students and the local community. The undergraduate students who worked with me in developing Vixen have enhanced their expertise in what has become the cutting edge technology of their industry and are therefore well positioned for lucrative employment opportunities in the industry. My academic department has gained a new course: "Multi-media System Development", which provides a highly desirable expertise to our students for employment in any enterprise today. The many Outreach Programs that I conducted during this grant period have exposed local Middle School students to the contributions that NASA is making in our society as well as awakened desires in many such students for careers in Science and Technology. I have applied Vixen to the development of two software packages: (a) JAS: Joshua Application Server - which allows a user to configure an EJB Server to serve a J2EE compliant application over the world wide web; (b) PCM: Professor Course Manager: a J2EE compliant application for configuring a course for distance learning. These types of applications are, however, generally available in the industry today.
Gray, John
2017-01-01
Machine-to-machine (M2M) communication is a key enabling technology for industrial internet of things (IIoT)-empowered industrial networks, where machines communicate with one another for collaborative automation and intelligent optimisation. This new industrial computing paradigm features high-quality connectivity, ubiquitous messaging, and interoperable interactions between machines. However, manufacturing IIoT applications have specificities that distinguish them from many other internet of things (IoT) scenarios in machine communications. By highlighting the key requirements and the major technical gaps of M2M in industrial applications, this article describes a collaboration-oriented M2M (CoM2M) messaging mechanism focusing on flexible connectivity and discovery, ubiquitous messaging, and semantic interoperability that are well suited for the production line-scale interoperability of manufacturing applications. The designs toward machine collaboration and data interoperability at both the communication and semantic level are presented. Then, the application scenarios of the presented methods are illustrated with a proof-of-concept implementation in the PicknPack food packaging line. Eventually, the advantages and some potential issues are discussed based on the PicknPack practice. PMID:29165347
Practical advantages of evolutionary computation
NASA Astrophysics Data System (ADS)
Fogel, David B.
1997-10-01
Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.
New solutions and applications of 3D computer tomography image processing
NASA Astrophysics Data System (ADS)
Effenberger, Ira; Kroll, Julia; Verl, Alexander
2008-02-01
As nowadays the industry aims at fast and high quality product development and manufacturing processes a modern and efficient quality inspection is essential. Compared to conventional measurement technologies, industrial computer tomography (CT) is a non-destructive technology for 3D-image data acquisition which helps to overcome their disadvantages by offering the possibility to scan complex parts with all outer and inner geometric features. In this paper new and optimized methods for 3D image processing, including innovative ways of surface reconstruction and automatic geometric feature detection of complex components, are presented, especially our work of developing smart online data processing and data handling methods, with an integrated intelligent online mesh reduction. Hereby the processing of huge and high resolution data sets is guaranteed. Besides, new approaches for surface reconstruction and segmentation based on statistical methods are demonstrated. On the extracted 3D point cloud or surface triangulation automated and precise algorithms for geometric inspection are deployed. All algorithms are applied to different real data sets generated by computer tomography in order to demonstrate the capabilities of the new tools. Since CT is an emerging technology for non-destructive testing and inspection more and more industrial application fields will use and profit from this new technology.
Lewis Structures Technology, 1988. Volume 2: Structural Mechanics
NASA Technical Reports Server (NTRS)
1988-01-01
Lewis Structures Div. performs and disseminates results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practitioners of structural engineering mechanics beyond the aerospace arena. The engineering community was familiarized with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.
Applications of laser ablation to microengineering
NASA Astrophysics Data System (ADS)
Gower, Malcolm C.; Rizvi, Nadeem H.
2000-08-01
Applications of pulsed laser ablation to the manufacture of micro- electro-mechanical systems (MEMS) and micro-opto-electro-mechanical systems (MOEMS) devices are presented. Laser ablative processes used to manufacture a variety of microsystems technology (MST) components in the computer peripheral, sensing and biomedical industries are described together with a view of some future developments.
Semantic computing and language knowledge bases
NASA Astrophysics Data System (ADS)
Wang, Lei; Wang, Houfeng; Yu, Shiwen
2017-09-01
As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.
Ab initio calculations for industrial materials engineering: successes and challenges.
Wimmer, Erich; Najafabadi, Reza; Young, George A; Ballard, Jake D; Angeliu, Thomas M; Vollmer, James; Chambers, James J; Niimi, Hiroaki; Shaw, Judy B; Freeman, Clive; Christensen, Mikael; Wolf, Walter; Saxe, Paul
2010-09-29
Computational materials science based on ab initio calculations has become an important partner to experiment. This is demonstrated here for the effect of impurities and alloying elements on the strength of a Zr twist grain boundary, the dissociative adsorption and diffusion of iodine on a zirconium surface, the diffusion of oxygen atoms in a Ni twist grain boundary and in bulk Ni, and the dependence of the work function of a TiN-HfO(2) junction on the replacement of N by O atoms. In all of these cases, computations provide atomic-scale understanding as well as quantitative materials property data of value to industrial research and development. There are two key challenges in applying ab initio calculations, namely a higher accuracy in the electronic energy and the efficient exploration of large parts of the configurational space. While progress in these areas is fueled by advances in computer hardware, innovative theoretical concepts combined with systematic large-scale computations will be needed to realize the full potential of ab initio calculations for industrial applications.
Adaptation of a Control Center Development Environment for Industrial Process Control
NASA Technical Reports Server (NTRS)
Killough, Ronnie L.; Malik, James M.
1994-01-01
In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.
Mendikute, Alberto; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-01-01
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g., 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras. PMID:28891946
Mendikute, Alberto; Yagüe-Fabra, José A; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-09-09
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g. 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras.
Discovery, Molecular Mechanisms, and Industrial Applications of Cold-Active Enzymes
Santiago, Margarita; Ramírez-Sarmiento, César A.; Zamora, Ricardo A.; Parra, Loreto P.
2016-01-01
Cold-active enzymes constitute an attractive resource for biotechnological applications. Their high catalytic activity at temperatures below 25°C makes them excellent biocatalysts that eliminate the need of heating processes hampering the quality, sustainability, and cost-effectiveness of industrial production. Here we provide a review of the isolation and characterization of novel cold-active enzymes from microorganisms inhabiting different environments, including a revision of the latest techniques that have been used for accomplishing these paramount tasks. We address the progress made in the overexpression and purification of cold-adapted enzymes, the evolutionary and molecular basis of their high activity at low temperatures and the experimental and computational techniques used for their identification, along with protein engineering endeavors based on these observations to improve some of the properties of cold-adapted enzymes to better suit specific applications. We finally focus on examples of the evaluation of their potential use as biocatalysts under conditions that reproduce the challenges imposed by the use of solvents and additives in industrial processes and of the successful use of cold-adapted enzymes in biotechnological and industrial applications. PMID:27667987
Discovery, Molecular Mechanisms, and Industrial Applications of Cold-Active Enzymes.
Santiago, Margarita; Ramírez-Sarmiento, César A; Zamora, Ricardo A; Parra, Loreto P
2016-01-01
Cold-active enzymes constitute an attractive resource for biotechnological applications. Their high catalytic activity at temperatures below 25°C makes them excellent biocatalysts that eliminate the need of heating processes hampering the quality, sustainability, and cost-effectiveness of industrial production. Here we provide a review of the isolation and characterization of novel cold-active enzymes from microorganisms inhabiting different environments, including a revision of the latest techniques that have been used for accomplishing these paramount tasks. We address the progress made in the overexpression and purification of cold-adapted enzymes, the evolutionary and molecular basis of their high activity at low temperatures and the experimental and computational techniques used for their identification, along with protein engineering endeavors based on these observations to improve some of the properties of cold-adapted enzymes to better suit specific applications. We finally focus on examples of the evaluation of their potential use as biocatalysts under conditions that reproduce the challenges imposed by the use of solvents and additives in industrial processes and of the successful use of cold-adapted enzymes in biotechnological and industrial applications.
NASA Astrophysics Data System (ADS)
Fourment, Lionel; Ducloux, Richard; Marie, Stéphane; Ejday, Mohsen; Monnereau, Dominique; Massé, Thomas; Montmitonnet, Pierre
2010-06-01
The use of material processing numerical simulation allows a strategy of trial and error to improve virtual processes without incurring material costs or interrupting production and therefore save a lot of money, but it requires user time to analyze the results, adjust the operating conditions and restart the simulation. Automatic optimization is the perfect complement to simulation. Evolutionary Algorithm coupled with metamodelling makes it possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. Ten industrial partners have been selected to cover the different area of the mechanical forging industry and provide different examples of the forming simulation tools. It aims to demonstrate that it is possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. The large computational time is handled by a metamodel approach. It allows interpolating the objective function on the entire parameter space by only knowing the exact function values at a reduced number of "master points". Two algorithms are used: an evolution strategy combined with a Kriging metamodel and a genetic algorithm combined with a Meshless Finite Difference Method. The later approach is extended to multi-objective optimization. The set of solutions, which corresponds to the best possible compromises between the different objectives, is then computed in the same way. The population based approach allows using the parallel capabilities of the utilized computer with a high efficiency. An optimization module, fully embedded within the Forge2009 IHM, makes possible to cover all the defined examples, and the use of new multi-core hardware to compute several simulations at the same time reduces the needed time dramatically. The presented examples demonstrate the method versatility. They include billet shape optimization of a common rail, the cogging of a bar and a wire drawing problem.
Automotive applications of superconductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ginsberg, M.
1987-01-01
These proceedings compile papers on supercomputers in the automobile industry. Titles include: An automotive engineer's guide to the effective use of scalar, vector, and parallel computers; fluid mechanics, finite elements, and supercomputers; and Automotive crashworthiness performance on a supercomputer.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.
Teaching Web Application Development: A Case Study in a Computer Science Course
ERIC Educational Resources Information Center
Del Fabro, Marcos Didonet; de Alimeda, Eduardo Cunha; Sluzarski, Fabiano
2012-01-01
Teaching web development in Computer Science undergraduate courses is a difficult task. Often, there is a gap between the students' experiences and the reality in the industry. As a consequence, the students are not always well-prepared once they get the degree. This gap is due to several reasons, such as the complexity of the assignments, the…
Consistent data-driven computational mechanics
NASA Astrophysics Data System (ADS)
González, D.; Chinesta, F.; Cueto, E.
2018-05-01
We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.
Error Mitigation of Point-to-Point Communication for Fault-Tolerant Computing
NASA Technical Reports Server (NTRS)
Akamine, Robert L.; Hodson, Robert F.; LaMeres, Brock J.; Ray, Robert E.
2011-01-01
Fault tolerant systems require the ability to detect and recover from physical damage caused by the hardware s environment, faulty connectors, and system degradation over time. This ability applies to military, space, and industrial computing applications. The integrity of Point-to-Point (P2P) communication, between two microcontrollers for example, is an essential part of fault tolerant computing systems. In this paper, different methods of fault detection and recovery are presented and analyzed.
The Role of Networks in Cloud Computing
NASA Astrophysics Data System (ADS)
Lin, Geng; Devine, Mac
The confluence of technology advancements and business developments in Broadband Internet, Web services, computing systems, and application software over the past decade has created a perfect storm for cloud computing. The "cloud model" of delivering and consuming IT functions as services is poised to fundamentally transform the IT industry and rebalance the inter-relationships among end users, enterprise IT, software companies, and the service providers in the IT ecosystem (Armbrust et al., 2009; Lin, Fu, Zhu, & Dasmalchi, 2009).
Transonic CFD applications at Boeing
NASA Technical Reports Server (NTRS)
Tinoco, E. N.
1989-01-01
The use of computational methods for three dimensional transonic flow design and analysis at the Boeing Company is presented. A range of computational tools consisting of production tools for every day use by project engineers, expert user tools for special applications by computational researchers, and an emerging tool which may see considerable use in the near future are described. These methods include full potential and Euler solvers, some coupled to three dimensional boundary layer analysis methods, for transonic flow analysis about nacelle, wing-body, wing-body-strut-nacelle, and complete aircraft configurations. As the examples presented show, such a toolbox of codes is necessary for the variety of applications typical of an industrial environment. Such a toolbox of codes makes possible aerodynamic advances not previously achievable in a timely manner, if at all.
Bioinformatics and Microarray Data Analysis on the Cloud.
Calabrese, Barbara; Cannataro, Mario
2016-01-01
High-throughput platforms such as microarray, mass spectrometry, and next-generation sequencing are producing an increasing volume of omics data that needs large data storage and computing power. Cloud computing offers massive scalable computing and storage, data sharing, on-demand anytime and anywhere access to resources and applications, and thus, it may represent the key technology for facing those issues. In fact, in the recent years it has been adopted for the deployment of different bioinformatics solutions and services both in academia and in the industry. Although this, cloud computing presents several issues regarding the security and privacy of data, that are particularly important when analyzing patients data, such as in personalized medicine. This chapter reviews main academic and industrial cloud-based bioinformatics solutions; with a special focus on microarray data analysis solutions and underlines main issues and problems related to the use of such platforms for the storage and analysis of patients data.
Computer image generation: Reconfigurability as a strategy in high fidelity space applications
NASA Technical Reports Server (NTRS)
Bartholomew, Michael J.
1989-01-01
The demand for realistic, high fidelity, computer image generation systems to support space simulation is well established. However, as the number and diversity of space applications increase, the complexity and cost of computer image generation systems also increase. One strategy used to harmonize cost with varied requirements is establishment of a reconfigurable image generation system that can be adapted rapidly and easily to meet new and changing requirements. The reconfigurability strategy through the life cycle of system conception, specification, design, implementation, operation, and support for high fidelity computer image generation systems are discussed. The discussion is limited to those issues directly associated with reconfigurability and adaptability of a specialized scene generation system in a multi-faceted space applications environment. Examples and insights gained through the recent development and installation of the Improved Multi-function Scene Generation System at Johnson Space Center, Systems Engineering Simulator are reviewed and compared with current simulator industry practices. The results are clear; the strategy of reconfigurability applied to space simulation requirements provides a viable path to supporting diverse applications with an adaptable computer image generation system.
Park, Hyun June; Park, Kyungmoon; Kim, Yong Hwan; Yoo, Young Je
2014-12-20
Candida antarctica lipase B (CalB) is one of the most useful enzyme for various reactions and bioconversions. Enhancing thermostability of CalB is required for industrial applications. In this study, we propose a computational design strategy to improve the thermostability of CalB. Molecular dynamics simulations at various temperatures were used to investigate the common fluctuation sites in CalB, which are considered to be thermally weak points. The RosettaDesign algorithm was used to design the selected residues. The redesigned CalB was simulated to verify both the enhancement of intramolecular interactions and the lowering of the overall root-mean-square deviation (RMSD) values. The A251E mutant designed using this strategy showed a 2.5-fold higher thermostability than the wild-type CalB. This strategy could apply to other industry applicable enzymes. Copyright © 2014 Elsevier B.V. All rights reserved.
A precise goniometer/tensiometer using a low cost single-board computer
NASA Astrophysics Data System (ADS)
Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.
2017-12-01
Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.
ERIC Educational Resources Information Center
Wise, Stuart; Greenwood, Janinka; Davis, Niki
2011-01-01
The music industry in the 21st century uses digital technology in a wide range of applications including performance, composition and in recording and publishing. Much of this digital technology is freely available via downloads from the internet, as part of software included with computers when they are purchased and via applications that are…
Data Mining and Knowledge Discover - IBM Cognitive Alternatives for NASA KSC
NASA Technical Reports Server (NTRS)
Velez, Victor Hugo
2016-01-01
Skillful tools in cognitive computing to transform industries have been found favorable and profitable for different Directorates at NASA KSC. In this study is shown how cognitive computing systems can be useful for NASA when computers are trained in the same way as humans are to gain knowledge over time. Increasing knowledge through senses, learning and a summation of events is how the applications created by the firm IBM empower the artificial intelligence in a cognitive computing system. NASA has explored and applied for the last decades the artificial intelligence approach specifically with cognitive computing in few projects adopting similar models proposed by IBM Watson. However, the usage of semantic technologies by the dedicated business unit developed by IBM leads these cognitive computing applications to outperform the functionality of the inner tools and present outstanding analysis to facilitate the decision making for managers and leads in a management information system.
JPRS report: Science and Technology. Europe and Latin America
NASA Astrophysics Data System (ADS)
1988-01-01
Articles from the popular and trade press are included on the following subjects: advanced materials, aerospace industry, automotive industry, biotechnology, computers, factory automation and robotics, microelectronics, and science and technology policy. The aerospace articles discuss briefly and in a nontechnical way the SAGEM bubble memories for space applications, Ariane V new testing facilities, innovative technologies of TDF-1 satellite, and the restructuring of the Aviation Division at France's Aerospatiale.
Laboratory and software applications for clinical trials: the global laboratory environment.
Briscoe, Chad
2011-11-01
The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.
Computer vision for general purpose visual inspection: a fuzzy logic approach
NASA Astrophysics Data System (ADS)
Chen, Y. H.
In automatic visual industrial inspection, computer vision systems have been widely used. Such systems are often application specific, and therefore require domain knowledge in order to have a successful implementation. Since visual inspection can be viewed as a decision making process, it is argued that the integration of fuzzy logic analysis and computer vision systems provides a practical approach to general purpose visual inspection applications. This paper describes the development of an integrated fuzzy-rule-based automatic visual inspection system. Domain knowledge about a particular application is represented as a set of fuzzy rules. From the status of predefined fuzzy variables, the set of fuzzy rules are defuzzified to give the inspection results. A practical application where IC marks (often in the forms of English characters and a company logo) inspection is demonstrated, which shows a more consistent result as compared to a conventional thresholding method.
Fundamental Concepts of Digital Image Processing
DOE R&D Accomplishments Database
Twogood, R. E.
1983-03-01
The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.
Design considerations for computationally constrained two-way real-time video communication
NASA Astrophysics Data System (ADS)
Bivolarski, Lazar M.; Saunders, Steven E.; Ralston, John D.
2009-08-01
Today's video codecs have evolved primarily to meet the requirements of the motion picture and broadcast industries, where high-complexity studio encoding can be utilized to create highly-compressed master copies that are then broadcast one-way for playback using less-expensive, lower-complexity consumer devices for decoding and playback. Related standards activities have largely ignored the computational complexity and bandwidth constraints of wireless or Internet based real-time video communications using devices such as cell phones or webcams. Telecommunications industry efforts to develop and standardize video codecs for applications such as video telephony and video conferencing have not yielded image size, quality, and frame-rate performance that match today's consumer expectations and market requirements for Internet and mobile video services. This paper reviews the constraints and the corresponding video codec requirements imposed by real-time, 2-way mobile video applications. Several promising elements of a new mobile video codec architecture are identified, and more comprehensive computational complexity metrics and video quality metrics are proposed in order to support the design, testing, and standardization of these new mobile video codecs.
A Survey of CAD/CAM Technology Applications in the U.S. Shipbuilding Industry
1984-01-01
operation for drafting. Computer Aided Engineering (CAE) analysis is used primarily to determine the validity of design characteristics and produc- tion...include time standard generation, sea trial analysis , and group Systems integration While no systems surveyed Aided Design (CAD) is the technology... analysis . is the largest problem involving software packages. are truly integrated, many are interfaced. Computer most interfaced category with links
Zhu, Zhen; Puliga, Michelangelo; Cerina, Federica; Chessa, Alessandro; Riccaboni, Massimo
2015-01-01
The fragmentation of production across countries has become an important feature of the globalization in recent decades and is often conceptualized by the term “global value chains” (GVCs). When empirically investigating the GVCs, previous studies are mainly interested in knowing how global the GVCs are rather than how the GVCs look like. From a complex networks perspective, we use the World Input-Output Database (WIOD) to study the evolution of the global production system. We find that the industry-level GVCs are indeed not chain-like but are better characterized by the tree topology. Hence, we compute the global value trees (GVTs) for all the industries available in the WIOD. Moreover, we compute an industry importance measure based on the GVTs and compare it with other network centrality measures. Finally, we discuss some future applications of the GVTs. PMID:25978067
I3Mote: An Open Development Platform for the Intelligent Industrial Internet
Martinez, Borja; Vilajosana, Xavier; Kim, Il Han; Zhou, Jianwei; Tuset-Peiró, Pere; Xhafa, Ariton; Poissonnier, Dominique; Lu, Xiaolin
2017-01-01
In this article we present the Intelligent Industrial Internet (I3) Mote, an open hardware platform targeting industrial connectivity and sensing deployments. The I3Mote features the most advanced low-power components to tackle sensing, on-board computing and wireless/wired connectivity for demanding industrial applications. The platform has been designed to fill the gap in the industrial prototyping and early deployment market with a compact form factor, low-cost and robust industrial design. I3Mote is an advanced and compact prototyping system integrating the required components to be deployed as a product, leveraging the need for adopting industries to build their own tailored solution. This article describes the platform design, firmware and software ecosystem and characterizes its performance in terms of energy consumption. PMID:28452945
Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.
1989-01-01
The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Solid-liquid phase coexistence of alkali nitrates from molecular dynamics simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jayaraman, Saivenkataraman
2010-03-01
Alkali nitrate eutectic mixtures are finding application as industrial heat transfer fluids in concentrated solar power generation systems. An important property for such applications is the melting point, or phase coexistence temperature. We have computed melting points for lithium, sodium and potassium nitrate from molecular dynamics simulations using a recently developed method, which uses thermodynamic integration to compute the free energy difference between the solid and liquid phases. The computed melting point for NaNO3 was within 15K of its experimental value, while for LiNO3 and KNO3, the computed melting points were within 100K of the experimental values [4]. We aremore » currently extending the approach to calculate melting temperatures for binary mixtures of lithium and sodium nitrate.« less
A Strategy for Improved System Assurance
2007-06-20
Quality (Measurements Life Cycle Safety, Security & Others) ISO /IEC 12207 * Software Life Cycle Processes ISO 9001 Quality Management System...14598 Software Product Evaluation Related ISO /IEC 90003 Guidelines for the Application of ISO 9001:2000 to Computer Software IEEE 12207 Industry...Implementation of International Standard ISO /IEC 12207 IEEE 1220 Standard for Application and Management of the System Engineering Process Use in
NASA Technical Reports Server (NTRS)
Kemeny, Sabrina E.
1994-01-01
Electronic and optoelectronic hardware implementations of highly parallel computing architectures address several ill-defined and/or computation-intensive problems not easily solved by conventional computing techniques. The concurrent processing architectures developed are derived from a variety of advanced computing paradigms including neural network models, fuzzy logic, and cellular automata. Hardware implementation technologies range from state-of-the-art digital/analog custom-VLSI to advanced optoelectronic devices such as computer-generated holograms and e-beam fabricated Dammann gratings. JPL's concurrent processing devices group has developed a broad technology base in hardware implementable parallel algorithms, low-power and high-speed VLSI designs and building block VLSI chips, leading to application-specific high-performance embeddable processors. Application areas include high throughput map-data classification using feedforward neural networks, terrain based tactical movement planner using cellular automata, resource optimization (weapon-target assignment) using a multidimensional feedback network with lateral inhibition, and classification of rocks using an inner-product scheme on thematic mapper data. In addition to addressing specific functional needs of DOD and NASA, the JPL-developed concurrent processing device technology is also being customized for a variety of commercial applications (in collaboration with industrial partners), and is being transferred to U.S. industries. This viewgraph p resentation focuses on two application-specific processors which solve the computation intensive tasks of resource allocation (weapon-target assignment) and terrain based tactical movement planning using two extremely different topologies. Resource allocation is implemented as an asynchronous analog competitive assignment architecture inspired by the Hopfield network. Hardware realization leads to a two to four order of magnitude speed-up over conventional techniques and enables multiple assignments, (many to many), not achievable with standard statistical approaches. Tactical movement planning (finding the best path from A to B) is accomplished with a digital two-dimensional concurrent processor array. By exploiting the natural parallel decomposition of the problem in silicon, a four order of magnitude speed-up over optimized software approaches has been demonstrated.
ERIC Educational Resources Information Center
Rogers, David F., Ed.; Smith, P. R., Ed.
1984-01-01
Ten papers focus on applications in specific curriculum areas, modelling and simulation, and computer managed learning. Projects described include voice support for the visually handicapped, distance education, and industrial training, as well as teaching applied mathematics, several facets of engineering, zoology, and, with videodisc, observation…
2010-03-01
release; distribution unlimited. Ref AFRL/RXQ Public Affairs Case # 10-100. Document contains color images . Although aqueous fire fighting agent...in conjunction with the standard Eulerian multiphase flow model. The two- equation k- model was selected due to its wide industrial application in...energy (k) and its dissipation rate (). Because of their heuristic development, RANS models have applicable limitations and in general must be
NASA Astrophysics Data System (ADS)
Jimenez, Edward S.; Goodman, Eric L.; Park, Ryeojin; Orr, Laurel J.; Thompson, Kyle R.
2014-09-01
This paper will investigate energy-efficiency for various real-world industrial computed-tomography reconstruction algorithms, both CPU- and GPU-based implementations. This work shows that the energy required for a given reconstruction is based on performance and problem size. There are many ways to describe performance and energy efficiency, thus this work will investigate multiple metrics including performance-per-watt, energy-delay product, and energy consumption. This work found that irregular GPU-based approaches1 realized tremendous savings in energy consumption when compared to CPU implementations while also significantly improving the performance-per- watt and energy-delay product metrics. Additional energy savings and other metric improvement was realized on the GPU-based reconstructions by improving storage I/O by implementing a parallel MIMD-like modularization of the compute and I/O tasks.
Product definition data interface
NASA Technical Reports Server (NTRS)
Birchfield, B.; Downey, P.
1984-01-01
The development and application of advanced Computer Aided Design/Computer Aided Manufacturing (CAD/CAM) technology in aerospace industry is discussed. New CAD/CAM capabilities provide the engineer and production worker with tools to produce better products and significantly improve productivity. This technology is expanding in all phases of engineering and manufacturing with large potential for improvements in productivity. The integration of CAD and CAM systematically to insure maximum utility throughout the U.S. Aerospace Industry, its large community of supporting suppliers, and the Department of Defense aircraft overhaul and repair facilities is outlined. The need for a framework for exchange of digital product definition data, which serves the function of the conventional engineering drawing is emphasized.
Applications of structural optimization methods to fixed-wing aircraft and spacecraft in the 1980s
NASA Technical Reports Server (NTRS)
Miura, Hirokazu; Neill, Douglas J.
1992-01-01
This report is the summary of a technical survey on the applications of structural optimization in the U.S. aerospace industry through the 1980s. Since applications to rotary wing aircraft will be covered by other literature, applications to fixed-wing aircraft and spacecraft were considered. It became clear that very significant progress has been made during this decade, indicating this technology is about to become one of the practical tools in computer aided structural design.
NASA Technical Reports Server (NTRS)
Aaronson, A. C.; Buelow, K.; David, F. C.; Packard, R. L.; Ravet, F. W. (Principal Investigator)
1979-01-01
The latest satellite and computer processing and analysis technologies were tested and evaluated in terms of their application feasibility. Technologies evaluated include those developed, tested, and evaluated by the LACIE, as well as candidate technologies developed by the research community and private industry. The implementation of the applications test system and the technology transfer experience between the LACIE and the applications test system is discussed highlighting the approach, the achievements, and the shortcomings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyonnais, Marc; Smith, Matt; Mace, Kate P.
SCinet is the purpose-built network that operates during the International Conference for High Performance Computing,Networking, Storage and Analysis (Super Computing or SC). Created each year for the conference, SCinet brings to life a high-capacity network that supports applications and experiments that are a hallmark of the SC conference. The network links the convention center to research and commercial networks around the world. This resource serves as a platform for exhibitors to demonstrate the advanced computing resources of their home institutions and elsewhere by supporting a wide variety of applications. Volunteers from academia, government and industry work together to design andmore » deliver the SCinet infrastructure. Industry vendors and carriers donate millions of dollars in equipment and services needed to build and support the local and wide area networks. Planning begins more than a year in advance of each SC conference and culminates in a high intensity installation in the days leading up to the conference. The SCinet architecture for SC16 illustrates a dramatic increase in participation from the vendor community, particularly those that focus on network equipment. Software-Defined Networking (SDN) and Data Center Networking (DCN) are present in nearly all aspects of the design.« less
Computer integrated manufacturing/processing in the HPI. [Hydrocarbon Processing Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoshimura, J.S.
1993-05-01
Hydrocarbon Processing and Systemhouse Inc., developed a comprehensive survey on the status of computer integrated manufacturing/processing (CIM/CIP) targeted specifically to the unique requirements of the hydrocarbon processing industry. These types of surveys and other benchmarking techniques can be invaluable in assisting companies to maximize business benefits from technology investments. The survey was organized into 5 major areas: CIM/CIP planning, management perspective, functional applications, integration and technology infrastructure and trends. The CIM/CIP planning area dealt with the use and type of planning methods to plan, justify implement information technology projects. The management perspective section addressed management priorities, expenditure levels and implementationmore » barriers. The functional application area covered virtually all functional areas of organization and focused on the specific solutions and benefits in each of the functional areas. The integration section addressed the needs and integration status of the organization's functional areas. Finally, the technology infrastructure and trends section dealt with specific technologies in use as well as trends over the next three years. In February 1993, summary areas from preliminary results were presented at the 2nd International Conference on Productivity and Quality in the Hydrocarbon Processing Industry.« less
Navier-Stokes computations useful in aircraft design
NASA Technical Reports Server (NTRS)
Holst, Terry L.
1990-01-01
Large scale Navier-Stokes computations about aircraft components as well as reasonably complete aircraft configurations are presented and discussed. Speed and memory requirements are described for various general problem classes, which in some cases are already being used in the industrial design environment. Recent computed results, with experimental comparisons when available, are included to highlight the presentation. Finally, prospects for the future are described and recommendations for areas of concentrated research are indicated. The future of Navier-Stokes computations is seen to be rapidly expanding across a broad front of applications, which includes the entire subsonic-to-hypersonic speed regime.
Parallel computations and control of adaptive structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Alvin, Kenneth F.; Belvin, W. Keith; Chong, K. P. (Editor); Liu, S. C. (Editor); Li, J. C. (Editor)
1991-01-01
The equations of motion for structures with adaptive elements for vibration control are presented for parallel computations to be used as a software package for real-time control of flexible space structures. A brief introduction of the state-of-the-art parallel computational capability is also presented. Time marching strategies are developed for an effective use of massive parallel mapping, partitioning, and the necessary arithmetic operations. An example is offered for the simulation of control-structure interaction on a parallel computer and the impact of the approach presented for applications in other disciplines than aerospace industry is assessed.
Applications of airborne ultrasound in human-computer interaction.
Dahl, Tobias; Ealo, Joao L; Bang, Hans J; Holm, Sverre; Khuri-Yakub, Pierre
2014-09-01
Airborne ultrasound is a rapidly developing subfield within human-computer interaction (HCI). Touchless ultrasonic interfaces and pen tracking systems are part of recent trends in HCI and are gaining industry momentum. This paper aims to provide the background and overview necessary to understand the capabilities of ultrasound and its potential future in human-computer interaction. The latest developments on the ultrasound transducer side are presented, focusing on capacitive micro-machined ultrasonic transducers, or CMUTs. Their introduction is an important step toward providing real, low-cost multi-sensor array and beam-forming options. We also provide a unified mathematical framework for understanding and analyzing algorithms used for ultrasound detection and tracking for some of the most relevant applications. Copyright © 2014. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
1988-01-01
The charter of the Structures Division is to perform and disseminate results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practioners of structural engineering mechanics beyond the aerospace arena. The specific purpose of the symposium was to familiarize the engineering structures community with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.
Initiatives in the Education and Training of Young People.
ERIC Educational Resources Information Center
Lister, Alan, Ed.
1985-01-01
Eight articles on educational technology's application to youth education and training describe United Kingdom's Junior Army leadership skills training; educational technology within Youth Training Scheme (YTS); YTS hotel and catering industry initiatives; Coventry's computer based learning project; cross-cultural courseware transfer; mathematics…
Multiscale Mechanical Characterization of Biomimetic Gels for Army Applications
2006-11-01
organs is critical for the design of protective equipment used in the automotive, body armor, and sports industries, among others. Improved...displacement curve. The total displaced volume is hrV 2π= . (9) The relations in this section are used to compute
CFD simulation of airflow inside tree canopies discharged from air-assisted sprayers
USDA-ARS?s Scientific Manuscript database
Effective pesticide application is not only essential for specialty crop industries but also very important for addressing increasing concerns about environmental contamination caused by pesticide spray drift. Numerical analysis using computational fluid dynamics (CFD) can contribute to better under...
AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE
NASA Technical Reports Server (NTRS)
Liever, P. A.; Sheta, E. F.; Habchi, S. D.
2006-01-01
A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.
Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes
NASA Technical Reports Server (NTRS)
Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)
2000-01-01
The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.
Technical Risk Prevention in the Workplace
NASA Astrophysics Data System (ADS)
Ricaud, Myriam
Nanotechnology has become a major economic and technological issue today. Indeed, nanometric dimensions give matter novel physical, chemical, and biological properties with a host of applications. Nanotechnology is thus having an increasing impact on new and emerging industries, such as computing, electronics, aerospace, and alternative energy supplies, but also on traditional forms of industry such as the automobile, aeronautics, food, pharmaceutical, and cosmetics sectors. In this way, nanotechnology has led to both gradual and radical innovation in many areas of industry: biochips, drug delivery, self-cleaning and antipollution concretes, antibacterial clothing, antiscratch paints, and the list continues [1, 2, 3].
Electronic Nose and Electronic Tongue
NASA Astrophysics Data System (ADS)
Bhattacharyya, Nabarun; Bandhopadhyay, Rajib
Human beings have five senses, namely, vision, hearing, touch, smell and taste. The sensors for vision, hearing and touch have been developed for several years. The need for sensors capable of mimicking the senses of smell and taste have been felt only recently in food industry, environmental monitoring and several industrial applications. In the ever-widening horizon of frontier research in the field of electronics and advanced computing, emergence of electronic nose (E-Nose) and electronic tongue (E-Tongue) have been drawing attention of scientists and technologists for more than a decade. By intelligent integration of multitudes of technologies like chemometrics, microelectronics and advanced soft computing, human olfaction has been successfully mimicked by such new techniques called machine olfaction (Pearce et al. 2002). But the very essence of such research and development efforts has centered on development of customized electronic nose and electronic tongue solutions specific to individual applications. In fact, research trends as of date clearly points to the fact that a machine olfaction system as versatile, universal and broadband as human nose and human tongue may not be feasible in the decades to come. But application specific solutions may definitely be demonstrated and commercialized by modulation in sensor design and fine-tuning the soft computing solutions. This chapter deals with theory, developments of E-Nose and E-Tongue technology and their applications. Also a succinct account of future trends of R&D efforts in this field with an objective of establishing co-relation between machine olfaction and human perception has been included.
P.C. disposal decisions: a banking industry case study
NASA Astrophysics Data System (ADS)
Shah, Sejal P.; Sarkis, Joseph
2002-02-01
The service industry and the manufacturing industry are interlinked in a supply chain situation. Part of the effectiveness of some manufacturing industry environmental performance based on remanufacturing and recycling is dependent on service industry decisions. In the information technology arena, personal computers (PCs) are the hard equipment of the service industry. The end-of-life decisions made by the service industry, and in this case the banking industry will have implications for the amount of systems within the waste or reverse logistics stream for manufacturers. Looking at some of the issues (and presenting a model for evaluation) related to decision making concerning end-of-life disposition for PCs is something this paper investigates. The analytical hierarchy process (AHP) is applied in this circumstance. The development of the model, its application, and results, provide the basis for much of the discussion in this paper.
Physics through the 1990s: Scientific interfaces and technological applications
NASA Technical Reports Server (NTRS)
1986-01-01
The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.
ERIC Educational Resources Information Center
Bintas, Jale; Barut, Asim
2008-01-01
The aim of research is to compare difference between tenth class students and determine their level of success about classic and web based educational applications of Turbo Pascal lesson. This research was applied to 10 A and 10 TLB students of Izmir Karsikaya Anatolian Technical and industrial high school computer department in second term of…
Linear programming computational experience with onyx
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atrek, E.
1994-12-31
ONYX is a linear programming software package based on an efficient variation of the gradient projection method. When fully configured, it is intended for application to industrial size problems. While the computational experience is limited at the time of this abstract, the technique is found to be robust and competitive with existing methodology in terms of both accuracy and speed. An overview of the approach is presented together with a description of program capabilities, followed by a discussion of up-to-date computational experience with the program. Conclusions include advantages of the approach and envisioned future developments.
[INVITED] Computational intelligence for smart laser materials processing
NASA Astrophysics Data System (ADS)
Casalino, Giuseppe
2018-03-01
Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.
Applied Operations Research: Augmented Reality in an Industrial Environment
NASA Technical Reports Server (NTRS)
Cole, Stuart K.
2015-01-01
Augmented reality is the application of computer generated data or graphics onto a real world view. Its use provides the operator additional information or a heightened situational awareness. While advancements have been made in automation and diagnostics of high value critical equipment to improve readiness, reliability and maintenance, the need for assisting and support to Operations and Maintenance staff persists. AR can improve the human machine interface where computer capabilities maximize the human experience and analysis capabilities. NASA operates multiple facilities with complex ground based HVCE in support of national aerodynamics and space exploration, and the need exists to improve operational support and close a gap related to capability sustainment where key and experienced staff consistently rotate work assignments and reach their expiration of term of service. The initiation of an AR capability to augment and improve human abilities and training experience in the industrial environment requires planning and establishment of a goal and objectives for the systems and specific applications. This paper explored use of AR in support of Operation staff in real time operation of HVCE and its maintenance. The results identified include identification of specific goal and objectives, challenges related to availability and computer system infrastructure.
Lewis Structures Technology, 1988. Volume 1: Structural Dynamics
NASA Technical Reports Server (NTRS)
1988-01-01
The specific purpose of the symposium was to familiarize the engineering structures community with the depth and range of research performed by the Structures Division of the Lewis Research Center and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive testing, dynamical systems, fatigue and damage, wind turbines, hot section technology, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics.
Post-Fisherian Experimentation: From Physical to Virtual
Jeff Wu, C. F.
2014-04-24
Fisher's pioneering work in design of experiments has inspired further work with broader applications, especially in industrial experimentation. Three topics in physical experiments are discussed: principles of effect hierarchy, sparsity, and heredity for factorial designs, a new method called CME for de-aliasing aliased effects, and robust parameter design. The recent emergence of virtual experiments on a computer is reviewed. Here, some major challenges in computer experiments, which must go beyond Fisherian principles, are outlined.
Structural Analysis Made 'NESSUSary'
NASA Technical Reports Server (NTRS)
2005-01-01
Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application
Job Prospects for Petroleum Engineers.
ERIC Educational Resources Information Center
Basta, Nicholas
1988-01-01
Describes petroleum engineering as one area in industry where job opportunities are few but where the worst of the declines has been seen. Discusses the causes of the decline. Lists several areas where petroleum engineers have found alternatives including environmental projects, water supply projects, and computer applications. (CW)
Information Systems Curriculum.
ERIC Educational Resources Information Center
O'Neil, Sharon Lund
This guide outlines an information systems curriculum that has been developed for postsecondary institutions in Texas. The curriculum, which is intended to help students acquire the competencies necessary to function in automated offices in business and industry, includes the following core courses: computer business applications I and II,…
Optics for Processes, Products and Metrology
NASA Astrophysics Data System (ADS)
Mather, George
1999-04-01
Optical physics has a variety of applications in industry, including process inspection, coatings development, vision instrumentation, spectroscopy, and many others. Optics has been used extensively in the design of solar energy collection systems and coatings, for example. Also, with the availability of good CCD cameras and fast computers, it has become possible to develop real-time inspection and metrology devices that can accommodate the high throughputs encountered in modern production processes. More recently, developments in moiré interferometry show great promise for applications in the basic metals and electronics industries. The talk will illustrate applications of optics by discussing process inspection techniques for defect detection, part dimensioning, birefringence measurement, and the analysis of optical coatings in the automotive, glass, and optical disc industries. In particular, examples of optical techniques for the quality control of CD-R, MO, and CD-RW discs will be presented. In addition, the application of optical concepts to solar energy collector design and to metrology by moiré techniques will be discussed. Finally, some of the modern techniques and instruments used for qualitative and quantitative material analysis will be presented.
High Performance Computing at NASA
NASA Technical Reports Server (NTRS)
Bailey, David H.; Cooper, D. M. (Technical Monitor)
1994-01-01
The speaker will give an overview of high performance computing in the U.S. in general and within NASA in particular, including a description of the recently signed NASA-IBM cooperative agreement. The latest performance figures of various parallel systems on the NAS Parallel Benchmarks will be presented. The speaker was one of the authors of the NAS (National Aerospace Standards) Parallel Benchmarks, which are now widely cited in the industry as a measure of sustained performance on realistic high-end scientific applications. It will be shown that significant progress has been made by the highly parallel supercomputer industry during the past year or so, with several new systems, based on high-performance RISC processors, that now deliver superior performance per dollar compared to conventional supercomputers. Various pitfalls in reporting performance will be discussed. The speaker will then conclude by assessing the general state of the high performance computing field.
Molecular simulation of separation of CO{sub 2} from flue gases in Cu-BTC metal-organic framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Q.Y.; Xue, C.Y.; Zhong, C.L.
2007-11-15
In this work, a computational study was performed on the adsorption separation of CO{sub 2} from flue gases (mixtures of CO{sub 2}/N{sub 2}/O{sub 2}) in Cu-BTC metal-organic framework (MOF) to investigate the applicability of MOFs to this important industrial system. The computational results showed that Cu-BTC is a promising material for separation of CO{sub 2} from flue gases, and the macroscopic separation behaviors of the MOF were elucidated at a molecular level to give insight into the underlying mechanisms. The present work not only provided useful information for understanding the separation characteristics of MOFs, but also showed their potential applicationsmore » in chemical industry.« less
National meeting to review IPAD status and goals. [Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1980-01-01
A joint NASA/industry project called Integrated Programs for Aerospace-vehicle Design (IPAD) is described, which has the goal of raising aerospace-industry productivity through the application of computers to integrate company-wide management of engineering data. Basically a general-purpose interactive computing system developed to support engineering design processes, the IPAD design is composed of three major software components: the executive, data management, and geometry and graphics software. Results of IPAD activities include a comprehensive description of a future representative aerospace vehicle design process and its interface to manufacturing, and requirements and preliminary design of a future IPAD software system to integrate engineering activities of an aerospace company having several products under simultaneous development.
Zhao, Linping; Patel, Pravin K; Cohen, Mimis
2012-07-01
Computer aided design and manufacturing (CAD/CAM) technology today is the standard in manufacturing industry. The application of the CAD/CAM technology, together with the emerging 3D medical images based virtual surgical planning (VSP) technology, to craniomaxillofacial reconstruction has been gaining increasing attention to reconstructive surgeons. This article illustrates the components, system and clinical management of the VSP and CAD/CAM technology including: data acquisition, virtual surgical and treatment planning, individual implant design and fabrication, and outcome assessment. It focuses primarily on the technical aspects of the VSP and CAD/CAM system to improve the predictability of the planning and outcome.
Adaptive Fuzzy Systems in Computational Intelligence
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.
1996-01-01
In recent years, the interest in computational intelligence techniques, which currently includes neural networks, fuzzy systems, and evolutionary programming, has grown significantly and a number of their applications have been developed in the government and industry. In future, an essential element in these systems will be fuzzy systems that can learn from experience by using neural network in refining their performances. The GARIC architecture, introduced earlier, is an example of a fuzzy reinforcement learning system which has been applied in several control domains such as cart-pole balancing, simulation of to Space Shuttle orbital operations, and tether control. A number of examples from GARIC's applications in these domains will be demonstrated.
Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT.
Lavassani, Mehrzad; Forsström, Stefan; Jennehag, Ulf; Zhang, Tingting
2018-05-12
Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications.
Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT
Lavassani, Mehrzad; Jennehag, Ulf; Zhang, Tingting
2018-01-01
Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications. PMID:29757227
Computers for Manned Space Applications Base on Commercial Off-the-Shelf Components
NASA Astrophysics Data System (ADS)
Vogel, T.; Gronowski, M.
2009-05-01
Similar to the consumer markets there has been an ever increasing demand in processing power, signal processing capabilities and memory space also for computers used for science data processing in space. An important driver of this development have been the payload developers for the International Space Station, requesting high-speed data acquisition and fast control loops in increasingly complex systems. Current experiments now even perform video processing and compression with their payload controllers. Nowadays the requirements for a space qualified computer are often far beyond the capabilities of, for example, the classic SPARC architecture that is found in ERC32 or LEON CPUs. An increase in performance usually demands costly and power consuming application specific solutions. Continuous developments over the last few years have now led to an alternative approach that is based on complete electronics modules manufactured for commercial and industrial customers. Computer modules used in industrial environments with a high demand for reliability under harsh environmental conditions like chemical reactors, electrical power plants or on manufacturing lines are entered into a selection procedure. Promising candidates then undergo a detailed characterisation process developed by Astrium Space Transportation. After thorough analysis and some modifications, these modules can replace fully qualified custom built electronics in specific, although not safety critical applications in manned space. This paper focuses on the benefits of COTS1 based electronics modules and the necessary analyses and modifications for their utilisation in manned space applications on the ISS. Some considerations regarding overall systems architecture will also be included. Furthermore this paper will also pinpoint issues that render such modules unsuitable for specific tasks, and justify the reasons. Finally, the conclusion of this paper will advocate the implementation of COTS based electronics for a range of applications within specifically adapted systems. The findings in this paper are extrapolated from two reference computer systems, both having been launched in 2008. One of those was a LEON-2 based computer installed onboard the Columbus Orbital Facility while the other system consisted mainly of a commercial Power-PC module that was modified for a launch mounted on the ICC pallet in the Space Shuttle's cargo bay. Both systems are currently upgraded and extended for future applications.
Velocity precision measurements using laser Doppler anemometry
NASA Astrophysics Data System (ADS)
Dopheide, D.; Taux, G.; Narjes, L.
1985-07-01
A Laser Doppler Anemometer (LDA) was calibrated to determine its applicability to high pressure measurements (up to 10 bars) for industrial purposes. The measurement procedure with LDA and the experimental computerized layouts are presented. The calibration procedure is based on absolute accuracy of Doppler frequency and calibration of interference strip intervals. A four-quadrant detector allows comparison of the interference strip distance measurements and computer profiles. Further development of LDA is recommended to increase accuracy (0.1% inaccuracy) and to apply the method industrially.
Local Area Networks and the Learning Lab of the Future.
ERIC Educational Resources Information Center
Ebersole, Dennis C.
1987-01-01
Considers educational applications of local area computer networks and discusses industry standards for design established by the International Standards Organization (ISO) and Institute of Electrical and Electronic Engineers (IEEE). A futuristic view of a learning laboratory using a local area network is presented. (Author/LRW)
Computer Tomography and Hybrid Optical/Digital Methods for Aerodynamic Measurements.
1987-12-28
Industrial Applications of Corn- on Axisymnnietric Flame ’Iempnlw res Measured by Holo- puted Tornographv arid NMI? Imiaging (Optical Society of graphic...Pontificia Universidad Catolica de Chile. Escuela de Ingenieria . Santiago, equal. The optical path length difference (OPD) be- Chile. tween the two rays
ERIC Educational Resources Information Center
Simkin, Mark G.
2008-01-01
Data-validation routines enable computer applications to test data to ensure their accuracy, completeness, and conformance to industry or proprietary standards. This paper presents five programming cases that require students to validate five different types of data: (1) simple user data entries, (2) UPC codes, (3) passwords, (4) ISBN numbers, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.
Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for bothmore » academia and government, including configuration options, hardware issues, challenges, and solutions.« less
NASA Technical Reports Server (NTRS)
1993-01-01
Under an Army Small Business Innovation Research (SBIR) grant, Symbiotics, Inc. developed a software system that permits users to upgrade products from standalone applications so they can communicate in a distributed computing environment. Under a subsequent NASA SBIR grant, Symbiotics added additional tools to the SOCIAL product to enable NASA to coordinate conventional systems for planning Shuttle launch support operations. Using SOCIAL, data may be shared among applications in a computer network even when the applications are written in different programming languages. The product was introduced to the commercial market in 1993 and is used to monitor and control equipment for operation support and to integrate financial networks. The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry. InQuisiX is a reuse library providing high performance classification, cataloging, searching, browsing, retrieval and synthesis capabilities. These form the foundation for software reuse, producing higher quality software at lower cost and in less time. Software Productivity Solutions, Inc. developed the technology under Small Business Innovation Research (SBIR) projects funded by NASA and the Army and is marketing InQuisiX in conjunction with Science Applications International Corporation (SAIC). The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry.
The development and application of CFD technology in mechanical engineering
NASA Astrophysics Data System (ADS)
Wei, Yufeng
2017-12-01
Computational Fluid Dynamics (CFD) is an analysis of the physical phenomena involved in fluid flow and heat conduction by computer numerical calculation and graphical display. The numerical method simulates the complexity of the physical problem and the precision of the numerical solution, which is directly related to the hardware speed of the computer and the hardware such as memory. With the continuous improvement of computer performance and CFD technology, it has been widely applied to the field of water conservancy engineering, environmental engineering and industrial engineering. This paper summarizes the development process of CFD, the theoretical basis, the governing equations of fluid mechanics, and introduces the various methods of numerical calculation and the related development of CFD technology. Finally, CFD technology in the mechanical engineering related applications are summarized. It is hoped that this review will help researchers in the field of mechanical engineering.
NASA Technical Reports Server (NTRS)
Kline, S. J. (Editor); Cantwell, B. J. (Editor); Lilley, G. M.
1982-01-01
Computational techniques for simulating turbulent flows were explored, together with the results of experimental investigations. Particular attention was devoted to the possibility of defining a universal closure model, applicable for all turbulence situations; however, conclusions were drawn that zonal models, describing localized structures, were the most promising techniques to date. The taxonomy of turbulent flows was summarized, as were algebraic, differential, integral, and partial differential methods for numerical depiction of turbulent flows. Numerous comparisons of theoretically predicted and experimentally obtained data for wall pressure distributions, velocity profiles, turbulent kinetic energy profiles, Reynolds shear stress profiles, and flows around transonic airfoils were presented. Simplifying techniques for reducing the necessary computational time for modeling complex flowfields were surveyed, together with the industrial requirements and applications of computational fluid dynamics techniques.
Security Risks of Cloud Computing and Its Emergence as 5th Utility Service
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.
NASA Astrophysics Data System (ADS)
Astley, R. J.; Sugimoto, R.; Mustafi, P.
2011-08-01
Novel techniques are presented to reduce noise from turbofan aircraft engines by optimising the acoustic treatment in engine ducts. The application of Computational Aero-Acoustics (CAA) to predict acoustic propagation and absorption in turbofan ducts is reviewed and a critical assessment of performance indicates that validated and accurate techniques are now available for realistic engine predictions. A procedure for integrating CAA methods with state of the art optimisation techniques is proposed in the remainder of the article. This is achieved by embedding advanced computational methods for noise prediction within automated and semi-automated optimisation schemes. Two different strategies are described and applied to realistic nacelle geometries and fan sources to demonstrate the feasibility of this approach for industry scale problems.
Research and Application of Autodesk Fusion360 in Industrial Design
NASA Astrophysics Data System (ADS)
Song, P. P.; Qi, Y. M.; Cai, D. C.
2018-05-01
In 2016, Fusion 360, a productintroduced byAutodesk and integrating industrial design, structural design, mechanical simulation, and CAM, turns out a design platform supportingcollaboration and sharing both cross-platform and via the cloud. In previous products, design and manufacturing use to be isolated. In the course of design, research and development, the communication between designers and engineers used to go on through different software products, tool commands, and even industry terms. Moreover, difficulty also lies with the communication between design thoughts and machining strategies. Naturally, a difficult product design and R & D process would trigger a noticeable gap between the design model and the actual product. A complete product development process tends to cover several major areas, such as industrial design, mechanical design, rendering and animation, computer aided emulation (CAE), and computer aided manufacturing (CAM). Fusion 360, a perfect design solving the technical problems of cross-platform data exchange, realizes the effective control of cross-regional collaboration and presents an overview of collaboration and breaks the barriers between art and manufacturing, andblocks between design and processing. The “Eco-development of Fusion360 Industrial Chain” is both a significant means to and an inevitable trend forthe manufacturers and industrial designers to carry out innovation in China.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
...; (Formerly FDA-2007D-0393)] Guidance for Industry: Blood Establishment Computer System Validation in the User... Industry: Blood Establishment Computer System Validation in the User's Facility'' dated April 2013. The... document entitled ``Guidance for Industry: Blood Establishment Computer System Validation in the User's...
ERIC Educational Resources Information Center
Murtha, Judith Rush
The purpose of this study was to write a computer program that would not only output a color pattern weave to a cathode ray tube (CRT), but would also analyze a painted design and output a printed diagram that would show how to set up a loom in order to produce the woven design. The first of seven chapters describes the problem and the intent of…
Trends in computer hardware and software.
Frankenfeld, F M
1993-04-01
Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.
None
2018-01-24
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-06-20
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry. Michael Yoo, Managing Director, Head of the Technical Council, UBS. Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse. Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-01-25
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industries Adam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-02-02
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry. Michael Yoo, Managing Director, Head of the Technical Council, UBS. Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse. Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance. 4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industries Adam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followedmore » by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN. 3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.« less
None
2018-02-01
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
None
2018-01-24
The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing â from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with Seti@Home. Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN. 3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.
Clinical operations generation next… The age of technology and outsourcing
Temkar, Priya
2015-01-01
Huge cost pressures and the need to drive faster approvals has driven a technology transformation in the clinical trial (CT) industry. The CT industry is thus leveraging mobile data, cloud computing, social media, robotic automation, and electronic source to drive efficiencies in a big way. Outsourcing of clinical operations support services to technology companies with a clinical edge is gaining tremendous importance. This paper provides an overview of current technology trends, applicable Food and Drug Administration (FDA) guidelines, basic challenges that the pharma industry is facing in trying to implement such changes and its shift towards outsourcing these services to enable it to focus on site operations. PMID:26623386
Clinical operations generation next… The age of technology and outsourcing.
Temkar, Priya
2015-01-01
Huge cost pressures and the need to drive faster approvals has driven a technology transformation in the clinical trial (CT) industry. The CT industry is thus leveraging mobile data, cloud computing, social media, robotic automation, and electronic source to drive efficiencies in a big way. Outsourcing of clinical operations support services to technology companies with a clinical edge is gaining tremendous importance. This paper provides an overview of current technology trends, applicable Food and Drug Administration (FDA) guidelines, basic challenges that the pharma industry is facing in trying to implement such changes and its shift towards outsourcing these services to enable it to focus on site operations.
Standards: The Keys to Domestic and International Competitiveness.
ERIC Educational Resources Information Center
Hunter, Robert D.
1993-01-01
Demonstrates the importance of standards for the competitiveness of U.S. companies and for international trade. The value of standards in research and development, marketing, design, purchasing, manufacturing, installation, and service is explained. Examples of specific standards and their application to the computer industry are included. (10…
1991-06-14
American firms of foreign origin as regards R&TD is being practised by the Department of Defense, and Sematech is one example here. As negotiations... taxation measures. Training schemes for staff in the banking sector encom- passing both the financial side and computerized sys- tems applications
Teacher Training for High Technology. Final Report.
ERIC Educational Resources Information Center
Goettmann, Thomas L.
The objective of this project was to develop computer literacy and a working knowledge of microprocessor applications and digital circuits for teachers in selected vocational subject areas. Twenty-four vocational trade and industry teachers completed 16 hours of training in microprocessor skills for computerized instruction and curriculum update.…
Secondary School Projects and the Microchip.
ERIC Educational Resources Information Center
Irvine, A. F.
This study of the applications of microelectronic devices in industry, together with an assessment of their value for use in schools, emphasizes the basic principles underlying the new technology and the practical ways in which these can contribute to associated work in computing and other disciplines in the school curriculum. Following a…
Building Virtual Models by Postprocessing Radiology Images: A Guide for Anatomy Faculty
ERIC Educational Resources Information Center
Tam, Matthew D. B. S.
2010-01-01
Radiology and radiologists are recognized as increasingly valuable resources for the teaching and learning of anatomy. State-of-the-art radiology department workstations with industry-standard software applications can provide exquisite demonstrations of anatomy, pathology, and more recently, physiology. Similar advances in personal computers and…
Application of process tomography in gas-solid fluidised beds in different scales and structures
NASA Astrophysics Data System (ADS)
Wang, H. G.; Che, H. Q.; Ye, J. M.; Tu, Q. Y.; Wu, Z. P.; Yang, W. Q.; Ocone, R.
2018-04-01
Gas-solid fluidised beds are commonly used in particle-related processes, e.g. for coal combustion and gasification in the power industry, and the coating and granulation process in the pharmaceutical industry. Because the operation efficiency depends on the gas-solid flow characteristics, it is necessary to investigate the flow behaviour. This paper is about the application of process tomography, including electrical capacitance tomography (ECT) and microwave tomography (MWT), in multi-scale gas-solid fluidisation processes in the pharmaceutical and power industries. This is the first time that both ECT and MWT have been applied for this purpose in multi-scale and complex structure. To evaluate the sensor design and image reconstruction and to investigate the effects of sensor structure and dimension on the image quality, a normalised sensitivity coefficient is introduced. In the meantime, computational fluid dynamic (CFD) analysis based on a computational particle fluid dynamic (CPFD) model and a two-phase fluid model (TFM) is used. Part of the CPFD-TFM simulation results are compared and validated by experimental results from ECT and/or MWT. By both simulation and experiment, the complex flow hydrodynamic behaviour in different scales is analysed. Time-series capacitance data are analysed both in time and frequency domains to reveal the flow characteristics.
NASA Astrophysics Data System (ADS)
Wray, Timothy J.
Computational fluid dynamics (CFD) is routinely used in performance prediction and design of aircraft, turbomachinery, automobiles, and in many other industrial applications. Despite its wide range of use, deficiencies in its prediction accuracy still exist. One critical weakness is the accurate simulation of complex turbulent flows using the Reynolds-Averaged Navier-Stokes equations in conjunction with a turbulence model. The goal of this research has been to develop an eddy viscosity type turbulence model to increase the accuracy of flow simulations for mildly separated flows, flows with rotation and curvature effects, and flows with surface roughness. It is accomplished by developing a new zonal one-equation turbulence model which relies heavily on the flow physics; it is now known in the literature as the Wray-Agarwal one-equation turbulence model. The effectiveness of the new model is demonstrated by comparing its results with those obtained by the industry standard one-equation Spalart-Allmaras model and two-equation Shear-Stress-Transport k - o model and experimental data. Results for subsonic, transonic, and supersonic flows in and about complex geometries are presented. It is demonstrated that the Wray-Agarwal model can provide the industry and CFD researchers an accurate, efficient, and reliable turbulence model for the computation of a large class of complex turbulent flows.
Advanced Computational Methods in Bio-Mechanics.
Al Qahtani, Waleed M S; El-Anwar, Mohamed I
2018-04-15
A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.
Some infra-red applications in combustion technology. Interim report 1 March-31 August 78
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swithenbank, J.; Turan, A.; Taylor, D.S.
1978-01-01
Infrared technology finds many applications in the field of combustion ranging from pollution monitoring, through military systems, to the control of industrial furnaces and boilers. This review of some selected concepts highlights the interaction between the diagnostic role of infrared measurements and the current status of mathematical modelling of combustion systems. The link between measurement and and computing has also evolved to the point where a digital processor is becoming an inherent part of many new instruments. This point is illustrated by reference to the diffraction particle size meter, fire detection and alarm systems, and furnace control. In the future,more » as fuels become scarce and expensive, and micro-electronics become more available and inexpensive, it is certain that infrared devices will find increasing application in smaller industries and the home. (Author)« less
NASA Astrophysics Data System (ADS)
Brebbia, C. A.; Futagami, T.; Tanaka, M.
The boundary-element method (BEM) in computational fluid and solid mechanics is examined in reviews and reports of theoretical studies and practical applications. Topics presented include the fundamental mathematical principles of BEMs, potential problems, EM-field problems, heat transfer, potential-wave problems, fluid flow, elasticity problems, fracture mechanics, plates and shells, inelastic problems, geomechanics, dynamics, industrial applications of BEMs, optimization methods based on the BEM, numerical techniques, and coupling.
Fast computation of the kurtogram for the detection of transient faults
NASA Astrophysics Data System (ADS)
Antoni, Jérôme
2007-01-01
The kurtogram is a fourth-order spectral analysis tool recently introduced for detecting and characterising non-stationarities in a signal. The paradigm relies on the assertion that each type of transient is associated with an optimal (frequency/frequency resolution) dyad { f,Δf} which maximises its kurtosis, and hence its detection. However, the complete exploration of the whole plane ( f,Δf) is a formidable task hardly amenable to on-line industrial applications. In this communication we describe a fast algorithm for computing the kurtogram over a grid that finely samples the ( f,Δf) plane. Its complexity is on the order of N log N, similarly to the FFT. The efficiency of the algorithm is then illustrated on several industrial cases concerned with the detection of incipient transient faults.
NASA Astrophysics Data System (ADS)
Homainejad, Amir S.; Satari, Mehran
2000-05-01
VR is possible which brings users to the reality by computer and VE is a simulated world which takes users to any points and directions of the object. VR and VE can be very useful if accurate and precise data are sued, and allows users to work with realistic model. Photogrammetry is a technique which is able to collect and provide accurate and precise data for building 3D model in a computer. Data can be collected from various sensor and cameras, and methods of data collector are vary based on the method of image acquiring. Indeed VR includes real-time graphics, 3D model, and display and it has application in the entertainment industry, flight simulators, industrial design.
The Computer Industry. High Technology Industries: Profiles and Outlooks.
ERIC Educational Resources Information Center
International Trade Administration (DOC), Washington, DC.
A series of meetings was held to assess future problems in United States high technology, particularly in the fields of robotics, computers, semiconductors, and telecommunications. This report, which focuses on the computer industry, includes a profile of this industry and the papers presented by industry speakers during the meetings. The profile…
An assembly system based on industrial robot with binocular stereo vision
NASA Astrophysics Data System (ADS)
Tang, Hong; Xiao, Nanfeng
2017-01-01
This paper proposes an electronic part and component assembly system based on an industrial robot with binocular stereo vision. Firstly, binocular stereo vision with a visual attention mechanism model is used to get quickly the image regions which contain the electronic parts and components. Secondly, a deep neural network is adopted to recognize the features of the electronic parts and components. Thirdly, in order to control the end-effector of the industrial robot to grasp the electronic parts and components, a genetic algorithm (GA) is proposed to compute the transition matrix and the inverse kinematics of the industrial robot (end-effector), which plays a key role in bridging the binocular stereo vision and the industrial robot. Finally, the proposed assembly system is tested in LED component assembly experiments, and the results denote that it has high efficiency and good applicability.
Java Performance for Scientific Applications on LLNL Computer Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kapfer, C; Wissink, A
2002-05-10
Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less
1992-02-01
universities and industry who have resident appointments for limited periods of time , and by consultants. Members of NASA’s research staff also may be...Submitted to Journal of Computational Physics. Banks, H. T., G. Propst, and R. J. Silcox: A comparison of time domain boundary conditions for acoustic...2, pp. 117-145, i991. Nicol, David M.: T/ cost of conservative synchronization in parallel discrete event sim- ulations. ICASE Report No. 90-20, May
Turbulent Bubbly Flow in a Vertical Pipe Computed By an Eddy-Resolving Reynolds Stress Model
2014-09-19
the numerical code OpenFOAM R©. 1 Introduction Turbulent bubbly flows are encountered in many industrially relevant applications, such as chemical in...performed using the OpenFOAM -2.2.2 computational code utilizing a cell- center-based finite volume method on an unstructured numerical grid. The...the mean Courant number is always below 0.4. The utilized turbulence models were implemented into the so-called twoPhaseEulerFoam solver in OpenFOAM , to
Testing For EM Upsets In Aircraft Control Computers
NASA Technical Reports Server (NTRS)
Belcastro, Celeste M.
1994-01-01
Effects of transient electrical signals evaluated in laboratory tests. Method of evaluating nominally fault-tolerant, aircraft-type digital-computer-based control system devised. Provides for evaluation of susceptibility of system to upset and evaluation of integrity of control when system subjected to transient electrical signals like those induced by electromagnetic (EM) source, in this case lightning. Beyond aerospace applications, fault-tolerant control systems becoming more wide-spread in industry; such as in automobiles. Method supports practical, systematic tests for evaluation of designs of fault-tolerant control systems.
[Hardware for graphics systems].
Goetz, C
1991-02-01
In all personal computer applications, be it for private or professional use, the decision of which "brand" of computer to buy is of central importance. In the USA Apple computers are mainly used in universities, while in Europe computers of the so-called "industry standard" by IBM (or clones thereof) have been increasingly used for many years. Independently of any brand name considerations, the computer components purchased must meet the current (and projected) needs of the user. Graphic capabilities and standards, processor speed, the use of co-processors, as well as input and output devices such as "mouse", printers and scanners are discussed. This overview is meant to serve as a decision aid. Potential users are given a short but detailed summary of current technical features.
NASA Astrophysics Data System (ADS)
Bouchpan-Lerust-Juéry, L.
2007-08-01
Current and next generation on-board computer systems tend to implement real-time embedded control applications (e.g. Attitude and Orbit Control Subsystem (AOCS), Packet Utililization Standard (PUS), spacecraft autonomy . . . ) which must meet high standards of Reliability and Predictability as well as Safety. All these requirements require a considerable amount of effort and cost for Space Sofware Industry. This paper, in a first part, presents a free Open Source integrated solution to develop RTAI applications from analysis, design, simulation and direct implementation using code generation based on Open Source and in its second part summarises this suggested approach, its results and the conclusion for further work.
CFD studies on biomass thermochemical conversion.
Wang, Yiqun; Yan, Lifeng
2008-06-01
Thermochemical conversion of biomass offers an efficient and economically process to provide gaseous, liquid and solid fuels and prepare chemicals derived from biomass. Computational fluid dynamic (CFD) modeling applications on biomass thermochemical processes help to optimize the design and operation of thermochemical reactors. Recent progression in numerical techniques and computing efficacy has advanced CFD as a widely used approach to provide efficient design solutions in industry. This paper introduces the fundamentals involved in developing a CFD solution. Mathematical equations governing the fluid flow, heat and mass transfer and chemical reactions in thermochemical systems are described and sub-models for individual processes are presented. It provides a review of various applications of CFD in the biomass thermochemical process field.
CFD Studies on Biomass Thermochemical Conversion
Wang, Yiqun; Yan, Lifeng
2008-01-01
Thermochemical conversion of biomass offers an efficient and economically process to provide gaseous, liquid and solid fuels and prepare chemicals derived from biomass. Computational fluid dynamic (CFD) modeling applications on biomass thermochemical processes help to optimize the design and operation of thermochemical reactors. Recent progression in numerical techniques and computing efficacy has advanced CFD as a widely used approach to provide efficient design solutions in industry. This paper introduces the fundamentals involved in developing a CFD solution. Mathematical equations governing the fluid flow, heat and mass transfer and chemical reactions in thermochemical systems are described and sub-models for individual processes are presented. It provides a review of various applications of CFD in the biomass thermochemical process field. PMID:19325848
NASA Astrophysics Data System (ADS)
Liu, Lei; Hong, Xiaobin; Wu, Jian; Lin, Jintong
As Grid computing continues to gain popularity in the industry and research community, it also attracts more attention from the customer level. The large number of users and high frequency of job requests in the consumer market make it challenging. Clearly, all the current Client/Server(C/S)-based architecture will become unfeasible for supporting large-scale Grid applications due to its poor scalability and poor fault-tolerance. In this paper, based on our previous works [1, 2], a novel self-organized architecture to realize a highly scalable and flexible platform for Grids is proposed. Experimental results show that this architecture is suitable and efficient for consumer-oriented Grids.
Artificial Intelligence in Surgery: Promises and Perils.
Hashimoto, Daniel A; Rosman, Guy; Rus, Daniela; Meireles, Ozanan R
2018-07-01
The aim of this review was to summarize major topics in artificial intelligence (AI), including their applications and limitations in surgery. This paper reviews the key capabilities of AI to help surgeons understand and critically evaluate new AI applications and to contribute to new developments. AI is composed of various subfields that each provide potential solutions to clinical problems. Each of the core subfields of AI reviewed in this piece has also been used in other industries such as the autonomous car, social networks, and deep learning computers. A review of AI papers across computer science, statistics, and medical sources was conducted to identify key concepts and techniques within AI that are driving innovation across industries, including surgery. Limitations and challenges of working with AI were also reviewed. Four main subfields of AI were defined: (1) machine learning, (2) artificial neural networks, (3) natural language processing, and (4) computer vision. Their current and future applications to surgical practice were introduced, including big data analytics and clinical decision support systems. The implications of AI for surgeons and the role of surgeons in advancing the technology to optimize clinical effectiveness were discussed. Surgeons are well positioned to help integrate AI into modern practice. Surgeons should partner with data scientists to capture data across phases of care and to provide clinical context, for AI has the potential to revolutionize the way surgery is taught and practiced with the promise of a future optimized for the highest quality patient care.
A portable data-logging system for industrial hygiene personal chlorine monitoring.
Langhorst, M L; Illes, S P
1986-02-01
The combination of suitable portable sensors or instruments with small microprocessor-based data-logger units has made it possible to obtain detailed monitoring data for many health and environmental applications. Following data acquisition in field use, the logged data may be transferred to a desk-top personal computer for complete flexibility in manipulation of data and formating of results. A system has been assembled from commercial components and demonstrated for chlorine personal monitoring applications. The system consists of personal chlorine sensors, a Metrosonics data-logger and reader unit, and an Apple II Plus personal computer. The computer software was developed to handle sensor calibration, data evaluation and reduction, report formating and long-term storage of raw data on a disk. This system makes it possible to generate time-concentration profiles, evaluate dose above a threshold, quantitate short-term excursions and summarize time-weighted average (TWA) results. Field data from plant trials demonstrated feasibility of use, ruggedness and reliability. No significant differences were found between the time-weighted average chlorine concentrations determined by the sensor/logger system and two other methods: the sulfamic acid bubbler reference method and the 3M Poroplastic diffusional dosimeter. The sensor/data-logger system, however, provided far more information than the other two methods in terms of peak excursions, TWAs and exposure doses. For industrial hygiene applications, the system allows better definition of employee exposures, particularly for chemicals with acute as well as chronic health effects.(ABSTRACT TRUNCATED AT 250 WORDS)
NASA Astrophysics Data System (ADS)
Singh, Krishna P.; Baweja, Lokesh; Wolkenhauer, Olaf; Rahman, Qamar; Gupta, Shailendra K.
2018-03-01
Graphene-based nanomaterials (GBNMs) are widely used in various industrial and biomedical applications. GBNMs of different compositions, size and shapes are being introduced without thorough toxicity evaluation due to the unavailability of regulatory guidelines. Computational toxicity prediction methods are used by regulatory bodies to quickly assess health hazards caused by newer materials. Due to increasing demand of GBNMs in various size and functional groups in industrial and consumer based applications, rapid and reliable computational toxicity assessment methods are urgently needed. In the present work, we investigate the impact of graphene and graphene oxide nanomaterials on the structural conformations of small hepcidin peptide and compare the materials for their structural and conformational changes. Our molecular dynamics simulation studies revealed conformational changes in hepcidin due to its interaction with GBMNs, which results in a loss of its functional properties. Our results indicate that hepcidin peptide undergo severe structural deformations when superimposed on the graphene sheet in comparison to graphene oxide sheet. These observations suggest that graphene is more toxic than a graphene oxide nanosheet of similar area. Overall, this study indicates that computational methods based on structural deformation, using molecular dynamics (MD) simulations, can be used for the early evaluation of toxicity potential of novel nanomaterials.
New design environment for defect detection in web inspection systems
NASA Astrophysics Data System (ADS)
Hajimowlana, S. Hossain; Muscedere, Roberto; Jullien, Graham A.; Roberts, James W.
1997-09-01
One of the aims of industrial machine vision is to develop computer and electronic systems destined to replace human vision in the process of quality control of industrial production. In this paper we discuss the development of a new design environment developed for real-time defect detection using reconfigurable FPGA and DSP processor mounted inside a DALSA programmable CCD camera. The FPGA is directly connected to the video data-stream and outputs data to a low bandwidth output bus. The system is targeted for web inspection but has the potential for broader application areas. We describe and show test results of the prototype system board, mounted inside a DALSA camera and discuss some of the algorithms currently simulated and implemented for web inspection applications.
Enhancing Security by System-Level Virtualization in Cloud Computing Environments
NASA Astrophysics Data System (ADS)
Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei
Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.
Distributed data mining on grids: services, tools, and applications.
Cannataro, Mario; Congiusta, Antonio; Pugliese, Andrea; Talia, Domenico; Trunfio, Paolo
2004-12-01
Data mining algorithms are widely used today for the analysis of large corporate and scientific datasets stored in databases and data archives. Industry, science, and commerce fields often need to analyze very large datasets maintained over geographically distributed sites by using the computational power of distributed and parallel systems. The grid can play a significant role in providing an effective computational support for distributed knowledge discovery applications. For the development of data mining applications on grids we designed a system called Knowledge Grid. This paper describes the Knowledge Grid framework and presents the toolset provided by the Knowledge Grid for implementing distributed knowledge discovery. The paper discusses how to design and implement data mining applications by using the Knowledge Grid tools starting from searching grid resources, composing software and data components, and executing the resulting data mining process on a grid. Some performance results are also discussed.
Nanopyroxene Grafting with β-Cyclodextrin Monomer for Wastewater Applications.
Nafie, Ghada; Vitale, Gerardo; Carbognani Ortega, Lante; Nassar, Nashaat N
2017-12-06
Emerging nanoparticle technology provides opportunities for environmentally friendly wastewater treatment applications, including those in the large liquid tailings containments in the Alberta oil sands. In this study, we synthesize β-cyclodextrin grafted nanopyroxenes to offer an ecofriendly platform for the selective removal of organic compounds typically present in these types of applications. We carry out computational modeling at the micro level through molecular mechanics and molecular dynamics simulations and laboratory experiments at the macro level to understand the interactions between the synthesized nanomaterials and two-model naphthenic acid molecules (cyclopentanecarboxylic and trans-4-pentylcyclohexanecarboxylic acids) typically existing in tailing ponds. The proof-of-concept computational modeling and experiments demonstrate that monomer grafted nanopyroxene or nano-AE of the sodium iron-silicate aegirine are found to be promising candidates for the removal of polar organic compounds from wastewater, among other applications. These nano-AE offer new possibilities for treating tailing ponds generated by the oil sands industry.
Photonics for aerospace sensors
NASA Astrophysics Data System (ADS)
Pellegrino, John; Adler, Eric D.; Filipov, Andree N.; Harrison, Lorna J.; van der Gracht, Joseph; Smith, Dale J.; Tayag, Tristan J.; Viveiros, Edward A.
1992-11-01
The maturation in the state-of-the-art of optical components is enabling increased applications for the technology. Most notable is the ever-expanding market for fiber optic data and communications links, familiar in both commercial and military markets. The inherent properties of optics and photonics, however, have suggested that components and processors may be designed that offer advantages over more commonly considered digital approaches for a variety of airborne sensor and signal processing applications. Various academic, industrial, and governmental research groups have been actively investigating and exploiting these properties of high bandwidth, large degree of parallelism in computation (e.g., processing in parallel over a two-dimensional field), and interconnectivity, and have succeeded in advancing the technology to the stage of systems demonstration. Such advantages as computational throughput and low operating power consumption are highly attractive for many computationally intensive problems. This review covers the key devices necessary for optical signal and image processors, some of the system application demonstration programs currently in progress, and active research directions for the implementation of next-generation architectures.
Present, future of automotive hybrid IC applications discussed
NASA Astrophysics Data System (ADS)
Matsuda, Nobuyoshi; Fukuoka, Atuhisa
1987-09-01
Hybrid ICs are presently utilized in various fields such as commercial televisions, VTRs, and audio devices, industrial usage of communication equipment, computers, terminals, and automobiles. Its applications and environments are various and diverse. The functions required for hybrid ICs vary from simple high density mounting for a system to the realization of high mechanisms with the application of function timing. The functions are properly used depending upon the system with its hybrid ICs and its circuit composition. Considering structure and reliability requirements for automotive hybrid ICs, an application example for hybrid ICs which use the package (COMPACT), will be discussed.
NASA Technical Reports Server (NTRS)
Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.
1992-01-01
Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.
A Review Study on Cloud Computing Issues
NASA Astrophysics Data System (ADS)
Kanaan Kadhim, Qusay; Yusof, Robiah; Sadeq Mahdi, Hamid; Al-shami, Sayed Samer Ali; Rahayu Selamat, Siti
2018-05-01
Cloud computing is the most promising current implementation of utility computing in the business world, because it provides some key features over classic utility computing, such as elasticity to allow clients dynamically scale-up and scale-down the resources in execution time. Nevertheless, cloud computing is still in its premature stage and experiences lack of standardization. The security issues are the main challenges to cloud computing adoption. Thus, critical industries such as government organizations (ministries) are reluctant to trust cloud computing due to the fear of losing their sensitive data, as it resides on the cloud with no knowledge of data location and lack of transparency of Cloud Service Providers (CSPs) mechanisms used to secure their data and applications which have created a barrier against adopting this agile computing paradigm. This study aims to review and classify the issues that surround the implementation of cloud computing which a hot area that needs to be addressed by future research.
Nonlinear optical THz generation and sensing applications
NASA Astrophysics Data System (ADS)
Kawase, Kodo
2012-03-01
We have suggested a wide range of real-life applications using novel terahertz imaging techniques. A high-resolution terahertz tomography was demonstrated by ultra short terahertz pulses using optical fiber and a nonlinear organic crystal. We also report on the thickness measurement of very thin films using high-sensitivity metal mesh filter. Further we have succeeded in a non-destructive inspection that can monitor the soot distribution in the ceramic filter using millimeter-to-terahertz wave computed tomography. These techniques are directly applicable to the non-destructive testing in industries.
A Computer Model for Teaching the Dynamic Behavior of AC Contactors
ERIC Educational Resources Information Center
Ruiz, J.-R. R.; Espinosa, A. G.; Romeral, L.
2010-01-01
Ac-powered contactors are extensively used in industry in applications such as automatic electrical devices, motor starters, and heaters. In this work, a practical session that allows students to model and simulate the dynamic behavior of ac-powered electromechanical contactors is presented. Simulation is carried out using a rigorous parametric…
The Application of Large-Scale Hypermedia Information Systems to Training.
ERIC Educational Resources Information Center
Crowder, Richard; And Others
1995-01-01
Discusses the use of hypermedia in electronic information systems that support maintenance operations in large-scale industrial plants. Findings show that after establishing an information system, the same resource base can be used to train personnel how to use the computer system and how to perform operational and maintenance tasks. (Author/JMV)
Interdisciplinary Project Experiences: Collaboration between Majors and Non-Majors
ERIC Educational Resources Information Center
Smarkusky, Debra L.; Toman, Sharon A.
2014-01-01
Students in computer science and information technology should be engaged in solving real-world problems received from government and industry as well as those that expose them to various areas of application. In this paper, we discuss interdisciplinary project experiences between majors and non-majors that offered a creative and innovative…
Calibration Experiments for a Computer Vision Oyster Volume Estimation System
ERIC Educational Resources Information Center
Chang, G. Andy; Kerns, G. Jay; Lee, D. J.; Stanek, Gary L.
2009-01-01
Calibration is a technique that is commonly used in science and engineering research that requires calibrating measurement tools for obtaining more accurate measurements. It is an important technique in various industries. In many situations, calibration is an application of linear regression, and is a good topic to be included when explaining and…
Videodisc/Microcomputer Technology in Wildland Fire Behavior Training
M. J. Jenkins; K.Y. Matsumoto-Grah
1987-01-01
Interactive video is a powerful medium, bringing together the emotional impact of video and film and the interactive capabilities of the computer. Interactive videodisc instruction can be used as a tutorial, for drill and practice and in simulations, as well as for information storage. Videodisc technology is being used in industrial, military and medical applications...
Electronic noses and tongues: Applications for the food and pharmaceutical industries
USDA-ARS?s Scientific Manuscript database
The electronic nose (enose) is designed to crudely mimic the human brain in that most contain sensors that non-selectively interact with odor molecules to produce some sort of signal that is then sent to a computer that uses multivariate statistics to determine patterns in the data. This pattern rec...
Trends in HFE Methods and Tools and Their Applicability to Safety Reviews
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Hara, J.M.; Plott, C.; Milanski, J.
2009-09-30
The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M&T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M&Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance tomore » support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M&Ts. The M&Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M&Ts for each trend, and (2) whether M&Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M&T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.« less
Microparticle Separation by Cyclonic Separation
NASA Astrophysics Data System (ADS)
Karback, Keegan; Leith, Alexander
2017-11-01
The ability to separate particles based on their size has wide ranging applications from the industrial to the medical. Currently, cyclonic separators are primarily used in agriculture and manufacturing to syphon out contaminates or products from an air supply. This has led us to believe that cyclonic separation has more applications than the agricultural and industrial. Using the OpenFoam computational package, we were able to determine the flow parameters of a vortex in a cyclonic separator in order to segregate dust particles to a cutoff size of tens of nanometers. To test the model, we constructed an experiment to separate a test dust of various sized particles. We filled a chamber with Arizona test dust and utilized an acoustic suspension technique to segregate particles finer than a coarse cutoff size and introduce them into the cyclonic separation apparatus where they were further separated via a vortex following our computational model. The size of the particles separated from this experiment will be used to further refine our model. Metropolitan State University of Denver, Colorado University of Denver, Dr. Randall Tagg, Dr. Richard Krantz.
IP-Based Video Modem Extender Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierson, L G; Boorman, T M; Howe, R E
2003-12-16
Visualization is one of the keys to understanding large complex data sets such as those generated by the large computing resources purchased and developed by the Advanced Simulation and Computing program (aka ASCI). In order to be convenient to researchers, visualization data must be distributed to offices and large complex visualization theaters. Currently, local distribution of the visual data is accomplished by distance limited modems and RGB switches that simply do not scale to hundreds of users across the local, metropolitan, and WAN distances without incurring large costs in fiber plant installation and maintenance. Wide Area application over the DOEmore » Complex is infeasible using these limited distance RGB extenders. On the other hand, Internet Protocols (IP) over Ethernet is a scalable well-proven technology that can distribute large volumes of data over these distances. Visual data has been distributed at lower resolutions over IP in industrial applications. This document describes requirements of the ASCI program in visual signal distribution for the purpose of identifying industrial partners willing to develop products to meet ASCI's needs.« less
NASA Astrophysics Data System (ADS)
Staszak, Katarzyna
2017-11-01
The membrane processes have played important role in the industrial separation process. These technologies can be found in all industrial areas such as food, beverages, metallurgy, pulp and paper, textile, pharmaceutical, automotive, biotechnology and chemical industry, as well as in water treatment for domestic and industrial application. Although these processes are known since twentieth century, there are still many studies that focus on the testing of new membranes' materials and determining of conditions for optimal selectivity, i. e. the optimum transmembrane pressure (TMP) or permeate flux to minimize fouling. Moreover the researchers proposed some calculation methods to predict the membrane processes properties. In this article, the laboratory scale experiments of membrane separation techniques, as well their validation by calculation methods are presented. Because membrane is the "heart" of the process, experimental and computational methods for its characterization are also described.
Handbook of Industrial Engineering Equations, Formulas, and Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badiru, Adedeji B; Omitaomu, Olufemi A
The first handbook to focus exclusively on industrial engineering calculations with a correlation to applications, Handbook of Industrial Engineering Equations, Formulas, and Calculations contains a general collection of the mathematical equations often used in the practice of industrial engineering. Many books cover individual areas of engineering and some cover all areas, but none covers industrial engineering specifically, nor do they highlight topics such as project management, materials, and systems engineering from an integrated viewpoint. Written by acclaimed researchers and authors, this concise reference marries theory and practice, making it a versatile and flexible resource. Succinctly formatted for functionality, the bookmore » presents: Basic Math Calculations; Engineering Math Calculations; Production Engineering Calculations; Engineering Economics Calculations; Ergonomics Calculations; Facility Layout Calculations; Production Sequencing and Scheduling Calculations; Systems Engineering Calculations; Data Engineering Calculations; Project Engineering Calculations; and Simulation and Statistical Equations. It has been said that engineers make things while industrial engineers make things better. To make something better requires an understanding of its basic characteristics and the underlying equations and calculations that facilitate that understanding. To do this, however, you do not have to be computational experts; you just have to know where to get the computational resources that are needed. This book elucidates the underlying equations that facilitate the understanding required to improve design processes, continuously improving the answer to the age-old question: What is the best way to do a job?« less
Job Prospects for Computer Engineers.
ERIC Educational Resources Information Center
Basta, Nicholas
1988-01-01
Discusses the computer engineering industry in the United States. Recounts recent shifts in the computer industry and notes that despite foreign competition, the industry offers graduating computer engineers ample opportunities for employment. Claims that skill and technical knowledge are the most important assets for getting a job. (TW)
NASA Astrophysics Data System (ADS)
Jakovics, A.
2007-06-01
The International Scientific Colloquium "Modelling for Material Processing" took place last year on June 8-9. It was the fourth time the colloquium was organized. The first colloquium took place in 1999. All colloquia were organized by the University of Latvia together with Leibniz University of Hannover (Germany) that signifies a long-term tradition (since 1988) of scientific cooperation between researchers of these two universities in the field of electrothermal process modelling. During the last colloquium scientific reports in the field of mathematical modelling in industrial electromagnetic applications for different materials (liquid metals, semiconductor technology, porous materials, melting of oxides and inductive heating) were presented. 70 researchers from 10 countries attended the colloquium. The contributions included about 30 oral presentations and 12 posters. The most illustrative presentations (oral and poster) in the field of MHD were selected for publication in a special issue of the international journal "Magnetohydrodynamics". Traditionally, many reports of the colloquium discuss the problems of MHD methods and devices applied to the metallurgical technologies and processes of semiconductor crystal growth. The new results illustrate the influence of combined electromagnetic fields on the hydrodynamics and heat/mass transfer in melts. The presented reports demonstrate that the models for simulation of turbulent liquid metal flows in melting furnaces, crystallization of alloys and single crystal growth in electromagnetic fields have become much more complex. The adequate description of occurring physical phenomena and the use of high performance computer and clusters allow to reduce the number of experiments in industrial facilities. The use of software and computers for modelling technological and environmental processes has a very long history at the University of Latvia. The first modelling activities in the field of industrial MHD applications had led to the establishment of the chair of Electrodynamics and Continuum Mechanics in 1970, the first head of which was professor Juris Mikelsons. In the early 90's, when all research institutions in our country underwent dramatic changes, not all research directions and institutions managed to adapt successfully to the new conditions. Fortunately, the people who were involved in computer modelling of physical processes were among the most successful. First, the existing and newly established contacts in Western Europe were used actively to reorient the applied researches in the directions actively studied at the universities and companies, which were the partners of the University of Latvia. As a result, research groups involved in these activities successfully joined the international effort related to the application of computer models to industrial processes, and the scientific laboratory for Mathematical Modelling of Environmental and Technological Processes was founded in 1994. The second direction of modelling development was related to the application of computer-based models for the environmental and technological processes (e.g., sediment transport in harbours, heat transfer in building constructions) that were important for the companies and state institutions in Latvia. Currently, the field of engineering physics, the core of which is the computer modelling of technological and environmental processes, is one of the largest and most successfully developing parts of researches and educational programs at the Department of Physics of the University of Latvia with very good perspectives in the future for the development of new technologies and knowledge transfer.
Thermodynamic and economic analysis of heat pumps for energy recovery in industrial processes
NASA Astrophysics Data System (ADS)
Urdaneta-B, A. H.; Schmidt, P. S.
1980-09-01
A computer code has been developed for analyzing the thermodynamic performance, cost and economic return for heat pump applications in industrial heat recovery. Starting with basic defining characteristics of the waste heat stream and the desired heat sink, the algorithm first evaluates the potential for conventional heat recovery with heat exchangers, and if applicable, sizes the exchanger. A heat pump system is then designed to process the residual heating and cooling requirements of the streams. In configuring the heat pump, the program searches a number of parameters, including condenser temperature, evaporator temperature, and condenser and evaporator approaches. All system components are sized for each set of parameters, and economic return is estimated and compared with system economics for conventional processing of the heated and cooled streams (i.e., with process heaters and coolers). Two case studies are evaluated, one in a food processing application and the other in an oil refinery unit.
Computational analysis of fluid dynamics in pharmaceutical freeze-drying.
Alexeenko, Alina A; Ganguly, Arnab; Nail, Steven L
2009-09-01
Analysis of water vapor flows encountered in pharmaceutical freeze-drying systems, laboratory-scale and industrial, is presented based on the computational fluid dynamics (CFD) techniques. The flows under continuum gas conditions are analyzed using the solution of the Navier-Stokes equations whereas the rarefied flow solutions are obtained by the direct simulation Monte Carlo (DSMC) method for the Boltzmann equation. Examples of application of CFD techniques to laboratory-scale and industrial scale freeze-drying processes are discussed with an emphasis on the utility of CFD for improvement of design and experimental characterization of pharmaceutical freeze-drying hardware and processes. The current article presents a two-dimensional simulation of a laboratory scale dryer with an emphasis on the importance of drying conditions and hardware design on process control and a three-dimensional simulation of an industrial dryer containing a comparison of the obtained results with analytical viscous flow solutions. It was found that the presence of clean in place (CIP)/sterilize in place (SIP) piping in the duct lead to significant changes in the flow field characteristics. The simulation results for vapor flow rates in an industrial freeze-dryer have been compared to tunable diode laser absorption spectroscopy (TDLAS) and gravimetric measurements.
NASA Astrophysics Data System (ADS)
Valle, Fabio
The paper analyzes the satellite broadband systems for consumer from the perspective of technological innovation. The suggested interpretation relies upon such concepts as technological paradigm, technological trajectory and salient points. Satellite technology for broadband is a complex system on which each component (i.e. the satellite, the end-user equipment, the on-ground systems and related infrastructure) develops at different speed. Innovation in this industry concentrates recently on satellite space aircraft that seemed to be the component with the highest perceived opportunity for improvement. The industry has designed recently satellite systems with continuous dimensional increase of capacity available, suggesting that there is a technological trajectory in this area, similar to Moore’s law in the computer industry. The implications for industry players, Ka-band systems, and growth of future applications are also examined.
University Researchers Approach to Providing Computer Simulations to Industry.
NASA Astrophysics Data System (ADS)
Birdsall, Charles
1996-05-01
University researchers perform in an exploratory mode in developing and applying computer simulations to their research problems. Post-docs and students make codes suited to their problems, and to thesis and article writing, with little code use planned beyond such. Industry product developers want well tested, cleanly applicable simulation codes, with freedom to go to the code developers for bug fixing and improvements (and not to have to hunt for a student who has graduated). Hence, these different modes clash; some cushion of understanding and new elements are needed to effect broader, continuing use of university developed codes. We and others have evolved approaches that appear to work, including providing free software, but with follow-ups done by small companies. (See Ref. 1 for more.) We will present our development of plasma device codes over 15 years, evolving into free distribution on the Internet (Ref. 2) with short courses and workshops; follow-ups are done by a small company (of former students, the code writers). In addition, an example of university code development will be given, that of application of the series (or dipole) resonance to providing plasma surface wave generated plasmas, drawing on decades old research; potential applications will be given. We will present what other university groups are doing and reflections on these modes by modelers and designers in the plasma processing industry (semiconductor manufacturing equipment companies), which is highly empirical at present. All of this interaction is still evolving. 9 Brown J. Browning, Sci.Am. Jan 1996, p.35 www See Internet address http://ptsg.eecs.berkeley.edu thebibliography
NASA/CARES dual-use ceramic technology spinoff applications
NASA Technical Reports Server (NTRS)
Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.; Nemeth, Noel N.
1994-01-01
NASA has developed software that enables American industry to establish the reliability and life of ceramic structures in a wide variety of 21st Century applications. Designing ceramic components to survive at higher temperatures than the capability of most metals and in severe loading environments involves the disciplines of statistics and fracture mechanics. Successful application of advanced ceramics material properties and the use of a probabilistic brittle material design methodology. The NASA program, known as CARES (Ceramics Analysis and Reliability Evaluation of Structures), is a comprehensive general purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. The latest version of this software, CARESALIFE, is coupled to several commercially available finite element analysis programs (ANSYS, MSC/NASTRAN, ABAQUS, COSMOS/N4, MARC), resulting in an advanced integrated design tool which is adapted to the computing environment of the user. The NASA-developed CARES software has been successfully used by industrial, government, and academic organizations to design and optimize ceramic components for many demanding applications. Industrial sectors impacted by this program include aerospace, automotive, electronic, medical, and energy applications. Dual-use applications include engine components, graphite and ceramic high temperature valves, TV picture tubes, ceramic bearings, electronic chips, glass building panels, infrared windows, radiant heater tubes, heat exchangers, and artificial hips, knee caps, and teeth.
Machine learning methods for classifying human physical activity from on-body accelerometers.
Mannini, Andrea; Sabatini, Angelo Maria
2010-01-01
The use of on-body wearable sensors is widespread in several academic and industrial domains. Of great interest are their applications in ambulatory monitoring and pervasive computing systems; here, some quantitative analysis of human motion and its automatic classification are the main computational tasks to be pursued. In this paper, we discuss how human physical activity can be classified using on-body accelerometers, with a major emphasis devoted to the computational algorithms employed for this purpose. In particular, we motivate our current interest for classifiers based on Hidden Markov Models (HMMs). An example is illustrated and discussed by analysing a dataset of accelerometer time series.
A Software Development Platform for Wearable Medical Applications.
Zhang, Ruikai; Lin, Wei
2015-10-01
Wearable medical devices have become a leading trend in healthcare industry. Microcontrollers are computers on a chip with sufficient processing power and preferred embedded computing units in those devices. We have developed a software platform specifically for the design of the wearable medical applications with a small code footprint on the microcontrollers. It is supported by the open source real time operating system FreeRTOS and supplemented with a set of standard APIs for the architectural specific hardware interfaces on the microcontrollers for data acquisition and wireless communication. We modified the tick counter routine in FreeRTOS to include a real time soft clock. When combined with the multitasking features in the FreeRTOS, the platform offers the quick development of wearable applications and easy porting of the application code to different microprocessors. Test results have demonstrated that the application software developed using this platform are highly efficient in CPU usage while maintaining a small code foot print to accommodate the limited memory space in microcontrollers.
Integrated Computational Materials Engineering for Magnesium in Automotive Body Applications
NASA Astrophysics Data System (ADS)
Allison, John E.; Liu, Baicheng; Boyle, Kevin P.; Hector, Lou; McCune, Robert
This paper provides an overview and progress report for an international collaborative project which aims to develop an ICME infrastructure for magnesium for use in automotive body applications. Quantitative processing-micro structure-property relationships are being developed for extruded Mg alloys, sheet-formed Mg alloys and high pressure die cast Mg alloys. These relationships are captured in computational models which are then linked with manufacturing process simulation and used to provide constitutive models for component performance analysis. The long term goal is to capture this information in efficient computational models and in a web-centered knowledge base. The work is being conducted at leading universities, national labs and industrial research facilities in the US, China and Canada. This project is sponsored by the U.S. Department of Energy, the U.S. Automotive Materials Partnership (USAMP), Chinese Ministry of Science and Technology (MOST) and Natural Resources Canada (NRCan).
The new landscape of parallel computer architecture
NASA Astrophysics Data System (ADS)
Shalf, John
2007-07-01
The past few years has seen a sea change in computer architecture that will impact every facet of our society as every electronic device from cell phone to supercomputer will need to confront parallelism of unprecedented scale. Whereas the conventional multicore approach (2, 4, and even 8 cores) adopted by the computing industry will eventually hit a performance plateau, the highest performance per watt and per chip area is achieved using manycore technology (hundreds or even thousands of cores). However, fully unleashing the potential of the manycore approach to ensure future advances in sustained computational performance will require fundamental advances in computer architecture and programming models that are nothing short of reinventing computing. In this paper we examine the reasons behind the movement to exponentially increasing parallelism, and its ramifications for system design, applications and programming models.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
Computer Utilization in Industrial Arts/Technology Education. Curriculum Guide.
ERIC Educational Resources Information Center
Connecticut Industrial Arts Association.
This guide is intended to assist industrial arts/technology education teachers in helping students in grades K-12 understand the impact of computers and computer technology in the world. Discussed in the introductory sections are the ways in which computers have changed the face of business, industry, and education and training; the scope and…
Moreno-Tapia, Sandra Veronica; Vera-Salas, Luis Alberto; Osornio-Rios, Roque Alfredo; Dominguez-Gonzalez, Aurelio; Stiharu, Ion; de Jesus Romero-Troncoso, Rene
2010-01-01
Computer numerically controlled (CNC) machines have evolved to adapt to increasing technological and industrial requirements. To cover these needs, new generation machines have to perform monitoring strategies by incorporating multiple sensors. Since in most of applications the online Processing of the variables is essential, the use of smart sensors is necessary. The contribution of this work is the development of a wireless network platform of reconfigurable smart sensors for CNC machine applications complying with the measurement requirements of new generation CNC machines. Four different smart sensors are put under test in the network and their corresponding signal processing techniques are implemented in a Field Programmable Gate Array (FPGA)-based sensor node. PMID:22163602
Moreno-Tapia, Sandra Veronica; Vera-Salas, Luis Alberto; Osornio-Rios, Roque Alfredo; Dominguez-Gonzalez, Aurelio; Stiharu, Ion; Romero-Troncoso, Rene de Jesus
2010-01-01
Computer numerically controlled (CNC) machines have evolved to adapt to increasing technological and industrial requirements. To cover these needs, new generation machines have to perform monitoring strategies by incorporating multiple sensors. Since in most of applications the online Processing of the variables is essential, the use of smart sensors is necessary. The contribution of this work is the development of a wireless network platform of reconfigurable smart sensors for CNC machine applications complying with the measurement requirements of new generation CNC machines. Four different smart sensors are put under test in the network and their corresponding signal processing techniques are implemented in a Field Programmable Gate Array (FPGA)-based sensor node.
NASA Astrophysics Data System (ADS)
Isnur Haryudo, Subuh; Imam Agung, Achmad; Firmansyah, Rifqi
2018-04-01
The purpose of this research is to develop learning media of control technique using Matrix Laboratory software with industry requirement approach. Learning media serves as a tool for creating a better and effective teaching and learning situation because it can accelerate the learning process in order to enhance the quality of learning. Control Techniques using Matrix Laboratory software can enlarge the interest and attention of students, with real experience and can grow independent attitude. This research design refers to the use of research and development (R & D) methods that have been modified by multi-disciplinary team-based researchers. This research used Computer based learning method consisting of computer and Matrix Laboratory software which was integrated with props. Matrix Laboratory has the ability to visualize the theory and analysis of the Control System which is an integration of computing, visualization and programming which is easy to use. The result of this instructional media development is to use mathematical equations using Matrix Laboratory software on control system application with DC motor plant and PID (Proportional-Integral-Derivative). Considering that manufacturing in the field of Distributed Control systems (DCSs), Programmable Controllers (PLCs), and Microcontrollers (MCUs) use PID systems in production processes are widely used in industry.
Future prospect 2012-2025 - How will our business change for the next 10 years -
NASA Astrophysics Data System (ADS)
Tanaka, Sakae
2013-04-01
The purpose of this lecture is to discuss about the "Future". How our business will change in the next 10 years? I believe the key is 3 mega-trends "Sustainability", "Cloud Computing" and "Life Innovation". With the development of social environment, the required business will change, too. The future would be invisible if you shut yourself up in your single industry. It is important to see various business fields horizontally, and recognize various key changes stereoscopically such as demographics, economy, technology, sense of value and lifestyle, when you develop mid-and-long term strategy. "Cloud" is silent, but the revolution of personal computing. It will bring the drastic changes in every industry. It will make "voice" and "moving image" possible to use as the interface to access your computer. Cloud computing will also make the client device more diversified and spread the application range widely. 15 years ago, the term "IT" was equivalent to "personal computer". Recently, it rather means to use smartphone and tablet device. In the next several years, TV and car-navigation system will be connected to broadband and it will become a part of personal computing. The meaning of personal computing is changing essentially year by year. In near future, the universe of computing will expand to the energy, medical and health-care, and agriculture etc. It passed only 20 years since we use "Computer" in a full scale operation. Recently, computer has start understanding our few words and talking in babble like a baby. The history of computing has just started.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bramley, A.N.
1985-01-01
This book presents the Proceedings of the Second Materials Engineering Conference. This valuable collection of papers deal with the awareness, creative use, economics, reliability, selection, design, testing and warranty of materials. The papers address topics of both immediate and lasting industrial importance at a readily assimilated level and contain information which will lead speedily to improvements in industrial practice. Topics considered include recent developments in the science and technology of high modulus polymers; computer aided design of advanced composites; a systematic approach to materials testing in metal forming; new cold working tool steels; friction surfacing and its applications; fatigue lifemore » assessment and materials engineering; alternative materials for internal combustion engines; adhesives and the engineer; thermoplastic bearings; engineering applications of ZA alloys; and utility and complexity in the selection of polymeric materials.« less
[Consideration of Mobile Medical Device Regulation].
Peng, Liang; Yang, Pengfei; He, Weigang
2015-07-01
The regulation of mobile medical devices is one of the hot topics in the industry now. The definition, regulation scope and requirements, potential risks of mobile medical devices were analyzed and discussed based on mobile computing techniques and the FDA guidance of mobile medical applications. The regulation work of mobile medical devices in China needs to adopt the risk-based method.
NASA Computational Fluid Dynamics Conference. Volume 2: Sessions 7-12
NASA Technical Reports Server (NTRS)
1989-01-01
The objectives of the conference were to disseminate CFD research results to industry and university CFD researchers, to promote synergy among NASA CFD researchers, and to permit feedback from researchers outside of NASA on issues pacing the discipline of CFD. The focus of the conference was on the application of CFD technology but also included fundamental activities.
Technological trends in automobiles.
Horton, E J; Compton, W D
1984-08-10
Current technological trends in the automotive industry reflect many diverse disciplines. Electronics and microprocessors, new engine transmission concepts, composite and ceramic materials, and computer-aided design and manufacture will combine to make possible the creation of advanced automobiles offering outstanding quality, fuel economy, and performance. A projected "average" vehicle of the 1990's is described to illustrate the application of these new concepts.
ERIC Educational Resources Information Center
Saulnier, Bruce
2016-01-01
To more effectively meet the expectations of industry for entry-level IT employees, a case is made for the inclusion of writing throughout the Computer Information Systems (CIS) curriculum. "Writing Across the Curriculum" ("WAC") principles are explained, and it is opined that both Writing to Learn (WTL) and Writing in the…
ERIC Educational Resources Information Center
White, Robert C.; Cormier, Robert J., Eds.
1986-01-01
In cooperation with the Maine Bureau of Rehabilitation, the University of Maine established the Rehabilitation Project in Data Processing in 1978 to train physically handicapped individuals to become business application computer programmers. Discusses various aspects of the program, considered one of the most successful rehabilitation programs in…
A Multiple Sensor Machine Vision System for Automatic Hardwood Feature Detection
D. Earl Kline; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman; Robert L. Brisbin
1993-01-01
A multiple sensor machine vision prototype is being developed to scan full size hardwood lumber at industrial speeds for automatically detecting features such as knots holes, wane, stain, splits, checks, and color. The prototype integrates a multiple sensor imaging system, a materials handling system, a computer system, and application software. The prototype provides...
Application on Internet of Things Technology Using in Library Management
NASA Astrophysics Data System (ADS)
Liu, Xueqing; Sheng, Wenwen
Following the computer, Internet and mobile communication network, the Internet of Things (IOT) will bring a new development of information industry, and moreover is a global technology revolution that is bound to have a profound impact on the economic development and social life. This paper analyzes the key technology and working principle of IOT, its development at home and abroad, its application in the library management, and proposes its development direction in the field of library management and promotion programs.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
IPAD: A unique approach to government/industry cooperation for technology development and transfer
NASA Technical Reports Server (NTRS)
Fulton, Robert E.; Salley, George C.
1985-01-01
A key element to improved industry productivity is effective management of Computer Aided Design / Computer Aided Manufacturing (CAD/CAM) information. To stimulate advancement, a unique joint government/industry project designated Integrated Programs for Aerospace-Vehicle Design (IPAD) was carried out from 1971 to 1984. The goal was to raise aerospace industry productivity through advancement of computer based technology to integrate and manage information involved in the design and manufacturing process. IPAD research was guided by an Industry Technical Advisory Board (ITAB) composed of over 100 representatives from aerospace and computer companies. The project complemented traditional NASA/DOD research to develop aerospace design technology and the Air Force's Integrated Computer Aided Manufacturing (ICAM) program to advance CAM technology. IPAD had unprecedented industry support and involvement and served as a unique approach to government industry cooperation in the development and transfer of advanced technology. The IPAD project background, approach, accomplishments, industry involvement, technology transfer mechanisms and lessons learned are summarized.
NASA Astrophysics Data System (ADS)
Claussen, U.
1984-01-01
The improvement of contrast and visibility of LCD by two different means was undertaken. The two methods are: (1) development of fluorescent dyes to increase the visibility of fluorescent activated displays (FLAD); and (2) development of dichroic dyes to increase the contrast of displays. This work was done in close cooperation with the electronic industry, where the newly synthesized dyes were tested. The targets for the chemical synthesis were selected with the help of computer model calculations. A marketable range of dyes was developed. Since the interest of the electronic industries concerning FLAD was low, the investigations were stopped. Dichroic dyes, especially black mixtures with good light fastness, order parameter, and solubility in nematic phases were developed. The application of these dyes is restricted to indoor use because of an increase of viscosity below -10 C. Applications on a technical scale, e.g., for the automotive industry, will be possible if the displays work at temperatures down to -40 C. This problem requires a complex optimization of the dye/nematic phase system.
Biophysical functionality in polysaccharides: from Lego-blocks to nano-particles.
Cesàro, Attilio; Bellich, Barbara; Borgogna, Massimiliano
2012-04-01
The objective of the paper is to show the very important biophysical concepts that have been developed with polysaccharides. In particular, an attempt will be made to relate "a posteriori" the fundamental aspects, both experimental and theoretical, with some industrial applications of polysaccharide-based materials. The overview of chain conformational aspects includes relationships between topological features and local dynamics, exemplified for some naturally occurring carbohydrate polymers. Thus, by using simulation techniques and computational studies, the physicochemical properties of aqueous solutions of polysaccharides are interpreted. The relevance of conformational disorder-order transitions, chain aggregation, and phase separation to the underlying role of the ionic contribution to these processes is discussed. We stress the importance of combining information from analysis of experimental data with that from statistical-thermodynamic models for understanding the conformation, size, and functional stability of industrially important polysaccharides. The peculiar properties of polysaccharides in industrial applications are summarized for the particularly important example of nanoparticles production, a field of growing relevance and scientific interest.
The Technology Information Environment with Industry{trademark} system description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Detry, R.; Machin, G.
The Technology Information Environment with Industry (TIE-In{trademark}) provides users with controlled access to distributed laboratory resources that are packaged in intelligent user interfaces. These interfaces help users access resources without requiring the user to have technical or computer expertise. TIE-In utilizes existing, proven technologies such as the Kerberos authentication system, X-Windows, and UNIX sockets. A Front End System (FES) authenticates users and allows them to register for resources and subsequently access them. The FES also stores status and accounting information, and provides an automated method for the resource owners to recover costs from users. The resources available through TIE-In aremore » typically laboratory-developed applications that are used to help design, analyze, and test components in the nation`s nuclear stockpile. Many of these applications can also be used by US companies for non-weapons-related work. TIE-In allows these industry partners to obtain laboratory-developed technical solutions without requiring them to duplicate the technical resources (people, hardware, and software) at Sandia.« less
Multicore Programming Challenges
NASA Astrophysics Data System (ADS)
Perrone, Michael
The computer industry is facing fundamental challenges that are driving a major change in the design of computer processors. Due to restrictions imposed by quantum physics, one historical path to higher computer processor performance - by increased clock frequency - has come to an end. Increasing clock frequency now leads to power consumption costs that are too high to justify. As a result, we have seen in recent years that the processor frequencies have peaked and are receding from their high point. At the same time, competitive market conditions are giving business advantage to those companies that can field new streaming applications, handle larger data sets, and update their models to market conditions faster. The desire for newer, faster and larger is driving continued demand for higher computer performance.
Computational Tools and Facilities for the Next-Generation Analysis and Design Environment
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)
1997-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.
Computer graphics in architecture and engineering
NASA Technical Reports Server (NTRS)
Greenberg, D. P.
1975-01-01
The present status of the application of computer graphics to the building profession or architecture and its relationship to other scientific and technical areas were discussed. It was explained that, due to the fragmented nature of architecture and building activities (in contrast to the aerospace industry), a comprehensive, economic utilization of computer graphics in this area is not practical and its true potential cannot now be realized due to the present inability of architects and structural, mechanical, and site engineers to rely on a common data base. Future emphasis will therefore have to be placed on a vertical integration of the construction process and effective use of a three-dimensional data base, rather than on waiting for any technological breakthrough in interactive computing.
An integrated computational tool for precipitation simulation
NASA Astrophysics Data System (ADS)
Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.
2011-07-01
Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.
NASA Technical Reports Server (NTRS)
Liu, D. D.; Kao, Y. F.; Fung, K. Y.
1989-01-01
A transonic equivalent strip (TES) method was further developed for unsteady flow computations of arbitrary wing planforms. The TES method consists of two consecutive correction steps to a given nonlinear code such as LTRAN2; namely, the chordwise mean flow correction and the spanwise phase correction. The computation procedure requires direct pressure input from other computed or measured data. Otherwise, it does not require airfoil shape or grid generation for given planforms. To validate the computed results, four swept wings of various aspect ratios, including those with control surfaces, are selected as computational examples. Overall trends in unsteady pressures are established with those obtained by XTRAN3S codes, Isogai's full potential code and measured data by NLR and RAE. In comparison with these methods, the TES has achieved considerable saving in computer time and reasonable accuracy which suggests immediate industrial applications.
The case for responsibility of the IT industry to promote equality for women in computing.
Turner, E
2001-04-01
This paper investigates the relationship between the role that information technology (IT) has played in the development of women's employment, the possibility of women having a significant influence on the technology's development, and the way that the IT industry perceives women as computer scientists, users and consumers. The industry's perception of women and men is investigated through the portrayal of them in computing advertisements. While women are increasingly updating their technological skills and know-how, and through this process are entering some positions in the workplace traditionally occupied by men, these achievements are not mirrored in their social and occupational status. The computer industry and higher education have worryingly low numbers of women, while the possibility of women influencing the development of computer technology is just emerging in feminist research. This paper argues that, though the IT industry, through their self-regulatory codes, subscribes to equal treatment of sexes, races and persons with disabilities, the industry nevertheless paints a stereotyped picture of inequality when portraying men and women in computer advertisements. As long as such a perception of women prevails within the industry, it will stand as a barrier to women having equal access to computer technology. If advertisements influence the way society perceives major social constructs and issues, then the computing industry has a social responsibility to portray men and women in an equal and non-stereotypical fashion.
Virtualization in education: Information Security lab in your hands
NASA Astrophysics Data System (ADS)
Karlov, A. A.
2016-09-01
The growing demand for qualified specialists in advanced information technologies poses serious challenges to the education and training of young personnel for science, industry and social problems. Virtualization as a way to isolate the user from the physical characteristics of computing resources (processors, servers, operating systems, networks, applications, etc.), has, in particular, an enormous influence in the field of education, increasing its efficiency, reducing the cost, making it more widely and readily available. The study of Information Security of computer systems is considered as an example of use of virtualization in education.
Proceedings of a Conference on Telecommunication Technologies, Networkings and Libraries
NASA Astrophysics Data System (ADS)
Knight, N. K.
1981-12-01
Current and developing technologies for digital transmission of image data likely to have an impact on the operations of libraries and information centers or provide support for information networking are reviewed. Technologies reviewed include slow scan television, teleconferencing, and videodisc technology and standards development for computer network interconnection through hardware and software, particularly packet switched networks computer network protocols for library and information service applications, the structure of a national bibliographic telecommunications network; and the major policy issues involved in the regulation or deregulation of the common communications carriers industry.
An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints
Sung, Jinmo; Jeong, Bongju
2014-01-01
Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments. PMID:24701158
An adaptive evolutionary algorithm for traveling salesman problem with precedence constraints.
Sung, Jinmo; Jeong, Bongju
2014-01-01
Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments.
NASA Astrophysics Data System (ADS)
Xiao, Jie
Polymer nanocomposites have a great potential to be a dominant coating material in a wide range of applications in the automotive, aerospace, ship-making, construction, and pharmaceutical industries. However, how to realize design sustainability of this type of nanostructured materials and how to ensure the true optimality of the product quality and process performance in coating manufacturing remain as a mountaintop area. The major challenges arise from the intrinsic multiscale nature of the material-process-product system and the need to manipulate the high levels of complexity and uncertainty in design and manufacturing processes. This research centers on the development of a comprehensive multiscale computational methodology and a computer-aided tool set that can facilitate multifunctional nanocoating design and application from novel function envisioning and idea refinement, to knowledge discovery and design solution derivation, and further to performance testing in industrial applications and life cycle analysis. The principal idea is to achieve exceptional system performance through concurrent characterization and optimization of materials, product and associated manufacturing processes covering a wide range of length and time scales. Multiscale modeling and simulation techniques ranging from microscopic molecular modeling to classical continuum modeling are seamlessly coupled. The tight integration of different methods and theories at individual scales allows the prediction of macroscopic coating performance from the fundamental molecular behavior. Goal-oriented design is also pursued by integrating additional methods for bio-inspired dynamic optimization and computational task management that can be implemented in a hierarchical computing architecture. Furthermore, multiscale systems methodologies are developed to achieve the best possible material application towards sustainable manufacturing. Automotive coating manufacturing, that involves paint spay and curing, is specifically discussed in this dissertation. Nevertheless, the multiscale considerations for sustainable manufacturing, the novel concept of IPP control, and the new PPDE-based optimization method are applicable to other types of manufacturing, e.g., metal coating development through electroplating. It is demonstrated that the methodological development in this dissertation can greatly facilitate experimentalists in novel material invention and new knowledge discovery. At the same time, they can provide scientific guidance and reveal various new opportunities and effective strategies for sustainable manufacturing.
Intelligent robot trends and predictions for the first year of the new millennium
NASA Astrophysics Data System (ADS)
Hall, Ernest L.
2000-10-01
An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The current use of these machines in outer space, medicine, hazardous materials, defense applications and industry is being pursued with vigor. In factory automation, industrial robots can improve productivity, increase product quality and improve competitiveness. The computer and the robot have both been developed during recent times. The intelligent robot combines both technologies and requires a thorough understanding and knowledge of mechatronics. Today's robotic machines are faster, cheaper, more repeatable, more reliable and safer than ever. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. Economically, the robotics industry now has more than a billion-dollar market in the U.S. and is growing. Feasibility studies show decreasing costs for robots and unaudited healthy rates of return for a variety of robotic applications. However, the road from inspiration to successful application can be long and difficult, often taking decades to achieve a new product. A greater emphasis on mechatronics is needed in our universities. Certainly, more cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit industry and society. The fearful robot stories may help us prevent future disaster. The inspirational robot ideas may inspire the scientists of tomorrow. However, the intelligent robot ideas, which can be reduced to practice, will change the world.
NASA Technical Reports Server (NTRS)
Polzien, R. E.; Rodriguez, D.
1981-01-01
Aspects of incorporating a thermal energy transport system (ETS) into a field of parabolic dish collectors for industrial process heat (IPH) applications were investigated. Specific objectives are to: (1) verify the mathematical optimization of pipe diameters and insulation thicknesses calculated by a computer code; (2) verify the cost model for pipe network costs using conventional pipe network construction; (3) develop a design and the associated production costs for incorporating risers and downcomers on a low cost concentrator (LCC); (4) investigate the cost reduction of using unconventional pipe construction technology. The pipe network design and costs for a particular IPH application, specifically solar thermally enhanced oil recovery (STEOR) are analyzed. The application involves the hybrid operation of a solar powered steam generator in conjunction with a steam generator using fossil fuels to generate STEOR steam for wells. It is concluded that the STEOR application provides a baseline pipe network geometry used for optimization studies of pipe diameter and insulation thickness, and for development of comparative cost data, and operating parameters for the design of riser/downcomer modifications to the low cost concentrator.
Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing
NASA Astrophysics Data System (ADS)
Klems, Markus; Nimis, Jens; Tai, Stefan
On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.
Applications of lasers and electro-optics
NASA Astrophysics Data System (ADS)
Tan, B. C.; Low, K. S.; Chen, Y. H.; Ahmad, Harith; Tou, T. Y.
Supported by the IRPA Programme on Laser Technology and Applications, many types of lasers have been designed, constructed and applied in various areas of science, medicine and industries. Amongst these lasers constructed were high power carbon dioxide lasers, rare gas halide excimer lasers, solid state Neodymium-YAG lasers, nitrogen lasers, flashlamp pumped dye lasers and nitrogen and excimer laser pumped dye lasers. These lasers and the associated electro-optics system, some with computer controlled, are designed and developed for the following areas of applications: (1) industrial applications of high power carbon dioxide lasers for making of i.c. components and other materials processing purposes -- prototype operational systems have been developed; (2) Medical applications of lasers for cancer treatment using the technique of photodynamic therapy -- a new and more effective treatment protocol has been proposed; (3) agricultural applications of lasers in palm oil and palm fruit-fluorescence diagnostic studies -- fruit ripeness signature has been developed and palm oil oxidation level were investigated; (4) development of atmospheric pollution monitoring systems using laser lidar techniques -- laboratory scale systems were developed; and (5) other applications of lasers including laser holographic and interferometric methods for the non destructive testing of materials.
Study on the integration approaches to CAD/CAPP/FMS in garment CIMS
NASA Astrophysics Data System (ADS)
Wang, Xiankui; Tian, Wensheng; Liu, Chengying; Li, Zhizhong
1995-08-01
Computer integrated manufacturing system (CIMS), as an advanced methodology, has been applied in many industry fields. There is, however, little research on the application of CIMS in the garment industry, especially on the integrated approach to CAD, CAPP, and FMS in garment CIMS. In this paper, the current situations of CAD, CAPP, and FMS in the garment industry are discussed, and information requirements between them as well as the integrated approaches are also investigated. The representation of the garments' product data by the group technology coding is proposed. Based on the group technology, a shared data base as an integration element can be constructed, which leads to the integration of CAD/CAPP/FMS in garment CIMS.
Micro-video display with ocular tracking and interactive voice control
NASA Technical Reports Server (NTRS)
Miller, James E.
1993-01-01
In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.
A computerized compensator design algorithm with launch vehicle applications
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Mcdaniel, W. L., Jr.
1976-01-01
This short paper presents a computerized algorithm for the design of compensators for large launch vehicles. The algorithm is applicable to the design of compensators for linear, time-invariant, control systems with a plant possessing a single control input and multioutputs. The achievement of frequency response specifications is cast into a strict constraint mathematical programming format. An improved solution algorithm for solving this type of problem is given, along with the mathematical necessities for application to systems of the above type. A computer program, compensator improvement program (CIP), has been developed and applied to a pragmatic space-industry-related example.
NASA Astrophysics Data System (ADS)
Popa, L.; Popa, V.
2017-08-01
The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.
NASA Astrophysics Data System (ADS)
Beshears, Ronald D.; Hediger, Lisa H.
1994-10-01
The Advanced Computed Tomography Inspection System (ACTIS) was developed by the Marshall Space Flight Center to support in-house solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through Technology Utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has even been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been demonstrated, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing Aerospace Company. Smaller systems, based on ACTIS technology are becoming increasingly available. This technology has much to offer small businesses and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in pursuing this technology.
NASA Technical Reports Server (NTRS)
Hediger, Lisa H.
1991-01-01
The Advanced Computed Tomography Inspection System (ACTIS) was developed by NASA Marshall to support solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through technology utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been shown, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing. Smaller systems, based on ACTIS technology, are becoming increasingly available. This technology has much to offer the small business and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in this technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dennig, Yasmin
Sandia National Laboratories has a long history of significant contributions to the high performance community and industry. Our innovative computer architectures allowed the United States to become the first to break the teraFLOP barrier—propelling us to the international spotlight. Our advanced simulation and modeling capabilities have been integral in high consequence US operations such as Operation Burnt Frost. Strong partnerships with industry leaders, such as Cray, Inc. and Goodyear, have enabled them to leverage our high performance computing (HPC) capabilities to gain a tremendous competitive edge in the marketplace. As part of our continuing commitment to providing modern computing infrastructuremore » and systems in support of Sandia missions, we made a major investment in expanding Building 725 to serve as the new home of HPC systems at Sandia. Work is expected to be completed in 2018 and will result in a modern facility of approximately 15,000 square feet of computer center space. The facility will be ready to house the newest National Nuclear Security Administration/Advanced Simulation and Computing (NNSA/ASC) Prototype platform being acquired by Sandia, with delivery in late 2019 or early 2020. This new system will enable continuing advances by Sandia science and engineering staff in the areas of operating system R&D, operation cost effectiveness (power and innovative cooling technologies), user environment and application code performance.« less
NASA Technical Reports Server (NTRS)
Habib-Agahi, H.
1981-01-01
Market assessment, refined with analysis disaggregated from a national level to the regional level and to specific market applications, resulted in more accurate and detailed market estimates. The development of an integrated set of computer simulations, coupled with refined market data, allowed progress in the ability to evaluate the worth of solar thermal parabolic dish systems. In-depth analyses of both electric and thermal market applications of these systems are described. The following market assessment studies were undertaken: (1) regional analysis of the near term market for parabolic dish systems; (2) potential early market estimate for electric applications; (3) potential early market estimate for industrial process heat/cogeneration applications; and (4) selection of thermal and electric application case studies for fiscal year 1981.
Ship Trim Optimization: Assessment of Influence of Trim on Resistance of MOERI Container Ship
Duan, Wenyang
2014-01-01
Environmental issues and rising fuel prices necessitate better energy efficiency in all sectors. Shipping industry is a stakeholder in environmental issues. Shipping industry is responsible for approximately 3% of global CO2 emissions, 14-15% of global NOX emissions, and 16% of global SOX emissions. Ship trim optimization has gained enormous momentum in recent years being an effective operational measure for better energy efficiency to reduce emissions. Ship trim optimization analysis has traditionally been done through tow-tank testing for a specific hullform. Computational techniques are increasingly popular in ship hydrodynamics applications. The purpose of this study is to present MOERI container ship (KCS) hull trim optimization by employing computational methods. KCS hull total resistances and trim and sinkage computed values, in even keel condition, are compared with experimental values and found in reasonable agreement. The agreement validates that mesh, boundary conditions, and solution techniques are correct. The same mesh, boundary conditions, and solution techniques are used to obtain resistance values in different trim conditions at Fn = 0.2274. Based on attained results, optimum trim is suggested. This research serves as foundation for employing computational techniques for ship trim optimization. PMID:24578649
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hillis, D.R.
A computer-based simulation with an artificial intelligence component and discovery learning was investigated as a method to formulate training needs for new or unfamiliar technologies. Specifically, the study examined if this simulation method would provide for the recognition of applications and knowledge/skills which would be the basis for establishing training needs. The study also examined the effect of field-dependence/independence on recognition of applications and knowledge/skills. A pretest-posttest control group experimental design involving fifty-eight college students from an industrial technology program was used. The study concluded that the simulation was effective in developing recognition of applications and the knowledge/skills for amore » new or unfamiliar technology. And, the simulation's effectiveness for providing this recognition was not limited by an individual's field-dependence/independence.« less
Methodology, status and plans for development and assessment of TUF and CATHENA codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luxat, J.C.; Liu, W.S.; Leung, R.K.
1997-07-01
An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically onmore » CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area.« less
Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs
NASA Astrophysics Data System (ADS)
Pianese, C.; Sorrentino, M.
2009-08-01
Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.
Understanding of and applications for robot vision guidance at KSC
NASA Technical Reports Server (NTRS)
Shawaga, Lawrence M.
1988-01-01
The primary thrust of robotics at KSC is for the servicing of Space Shuttle remote umbilical docking functions. In order for this to occur, robots performing servicing operations must be capable of tracking a swaying Orbiter in Six Degrees of Freedom (6-DOF). Currently, in NASA KSC's Robotic Applications Development Laboratory (RADL), an ASEA IRB-90 industrial robot is being equipped with a real-time computer vision (hardware and software) system to allow it to track a simulated Orbiter interface (target) in 6-DOF. The real-time computer vision system effectively becomes the eyes for the lab robot, guiding it through a closed loop visual feedback system to move with the simulated Orbiter interface. This paper will address an understanding of this vision guidance system and how it will be applied to remote umbilical servicing at KSC. In addition, other current and future applications will be addressed.
Commercial and industrial applications of color ink jet: a technological perspective
NASA Astrophysics Data System (ADS)
Dunand, Alain
1996-03-01
In just 5 years, color ink-jet has become the dominant technology for printing color images and graphics in the office and home markets. In commercial printing, the traditional printing processes are being influenced by new digital techniques. Color ink-jet proofing, and concepts such as computer to film/plate or digital processes are contributing to the evolution of the industry. In industrial color printing, the penetration of digital techniques is just beginning. All widely used conventional contact printing technologies involve mechanical printing forms including plates, screens or engraved cylinders. Such forms, which need to be newly created and set up for each job, increase costs. In our era of fast changing customer demands, growing needs for customization, and increasing use of digital exchange of information, the commercial and industrial printing markets represent an enormous potential for digital printing technologies. The adoption characteristics for the use of color ink-jet in these industries are discussed. Examples of color ink-jet applications in the fields of billboard printing, floor/wall covering decoration, and textile printing are described. The requirements on print quality, productivity, reliability, substrate compatibility, and color lead to the consideration of various types of ink-jet technologies. Key technical enabling factors and directions for future improvements are presented.
Bu, Yifan; Cui, Yinglu; Peng, Ying; Hu, Meirong; Tian, Yu'e; Tao, Yong; Wu, Bian
2018-04-01
Xylanases, which cleave the β-1,4-glycosidic bond between xylose residues to release xylooligosaccharides (XOS), are widely used as food additives, animal feeds, and pulp bleaching agents. However, the thermally unstable nature of xylanases would hamper their industrial application. In this study, we used in silico design in a glycoside hydrolase family (GH) 11 xylanase to stabilize the enzyme. A combination of the best mutations increased the apparent melting temperature by 14 °C and significantly enhanced thermostability and thermoactivation. The variant also showed an upward-shifted optimal temperature for catalysis without compromising its activity at low temperatures. Moreover, a 10-fold higher XOS production yield was obtained at 70 °C, which compensated the low yield obtained with the wild-type enzyme. Collectively, the variant constructed by the computational strategy can be used as an efficient biocatalyst for XOS production at industrially viable conditions.
CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability
NASA Technical Reports Server (NTRS)
Claus, Russell; Weitzer, Ilan
2002-01-01
Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.
Advances in Integrated Computational Materials Engineering "ICME"
NASA Astrophysics Data System (ADS)
Hirsch, Jürgen
The methods of Integrated Computational Materials Engineering that were developed and successfully applied for Aluminium have been constantly improved. The main aspects and recent advances of integrated material and process modeling are simulations of material properties like strength and forming properties and for the specific microstructure evolution during processing (rolling, extrusion, annealing) under the influence of material constitution and process variations through the production process down to the final application. Examples are discussed for the through-process simulation of microstructures and related properties of Aluminium sheet, including DC ingot casting, pre-heating and homogenization, hot and cold rolling, final annealing. New results are included of simulation solution annealing and age hardening of 6xxx alloys for automotive applications. Physically based quantitative descriptions and computer assisted evaluation methods are new ICME methods of integrating new simulation tools also for customer applications, like heat affected zones in welding of age hardening alloys. The aspects of estimating the effect of specific elements due to growing recycling volumes requested also for high end Aluminium products are also discussed, being of special interest in the Aluminium producing industries.
Robust tuning of robot control systems
NASA Technical Reports Server (NTRS)
Minis, I.; Uebel, M.
1992-01-01
The computed torque control problem is examined for a robot arm with flexible, geared, joint drive systems which are typical in many industrial robots. The standard computed torque algorithm is not directly applicable to this class of manipulators because of the dynamics introduced by the joint drive system. The proposed approach to computed torque control combines a computed torque algorithm with torque controller at each joint. Three such control schemes are proposed. The first scheme uses the joint torque control system currently implemented on the robot arm and a novel form of the computed torque algorithm. The other two use the standard computed torque algorithm and a novel model following torque control system based on model following techniques. Standard tasks and performance indices are used to evaluate the performance of the controllers. Both numerical simulations and experiments are used in evaluation. The study shows that all three proposed systems lead to improved tracking performance over a conventional PD controller.
Synthetic mixed-signal computation in living cells
Rubens, Jacob R.; Selvaggio, Gianluca; Lu, Timothy K.
2016-01-01
Living cells implement complex computations on the continuous environmental signals that they encounter. These computations involve both analogue- and digital-like processing of signals to give rise to complex developmental programs, context-dependent behaviours and homeostatic activities. In contrast to natural biological systems, synthetic biological systems have largely focused on either digital or analogue computation separately. Here we integrate analogue and digital computation to implement complex hybrid synthetic genetic programs in living cells. We present a framework for building comparator gene circuits to digitize analogue inputs based on different thresholds. We then demonstrate that comparators can be predictably composed together to build band-pass filters, ternary logic systems and multi-level analogue-to-digital converters. In addition, we interface these analogue-to-digital circuits with other digital gene circuits to enable concentration-dependent logic. We expect that this hybrid computational paradigm will enable new industrial, diagnostic and therapeutic applications with engineered cells. PMID:27255669
A brief overview of NASA Langley's research program in formal methods
NASA Technical Reports Server (NTRS)
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
Technologies for Achieving Field Ubiquitous Computing
NASA Astrophysics Data System (ADS)
Nagashima, Akira
Although the term “ubiquitous” may sound like jargon used in information appliances, ubiquitous computing is an emerging concept in industrial automation. This paper presents the author's visions of field ubiquitous computing, which is based on the novel Internet Protocol IPv6. IPv6-based instrumentation will realize the next generation manufacturing excellence. This paper focuses on the following five key issues: 1. IPv6 standardization; 2. IPv6 interfaces embedded in field devices; 3. Compatibility with FOUNDATION fieldbus; 4. Network securities for field applications; and 5. Wireless technologies to complement IP instrumentation. Furthermore, the principles of digital plant operations and ubiquitous production to support the above key technologies to achieve field ubiquitous systems are discussed.
NDE for the 21st century: industry 4.0 requires NDE 4.0 (Conference Presentation)
NASA Astrophysics Data System (ADS)
Meyendorf, Norbert G.
2017-04-01
Industry 4.0 stands for the fourth industrial revolution that is ongoing at present. Industry 4.0 is a terminology preferred used in Europe to characterize the integration of production and communication technologies, the so called "smart factory". The first industrial revolution was the mechanization of work. The second was mass production and the assembly line. While the third revolution was the computer integrated manufacturing. Industry 4.0 encompasses the complete networking of all industrial areas. Lowering costs and efficient in-time production will be possible also for low numbers of very unique parts for example by additive manufacturing (3D printing). A significant aspect is also quality and maintainability of these sometimes unique structures and components. NDE has to follow these trends, not only by adapting NDE techniques to the new technologies, but also introducing the capability of cyber systems into the inspection and maintenance processes. The requirements and challenges for this new technological area will be discussed. Chances for applications of new technologies and systems for NDE will be demonstrated online.
Methods of Mathematical and Computational Physics for Industry, Science, and Technology
NASA Astrophysics Data System (ADS)
Melnik, Roderick V. N.; Voss, Frands
2006-11-01
Many industrial problems provide scientists with important and challenging problems that need to be solved today rather than tomorrow. The key role of mathematical physics, modelling, and computational methodologies in addressing such problems continues to increase. Science has never been exogenous to applied research. Gigantic ships and steam engines, repeating catapult of Dionysius and the Antikythera `computer' invented around 80BC are just a few examples demonstrating a profound link between theoretical and applied science in the ancient world. Nowadays, many industrial problems are typically approached by groups of researchers who are working as a team bringing their expertise to the success of the entire enterprise. Since the late 1960s several groups of European mathematicians and scientists have started organizing regular meetings, seeking new challenges from industry and contributing to the solution of important industrial problems. In particular, this often took the format of week-long workshops originally initiated by the Oxford Study Groups with Industry in 1968. Such workshops are now held in many European countries (typically under the auspices of the European Study Groups with Industry - ESGI), as well as in Australia, Canada, the United States, and other countries around the world. Problems given by industrial partners are sometimes very difficult to complete within a week. However, during a week of brainstorming activities these problems inevitably stimulate developing fruitful new ideas, new approaches, and new collaborations. At the same time, there are cases where as soon as the problem is formulated mathematically, it is relatively easy to solve. Hence, putting the industrial problem into a mathematical framework, based on physical laws, often provides a key element to the success. In addition to this important first step, the value in such cases is the real, practical applicability of the results obtained for an industrial partner who presents the problem. Under both outlined scenarios, scientists and mathematicians are provided with an opportunity to challenge themselves with real-world problems and to work together in a team on important industrial issues. This issue is a result of selected contributions by participants of the meeting that took place in the Sønderborg area of Denmark, one of the most important centers for information technology, telecommunication and electronics in the country. The meeting was hosted by the University of Southern Denmark in a picturesque area of Southern Jutland. It brought together about 65 participants, among whom were professional mathematicians, engineers, physicists, and industrial participants. The meeting was a truly international one, with delegates from four major Danish Universities, the UK, Norway, Italy, Czech Republic, Turkey, China, Germany, Latvia, Canada, the United States, and Finland. Five challenging projects were presented by leading industrial companies, including Grundfos, Danfoss Industrial Control, Unisensor, and Danfoss Flow Division (now Siemens). The meeting featured also the Mathematics for Industry Workshop with several distinguished international speakers. This volume of Journal of Physics: Conference Series on `Methods of Mathematical and Computational Physics for Industry, Science, and Technology' contains contributions from some of the participants of the workshop as well as the papers produced as a result of collaborative efforts with the above mentioned industrial companies. We would like to thank all authors and participants for their contributions and for bearing with us during the review process and preparation of this issue. We thank also all our referees for their timely and detailed reports. The publication of the proceedings of this meeting in Denmark was delayed due to problems with a previous publisher. We are very grateful that Journal of Physics: Conference Series kindly agreed to publish the proceedings rapidly at this late stage. As industrial problems become increasingly multidisciplinary, their successful solutions are often contingent on effective collaborative efforts between scientists, mathematicians, industrialists, and engineers. This volume has provided several examples of such collaborative efforts in the context of real-world industrial problems along with the analysis of important physics-based mathematical models applicable in a range of industrial contexts. Roderick V N Melnik, Professor of Mathematical Modelling, Syddansk Universitet (Denmark) and Professor and Canada Research Chair, Wilfrid Laurier University, Waterloo, Canada E-mail: rmelnik@wlu.ca Frands Voss, Director of the Mads Clausen Institute, Syddansk Universitet (Denmark)
Artificial Intelligence Controls Tape-Recording Sequence
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.; Otamura, Roy M.; Zottarelli, Lawrence J.
1989-01-01
Developmental expert-system computer program intended to schedule recording of large amounts of data on limited amount of magnetic tape. Schedules recording using two sets of rules. First set incorporates knowledge of locations for recording of new data. Second set incorporates knowledge about issuing commands to recorder. Designed primarily for use on Voyager Spacecraft, also applicable to planning and sequencing in industry.
Using Biometric Measurement in Real-Time as a Sympathetic System in Computer Games
ERIC Educational Resources Information Center
Charij, Stephanie; Oikonomou, Andreas
2013-01-01
With the increasing potential for gaming hardware and peripherals to support biometrics, their application within the games industry for software and design should be considered. This paper assesses the ability to use a form of biometric measurement, heart rate, in real-time to improve the challenge and enjoyment of a game by catering it to…
NASA Technical Reports Server (NTRS)
Murthy, T. Sreekanta; Kvaternik, Raymond G.
1991-01-01
A NASA/industry rotorcraft structural dynamics program known as Design Analysis Methods for VIBrationS (DAMVIBS) was initiated at Langley Research Center in 1984 with the objective of establishing the technology base needed by the industry for developing an advanced finite-element-based vibrations design analysis capability for airframe structures. As a part of the in-house activities contributing to that program, a study was undertaken to investigate the use of formal, nonlinear programming-based, numerical optimization techniques for airframe vibrations design work. Considerable progress has been made in connection with that study since its inception in 1985. This paper presents a unified summary of the experiences and results of that study. The formulation and solution of airframe optimization problems are discussed. Particular attention is given to describing the implementation of a new computational procedure based on MSC/NASTRAN and CONstrained function MINimization (CONMIN) in a computer program system called DYNOPT for the optimization of airframes subject to strength, frequency, dynamic response, and fatigue constraints. The results from the application of the DYNOPT program to the Bell AH-1G helicopter are presented and discussed.
Automated surface inspection for steel products using computer vision approach.
Xi, Jiaqi; Shentu, Lifeng; Hu, Jikang; Li, Mian
2017-01-10
Surface inspection is a critical step in ensuring the product quality in the steel-making industry. In order to relieve inspectors of laborious work and improve the consistency of inspection, much effort has been dedicated to the automated inspection using computer vision approaches over the past decades. However, due to non-uniform illumination conditions and similarity between the surface textures and defects, the present methods are usually applicable to very specific cases. In this paper a new framework for surface inspection has been proposed to overcome these limitations. By investigating the image formation process, a quantitative model characterizing the impact of illumination on the image quality is developed, based on which the non-uniform brightness in the image can be effectively removed. Then a simple classifier is designed to identify the defects among the surface textures. The significance of this approach lies in its robustness to illumination changes and wide applicability to different inspection scenarios. The proposed approach has been successfully applied to the real-time surface inspection of round billets in real manufacturing. Implemented on a conventional industrial PC, the algorithm can proceed at 12.5 frames per second with the successful detection rate being over 90% for turned and skinned billets.
Quantum mechanics implementation in drug-design workflows: does it really help?
Arodola, Olayide A; Soliman, Mahmoud Es
2017-01-01
The pharmaceutical industry is progressively operating in an era where development costs are constantly under pressure, higher percentages of drugs are demanded, and the drug-discovery process is a trial-and-error run. The profit that flows in with the discovery of new drugs has always been the motivation for the industry to keep up the pace and keep abreast with the endless demand for medicines. The process of finding a molecule that binds to the target protein using in silico tools has made computational chemistry a valuable tool in drug discovery in both academic research and pharmaceutical industry. However, the complexity of many protein-ligand interactions challenges the accuracy and efficiency of the commonly used empirical methods. The usefulness of quantum mechanics (QM) in drug-protein interaction cannot be overemphasized; however, this approach has little significance in some empirical methods. In this review, we discuss recent developments in, and application of, QM to medically relevant biomolecules. We critically discuss the different types of QM-based methods and their proposed application to incorporating them into drug-design and -discovery workflows while trying to answer a critical question: are QM-based methods of real help in drug-design and -discovery research and industry?
Non-linear aeroelastic prediction for aircraft applications
NASA Astrophysics Data System (ADS)
de C. Henshaw, M. J.; Badcock, K. J.; Vio, G. A.; Allen, C. B.; Chamberlain, J.; Kaynes, I.; Dimitriadis, G.; Cooper, J. E.; Woodgate, M. A.; Rampurawala, A. M.; Jones, D.; Fenwick, C.; Gaitonde, A. L.; Taylor, N. V.; Amor, D. S.; Eccles, T. A.; Denley, C. J.
2007-05-01
Current industrial practice for the prediction and analysis of flutter relies heavily on linear methods and this has led to overly conservative design and envelope restrictions for aircraft. Although the methods have served the industry well, it is clear that for a number of reasons the inclusion of non-linearity in the mathematical and computational aeroelastic prediction tools is highly desirable. The increase in available and affordable computational resources, together with major advances in algorithms, mean that non-linear aeroelastic tools are now viable within the aircraft design and qualification environment. The Partnership for Unsteady Methods in Aerodynamics (PUMA) Defence and Aerospace Research Partnership (DARP) was sponsored in 2002 to conduct research into non-linear aeroelastic prediction methods and an academic, industry, and government consortium collaborated to address the following objectives: To develop useable methodologies to model and predict non-linear aeroelastic behaviour of complete aircraft. To evaluate the methodologies on real aircraft problems. To investigate the effect of non-linearities on aeroelastic behaviour and to determine which have the greatest effect on the flutter qualification process. These aims have been very effectively met during the course of the programme and the research outputs include: New methods available to industry for use in the flutter prediction process, together with the appropriate coaching of industry engineers. Interesting results in both linear and non-linear aeroelastics, with comprehensive comparison of methods and approaches for challenging problems. Additional embryonic techniques that, with further research, will further improve aeroelastics capability. This paper describes the methods that have been developed and how they are deployable within the industrial environment. We present a thorough review of the PUMA aeroelastics programme together with a comprehensive review of the relevant research in this domain. This is set within the context of a generic industrial process and the requirements of UK and US aeroelastic qualification. A range of test cases, from simple small DOF cases to full aircraft, have been used to evaluate and validate the non-linear methods developed and to make comparison with the linear methods in everyday use. These have focused mainly on aerodynamic non-linearity, although some results for structural non-linearity are also presented. The challenges associated with time domain (coupled computational fluid dynamics-computational structural model (CFD-CSM)) methods have been addressed through the development of grid movement, fluid-structure coupling, and control surface movement technologies. Conclusions regarding the accuracy and computational cost of these are presented. The computational cost of time-domain methods, despite substantial improvements in efficiency, remains high. However, significant advances have been made in reduced order methods, that allow non-linear behaviour to be modelled, but at a cost comparable with that of the regular linear methods. Of particular note is a method based on Hopf bifurcation that has reached an appropriate maturity for deployment on real aircraft configurations, though only limited results are presented herein. Results are also presented for dynamically linearised CFD approaches that hold out the possibility of non-linear results at a fraction of the cost of time coupled CFD-CSM methods. Local linearisation approaches (higher order harmonic balance and continuation method) are also presented; these have the advantage that no prior assumption of the nature of the aeroelastic instability is required, but currently these methods are limited to low DOF problems and it is thought that these will not reach a level of maturity appropriate to real aircraft problems for some years to come. Nevertheless, guidance on the most likely approaches has been derived and this forms the basis for ongoing research. It is important to recognise that the aeroelastic design and qualification requires a variety of methods applicable at different stages of the process. The methods reported herein are mapped to the process, so that their applicability and complementarity may be understood. Overall, the programme has provided a suite of methods that allow realistic consideration of non-linearity in the aeroelastic design and qualification of aircraft. Deployment of these methods is underway in the industrial environment, but full realisation of the benefit of these approaches will require appropriate engagement with the standards community so that safety standards may take proper account of the inclusion of non-linearity.
Caballero, Daniel; Antequera, Teresa; Caro, Andrés; Ávila, María Del Mar; G Rodríguez, Pablo; Perez-Palacios, Trinidad
2017-07-01
Magnetic resonance imaging (MRI) combined with computer vision techniques have been proposed as an alternative or complementary technique to determine the quality parameters of food in a non-destructive way. The aim of this work was to analyze the sensory attributes of dry-cured loins using this technique. For that, different MRI acquisition sequences (spin echo, gradient echo and turbo 3D), algorithms for MRI analysis (GLCM, NGLDM, GLRLM and GLCM-NGLDM-GLRLM) and predictive data mining techniques (multiple linear regression and isotonic regression) were tested. The correlation coefficient (R) and mean absolute error (MAE) were used to validate the prediction results. The combination of spin echo, GLCM and isotonic regression produced the most accurate results. In addition, the MRI data from dry-cured loins seems to be more suitable than the data from fresh loins. The application of predictive data mining techniques on computational texture features from the MRI data of loins enables the determination of the sensory traits of dry-cured loins in a non-destructive way. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Westlund, Harold B.; Meyer, Gary W.; Hunt, Fern Y.
2002-01-01
Computer rendering is used to simulate the appearance of lighted objects for applications in architectural design, for animation and simulation in the entertainment industry, and for display and design in the automobile industry. Rapid advances in computer graphics technology suggest that in the near future it will be possible to produce photorealistic images of coated surfaces from scattering data. This could enable the identification of important parameters in the coatings manufacturing process that lead to desirable appearance, and to the design of virtual surfaces by visualizing prospective coating formulations once their optical properties are known. Here we report the results of our work to produce visually and radiometrically accurate renderings of selected appearance attributes of sample coated surfaces. It required changes in the rendering programs, which in general are not designed to accept high quality optical and material measurements, and changes in the optical measurement protocols. An outcome of this research is that some current ASTM standards can be replaced or enhanced by computer based standards of appearance. PMID:27446729
Advances in multi-scale modeling of solidification and casting processes
NASA Astrophysics Data System (ADS)
Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang
2011-04-01
The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.
Line x-ray source for diffraction enhanced imaging in clinical and industrial applications
NASA Astrophysics Data System (ADS)
Wang, Xiaoqin
Mammography is one type of imaging modalities that uses a low-dose x-ray or other radiation sources for examination of breasts. It plays a central role in early detection of breast cancers. The material similarity of tumor-cell and health cell, breast implants surgery and other factors, make the breast cancers hard to visualize and detect. Diffraction enhanced imaging (DEI), first proposed and investigated by D. Chapman is a new x-ray radiographic imaging modality using monochromatic x-rays from a synchrotron source, which produced images of thick absorbing objects that are almost completely free of scatter. It shows dramatically improved contrast over standard imaging when applied to the same phantom. The contrast is based not only on attenuation but also on the refraction and diffraction properties of the sample. This imaging method may improve image quality of mammography, other medical applications, industrial radiography for non-destructive testing and x-ray computed tomography. However, the size, and cost, of a synchrotron source limits the application of the new modality to be applicable at clinical levels. This research investigates the feasibility of a designed line x-ray source to produce intensity compatible to synchrotron sources. It is composed of a 2-cm in length tungsten filament, installed on a carbon steel filament cup (backing plate), as the cathode and a stationary oxygen-free copper anode with molybdenum coating on the front surface serves as the target. Characteristic properties of the line x-ray source were computationally studied and the prototype was experimentally investigated. SIMIION code was used to computationally study the electron trajectories emanating from the filament towards the molybdenum target. A Faraday cup on the prototype device, proof-of-principle, was used to measure the distribution of electrons on the target, which compares favorably to computational results. The intensities of characteristic x-ray for molybdenum, tungsten and rhodium targets were investigated with different window materials for -30kV to -100kV applied potential. Heat loading and thermal management of the target has been investigated computationally using COMSOL code package, and experimental measurements of target temperature rise was taken via thermocouples attached to the target. Temperature measurements for low voltage, low current regime without active cooling were compared to computational results for code-experiment benchmarking. Two different phantoms were used in the simulation of DEI images, which showed that the designed x-ray source with DEI setup could produce images with significant improved contrast. The computational results, along with experimental measurements on the prototype setup, indicate the possibility of scale up to larger area x-ray source adequate for DEI applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
An Efficient Wireless Sensor Network for Industrial Monitoring and Control.
Aponte-Luis, Juan; Gómez-Galán, Juan Antonio; Gómez-Bravo, Fernando; Sánchez-Raya, Manuel; Alcina-Espigado, Javier; Teixido-Rovira, Pedro Miguel
2018-01-10
This paper presents the design of a wireless sensor network particularly designed for remote monitoring and control of industrial parameters. The article describes the network components, protocol and sensor deployment, aimed to accomplish industrial constraint and to assure reliability and low power consumption. A particular case of study is presented. The system consists of a base station, gas sensing nodes, a tree-based routing scheme for the wireless sensor nodes and a real-time monitoring application that operates from a remote computer and a mobile phone. The system assures that the industrial safety quality and the measurement and monitoring system achieves an efficient industrial monitoring operations. The robustness of the developed system and the security in the communications have been guaranteed both in hardware and software level. The system is flexible and can be adapted to different environments. The testing of the system confirms the feasibility of the proposed implementation and validates the functional requirements of the developed devices, the networking solution and the power consumption management.
An Efficient Wireless Sensor Network for Industrial Monitoring and Control
Aponte-Luis, Juan; Gómez-Bravo, Fernando; Sánchez-Raya, Manuel; Alcina-Espigado, Javier; Teixido-Rovira, Pedro Miguel
2018-01-01
This paper presents the design of a wireless sensor network particularly designed for remote monitoring and control of industrial parameters. The article describes the network components, protocol and sensor deployment, aimed to accomplish industrial constraint and to assure reliability and low power consumption. A particular case of study is presented. The system consists of a base station, gas sensing nodes, a tree-based routing scheme for the wireless sensor nodes and a real-time monitoring application that operates from a remote computer and a mobile phone. The system assures that the industrial safety quality and the measurement and monitoring system achieves an efficient industrial monitoring operations. The robustness of the developed system and the security in the communications have been guaranteed both in hardware and software level. The system is flexible and can be adapted to different environments. The testing of the system confirms the feasibility of the proposed implementation and validates the functional requirements of the developed devices, the networking solution and the power consumption management. PMID:29320466
The application of dynamic programming in production planning
NASA Astrophysics Data System (ADS)
Wu, Run
2017-05-01
Nowadays, with the popularity of the computers, various industries and fields are widely applying computer information technology, which brings about huge demand for a variety of application software. In order to develop software meeting various needs with most economical cost and best quality, programmers must design efficient algorithms. A superior algorithm can not only soul up one thing, but also maximize the benefits and generate the smallest overhead. As one of the common algorithms, dynamic programming algorithms are used to solving problems with some sort of optimal properties. When solving problems with a large amount of sub-problems that needs repetitive calculations, the ordinary sub-recursive method requires to consume exponential time, and dynamic programming algorithm can reduce the time complexity of the algorithm to the polynomial level, according to which we can conclude that dynamic programming algorithm is a very efficient compared to other algorithms reducing the computational complexity and enriching the computational results. In this paper, we expound the concept, basic elements, properties, core, solving steps and difficulties of the dynamic programming algorithm besides, establish the dynamic programming model of the production planning problem.
Application-Program-Installer Builder
NASA Technical Reports Server (NTRS)
Wolgast, Paul; Demore, Martha; Lowik, Paul
2007-01-01
A computer program builds application programming interfaces (APIs) and related software components for installing and uninstalling application programs in any of a variety of computers and operating systems that support the Java programming language in its binary form. This program is partly similar in function to commercial (e.g., Install-Shield) software. This program is intended to enable satisfaction of a quasi-industry-standard set of requirements for a set of APIs that would enable such installation and uninstallation and that would avoid the pitfalls that are commonly encountered during installation of software. The requirements include the following: 1) Properly detecting prerequisites to an application program before performing the installation; 2) Properly registering component requirements; 3) Correctly measuring the required hard-disk space, including accounting for prerequisite components that have already been installed; and 4) Correctly uninstalling an application program. Correct uninstallation includes (1) detecting whether any component of the program to be removed is required by another program, (2) not removing that component, and (3) deleting references to requirements of the to-be-removed program for components of other programs so that those components can be properly removed at a later time.
The transition of oncologic imaging from its “industrial era” to it is “information era” demands analytical methods that 1) extract information from this data that is clinically and biologically relevant; 2) integrate imaging, clinical, and genomic data via rigorous statistical and computational methodologies in order to derive models valuable for understanding cancer mechanisms, diagnosis, prognostic assessment, response evaluation, and personalized treatment management; 3) are available to the biomedical community for easy use and application, with the aim of understanding, diagnosing, an
FUTURE APPLICATIONS OF EXPERT SYSTEMS FOR THE EVALUATION OF ENERGY RESOURCES.
Miller, Betty M.
1988-01-01
The loss of professional experience and expertise in the domain of the earth sciences may prove to be one of the most serious outcomes of the boom-and-bust cyclic nature of the volatile energy and mining industries. Promising new applications of powerful computer systems, known as 'expert systems' or 'knowledge-based systems', are predicted for use in the earth science. These systems have the potential capability to capture and preserve the invaluable knowledge bases essential to the evaluation of US energy and mineral resources.
FUTURE APPLICATIONS OF EXPERT SYSTEMS FOR THE EVALUATION OF ENERGY RESOURCES.
Miller, B.M.
1987-01-01
The loss of professional experience and expertise in the domain of the earth sciences may prove to be one of the most serious outcomes of the boom-and-bust cyclic nature of the volatile energy and mining industries. Promising new applications of powerful computer systems, known as 'expert systems' or 'knowledge-based systems', are predicted for use in the earth sciences. These systems have the potential capability to capture and preserve the invaluable knowledge bases essential to the evaluation of the Nation's energy and mineral resources.
BCAT (Binary Colloid Alloy Test) experiment documentation
2009-05-02
ISS019-E-013244 (2 May 2009) --- Astronaut Michael Barratt, Expedition 19/20 flight engineer, uses a computer during a session with the Binodal Colloidal Aggregation Test?4 (BCAT-4) in the Destiny laboratory of the International Space Station. This experiment studies the long-term behavior of colloids ? fine particles suspended in a fluid in a microgravity environment, where the effects of sedimentation and convention are removed. Results from this study may lead to new colloid materials with applications in the communications and computer industries for switches, displays and optical devices with properties that could rival those of lasers.
Intellectual property and information controversy (II)
NASA Astrophysics Data System (ADS)
Aoyama, Hirokazu
As advanced information has been proceeded rapidly, intellectual property has become more important than ever as business resources of enterprises. Based on the former report by the author "present status of and trend in intellectual property" this paper describes "information" related intellectual property controversy which have been occurred, that is, 1) affairs related to computer hardwares and softwares (the case of compatible machines and OS, the case of application softwares, computer crimes) and 2) affairs on trade secret (the case of revealing enterprises'secret, the case of industrial espionage). It also discusses how intellectual property should be protected and utilized from now on.
Fundamental organometallic reactions: Applications on the CYBER 205
NASA Technical Reports Server (NTRS)
Rappe, A. K.
1984-01-01
Two of the most challenging problems of Organometallic chemistry (loosely defined) are pollution control with the large space velocities needed and nitrogen fixation, a process so capably done by nature and so relatively poorly done by man (industry). For a computational chemist these problems are on the fringe of what is possible with conventional computers (large models needed and accurate energetics required). A summary of the algorithmic modification needed to address these problems on a vector processor such as the CYBER 205 and a sketch of findings to date on deNOx catalysis and nitrogen fixation are presented.
NASA Technical Reports Server (NTRS)
1990-01-01
Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.
New Challenges in Computational Thermal Hydraulics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yadigaroglu, George; Lakehal, Djamel
New needs and opportunities drive the development of novel computational methods for the design and safety analysis of light water reactors (LWRs). Some new methods are likely to be three dimensional. Coupling is expected between system codes, computational fluid dynamics (CFD) modules, and cascades of computations at scales ranging from the macro- or system scale to the micro- or turbulence scales, with the various levels continuously exchanging information back and forth. The ISP-42/PANDA and the international SETH project provide opportunities for testing applications of single-phase CFD methods to LWR safety problems. Although industrial single-phase CFD applications are commonplace, computational multifluidmore » dynamics is still under development. However, first applications are appearing; the state of the art and its potential uses are discussed. The case study of condensation of steam/air mixtures injected from a downward-facing vent into a pool of water is a perfect illustration of a simulation cascade: At the top of the hierarchy of scales, system behavior can be modeled with a system code; at the central level, the volume-of-fluid method can be applied to predict large-scale bubbling behavior; at the bottom of the cascade, direct-contact condensation can be treated with direct numerical simulation, in which turbulent flow (in both the gas and the liquid), interfacial dynamics, and heat/mass transfer are directly simulated without resorting to models.« less
Center for Advanced Computational Technology
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2000-01-01
The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.
Reis, H; Rasulev, B; Papadopoulos, M G; Leszczynski, J
2015-01-01
Fullerene and its derivatives are currently one of the most intensively investigated species in the area of nanomedicine and nanochemistry. Various unique properties of fullerenes are responsible for their wide range applications in industry, biology and medicine. A large pool of functionalized C60 and C70 fullerenes is investigated theoretically at different levels of quantum-mechanical theory. The semiempirial PM6 method, density functional theory with the B3LYP functional, and correlated ab initio MP2 method are employed to compute the optimized structures, and an array of properties for the considered species. In addition to the calculations for isolated molecules, the results of solution calculations are also reported at the DFT level, using the polarizable continuum model (PCM). Ionization potentials (IPs) and electron affinities (EAs) are computed by means of Koopmans' theorem as well as with the more accurate but computationally expensive ΔSCF method. Both procedures yield comparable values, while comparison of IPs and EAs computed with different quantum-mechanical methods shows surprisingly large differences. Harmonic vibrational frequencies are computed at the PM6 and B3LYP levels of theory and compared with each other. A possible application of the frequencies as 3D descriptors in the EVA (EigenVAlues) method is shown. All the computed data are made available, and may be used to replace experimental data in routine applications where large amounts of data are required, e.g. in structure-activity relationship studies of the toxicity of fullerene derivatives.
Computers: Good for Education?
ERIC Educational Resources Information Center
Skinner, David
1997-01-01
Explores the use of computers in the classroom, and concludes that the burden should be on the computer industry to prove that it really has something to offer to the educational system. Instead, article notes, the computer industry is pushing computer use that has not been demonstrated to be an educational necessity. (SLD)
Zhao, Yu; Liu, Yide; Lai, Ivan K W; Zhang, Hongfeng; Zhang, Yi
2016-03-18
As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human-computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users' compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user's compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user-product (brand) relationships.
Zhao, Yu; Liu, Yide; Lai, Ivan K. W.; Zhang, Hongfeng; Zhang, Yi
2016-01-01
As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human–computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users’ compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user’s compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user—product (brand) relationships. PMID:26999155
Use of a Computer-Mediated Delphi Process to Validate a Mass Casualty Conceptual Model
CULLEY, JOAN M.
2012-01-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters. PMID:21076283
Use of a computer-mediated Delphi process to validate a mass casualty conceptual model.
Culley, Joan M
2011-05-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters.
1999-11-10
Space Vacuum Epitaxy Center works with industry and government laboratories to develop advanced thin film materials and devices by utilizing the most abundant free resource in orbit: the vacuum of space. SVEC, along with its affiliates, is developing semiconductor mid-IR lasers for environmental sensing and defense applications, high efficiency solar cells for space satellite applications, oxide thin films for computer memory applications, and ultra-hard thin film coatings for wear resistance in micro devices. Performance of these vacuum deposited thin film materials and devices can be enhanced by using the ultra-vacuum of space for which SVEC has developed the Wake Shield Facility---a free flying research platform dedicated to thin film materials development in space.
2000-11-10
Space Vacuum Epitaxy Center works with industry and government laboratories to develop advanced thin film materials and devices by utilizing the most abundant free resource in orbit: the vacuum of space. SVEC, along with its affiliates, is developing semiconductor mid-IR lasers for environmental sensing and defense applications, high efficiency solar cells for space satellite applications, oxide thin films for computer memory applications, and ultra-hard thin film coatings for wear resistance in micro devices. Performance of these vacuum deposited thin film materials and devices can be enhanced by using the ultra-vacuum of space for which SVEC has developed the Wake Shield Facility---a free flying research platform dedicated to thin film materials development in space.
Technology Requirements and Selection for Securely Partitioning OBSW
NASA Astrophysics Data System (ADS)
Mendham, Peter; Windsor, James; Eckstein, Knut
2010-08-01
The Securely Partitioning Spacecraft Computing Resources project is a current ESA TRP activity investigating the application of secure time and space partitioning (TSP) technologies to enable multi-use missions from a single platform. Secure TSP technologies are used in a number of application areas outside the space domain and an opportunity exists to 'spin-in' a suitable solution. The selection of a technology for use within space the European space industry relies on an understanding of the requirements for the application of secure TSP, of which this paper presents a summary. Further, the paper outlines the selection process taken by the project and highlights promising solutions for use today.
Behavioural science at work for Canada: National Research Council laboratories.
Veitch, Jennifer A
2007-03-01
The National Research Council is Canada's principal research and development agency. Its 20 institutes are structured to address interdisciplinary problems for industrial sectors, and to provide the necessary scientific infrastructure, such as the national science library. Behavioural scientists are active in five institutes: Biological Sciences, Biodiagnostics, Aerospace, Information Technology, and Construction. Research topics include basic cellular neuroscience, brain function, human factors in the cockpit, human-computer interaction, emergency evacuation, and indoor environment effects on occupants. Working in collaboration with NRC colleagues and with researchers from universities and industry, NRC behavioural scientists develop knowledge, designs, and applications that put technology to work for people, designed with people in mind.
Technological inductive power transfer systems
NASA Astrophysics Data System (ADS)
Madzharov, Nikolay D.; Nemkov, Valentin S.
2017-05-01
Inductive power transfer is a very fast expanding technology with multiple design principles and practical implementations ranging from charging phones and computers to bionic systems, car chargers and continuous power transfer in technological lines. Only a group of devices working in near magnetic field is considered. This article is devoted to overview of different inductive power transfer (IPT) devices. The review of literature in this area showed that industrial IPT are not much discussed and examined. The authors have experience in design and implementation of several types of IPTs belonging to wireless automotive chargers and to industrial application group. Main attention in the article is paid to principles and design of technological IPTs
NASA Technical Reports Server (NTRS)
Morrison, Dennis R.
2005-01-01
The microparticle flow sensor (MFS) is a system for identifying and counting microscopic particles entrained in a flowing liquid. The MFS includes a transparent, optoelectronically instrumented laminar-flow chamber (see figure) and a computer for processing instrument-readout data. The MFS could be used to count microparticles (including micro-organisms) in diverse applications -- for example, production of microcapsules, treatment of wastewater, pumping of industrial chemicals, and identification of ownership of liquid products.
1978-09-01
Strategy for a Class of Static Systems", IEEE Trans, on Auto. Cont., AC-17, 7-15 (1972). 14. Findeisen , W., and Lefkowitz, I., "Design and Applications...of Multilayer Control", Proc. Fourth IFAC Congress, Warsaw, 1969. 15. Findeisen , W., Multilevel Control Systems, Panstwowe Wydawnictwo Naukowe
Design and Integration of Hydrostatic Transmission in a 300-HP Marine Corps Amphibious Vehicle
1985-03-01
tests , and the control logic, micro- computer hardware , and electro-hydraulic actuators that transform operator inputs into drivetrain outputs. Also...actually the case based on manufacturers’ information. The use of swash plate pumps in this application presents no real problem and is in fact the ...industry norm. Although the swash plate pumps do suffer slightly from a decrease in
Mineral resource of the month: cobalt
Shedd, Kim B.
2009-01-01
Cobalt is a metal used in numerous commercial, industrial and military applications. On a global basis, the leading use of cobalt is in rechargeable lithium-ion, nickel-cadmium and nickel-metal hydride battery electrodes. Cobalt use has grown rapidly since the early 1990s, with the development of new battery technologies and an increase in demand for portable electronics such as cell phones, laptop computers and cordless power tools.
Singularities in Free Surface Flows
NASA Astrophysics Data System (ADS)
Thete, Sumeet Suresh
Free surface flows where the shape of the interface separating two or more phases or liquids are unknown apriori, are commonplace in industrial applications and nature. Distribution of drop sizes, coalescence rate of drops, and the behavior of thin liquid films are crucial to understanding and enhancing industrial practices such as ink-jet printing, spraying, separations of chemicals, and coating flows. When a contiguous mass of liquid such as a drop, filament or a film undergoes breakup to give rise to multiple masses, the topological transition is accompanied with a finite-time singularity . Such singularity also arises when two or more masses of liquid merge into each other or coalesce. Thus the dynamics close to singularity determines the fate of about-to-form drops or films and applications they are involved in, and therefore needs to be analyzed precisely. The primary goal of this thesis is to resolve and analyze the dynamics close to singularity when free surface flows experience a topological transition, using a combination of theory, experiments, and numerical simulations. The first problem under consideration focuses on the dynamics following flow shut-off in bottle filling applications that are relevant to pharmaceutical and consumer products industry, using numerical techniques based on Galerkin Finite Element Methods (GFEM). The second problem addresses the dual flow behavior of aqueous foams that are observed in oil and gas fields and estimates the relevant parameters that describe such flows through a series of experiments. The third problem aims at understanding the drop formation of Newtonian and Carreau fluids, computationally using GFEM. The drops are formed as a result of imposed flow rates or expanding bubbles similar to those of piezo actuated and thermal ink-jet nozzles. The focus of fourth problem is on the evolution of thinning threads of Newtonian fluids and suspensions towards singularity, using computations based on GFEM and experimental techniques. The aim of fifth problem is to analyze the coalescence dynamics of drops through a combination of GFEM and scaling theory. Lastly, the sixth problem concerns the thinning and rupture dynamics of thin films of Newtonian and power-law fluids using scaling theory based on asymptotic analysis and the predictions of this theory are corroborated using computations based on GFEM.
Liu, Ping; Li, Guodong; Liu, Xinggao
2015-09-01
Control vector parameterization (CVP) is an important approach of the engineering optimization for the industrial dynamic processes. However, its major defect, the low optimization efficiency caused by calculating the relevant differential equations in the generated nonlinear programming (NLP) problem repeatedly, limits its wide application in the engineering optimization for the industrial dynamic processes. A novel highly effective control parameterization approach, fast-CVP, is first proposed to improve the optimization efficiency for industrial dynamic processes, where the costate gradient formulae is employed and a fast approximate scheme is presented to solve the differential equations in dynamic process simulation. Three well-known engineering optimization benchmark problems of the industrial dynamic processes are demonstrated as illustration. The research results show that the proposed fast approach achieves a fine performance that at least 90% of the computation time can be saved in contrast to the traditional CVP method, which reveals the effectiveness of the proposed fast engineering optimization approach for the industrial dynamic processes. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Recent Advances in X-ray Cone-beam Computed Laminography.
O'Brien, Neil S; Boardman, Richard P; Sinclair, Ian; Blumensath, Thomas
2016-10-06
X-ray computed tomography is an established volume imaging technique used routinely in medical diagnosis, industrial non-destructive testing, and a wide range of scientific fields. Traditionally, computed tomography uses scanning geometries with a single axis of rotation together with reconstruction algorithms specifically designed for this setup. Recently there has however been increasing interest in more complex scanning geometries. These include so called X-ray computed laminography systems capable of imaging specimens with large lateral dimensions or large aspect ratios, neither of which are well suited to conventional CT scanning procedures. Developments throughout this field have thus been rapid, including the introduction of novel system trajectories, the application and refinement of various reconstruction methods, and the use of recently developed computational hardware and software techniques to accelerate reconstruction times. Here we examine the advances made in the last several years and consider their impact on the state of the art.
Snore related signals processing in a private cloud computing system.
Qian, Kun; Guo, Jian; Xu, Huijie; Zhu, Zhaomeng; Zhang, Gongxuan
2014-09-01
Snore related signals (SRS) have been demonstrated to carry important information about the obstruction site and degree in the upper airway of Obstructive Sleep Apnea-Hypopnea Syndrome (OSAHS) patients in recent years. To make this acoustic signal analysis method more accurate and robust, big SRS data processing is inevitable. As an emerging concept and technology, cloud computing has motivated numerous researchers and engineers to exploit applications both in academic and industry field, which could have an ability to implement a huge blue print in biomedical engineering. Considering the security and transferring requirement of biomedical data, we designed a system based on private cloud computing to process SRS. Then we set the comparable experiments of processing a 5-hour audio recording of an OSAHS patient by a personal computer, a server and a private cloud computing system to demonstrate the efficiency of the infrastructure we proposed.
Advances in computer-aided well-test interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, R.N.
1994-07-01
Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
Exploring Computer Technology. The Illinois Plan for Industrial Education.
ERIC Educational Resources Information Center
Illinois State Univ., Normal.
This guide, which is one in the "Exploration" series of curriculum guides intended to assist junior high and middle school industrial educators in helping their students explore diverse industrial situations and technologies used in industry, deals with exploring computer technology. The following topics are covered in the individual…
Transient Solid Dynamics Simulations on the Sandia/Intel Teraflop Computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Attaway, S.; Brown, K.; Gardner, D.
1997-12-31
Transient solid dynamics simulations are among the most widely used engineering calculations. Industrial applications include vehicle crashworthiness studies, metal forging, and powder compaction prior to sintering. These calculations are also critical to defense applications including safety studies and weapons simulations. The practical importance of these calculations and their computational intensiveness make them natural candidates for parallelization. This has proved to be difficult, and existing implementations fail to scale to more than a few dozen processors. In this paper we describe our parallelization of PRONTO, Sandia`s transient solid dynamics code, via a novel algorithmic approach that utilizes multiple decompositions for differentmore » key segments of the computations, including the material contact calculation. This latter calculation is notoriously difficult to perform well in parallel, because it involves dynamically changing geometry, global searches for elements in contact, and unstructured communications among the compute nodes. Our approach scales to at least 3600 compute nodes of the Sandia/Intel Teraflop computer (the largest set of nodes to which we have had access to date) on problems involving millions of finite elements. On this machine we can simulate models using more than ten- million elements in a few tenths of a second per timestep, and solve problems more than 3000 times faster than a single processor Cray Jedi.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter
In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less
Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter; ...
2016-06-30
In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less
Gaspar, Paula; Carvalho, Ana L; Vinga, Susana; Santos, Helena; Neves, Ana Rute
2013-11-01
The lactic acid bacteria (LAB) are a functionally related group of low-GC Gram-positive bacteria known essentially for their roles in bioprocessing of foods and animal feeds. Due to extensive industrial use and enormous economical value, LAB have been intensively studied and a large body of comprehensive data on their metabolism and genetics was generated throughout the years. This knowledge has been instrumental in the implementation of successful applications in the food industry, such as the selection of robust starter cultures with desired phenotypic traits. The advent of genomics, functional genomics and high-throughput experimentation combined with powerful computational tools currently allows for a systems level understanding of these food industry workhorses. The technological developments in the last decade have provided the foundation for the use of LAB in applications beyond the classic food fermentations. Here we discuss recent metabolic engineering strategies to improve particular cellular traits of LAB and to design LAB cell factories for the bioproduction of added value chemicals. Copyright © 2013 Elsevier Inc. All rights reserved.
Computer integration of engineering design and production: A national opportunity
NASA Astrophysics Data System (ADS)
1984-10-01
The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.
Computer integration of engineering design and production: A national opportunity
NASA Technical Reports Server (NTRS)
1984-01-01
The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.
An introduction to quantum machine learning
NASA Astrophysics Data System (ADS)
Schuld, Maria; Sinayskiy, Ilya; Petruccione, Francesco
2015-04-01
Machine learning algorithms learn a desired input-output relation from examples in order to interpret new inputs. This is important for tasks such as image and speech recognition or strategy optimisation, with growing applications in the IT industry. In the last couple of years, researchers investigated if quantum computing can help to improve classical machine learning algorithms. Ideas range from running computationally costly algorithms or their subroutines efficiently on a quantum computer to the translation of stochastic methods into the language of quantum theory. This contribution gives a systematic overview of the emerging field of quantum machine learning. It presents the approaches as well as technical details in an accessible way, and discusses the potential of a future theory of quantum learning.
Cost-effective use of minicomputers to solve structural problems
NASA Technical Reports Server (NTRS)
Storaasli, O. O.; Foster, E. P.
1978-01-01
Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.
TeleMed: Wide-area, secure, collaborative object computing with Java and CORBA for healthcare
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forslund, D.W.; George, J.E.; Gavrilov, E.M.
1998-12-31
Distributed computing is becoming commonplace in a variety of industries with healthcare being a particularly important one for society. The authors describe the development and deployment of TeleMed in a few healthcare domains. TeleMed is a 100% Java distributed application build on CORBA and OMG standards enabling the collaboration on the treatment of chronically ill patients in a secure manner over the Internet. These standards enable other systems to work interoperably with TeleMed and provide transparent access to high performance distributed computing to the healthcare domain. The goal of wide scale integration of electronic medical records is a grand-challenge scalemore » problem of global proportions with far-reaching social benefits.« less
Surveillance of industrial processes with correlated parameters
White, Andrew M.; Gross, Kenny C.; Kubic, William L.; Wigeland, Roald A.
1996-01-01
A system and method for surveillance of an industrial process. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions.
College Stores and Computers and Students and Faculty.
ERIC Educational Resources Information Center
Newcomb, Jack, Ed.
1982-01-01
Information on the computer industry and computer use by students, faculty, and the publishing industry that may be useful in planning college store merchandising is compiled from a variety of sources. (MSE)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
...] Guidances for Industry and Food and Drug Administration Staff: Computer-Assisted Detection Devices Applied... Clinical Performance Assessment: Considerations for Computer-Assisted Detection Devices Applied to... guidance, entitled ``Computer-Assisted Detection Devices Applied to Radiology Images and Radiology Device...
Fused smart sensor network for multi-axis forward kinematics estimation in industrial robots.
Rodriguez-Donate, Carlos; Osornio-Rios, Roque Alfredo; Rivera-Guillen, Jesus Rooney; Romero-Troncoso, Rene de Jesus
2011-01-01
Flexible manipulator robots have a wide industrial application. Robot performance requires sensing its position and orientation adequately, known as forward kinematics. Commercially available, motion controllers use high-resolution optical encoders to sense the position of each joint which cannot detect some mechanical deformations that decrease the accuracy of the robot position and orientation. To overcome those problems, several sensor fusion methods have been proposed but at expenses of high-computational load, which avoids the online measurement of the joint's angular position and the online forward kinematics estimation. The contribution of this work is to propose a fused smart sensor network to estimate the forward kinematics of an industrial robot. The developed smart processor uses Kalman filters to filter and to fuse the information of the sensor network. Two primary sensors are used: an optical encoder, and a 3-axis accelerometer. In order to obtain the position and orientation of each joint online a field-programmable gate array (FPGA) is used in the hardware implementation taking advantage of the parallel computation capabilities and reconfigurability of this device. With the aim of evaluating the smart sensor network performance, three real-operation-oriented paths are executed and monitored in a 6-degree of freedom robot.
NASA Astrophysics Data System (ADS)
Hannachi, Ammar; Kohler, Sophie; Lallement, Alex; Hirsch, Ernest
2015-04-01
3D modeling of scene contents takes an increasing importance for many computer vision based applications. In particular, industrial applications of computer vision require efficient tools for the computation of this 3D information. Routinely, stereo-vision is a powerful technique to obtain the 3D outline of imaged objects from the corresponding 2D images. As a consequence, this approach provides only a poor and partial description of the scene contents. On another hand, for structured light based reconstruction techniques, 3D surfaces of imaged objects can often be computed with high accuracy. However, the resulting active range data in this case lacks to provide data enabling to characterize the object edges. Thus, in order to benefit from the positive points of various acquisition techniques, we introduce in this paper promising approaches, enabling to compute complete 3D reconstruction based on the cooperation of two complementary acquisition and processing techniques, in our case stereoscopic and structured light based methods, providing two 3D data sets describing respectively the outlines and surfaces of the imaged objects. We present, accordingly, the principles of three fusion techniques and their comparison based on evaluation criterions related to the nature of the workpiece and also the type of the tackled application. The proposed fusion methods are relying on geometric characteristics of the workpiece, which favour the quality of the registration. Further, the results obtained demonstrate that the developed approaches are well adapted for 3D modeling of manufactured parts including free-form surfaces and, consequently quality control applications using these 3D reconstructions.
NASA Astrophysics Data System (ADS)
Stacey, Weston M.
2001-02-01
An authoritative textbook and up-to-date professional's guide to basic and advanced principles and practices Nuclear reactors now account for a significant portion of the electrical power generated worldwide. At the same time, the past few decades have seen an ever-increasing number of industrial, medical, military, and research applications for nuclear reactors. Nuclear reactor physics is the core discipline of nuclear engineering, and as the first comprehensive textbook and reference on basic and advanced nuclear reactor physics to appear in a quarter century, this book fills a large gap in the professional literature. Nuclear Reactor Physics is a textbook for students new to the subject, for others who need a basic understanding of how nuclear reactors work, as well as for those who are, or wish to become, specialists in nuclear reactor physics and reactor physics computations. It is also a valuable resource for engineers responsible for the operation of nuclear reactors. Dr. Weston Stacey begins with clear presentations of the basic physical principles, nuclear data, and computational methodology needed to understand both the static and dynamic behaviors of nuclear reactors. This is followed by in-depth discussions of advanced concepts, including extensive treatment of neutron transport computational methods. As an aid to comprehension and quick mastery of computational skills, he provides numerous examples illustrating step-by-step procedures for performing the calculations described and chapter-end problems. Nuclear Reactor Physics is a useful textbook and working reference. It is an excellent self-teaching guide for research scientists, engineers, and technicians involved in industrial, research, and military applications of nuclear reactors, as well as government regulators who wish to increase their understanding of nuclear reactors.
Practical applications of new research information in the practice of bovine embryo transfer.
Looney, C R; Pryor, J H
2010-01-01
For more than 40 years, practitioners have sought to improve all aspects of commercial bovine embryo transfer. The development of new technologies for this industry has been substantial, with recent focus on cryopreservation techniques and the in vitro production of embryos fertilised with sexed spermatozoa. When these and other new technologies are developed, the following questions remain: (1) is said technology regulated or does it require licensing; and (2) is it applicable and, if so, is it financially feasible? Computer access to published research and the advancement of data software programs conducive to the industry for data procurement have been essential for helping practitioners answer these questions by enhancing their ability to analyse and apply data. The focus of the present paper is to aid commercial embryo transfer practitioners in determining new technologies that are available and whether they can be implemented effectively, benefiting their programs.
Laser-induced plasmonic colours on metals
NASA Astrophysics Data System (ADS)
Guay, Jean-Michel; Calà Lesina, Antonino; Côté, Guillaume; Charron, Martin; Poitras, Daniel; Ramunno, Lora; Berini, Pierre; Weck, Arnaud
2017-07-01
Plasmonic resonances in metallic nanoparticles have been used since antiquity to colour glasses. The use of metal nanostructures for surface colourization has attracted considerable interest following recent developments in plasmonics. However, current top-down colourization methods are not ideally suited to large-scale industrial applications. Here we use a bottom-up approach where picosecond laser pulses can produce a full palette of non-iridescent colours on silver, gold, copper and aluminium. We demonstrate the process on silver coins weighing up to 5 kg and bearing large topographic variations (~1.5 cm). We find that colours are related to a single parameter, the total accumulated fluence, making the process suitable for high-throughput industrial applications. Statistical image analyses of laser-irradiated surfaces reveal various nanoparticle size distributions. Large-scale finite-difference time-domain computations based on these nanoparticle distributions reproduce trends seen in reflectance measurements, and demonstrate the key role of plasmonic resonances in colour formation.
Laser-induced plasmonic colours on metals
Guay, Jean-Michel; Calà Lesina, Antonino; Côté, Guillaume; Charron, Martin; Poitras, Daniel; Ramunno, Lora; Berini, Pierre; Weck, Arnaud
2017-01-01
Plasmonic resonances in metallic nanoparticles have been used since antiquity to colour glasses. The use of metal nanostructures for surface colourization has attracted considerable interest following recent developments in plasmonics. However, current top-down colourization methods are not ideally suited to large-scale industrial applications. Here we use a bottom-up approach where picosecond laser pulses can produce a full palette of non-iridescent colours on silver, gold, copper and aluminium. We demonstrate the process on silver coins weighing up to 5 kg and bearing large topographic variations (∼1.5 cm). We find that colours are related to a single parameter, the total accumulated fluence, making the process suitable for high-throughput industrial applications. Statistical image analyses of laser-irradiated surfaces reveal various nanoparticle size distributions. Large-scale finite-difference time-domain computations based on these nanoparticle distributions reproduce trends seen in reflectance measurements, and demonstrate the key role of plasmonic resonances in colour formation. PMID:28719576
Enhanced sampling techniques in biomolecular simulations.
Spiwok, Vojtech; Sucur, Zoran; Hosek, Petr
2015-11-01
Biomolecular simulations are routinely used in biochemistry and molecular biology research; however, they often fail to match expectations of their impact on pharmaceutical and biotech industry. This is caused by the fact that a vast amount of computer time is required to simulate short episodes from the life of biomolecules. Several approaches have been developed to overcome this obstacle, including application of massively parallel and special purpose computers or non-conventional hardware. Methodological approaches are represented by coarse-grained models and enhanced sampling techniques. These techniques can show how the studied system behaves in long time-scales on the basis of relatively short simulations. This review presents an overview of new simulation approaches, the theory behind enhanced sampling methods and success stories of their applications with a direct impact on biotechnology or drug design. Copyright © 2014 Elsevier Inc. All rights reserved.
Li, Jia; Zhou, Quan; Xu, Zhenming
2014-12-01
Although corona electrostatic separation is successfully used in recycling waste printed circuit boards in industrial applications, there are problems that cannot be resolved completely, such as nonmetal particle aggregation and spark discharge. Both of these problems damage the process of separation and are not easy to identify during the process of separation in industrial applications. This paper provides a systematic study on a real-time monitoring system. Weight monitoring systems were established to continuously monitor the separation process. A Virtual Instrumentation program written by LabVIEW was utilized to sample and analyse the mass increment of the middling product. It includes four modules: historical data storage, steady-state analysis, data computing and alarm. Three kinds of operating conditions were used to verify the applicability of the monitoring system. It was found that the system achieved the goal of monitoring during the separation process and realized the function of real-time analysis of the received data. The system also gave comprehensible feedback on the accidents of material blockages in the feed inlet and high-voltage spark discharge. With the warning function of the alarm system, the whole monitoring system could save the human cost and help the new technology to be more easily applied in industry. © The Author(s) 2014.
The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC
NASA Astrophysics Data System (ADS)
Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan
2016-04-01
The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.
Surveillance of industrial processes with correlated parameters
White, A.M.; Gross, K.C.; Kubic, W.L.; Wigeland, R.A.
1996-12-17
A system and method for surveillance of an industrial process are disclosed. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions. 10 figs.
NASA Astrophysics Data System (ADS)
Wimmer, E.
2008-02-01
A workshop, 'Theory Meets Industry', was held on 12-14 June 2007 in Vienna, Austria, attended by a well balanced number of academic and industrial scientists from America, Europe, and Japan. The focus was on advances in ab initio solid state calculations and their practical use in industry. The theoretical papers addressed three dominant themes, namely (i) more accurate total energies and electronic excitations, (ii) more complex systems, and (iii) more diverse and accurate materials properties. Hybrid functionals give some improvements in energies, but encounter difficulties for metallic systems. Quantum Monte Carlo methods are progressing, but no clear breakthrough is on the horizon. Progress in order-N methods is steady, as is the case for efficient methods for exploring complex energy hypersurfaces and large numbers of structural configurations. The industrial applications were dominated by materials issues in energy conversion systems, the quest for hydrogen storage materials, improvements of electronic and optical properties of microelectronic and display materials, and the simulation of reactions on heterogeneous catalysts. The workshop is a clear testimony that ab initio computations have become an industrial practice with increasingly recognized impact.
Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming
Philip A. Araman
1990-01-01
This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...
Patent Law for Computer Scientists
NASA Astrophysics Data System (ADS)
Closa, Daniel; Gardiner, Alex; Giemsa, Falk; Machek, Jörg
More than five centuries ago the first patent statute was passed by the Venetian senate. It already had most of the features of modern patent law, recognizing the public interest in innovation and granting exclusive right in exchange for a full disclosure. Some 350 years later the industrial revolution led to globalisation. The wish to protect intellectual property on a more international level evolved and supranational treaties were negotiated. Patent laws are still different in many countries, however, and inventors are sometimes at a loss to understand which basic requirements should be satisfied if an invention is to be granted a patent. This is particularly true for inventions implemented on a computer. While roughly a third of all applications (and granted patents) relate, in one way or another, to a computer, applications where the innovation mainly resides in software or in a business method are treated differently by the major patent offices. The procedures at the USPTO, JPO and EPO and, in particular, the differences in the treatment of applications centring on software are briefly explained. In later sections of this book, a wealth of examples will be presented. The methodology behind the treatment of these examples is explained.
A Formal Model of Partitioning for Integrated Modular Avionics
NASA Technical Reports Server (NTRS)
DiVito, Ben L.
1998-01-01
The aviation industry is gradually moving toward the use of integrated modular avionics (IMA) for civilian transport aircraft. An important concern for IMA is ensuring that applications are safely partitioned so they cannot interfere with one another. We have investigated the problem of ensuring safe partitioning and logical non-interference among separate applications running on a shared Avionics Computer Resource (ACR). This research was performed in the context of ongoing standardization efforts, in particular, the work of RTCA committee SC-182, and the recently completed ARINC 653 application executive (APEX) interface standard. We have developed a formal model of partitioning suitable for evaluating the design of an ACR. The model draws from the mathematical modeling techniques developed by the computer security community. This report presents a formulation of partitioning requirements expressed first using conventional mathematical notation, then formalized using the language of SRI'S Prototype Verification System (PVS). The approach is demonstrated on three candidate designs, each an abstraction of features found in real systems.
Modeling adsorption with lattice Boltzmann equation
Guo, Long; Xiao, Lizhi; Shan, Xiaowen; Zhang, Xiaoling
2016-01-01
The research of adsorption theory has recently gained renewed attention due to its critical relevance to a number of trending industrial applications, hydrogen storage and shale gas exploration for instance. The existing theoretical foundation, laid mostly in the early twentieth century, was largely based on simple heuristic molecular interaction models and static interaction potential which, although being insightful in illuminating the fundamental mechanisms, are insufficient for computations with realistic adsorbent structure and adsorbate hydrodynamics, both critical for real-life applications. Here we present and validate a novel lattice Boltzmann model incorporating both adsorbate-adsorbate and adsorbate-adsorbent interactions with hydrodynamics which, for the first time, allows adsorption to be computed with real-life details. Connection with the classic Ono-Kondo lattice theory is established and various adsorption isotherms, both within and beyond the IUPAC classification are observed as a pseudo-potential is varied. This new approach not only enables an important physical to be simulated for real-life applications, but also provides an enabling theoretical framework within which the fundamentals of adsorption can be studied. PMID:27256325
Carbon footprint of electronic devices
NASA Astrophysics Data System (ADS)
Sloma, Marcin
2013-07-01
Paper assesses the greenhouse gas emissions related to the electronic sectors including information and communication technology and media sectors. While media often presents the carbon emission problem of other industries like petroleum industry, the airlines and automobile sectors, plastics and steel manufacturers, the electronics industry must include the increasing carbon footprints caused from their applications like media and entertainment, computers and cooling devices, complex telecommunications networks, cloud computing and powerful mobile phones. In that sense greenhouse gas emission of electronics should be studied in a life cycle perspective, including regular operational electricity use. Paper presents which product groups or processes are major contributors in emission. From available data and extrapolation of existing information we know that the information and communication technology sector produced 1.3% and media sector 1.7% of global gas emissions within production cycle, using the data from 2007.In the same time global electricity use of that sectors was 3.9% and 3.2% respectively. The results indicate that for both sectors operation leads to more gas emissions than manufacture, although impacts from the manufacture is significant, especially in the supply chain. Media electronics led to more emissions than PCs (manufacture and operation). Examining the role of electronics in climate change, including disposal of its waste, will enable the industry to take internal actions, leading to lowering the impact on the climate change within the sector itself.
The Magnesium Industry Today…The Global Perspective
NASA Astrophysics Data System (ADS)
Patzer, Greg
World demand for magnesium will show a decline in 2009. The outlook for 2010, which is guardedly optimistic, will be for a resumption of slow growth. The industry has seen marked changes in the sources of supply for primary and alloyed magnesium in recent years. Technological advances in magnesium continue at a strong pace as does interest in the material as a substitute for other light metals. The automotive segment remains the end-use area with the largest growth potential, if for no other reason than the size and quantity of the potential materials substitution applications. However, the shrinkage of that market, particularly in North America will have a definite impact on expectations for magnesium. The 3C market (computers, communications & consumer electronics) will continue to show above average growth. Other niche markets related to medical and construction industries also offer potential.
NASA Astrophysics Data System (ADS)
Reece Roth, J.
2004-11-01
The majority of industrial plasma processing with glow discharges has been conducted at pressures below 10 torr. This tends to limit applications to high value workpieces as a result of the high capital cost of vacuum systems and the production constraints of batch processing. It has long been recognized that glow discharge plasmas would play a much larger industrial role if they could be generated at one atmosphere. The One Atmosphere Uniform Glow Discharge Plasma (OAUGDP), developed at the University of Tennessee's Plasma Sciences Laboratory, is a non-thermal RF plasma operating on displacement currents with the time-resolved characteristics of a classical low pressure DC normal glow discharge. As a glow discharge, the OAUGDP operates with maximum electrical efficiency at the Stoletow point, where the energy input per ion-electron pair is a minimum [1, 2]. Several interdisciplinary teams have investigated potential applications of the OAUGDP. These teams included collaborators from the UTK Textiles and Nonwovens Development Center (TANDEC), and the Departments of Electrical and Computer Engineering, Microbiology, and Food Science and Technology, as well as the NASA Langley Research Center. The potential applications of the OAUGDP have all been at one atmosphere and room temperature, using air as the working gas. These applications include sterilizing medical and dental equipment; sterilizable air filters to deal with the "sick building syndrome"; removal of soot from Diesel engine exhaust; subsonic plasma aerodynamic effects, including flow re-attachment to airfoils and boundary layer modification; electrohydrodynamic (EDH) flow control of working gases; increasing the surface energy of materials; improving the adhesion of paints and electroplated layers: improving the wettability and wickability of fabrics; stripping of photoresist; and plasma deposition and directional etching of potential microelectronic relevance. [1] J. R. Roth, Industrial Plasma Engineering: Volume I, Principles. Institute of Physics Publishing, Bristol and Philadelphia 1995, ISBN 0-7503-0318-2. [2] Roth, J. R. Industrial Plasma Engineering: Volume II Applications to Nonthermal Plasma Processing Institute of Physics Publishing, Bristol and Philadelphia. 2001, ISBN 0-7503-0545-2.
Economic Evaluation of Computerized Structural Analysis
NASA Technical Reports Server (NTRS)
Fortin, P. E.
1985-01-01
This completed effort involved a technical and economic study of the capabilities of computer programs in the area of structural analysis. The applicability of the programs to NASA projects and to other users was studied. The applications in other industries was explored including both research and development and applied areas. The costs of several alternative analysis programs were compared. A literature search covered applicable technical literature including journals, trade publications and books. In addition to the literature search, several commercial companies that have developed computerized structural analysis programs were contacted and their technical brochures reviewed. These programs include SDRC I-DEAS, MSC/NASTRAN, SCADA, SUPERSAP, NISA/DISPLAY, STAAD-III, MICAS, GTSTRUDL, and STARS. These programs were briefly reviewed as applicable to NASA projects.
NASA Technical Reports Server (NTRS)
1986-01-01
Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.
1978-02-09
incorporation of subsystems and tasks into systems are specified by state and industrial- sector standards. However, such rigid requirements interfere...construction of the step-down substation, without which the new sector is inoperable. The question of who will build the substation is still unresolved... economico -mathematical evaluation). Among such applications, program decks completed by the "Soyuzsistemprom" are those for data integra- tion and
The National Shipbuilding Research Program Executive Summary Robotics in Shipbuilding Workshop
1981-01-01
based on technoeconomic analysis and consideration environment. of working c-c-2 (3) The conceptual designs were based on application of commercial...results of our study. We identified shipbuilding tasks that should be performed by industrial robots based on technoeconomic and working-life incentives...is the TV image of the illuminated workplaces. The image is analyzed by the computer. The analysis includes noise rejection and fitting of straight
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holling, G.H.
1994-12-31
Variable switched reluctance (VSR) motors are gaining importance for industrial applications. The paper will introduce a novel approach to simplify the computation involved in the control of VSR motors. Results are shown, that validate the approach and demonstrates the superior performance compared to tabulated control parameters with linear interpolation, which are widely used in implementations.
Managing Engineering Design Information
1989-10-01
aerospace industry, and design operations cannot be delayed until a prior task is completed [Ref. 9]. 5 ReacDevelopment, and ’ Marketin ~g nceptal...Figure 4. Translator Interface Between Application Tools 12 2. Directory Data Base Approach The directory approach uses a data base with the traditional ...Technologies, 1985, pp. 313-320. 17. Bray, O.H., "Computer-Integrated Manufacturing: The Data Management Strategy," Digital Press, Bedford, MA, 1988. 18. Atre
NASA Astrophysics Data System (ADS)
Schaub, Scott A.; Naqwi, Amir A.; Harding, Foster L.
1998-01-01
We present fundamental studies examining the design of a phase /Doppler laser light-scattering system applicable to on-line measurements of small-diameter ( <15 m) fibers during fiberglass manufacturing. We first discuss off-line diameter measurement techniques currently used in the fiberglass industry and outline the limitations and problems associated with these methods. For the phase /Doppler design study we have developed a theoretical computer model for the response of the measurement system to cylindrical fibers, which is based on electromagnetic scattering theory. The model, valid for arbitrary fiber diameters and hardware configurations, generates simulated detector output as a function of time for a finite absorbing, cylindrical fiber oriented perpendicular to the two incident laser beams. Results of experimental measurements are presented, confirming predictions of the theoretical model. Parametric studies have also been conducted using the computer model to identify experimental arrangements that provide linear phase -diameter relationships for small-diameter fibers, within the measurement constraints imposed by the fiberglass production environment. The effect of variations in optical properties of the glass as well as fiber orientation effects are discussed. Through this research we have identified phase /Doppler arrangements that we expect to have future applications in the fiberglass industry for on-line diameter monitoring and process control.
Schaub, S A; Naqwi, A A; Harding, F L
1998-01-20
We present fundamental studies examining the design of a phase/Doppler laser light-scattering system applicable to on-line measurements of small-diameter (<15 mum) fibers during fiberglass manufacturing. We first discuss off-line diameter measurement techniques currently used in the fiberglass industry and outline the limitations and problems associated with these methods. For the phase/Doppler design study we have developed a theoretical computer model for the response of the measurement system to cylindrical fibers, which is based on electromagnetic scattering theory. The model, valid for arbitrary fiber diameters and hardware configurations, generates simulated detector output as a function of time for a finite absorbing, cylindrical fiber oriented perpendicular to the two incident laser beams. Results of experimental measurements are presented, confirming predictions of the theoretical model. Parametric studies have also been conducted using the computer model to identify experimental arrangements that provide linear phase-diameter relationships for small-diameter fibers, within the measurement constraints imposed by the fiberglass production environment. The effect of variations in optical properties of the glass as well as fiber orientation effects are discussed. Through this research we have identified phase/Doppler arrangements that we expect to have future applications in the fiberglass industry for on-line diameter monitoring and process control.
Detailed model for practical pulverized coal furnaces and gasifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, P.J.; Smoot, L.D.
1989-08-01
This study has been supported by a consortium of nine industrial and governmental sponsors. Work was initiated on May 1, 1985 and completed August 31, 1989. The central objective of this work was to develop, evaluate and apply a practical combustion model for utility boilers, industrial furnaces and gasifiers. Key accomplishments have included: Development of an advanced first-generation, computer model for combustion in three dimensional furnaces; development of a new first generation fouling and slagging submodel; detailed evaluation of an existing NO{sub x} submodel; development and evaluation of an improved radiation submodel; preparation and distribution of a three-volume final report:more » (a) Volume 1: General Technical Report; (b) Volume 2: PCGC-3 User's Manual; (c) Volume 3: Data Book for Evaluation of Three-Dimensional Combustion Models; and organization of a user's workshop on the three-dimensional code. The furnace computer model developed under this study requires further development before it can be applied generally to all applications; however, it can be used now by specialists for many specific applications, including non-combusting systems and combusting geseous systems. A new combustion center was organized and work was initiated to continue the important research effort initiated by this study. 212 refs., 72 figs., 38 tabs.« less
DOE planning workshop advanced biomedical technology initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-06-01
The Department of Energy has mad major contributions in the biomedical sciences with programs in medical applications and instrumentation development, molecular biology, human genome, and computational sciences. In an effort to help determine DOE`s role in applying these capabilities to the nation`s health care needs, a planning workshop was held on January 11--12, 1994. The workshop was co-sponsored by the Department`s Office of Energy Research and Defense Programs organizations. Participants represented industry, medical research institutions, national laboratories, and several government agencies. They attempted to define the needs of the health care industry. identify DOE laboratory capabilities that address these needs,more » and determine how DOE, in cooperation with other team members, could begin an initiative with the goals of reducing health care costs while improving the quality of health care delivery through the proper application of technology and computational systems. This document is a report of that workshop. Seven major technology development thrust areas were considered. Each involves development of various aspects of imaging, optical, sensor and data processing and storage technologies. The thrust areas as prioritized for DOE are: (1) Minimally Invasive Procedures; (2) Technologies for Individual Self Care; (3) Outcomes Research; (4) Telemedicine; (5) Decision Support Systems; (6) Assistive Technology; (7) Prevention and Education.« less
A review of emerging non-volatile memory (NVM) technologies and applications
NASA Astrophysics Data System (ADS)
Chen, An
2016-11-01
This paper will review emerging non-volatile memory (NVM) technologies, with the focus on phase change memory (PCM), spin-transfer-torque random-access-memory (STTRAM), resistive random-access-memory (RRAM), and ferroelectric field-effect-transistor (FeFET) memory. These promising NVM devices are evaluated in terms of their advantages, challenges, and applications. Their performance is compared based on reported parameters of major industrial test chips. Memory selector devices and cell structures are discussed. Changing market trends toward low power (e.g., mobile, IoT) and data-centric applications create opportunities for emerging NVMs. High-performance and low-cost emerging NVMs may simplify memory hierarchy, introduce non-volatility in logic gates and circuits, reduce system power, and enable novel architectures. Storage-class memory (SCM) based on high-density NVMs could fill the performance and density gap between memory and storage. Some unique characteristics of emerging NVMs can be utilized for novel applications beyond the memory space, e.g., neuromorphic computing, hardware security, etc. In the beyond-CMOS era, emerging NVMs have the potential to fulfill more important functions and enable more efficient, intelligent, and secure computing systems.
The internet of things and the development of network technology in China
NASA Astrophysics Data System (ADS)
Wang, Ruxin; Zhao, Jianzhen; Ma, Hangtong
2018-04-01
The English name of the Internet of Things the Internet of Things, referred to as: the IOT. Internet of Things through the pass, radio frequency identification technology, global positioning system technology, real-time acquisition of any monitoring, connectivity, interactive objects or processes, collecting their sound, light, heat, electricity, mechanics, chemistry, biology, the location of a variety of the information you need network access through a variety of possible things and things, objects and people in the Pan-link intelligent perception of items and processes, identification and management. The Internet of Things IntelliSense recognition technology and pervasive computing, ubiquitous network integration application, known as the third wave of the world's information industry development following the computer, the Internet. Not so much the Internet of Things is a network, as Internet of Things services and applications, Internet of Things is also seen as Internet application development. Therefore, the application of innovation is the core of the development of Internet of Things, and 2.0 of the user experience as the core innovation is the soul of Things.
Industry-Wide Workshop on Computational Turbulence Modeling
NASA Technical Reports Server (NTRS)
Shabbir, Aamir (Compiler)
1995-01-01
This publication contains the presentations made at the Industry-Wide Workshop on Computational Turbulence Modeling which took place on October 6-7, 1994. The purpose of the workshop was to initiate the transfer of technology developed at Lewis Research Center to industry and to discuss the current status and the future needs of turbulence models in industrial CFD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
Automated Modeling and Simulation Using the Bond Graph Method for the Aerospace Industry
NASA Technical Reports Server (NTRS)
Granda, Jose J.; Montgomery, Raymond C.
2003-01-01
Bond graph modeling was originally developed in the late 1950s by the late Prof. Henry M. Paynter of M.I.T. Prof. Paynter acted well before his time as the main advantage of his creation, other than the modeling insight that it provides and the ability of effectively dealing with Mechatronics, came into fruition only with the recent advent of modern computer technology and the tools derived as a result of it, including symbolic manipulation, MATLAB, and SIMULINK and the Computer Aided Modeling Program (CAMPG). Thus, only recently have these tools been available allowing one to fully utilize the advantages that the bond graph method has to offer. The purpose of this paper is to help fill the knowledge void concerning its use of bond graphs in the aerospace industry. The paper first presents simple examples to serve as a tutorial on bond graphs for those not familiar with the technique. The reader is given the basic understanding needed to appreciate the applications that follow. After that, several aerospace applications are developed such as modeling of an arresting system for aircraft carrier landings, suspension models used for landing gears and multibody dynamics. The paper presents also an update on NASA's progress in modeling the International Space Station (ISS) using bond graph techniques, and an advanced actuation system utilizing shape memory alloys. The later covers the Mechatronics advantages of the bond graph method, applications that simultaneously involves mechanical, hydraulic, thermal, and electrical subsystem modeling.
Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, D. J.; McCabe, J.
This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less
WARP: Weight Associative Rule Processor. A dedicated VLSI fuzzy logic megacell
NASA Technical Reports Server (NTRS)
Pagni, A.; Poluzzi, R.; Rizzotto, G. G.
1992-01-01
During the last five years Fuzzy Logic has gained enormous popularity in the academic and industrial worlds. The success of this new methodology has led the microelectronics industry to create a new class of machines, called Fuzzy Machines, to overcome the limitations of traditional computing systems when utilized as Fuzzy Systems. This paper gives an overview of the methods by which Fuzzy Logic data structures are represented in the machines (each with its own advantages and inefficiencies). Next, the paper introduces WARP (Weight Associative Rule Processor) which is a dedicated VLSI megacell allowing the realization of a fuzzy controller suitable for a wide range of applications. WARP represents an innovative approach to VLSI Fuzzy controllers by utilizing different types of data structures for characterizing the membership functions during the various stages of the Fuzzy processing. WARP dedicated architecture has been designed in order to achieve high performance by exploiting the computational advantages offered by the different data representations.
A Computational Study of the Rheology and Structure of Surfactant Covered Droplets
NASA Astrophysics Data System (ADS)
Maia, Joao; Boromand, Arman; Jamali, Safa
2015-11-01
The use of different types of surface-active agents is ubiquitous practice in different industrial applications ranging from cosmetic and food industries to polymeric nano-composite and blends. This allows stable multiphasic systems like foams and emulsions to be produced. Stability and shelf-life of those products are directly determined by the efficiency of the surfactant molecules. Although the effect of molecular configuration of the surface-active molecules on the planar interfaces has been studied both experimentally and computationally, it remains challenging to track the efficiency and effectiveness of different surfactant molecules on curved interfaces. In this study we address this gap by using Dissipative Particle Dynamics, to study the effectiveness and efficiency of different surfactant molecules (linear vs. branched) on a curved interface in equilibrium and far from equilibrium. In particular, we are interested to relate interfacial properties of the surface covered droplets and its dynamics to the molecular configuration of the surface active molecules under equilibrium and far from equilibrium condition.
An Introduction to the Industrial Applications of Microcontrollers
NASA Astrophysics Data System (ADS)
Carelse, Xavier F.
A microcontroller is sometimes described as a “computer on a chip” because it contains all the features of a full computer including central processor, in-built clock circuitry, ROM, RAM, input and output ports with special features'such as serial communication, analogue-to-digital conversion and, more recently, signal processing. The smallest microcontroller has only eight pins but some having 68 pins are also being marketed. In the last five years, the prices of microcontrollers have dropped by 80% and are now one of the most cost-effective components in industry. Being software-driven, microcontrollers greatly simplify the design of sophisticated instrumentation and control circuitry. They are able to effect precise calculations sometimes needed for feedback in control systems and now form the basis of all intelligent embedded systems such as those required in television and VCR remote controls, microwave ovens, washing machines, etc. More than ten times as many microcontrollers than microprocessors are manufactured and sold in the world in spite of the high profile that the latter enjoys because of the personal computer market. In Zimbabwe, extensive research is being carried out to use microcontrollers to aid the cost recovery of domestic and commercial solar installations as part of the rural electrification programme.
Some Thoughts Regarding Practical Quantum Computing
NASA Astrophysics Data System (ADS)
Ghoshal, Debabrata; Gomez, Richard; Lanzagorta, Marco; Uhlmann, Jeffrey
2006-03-01
Quantum computing has become an important area of research in computer science because of its potential to provide more efficient algorithmic solutions to certain problems than are possible with classical computing. The ability of performing parallel operations over an exponentially large computational space has proved to be the main advantage of the quantum computing model. In this regard, we are particularly interested in the potential applications of quantum computers to enhance real software systems of interest to the defense, industrial, scientific and financial communities. However, while much has been written in popular and scientific literature about the benefits of the quantum computational model, several of the problems associated to the practical implementation of real-life complex software systems in quantum computers are often ignored. In this presentation we will argue that practical quantum computation is not as straightforward as commonly advertised, even if the technological problems associated to the manufacturing and engineering of large-scale quantum registers were solved overnight. We will discuss some of the frequently overlooked difficulties that plague quantum computing in the areas of memories, I/O, addressing schemes, compilers, oracles, approximate information copying, logical debugging, error correction and fault-tolerant computing protocols.
For Drafting Programs--Computer Graphics in Industrial Tech.
ERIC Educational Resources Information Center
Sutliff, Ron
1980-01-01
Posits that computer-aided drafting and design should be introduced to students in industrial technology programs. Discusses ways the technical educator can get involved in computer graphics to familiarize students with it without a large outlay of money. (JOW)
NASA Astrophysics Data System (ADS)
Oukacha, Hassan
The rapid advancement of Complementary Metal Oxide Semiconductor (CMOS) technology has formed the backbone of the modern computing revolution enabling the development of computationally intensive electronic devices that are smaller, faster, less expensive, and consume less power. This well-established technology has transformed the mobile computing and communications industries by providing high levels of system integration on a single substrate, high reliability and low manufacturing cost. The driving force behind this computing revolution is the scaling of semiconductor devices to smaller geometries which has resulted in faster switching speeds and the promise of replacing traditional, bulky radio frequency (RF) components with miniaturized devices. Such devices play an important role in our society enabling ubiquitous computing and on-demand data access. This thesis presents the design and development of a magnetic circulator component in a standard 180 nm CMOS process. The design approach involves integration of nanoscale ferrite materials on a CMOS chip to avoid using bulky magnetic materials employed in conventional circulators. This device constitutes the next generation broadband millimeter-wave circulator integrated in CMOS using ferrite materials operating in the 60GHz frequency band. The unlicensed ultra-high frequency spectrum around 60GHz offers many benefits: very high immunity to interference, high security, and frequency re-use. Results of both simulations and measurements are presented in this thesis. The presented results show the benefits of this technique and the potential that it has in incorporating a complete system-on-chip (SoC) that includes low noise amplifier, power amplier, and antenna. This system-on-chip can be used in the same applications where the conventional circulator has been employed, including communication systems, radar systems, navigation and air traffic control, and military equipment. This set of applications of circulator shows how crucial this device is to many industries and the need for smaller, cost effective RF components.
Parametric Estimation of Load for Air Force Data Centers
2015-03-27
R. Nelson, L. Orsenigo and S . Winter, "’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change...34’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change, vol. 8, no. 1, pp. 3-40, 1999. [7] VMWare...NAME( S ) AND ADDRESS(ES) Vinh Phung, 38ES/ENOC 5813 Arnold St, Building 4064 Tinker AFB OK 73145-8120 COM : 405-734-7461, vinh.phung@us.af.mil 10
Carbon Nanotubes for Space Applications
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya
2000-01-01
The potential of nanotube technology for NASA missions is significant and is properly recognized by NASA management. Ames has done much pioneering research in the last five years on carbon nanotube growth, characterization, atomic force microscopy, sensor development and computational nanotechnology. NASA Johnson Space Center has focused on laser ablation production of nanotubes and composites development. These in-house efforts, along with strategic collaboration with academia and industry, are geared towards meeting the agency's mission requirements. This viewgraph presentation (including an explanation for each slide) outlines the research focus for Ames nanotechnology, including details on carbon nanotubes' properties, applications, and synthesis.
Executive control systems in the engineering design environment
NASA Technical Reports Server (NTRS)
Hurst, P. W.; Pratt, T. W.
1985-01-01
Executive Control Systems (ECSs) are software structures for the unification of various engineering design application programs into comprehensive systems with a central user interface (uniform access) method and a data management facility. Attention is presently given to the most significant determinations of a research program conducted for 24 ECSs, used in government and industry engineering design environments to integrate CAD/CAE applications programs. Characterizations are given for the systems' major architectural components and the alternative design approaches considered in their development. Attention is given to ECS development prospects in the areas of interdisciplinary usage, standardization, knowledge utilization, and computer science technology transfer.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Holt, Katherine B
2007-12-15
Although nanocrystalline diamond powders have been produced in industrial quantities, mainly by detonation synthesis, for many decades their use in applications other than traditional polishing and grinding have been limited, until recently. This paper presents the wide-ranging applications of nanodiamond particles to date and discusses future research directions in this field. Owing to the recent commercial availability of these powders and the present interest in nanotechnology, one can predict a huge increase in research with these materials in the very near future. However, to fully exploit these materials, fundamental as well as applied research is required to understand the transition between bulk and surface properties as the size of particles decreases.
NASA Technical Reports Server (NTRS)
Wu, S. T. (Editor); Christensen, D. L.; Head, R. R.
1978-01-01
Demonstration projects, systems-subsystems simulation programs, applications (heating, cooling, agricultural, industrial), and climatic data testing (standards, economics, institutional) are the topics of the book. Economics of preheating water for commercial use and collecting, processing, and dissemination of data for the national demonstration program are discussed. Computer simulation of a solar energy system and graphical representation of solar collector performance are considered. Attention is given to solar driven heat pumps, solar cooling equipment, hybrid passive/active solar systems, and solar farm buildings. Evaluation of a thermographic scanning device for solar energy and conservation applications, use of meteorological data in system evaluation, and biomass conversion potential are presented.
NASA Astrophysics Data System (ADS)
Upputuri, Paul Kumar; Pramanik, Manojit
2018-02-01
Phase shifting white light interferometry (PSWLI) has been widely used for optical metrology applications because of their precision, reliability, and versatility. White light interferometry using monochrome CCD makes the measurement process slow for metrology applications. WLI integrated with Red-Green-Blue (RGB) CCD camera is finding imaging applications in the fields optical metrology and bio-imaging. Wavelength dependent refractive index profiles of biological samples were computed from colour white light interferograms. In recent years, whole-filed refractive index profiles of red blood cells (RBCs), onion skin, fish cornea, etc. were measured from RGB interferograms. In this paper, we discuss the bio-imaging applications of colour CCD based white light interferometry. The approach makes the measurement faster, easier, cost-effective, and even dynamic by using single fringe analysis methods, for industrial applications.
Numerical Propulsion System Simulation (NPSS) 1999 Industry Review
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Greg; Naiman, Cynthia; Evans, Austin
2000-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. In addition, the paper contains a summary of the feedback received from industry partners in the development effort and the actions taken over the past year to respond to that feedback. The NPSS development was supported in FY99 by the High Performance Computing and Communications Program.
A Real-Time Monitoring System of Industry Carbon Monoxide Based on Wireless Sensor Networks.
Yang, Jiachen; Zhou, Jianxiong; Lv, Zhihan; Wei, Wei; Song, Houbing
2015-11-20
Carbon monoxide (CO) burns or explodes at over-standard concentration. Hence, in this paper, a Wifi-based, real-time monitoring of a CO system is proposed for application in the construction industry, in which a sensor measuring node is designed by low-frequency modulation method to acquire CO concentration reliably, and a digital filtering method is adopted for noise filtering. According to the triangulation, the Wifi network is constructed to transmit information and determine the position of nodes. The measured data are displayed on a computer or smart phone by a graphical interface. The experiment shows that the monitoring system obtains excellent accuracy and stability in long-term continuous monitoring.
Zero cylinder coordinate system approach to image reconstruction in fan beam ICT
NASA Astrophysics Data System (ADS)
Yan, Yan-Chun; Xian, Wu; Hall, Ernest L.
1992-11-01
The state-of-the-art of the transform algorithms has allowed the newest versions to produce excellent and efficient reconstructed images in most applications, especially in medical CT and industrial CT etc. Based on the Zero Cylinder Coordinate system (ZCC) presented in this paper, a new transform algorithm of image reconstruction in fan beam industrial CT is suggested. It greatly reduces the amount of computation of the backprojection, which requires only two INC instructions to calculate the weighted factor and the subcoordinate. A new backprojector is designed, which simplifies its assembly-line mechanism based on the ZCC method. Finally, a simulation results on microcomputer is given out, which proves this method is effective and practical.
NASA Astrophysics Data System (ADS)
Ramulu, M.; Rogers, E.
1994-04-01
The predominant machining application with graphite/epoxy composite materials in aerospace industry is peripheral trimming. The computer numerically controlled (CNC) high speed routers required to do edge trimming work are generally scheduled for production work in industry and are not available for extensive cutter testing. Therefore, an experimental method of simulating the conditions of periphery trim using a lathe is developed in this paper. The validity of the test technique will be demonstrated by conducting carbide tool wear tests under dry cutting conditions. The experimental results will be analyzed to characterize the wear behavior of carbide cutting tools in machining the composite materials.
Machine Vision For Industrial Control:The Unsung Opportunity
NASA Astrophysics Data System (ADS)
Falkman, Gerald A.; Murray, Lawrence A.; Cooper, James E.
1984-05-01
Vision modules have primarily been developed to relieve those pressures newly brought into existence by Inspection (QUALITY) and Robotic (PRODUCTIVITY) mandates. Industrial Control pressure stems on the other hand from the older first industrial revolution mandate of throughput. Satisfying such pressure calls for speed in both imaging and decision making. Vision companies have, however, put speed on a backburner or ignore it entirely because most modules are computer/software based which limits their speed potential. Increasingly, the keynote being struck at machine vision seminars is that "Visual and Computational Speed Must Be Increased and Dramatically!" There are modular hardwired-logic systems that are fast but, all too often, they are not very bright. Such units: Measure the fill factor of bottles as they spin by, Read labels on cans, Count stacked plastic cups or Monitor the width of parts streaming past the camera. Many are only a bit more complex than a photodetector. Once in place, most of these units are incapable of simple upgrading to a new task and are Vision's analog to the robot industry's pick and place (RIA TYPE E) robot. Vision thus finds itself amidst the same quandries that once beset the Robot Industry of America when it tried to define a robot, excluded dumb ones, and was left with only slow machines whose unit volume potential is shatteringly low. This paper develops an approach to meeting the need of a vision system that cuts a swath into the terra incognita of intelligent, high-speed vision processing. Main attention is directed to vision for industrial control. Some presently untapped vision application areas that will be serviced include: Electronics, Food, Sports, Pharmaceuticals, Machine Tools and Arc Welding.
Marshal Wrubel and the Electronic Computer as an Astronomical Instrument
NASA Astrophysics Data System (ADS)
Mutschlecner, J. P.; Olsen, K. H.
1998-05-01
In 1960, Marshal H. Wrubel, professor of astrophysics at Indiana University, published an influential review paper under the title, "The Electronic Computer as an Astronomical Instrument." This essay pointed out the enormous potential of the electronic computer as an instrument of observational and theoretical research in astronomy, illustrated programming concepts, and made specific recommendations for the increased use of computers in astronomy. He noted that, with a few scattered exceptions, computer use by the astronomical community had heretofore been "timid and sporadic." This situation was to improve dramatically in the next few years. By the late 1950s, general-purpose, high-speed, "mainframe" computers were just emerging from the experimental, developmental stage, but few were affordable by or available to academic and research institutions not closely associated with large industrial or national defense programs. Yet by 1960 Wrubel had spent a decade actively pioneering and promoting the imaginative application of electronic computation within the astronomical community. Astronomy upper-level undergraduate and graduate students at Indiana were introduced to computing, and Ph.D. candidates who he supervised applied computer techniques to problems in theoretical astrophysics. He wrote an early textbook on programming, taught programming classes, and helped establish and direct the Research Computing Center at Indiana, later named the Wrubel Computing Center in his honor. He and his students created a variety of algorithms and subroutines and exchanged these throughout the astronomical community by distributing the Astronomical Computation News Letter. Nationally as well as internationally, Wrubel actively cooperated with other groups interested in computing applications for theoretical astrophysics, often through his position as secretary of the IAU commission on Stellar Constitution.
Manual control models of industrial management
NASA Technical Reports Server (NTRS)
Crossman, E. R. F. W.
1972-01-01
The industrial engineer is often required to design and implement control systems and organization for manufacturing and service facilities, to optimize quality, delivery, and yield, and minimize cost. Despite progress in computer science most such systems still employ human operators and managers as real-time control elements. Manual control theory should therefore be applicable to at least some aspects of industrial system design and operations. Formulation of adequate model structures is an essential prerequisite to progress in this area; since real-world production systems invariably include multilevel and multiloop control, and are implemented by timeshared human effort. A modular structure incorporating certain new types of functional element, has been developed. This forms the basis for analysis of an industrial process operation. In this case it appears that managerial controllers operate in a discrete predictive mode based on fast time modelling, with sampling interval related to plant dynamics. Successive aggregation causes reduced response bandwidth and hence increased sampling interval as a function of level.
Photonics: Maintaining competitiveness in the information era
NASA Astrophysics Data System (ADS)
Photonics concerns the use of photons to work with or to replace electrons in certain communications, computer, or control applications traditionally carried out by electronics. It is a key high-technology area, well established in long-distance fiber-optic telecommunications and rapidly growing in other areas of great importance to society. This report concentrates on technical areas where the overall worldwide market for equipment approaches $400 billion per year, i.e., Telecommunications; Information processing; Optical storage and display; Optical sensors; and Policy Issues. It is essential to increase our industrial competitiveness in product development, manufacturing skills, and marketing; There must be continuing industrial effort in long-range research and innovation; The photonics industry should consider the advantages of an industry association that could help organize consortia to conduct cooperative research and address technical problems and policy issues beyond the scope of any one organization; and Government contractors who receive a percentage of sales for their independent research should devote a sizable fraction to projects with a life span of 5 to 10 years.
CAFE: Computer aided fabric evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sims, J.E.
1994-05-06
With the intent of automating the inspection of color printed fabrics for defects, the Engineering Research Division of the Lawrence Livermore National Laboratory in addition with several other national labs, in conjunction with the textile industry has initiated the CAFE project. The projects objective is predicated on the development, implementation and testing of an algorithm for the inspection of color printed fabrics. We attempt to take advantage of the wide ranging applications possible with Computer Vision in order to achieve this. The first job of the algorithm is to teach the computer the {open_quote}correct{close_quote} repeat as the reference, tests themore » remaining repeats in the pattern. There are two different ways to go about doing the first job and with this paper we will describe both methods.« less
Addressing Failures in Exascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snir, Marc; Wisniewski, Robert; Abraham, Jacob
2014-01-01
We present here a report produced by a workshop on Addressing failures in exascale computing' held in Park City, Utah, 4-11 August 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system, discuss existing knowledge on resilience across the various hardware and software layers of an exascale system, and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, andmore » academia, and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.« less
NDE scanning and imaging of aircraft structure
NASA Astrophysics Data System (ADS)
Bailey, Donald; Kepler, Carl; Le, Cuong
1995-07-01
The Science and Engineering Lab at McClellan Air Force Base, Sacramento, Calif. has been involved in the development and use of computer-based scanning systems for NDE (nondestructive evaluation) since 1985. This paper describes the history leading up to our current applications which employ eddy current and ultrasonic scanning of aircraft structures that contain both metallics and advanced composites. The scanning is performed using industrialized computers interfaced to proprietary acquisition equipment and software. Examples are shown that image several types of damage such as exfoliation and fuselage lap joint corrosion in aluminum, impact damage, embedded foreign material, and porosity in Kevlar and graphite epoxy composites. Image analysis techniques are reported that are performed using consumer oriented computer hardware and software that are not NDE specific and not expensive
3D chemical imaging in the laboratory by hyperspectral X-ray computed tomography
Egan, C. K.; Jacques, S. D. M.; Wilson, M. D.; Veale, M. C.; Seller, P.; Beale, A. M.; Pattrick, R. A. D.; Withers, P. J.; Cernik, R. J.
2015-01-01
We report the development of laboratory based hyperspectral X-ray computed tomography which allows the internal elemental chemistry of an object to be reconstructed and visualised in three dimensions. The method employs a spectroscopic X-ray imaging detector with sufficient energy resolution to distinguish individual elemental absorption edges. Elemental distributions can then be made by K-edge subtraction, or alternatively by voxel-wise spectral fitting to give relative atomic concentrations. We demonstrate its application to two material systems: studying the distribution of catalyst material on porous substrates for industrial scale chemical processing; and mapping of minerals and inclusion phases inside a mineralised ore sample. The method makes use of a standard laboratory X-ray source with measurement times similar to that required for conventional computed tomography. PMID:26514938
On Emulation of Flueric Devices in Excitable Chemical Medium
Adamatzky, Andrew
2016-01-01
Flueric devices are fluidic devices without moving parts. Fluidic devices use fluid as a medium for information transfer and computation. A Belousov-Zhabotinsky (BZ) medium is a thin-layer spatially extended excitable chemical medium which exhibits travelling excitation wave-fronts. The excitation wave-fronts transfer information. Flueric devices compute via jets interaction. BZ devices compute via excitation wave-fronts interaction. In numerical model of BZ medium we show that functions of key flueric devices are implemented in the excitable chemical system: signal generator, and, xor, not and nor Boolean gates, delay elements, diodes and sensors. Flueric devices have been widely used in industry since late 1960s and are still employed in automotive and aircraft technologies. Implementation of analog of the flueric devices in the excitable chemical systems opens doors to further applications of excitation wave-based unconventional computing in soft robotics, embedded organic electronics and living technologies. PMID:27997561
Mind the Gap! A Journey towards Computational Toxicology.
Mangiatordi, Giuseppe Felice; Alberga, Domenico; Altomare, Cosimo Damiano; Carotti, Angelo; Catto, Marco; Cellamare, Saverio; Gadaleta, Domenico; Lattanzi, Gianluca; Leonetti, Francesco; Pisani, Leonardo; Stefanachi, Angela; Trisciuzzi, Daniela; Nicolotti, Orazio
2016-09-01
Computational methods have advanced toxicology towards the development of target-specific models based on a clear cause-effect rationale. However, the predictive potential of these models presents strengths and weaknesses. On the good side, in silico models are valuable cheap alternatives to in vitro and in vivo experiments. On the other, the unconscious use of in silico methods can mislead end-users with elusive results. The focus of this review is on the basic scientific and regulatory recommendations in the derivation and application of computational models. Attention is paid to examine the interplay between computational toxicology and drug discovery and development. Avoiding the easy temptation of an overoptimistic future, we report our view on what can, or cannot, realistically be done. Indeed, studies of safety/toxicity represent a key element of chemical prioritization programs carried out by chemical industries, and primarily by pharmaceutical companies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
On Emulation of Flueric Devices in Excitable Chemical Medium.
Adamatzky, Andrew
2016-01-01
Flueric devices are fluidic devices without moving parts. Fluidic devices use fluid as a medium for information transfer and computation. A Belousov-Zhabotinsky (BZ) medium is a thin-layer spatially extended excitable chemical medium which exhibits travelling excitation wave-fronts. The excitation wave-fronts transfer information. Flueric devices compute via jets interaction. BZ devices compute via excitation wave-fronts interaction. In numerical model of BZ medium we show that functions of key flueric devices are implemented in the excitable chemical system: signal generator, and, xor, not and nor Boolean gates, delay elements, diodes and sensors. Flueric devices have been widely used in industry since late 1960s and are still employed in automotive and aircraft technologies. Implementation of analog of the flueric devices in the excitable chemical systems opens doors to further applications of excitation wave-based unconventional computing in soft robotics, embedded organic electronics and living technologies.
Task allocation model for minimization of completion time in distributed computer systems
NASA Astrophysics Data System (ADS)
Wang, Jai-Ping; Steidley, Carl W.
1993-08-01
A task in a distributed computing system consists of a set of related modules. Each of the modules will execute on one of the processors of the system and communicate with some other modules. In addition, precedence relationships may exist among the modules. Task allocation is an essential activity in distributed-software design. This activity is of importance to all phases of the development of a distributed system. This paper establishes task completion-time models and task allocation models for minimizing task completion time. Current work in this area is either at the experimental level or without the consideration of precedence relationships among modules. The development of mathematical models for the computation of task completion time and task allocation will benefit many real-time computer applications such as radar systems, navigation systems, industrial process control systems, image processing systems, and artificial intelligence oriented systems.
Bouhaddou, Omar; Lincoln, Michael J.; Maulden, Sarah; Murphy, Holli; Warnekar, Pradnya; Nguyen, Viet; Lam, Siew; Brown, Steven H; Frankson, Ferdinand J.; Crandall, Glen; Hughes, Carla; Sigley, Roger; Insley, Marcia; Graham, Gail
2006-01-01
The Veterans Administration (VA) has adopted an ambitious program to standardize its clinical terminology to comply with industry-wide standards. The VA is using commercially available tools and in-house software to create a high-quality reference terminology system. The terminology will be used by current and future applications with no planned disruption to operational systems. The first large customer of the group is the national VA Health Data Repository (HDR). Unique enterprise identifiers are assigned to each standard term, and a rich network of semantic relationships makes the resulting data not only recognizable, but highly computable and reusable in a variety of applications, including decision support and data sharing with partners such as the Department of Defense (DoD). This paper describes the specific methods and approaches that the VA has employed to develop and implement this innovative program in existing information system. The goal is to share with others our experience with key issues that face our industry as we move toward an electronic health record for every individual. PMID:17238306
General purpose optimization software for engineering design
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.
1990-01-01
The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.
Computer vision in the poultry industry
USDA-ARS?s Scientific Manuscript database
Computer vision is becoming increasingly important in the poultry industry due to increasing use and speed of automation in processing operations. Growing awareness of food safety concerns has helped add food safety inspection to the list of tasks that automated computer vision can assist. Researc...
NASA Astrophysics Data System (ADS)
Coutu, S.; Ragaz, M.; Mäder, D.; Hammer, P.; Andriesse, M.; Güttinger, U.; Feyen, H.
2017-12-01
The insurance industry has been contributing to the resilient development of agriculture in multiple regions of the globe since the beginning of the 19th Century. It also has from the very beginning of the development of EO Sciences, kept a very close eye on the development of technologies and techniques in this domain. Recent advances in this area such as increased satellite imagery resolution, faster computation time and Big Data management combined with the ground-based knowledge from the insurance industry have offered farmers not only tools permitting better crop management, but also reliable and live yield coverage. This study presents several of these applications at different scales (industrial farming and micro-farming) and in different climate regions, with an emphasis on the limit of current products. Some of these limits such as lack of access of to ground data, R&D efforts or understanding of ground needs could be quickly overcome through closer public-private or private-private collaborations. However, despite a clear benefit for the Food Security nexus and potential win-win situations, those collaborations are not always simple to develop. We present here successful but also disappointing collaboration cases based on the Swiss Re experience, as a global insurance leader. As a conclusion, we highlight how academia, NGOs, governmental organization, start-ups and the insurance industry can get together to foster the development of EO in the domain of Food Security, and bring cutting-edge science to game changing industrial applications.
Kanarska, Yuliya; Walton, Otis
2015-11-30
Fluid-granular flows are common phenomena in nature and industry. Here, an efficient computational technique based on the distributed Lagrange multiplier method is utilized to simulate complex fluid-granular flows. Each particle is explicitly resolved on an Eulerian grid as a separate domain, using solid volume fractions. The fluid equations are solved through the entire computational domain, however, Lagrange multiplier constrains are applied inside the particle domain such that the fluid within any volume associated with a solid particle moves as an incompressible rigid body. The particle–particle interactions are implemented using explicit force-displacement interactions for frictional inelastic particles similar to the DEMmore » method with some modifications using the volume of an overlapping region as an input to the contact forces. Here, a parallel implementation of the method is based on the SAMRAI (Structured Adaptive Mesh Refinement Application Infrastructure) library.« less
NASA Astrophysics Data System (ADS)
Stockert, Sven; Wehr, Matthias; Lohmar, Johannes; Abel, Dirk; Hirt, Gerhard
2017-10-01
In the electrical and medical industries the trend towards further miniaturization of devices is accompanied by the demand for smaller manufacturing tolerances. Such industries use a plentitude of small and narrow cold rolled metal strips with high thickness accuracy. Conventional rolling mills can hardly achieve further improvement of these tolerances. However, a model-based controller in combination with an additional piezoelectric actuator for high dynamic roll adjustment is expected to enable the production of the required metal strips with a thickness tolerance of +/-1 µm. The model-based controller has to be based on a rolling theory which can describe the rolling process very accurately. Additionally, the required computing time has to be low in order to predict the rolling process in real-time. In this work, four rolling theories from literature with different levels of complexity are tested for their suitability for the predictive controller. Rolling theories of von Kármán, Siebel, Bland & Ford and Alexander are implemented in Matlab and afterwards transferred to the real-time computer used for the controller. The prediction accuracy of these theories is validated using rolling trials with different thickness reduction and a comparison to the calculated results. Furthermore, the required computing time on the real-time computer is measured. Adequate results according the prediction accuracy can be achieved with the rolling theories developed by Bland & Ford and Alexander. A comparison of the computing time of those two theories reveals that Alexander's theory exceeds the sample rate of 1 kHz of the real-time computer.
Synthetic biology advances and applications in the biotechnology industry: a perspective.
Katz, Leonard; Chen, Yvonne Y; Gonzalez, Ramon; Peterson, Todd C; Zhao, Huimin; Baltz, Richard H
2018-06-18
Synthetic biology is a logical extension of what has been called recombinant DNA (rDNA) technology or genetic engineering since the 1970s. As rDNA technology has been the driver for the development of a thriving biotechnology industry today, starting with the commercialization of biosynthetic human insulin in the early 1980s, synthetic biology has the potential to take the industry to new heights in the coming years. Synthetic biology advances have been driven by dramatic cost reductions in DNA sequencing and DNA synthesis; by the development of sophisticated tools for genome editing, such as CRISPR/Cas9; and by advances in informatics, computational tools, and infrastructure to facilitate and scale analysis and design. Synthetic biology approaches have already been applied to the metabolic engineering of microorganisms for the production of industrially important chemicals and for the engineering of human cells to treat medical disorders. It also shows great promise to accelerate the discovery and development of novel secondary metabolites from microorganisms through traditional, engineered, and combinatorial biosynthesis. We anticipate that synthetic biology will continue to have broadening impacts on the biotechnology industry to address ongoing issues of human health, world food supply, renewable energy, and industrial chemicals and enzymes.
PFEM-based modeling of industrial granular flows
NASA Astrophysics Data System (ADS)
Cante, J.; Dávalos, C.; Hernández, J. A.; Oliver, J.; Jonsén, P.; Gustafsson, G.; Häggblad, H.-Å.
2014-05-01
The potential of numerical methods for the solution and optimization of industrial granular flows problems is widely accepted by the industries of this field, the challenge being to promote effectively their industrial practice. In this paper, we attempt to make an exploratory step in this regard by using a numerical model based on continuous mechanics and on the so-called Particle Finite Element Method (PFEM). This goal is achieved by focusing two specific industrial applications in mining industry and pellet manufacturing: silo discharge and calculation of power draw in tumbling mills. Both examples are representative of variations on the granular material mechanical response—varying from a stagnant configuration to a flow condition. The silo discharge is validated using the experimental data, collected on a full-scale flat bottomed cylindrical silo. The simulation is conducted with the aim of characterizing and understanding the correlation between flow patterns and pressures for concentric discharges. In the second example, the potential of PFEM as a numerical tool to track the positions of the particles inside the drum is analyzed. Pressures and wall pressures distribution are also studied. The power draw is also computed and validated against experiments in which the power is plotted in terms of the rotational speed of the drum.
Design on intelligent gateway technique in home network
NASA Astrophysics Data System (ADS)
Hu, Zhonggong; Feng, Xiancheng
2008-12-01
Based on digitization, multimedia, mobility, wide band, real-time interaction and so on,family networks, because can provide diverse and personalized synthesis service in information, correspondence work, entertainment, education and health care and so on, are more and more paid attention by the market. The family network product development has become the focus of the related industry. In this paper,the concept of the family network and the overall reference model of the family network are introduced firstly.Then the core techniques and the correspondence standard related with the family network are proposed.The key analysis is made for the function of family gateway, the function module of the software,the key technologies to client side software architecture and the trend of development of the family network entertainment seeing and hearing service and so on. Product present situation of the family gateway and the future trend of development, application solution of the digital family service are introduced. The development of the family network product bringing about the digital family network industry is introduced finally.It causes the development of software industries,such as communication industry,electrical appliances industry, computer and game and so on.It also causes the development of estate industry.
1990-05-01
Research is conducted primarily by visiting scientists from universities and industry who have resident appointments for limited periods of time , and...Elsevier Science Publishers B. V. (North-holland), IFIP, 1989. Crowley, Kay, Joel Saltz, Ravi Mirchandaney, and Harry Berryman: Run- time scheduling...Inverse problem techniques for beams with tip body and time hysteresis camping. ICASE Report No. 89-22, April 18, 1989. 24 pages. To appear in
The NASA aircraft icing research program
NASA Technical Reports Server (NTRS)
Shaw, Robert J.; Reinmann, John J.
1990-01-01
The objective of the NASA aircraft icing research program is to develop and make available to industry icing technology to support the needs and requirements for all-weather aircraft designs. Research is being done for both fixed wing and rotary wing applications. The NASA program emphasizes technology development in two areas, advanced ice protection concepts and icing simulation. Reviewed here are the computer code development/validation, icing wind tunnel testing, and icing flight testing efforts.
Computational Analysis of Effect of Transient Fluid Force on Composite Structures
2013-12-01
as they well represent an E-glass fiber reinforced composite frequently used in research and industrial applications. The fluid domain was sized...provide unique perspectives on peak stress ratios . The two models both share increased structural rigidity. The cylinder is reinforced by... Poisson ratio of 0.3 and Young’s modulus of 20 GPa were added to the transient structural engineering data cell (Figure 69). 78 Figure 69. E-Glass
Webinar: Delivering Transformational HPC Solutions to Industry
Streitz, Frederick
2018-01-16
Dr. Frederick Streitz, director of the High Performance Computing Innovation Center, discusses Lawrence Livermore National Laboratory computational capabilities and expertise available to industry in this webinar.
Computing at the speed limit (supercomputers)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernhard, R.
1982-07-01
The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Debellefontaine, H.; Foussard, J.N.
2000-07-01
Aqueous wastes containing organic pollutants can be efficiently treated by wet air oxidation (WAO), i.e., oxidation (or combustion) by molecular oxygen in the liquid phase, at high temperature (200--325 C) and pressure (up to 175 bar). This method is suited to the elimination of special aqueous wastes from the chemical industry as well as to the treatment of domestic sludge. It is an enclosed process, with a limited interaction with the environment, as opposed to incineration. Usually, the operating cost is lower than 95 Euro M{sup {minus}3} and the preferred COD load ranges from 10 to 80 kg m{sup {minus}3}.more » Only a handful of industrial reactors are in operation world-wide, mainly because of the high capital investment they require. This paper reviews the major results obtained with the WAO process and assess its field of possible application to industrial wastes. In addition, as only a very few studies have been devoted to the scientific design of such reactors (bubble columns), what needs to be known for this scientific design is discussed. At present, a computer program aimed at determining the performance of a wet air oxidation reactor depending on the various operating parameters has been implemented at the laboratory. Some typical results are presented, pointing out the most important parameters and the specific behavior of these units.« less
Using variable homography to measure emergent fibers on textile fabrics
NASA Astrophysics Data System (ADS)
Xu, Jun; Cudel, Christophe; Kohler, Sophie; Fontaine, Stéphane; Haeberlé, Olivier; Klotz, Marie-Louise
2011-07-01
A fabric's smoothness is a key factor to determine the quality of textile finished products and has great influence on the functionality of industrial textiles and high-end textile products. With popularization of the 'zero defect' industrial concept, identifying and measuring defective material in the early stage of production is of great interest for the industry. In the current market, many systems are able to achieve automatic monitoring and control of fabric, paper, and nonwoven material during the entire production process, however online measurement of hairiness is still an open topic and highly desirable for industrial applications. In this paper we propose a computer vision approach, based on variable homography, which can be used to measure the emergent fiber's length on textile fabrics. The main challenges addressed in this paper are the application of variable homography to textile monitoring and measurement, as well as the accuracy of the estimated calculation. We propose that a fibrous structure can be considered as a two-layer structure and then show how variable homography can estimate the length of the fiber defects. Simulations are carried out to show the effectiveness of this method to measure the emergent fiber's length. The true lengths of selected fibers are measured precisely using a digital optical microscope, and then the same fibers are tested by our method. Our experimental results suggest that smoothness monitored by variable homography is an accurate and robust method for quality control of important industrially fabrics.
Automation; The New Industrial Revolution.
ERIC Educational Resources Information Center
Arnstein, George E.
Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…
National Survey of Computer Aided Manufacturing in Industrial Technology Programs.
ERIC Educational Resources Information Center
Heidari, Farzin
The current status of computer-aided manufacturing in the 4-year industrial technology programs in the United States was studied. All industrial technology department chairs were mailed a questionnaire divided into program information, equipment information, and general comments sections. The questionnaire was designed to determine the subjects…
Computers Transform an Industry.
ERIC Educational Resources Information Center
Simich, Jack
1982-01-01
Describes the use of computer technology in the graphics communication industry. Areas that are examined include typesetting, color scanners, communications satellites, page make-up systems, and the business office. (CT)
Automation of the longwall mining system
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Aster, R. W.; Harris, J.; High, J.
1982-01-01
Cost effective, safe, and technologically sound applications of automation technology to underground coal mining were identified. The longwall analysis commenced with a general search for government and industry experience of mining automation technology. A brief industry survey was conducted to identify longwall operational, safety, and design problems. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state of the art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system.