Sample records for conventional design processes

  1. Design and fabrication of conventional and unconventional superconductors

    NASA Technical Reports Server (NTRS)

    Collings, E. W.

    1983-01-01

    The design and fabrication of conventional and unconventionally processed Ti-Nb base and Al5-compound-base, respectively, composite superconductors is discussed in a nine section review. The first two sections introduce the general properties of alloy and compound superconductors, and the design and processing requirements for the production of long lengths of stable low loss conductor. All aspects of flux jump stability, and the general requirements of cryogenic stabilization are addressed. Conductor design from an a.c.-loss standpoint; some basic formulae describing hysteretic and eddy current losses and the influences on a.c. loss of filament diameter, strand (conductor) diameter, twist pitch, and matrix resistivity are discussed. The basic techniques used in the fabrication of conventional multifilamentary conductors are described.

  2. Design and fabrication of wraparound contact silicon solar cells

    NASA Technical Reports Server (NTRS)

    Goodelle, G.

    1972-01-01

    Work is reported on the development and production of 1,000 N+/P wraparound solar cells of two different design configurations: Design 1, a bar configuration wraparound and Design 2, a corner pad configuration wraparound. The project goal consisted of determining which of the two designs was better with regard to production cost where the typical cost of a conventional solar cell was considered as the norm. Emphasis was also placed on obtaining the highest possible output efficiency, although a minumum efficiency of 10.5% was required. Five hundred cells of Design 1 and 500 cells of Design 2 were fabricated. Design 1 which used similar procedures to those used in the fabrication of conventional cells, was the less expensive with a cost very close to that of a conventional cell. Design 2 was more expensive mainly because the more exotic process procedures used were less developed than those used for Design 1. However, Design 2 processing technology demonstrated a feasibility that should warrant future investigation toward improvement and refinement.

  3. Improved design of constrained model predictive tracking control for batch processes against unknown uncertainties.

    PubMed

    Wu, Sheng; Jin, Qibing; Zhang, Ridong; Zhang, Junfeng; Gao, Furong

    2017-07-01

    In this paper, an improved constrained tracking control design is proposed for batch processes under uncertainties. A new process model that facilitates process state and tracking error augmentation with further additional tuning is first proposed. Then a subsequent controller design is formulated using robust stable constrained MPC optimization. Unlike conventional robust model predictive control (MPC), the proposed method enables the controller design to bear more degrees of tuning so that improved tracking control can be acquired, which is very important since uncertainties exist inevitably in practice and cause model/plant mismatches. An injection molding process is introduced to illustrate the effectiveness of the proposed MPC approach in comparison with conventional robust MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Conceptual design of single turbofan engine powered light aircraft

    NASA Technical Reports Server (NTRS)

    Snyder, F. S.; Voorhees, C. G.; Heinrich, A. M.; Baisden, D. N.

    1977-01-01

    The conceptual design of a four place single turbofan engine powered light aircraft was accomplished utilizing contemporary light aircraft conventional design techniques as a means of evaluating the NASA-Ames General Aviation Synthesis Program (GASP) as a preliminary design tool. In certain areas, disagreement or exclusion were found to exist between the results of the conventional design and GASP processes. Detail discussion of these points along with the associated contemporary design methodology are presented.

  5. HMI conventions for process control graphics.

    PubMed

    Pikaar, Ruud N

    2012-01-01

    Process operators supervise and control complex processes. To enable the operator to do an adequate job, instrumentation and process control engineers need to address several related topics, such as console design, information design, navigation, and alarm management. In process control upgrade projects, usually a 1:1 conversion of existing graphics is proposed. This paper suggests another approach, efficiently leading to a reduced number of new powerful process graphics, supported by a permanent process overview displays. In addition a road map for structuring content (process information) and conventions for the presentation of objects, symbols, and so on, has been developed. The impact of the human factors engineering approach on process control upgrade projects is illustrated by several cases.

  6. A preferential design approach for energy-efficient and robust implantable neural signal processing hardware.

    PubMed

    Narasimhan, Seetharam; Chiel, Hillel J; Bhunia, Swarup

    2009-01-01

    For implantable neural interface applications, it is important to compress data and analyze spike patterns across multiple channels in real time. Such a computational task for online neural data processing requires an innovative circuit-architecture level design approach for low-power, robust and area-efficient hardware implementation. Conventional microprocessor or Digital Signal Processing (DSP) chips would dissipate too much power and are too large in size for an implantable system. In this paper, we propose a novel hardware design approach, referred to as "Preferential Design" that exploits the nature of the neural signal processing algorithm to achieve a low-voltage, robust and area-efficient implementation using nanoscale process technology. The basic idea is to isolate the critical components with respect to system performance and design them more conservatively compared to the noncritical ones. This allows aggressive voltage scaling for low power operation while ensuring robustness and area efficiency. We have applied the proposed approach to a neural signal processing algorithm using the Discrete Wavelet Transform (DWT) and observed significant improvement in power and robustness over conventional design.

  7. "Glitch Logic" and Applications to Computing and Information Security

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Katkoori, Srinivas

    2009-01-01

    This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.

  8. Microwave processing of a dental ceramic used in computer-aided design/computer-aided manufacturing.

    PubMed

    Pendola, Martin; Saha, Subrata

    2015-01-01

    Because of their favorable mechanical properties and natural esthetics, ceramics are widely used in restorative dentistry. The conventional ceramic sintering process required for their use is usually slow, however, and the equipment has an elevated energy consumption. Sintering processes that use microwaves have several advantages compared to regular sintering: shorter processing times, lower energy consumption, and the capacity for volumetric heating. The objective of this study was to test the mechanical properties of a dental ceramic used in computer-aided design/computer-aided manufacturing (CAD/CAM) after the specimens were processed with microwave hybrid sintering. Density, hardness, and bending strength were measured. When ceramic specimens were sintered with microwaves, the processing times were reduced and protocols were simplified. Hardness was improved almost 20% compared to regular sintering, and flexural strength measurements suggested that specimens were approximately 50% stronger than specimens sintered in a conventional system. Microwave hybrid sintering may preserve or improve the mechanical properties of dental ceramics designed for CAD/CAM processing systems, reducing processing and waiting times.

  9. A human performance evaluation of graphic symbol-design features.

    PubMed

    Samet, M G; Geiselman, R E; Landee, B M

    1982-06-01

    16 subjects learned each of two tactical display symbol sets (conventional symbols and iconic symbols) in turn and were then shown a series of graphic displays containing various symbol configurations. For each display, the subject was asked questions corresponding to different behavioral processes relating to symbol use (identification, search, comparison, pattern recognition). The results indicated that: (a) conventional symbols yielded faster pattern-recognition performance than iconic symbols, and iconic symbols did not yield faster identification than conventional symbols, and (b) the portrayal of additional feature information (through the use of perimeter density or vector projection coding) slowed processing of the core symbol information in four tasks, but certain symbol-design features created less perceptual interference and had greater correspondence with the portrayal of specific tactical concepts than others. The results were discussed in terms of the complexities involved in the selection of symbol design features for use in graphic tactical displays.

  10. Design of a control configured tanker aircraft

    NASA Technical Reports Server (NTRS)

    Walker, S. A.

    1976-01-01

    The benefits that accrue from using control configured vehicle (CCV) concepts were examined along with the techniques for applying these concepts to an advanced tanker aircraft design. Reduced static stability (RSS) and flutter mode control (FMC) were the two primary CCV concepts used in the design. The CCV tanker was designed to the same mission requirements specified for a conventional tanker design. A seven degree of freedom mathematical model of the flexible aircraft was derived and used to synthesize a lateral stability augmentation system (SAS), a longitudinal control augmentation system (CAS), and a FMC system. Fatigue life and cost analyses followed the control system synthesis, after which a comparative evaluation of the CCV and conventional tankers was made. This comparison indicated that the CCV weight and cost were lower but that, for this design iteration, the CCV fatigue life was shorter. Also, the CCV crew station acceleration was lower, but the acceleration at the boom operator station was higher relative to the corresponding conventional tanker. Comparison of the design processes used in the CCV and conventional design studies revealed that they were basically the same.

  11. Concepts for the development of nanoscale stable precipitation-strengthened steels manufactured by conventional methods

    DOE PAGES

    Yablinsky, C. A.; Tippey, K. E.; Vaynman, S.; ...

    2014-11-11

    In this study, the development of oxide dispersion strengthened ferrous alloys has shown that microstructures designed for excellent irradiation resistance and thermal stability ideally contain stable nanoscale precipitates and dislocation sinks. Based upon this understanding, the microstructures of conventionally manufactured ferritic and ferritic-martensitic steels can be designed to include controlled volume fractions of fine, stable precipitates and dislocation sinks via specific alloying and processing paths. The concepts proposed here are categorized as advanced high-Cr ferritic-martensitic (AHCr-FM) and novel tailored precipitate ferritic (TPF) steels, which have the potential to improve the in-reactor performance of conventionally manufactured alloys. AHCr-FM steels have modifiedmore » alloy content relative to current reactor materials (such as alloy NF616/P92) to maximize desirable precipitates and control phase stability. TPF steels are designed to incorporate nickel aluminides, in addition to microalloy carbides, in a ferritic matrix to produce fine precipitate arrays with good thermal stability. Both alloying concepts may also benefit from thermomechanical processing to establish dislocation sinks and modify phase transformation behaviors. Alloying and processing paths toward designed microstructures are discussed for both AHCr-FM and TPF material classes.« less

  12. Nontraditional Intersections/Interchanges: Informational Report

    DOT National Transportation Integrated Search

    2007-06-18

    Comprehensive Coverage -Geometric design considerations. -Traffic analysis and comparison with similar conventional design. -Signal settings. -Signing and marking. -Material or cost comparison. -Selection Process in a spread sheet.

  13. The Seductive Power of an Innovation: Enrolling Non-Conventional Actors in a Drip Irrigation Community in Morocco

    ERIC Educational Resources Information Center

    Benouniche, Maya; Errahj, Mostafa; Kuper, Marcel

    2016-01-01

    Purpose: The aim of this study was to analyze the motivations of non-conventional innovation actors to engage in innovation processes, how their involvement changed the technology and their own social-professional status, and to analyze their role in the diffusion of the innovation. Design/methodology/approach: We studied the innovation process of…

  14. Additive manufacturing: Toward holistic design

    DOE PAGES

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...

    2017-03-18

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  15. Ultra-low-power and robust digital-signal-processing hardware for implantable neural interface microsystems.

    PubMed

    Narasimhan, S; Chiel, H J; Bhunia, S

    2011-04-01

    Implantable microsystems for monitoring or manipulating brain activity typically require on-chip real-time processing of multichannel neural data using ultra low-power, miniaturized electronics. In this paper, we propose an integrated-circuit/architecture-level hardware design framework for neural signal processing that exploits the nature of the signal-processing algorithm. First, we consider different power reduction techniques and compare the energy efficiency between the ultra-low frequency subthreshold and conventional superthreshold design. We show that the superthreshold design operating at a much higher frequency can achieve comparable energy dissipation by taking advantage of extensive power gating. It also provides significantly higher robustness of operation and yield under large process variations. Next, we propose an architecture level preferential design approach for further energy reduction by isolating the critical computation blocks (with respect to the quality of the output signal) and assigning them higher delay margins compared to the noncritical ones. Possible delay failures under parameter variations are confined to the noncritical components, allowing graceful degradation in quality under voltage scaling. Simulation results using prerecorded neural data from the sea-slug (Aplysia californica) show that the application of the proposed design approach can lead to significant improvement in total energy, without compromising the output signal quality under process variations, compared to conventional design approaches.

  16. NULL Convention Floating Point Multiplier

    PubMed Central

    Ramachandran, Seshasayanan

    2015-01-01

    Floating point multiplication is a critical part in high dynamic range and computational intensive digital signal processing applications which require high precision and low power. This paper presents the design of an IEEE 754 single precision floating point multiplier using asynchronous NULL convention logic paradigm. Rounding has not been implemented to suit high precision applications. The novelty of the research is that it is the first ever NULL convention logic multiplier, designed to perform floating point multiplication. The proposed multiplier offers substantial decrease in power consumption when compared with its synchronous version. Performance attributes of the NULL convention logic floating point multiplier, obtained from Xilinx simulation and Cadence, are compared with its equivalent synchronous implementation. PMID:25879069

  17. NULL convention floating point multiplier.

    PubMed

    Albert, Anitha Juliette; Ramachandran, Seshasayanan

    2015-01-01

    Floating point multiplication is a critical part in high dynamic range and computational intensive digital signal processing applications which require high precision and low power. This paper presents the design of an IEEE 754 single precision floating point multiplier using asynchronous NULL convention logic paradigm. Rounding has not been implemented to suit high precision applications. The novelty of the research is that it is the first ever NULL convention logic multiplier, designed to perform floating point multiplication. The proposed multiplier offers substantial decrease in power consumption when compared with its synchronous version. Performance attributes of the NULL convention logic floating point multiplier, obtained from Xilinx simulation and Cadence, are compared with its equivalent synchronous implementation.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  19. Software For Design Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1991-01-01

    Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.

  20. NON-POLLUTING REPLACEMENT FOR CHROMATE CONVERSION COATING & ZINC PHOSPHATING IN POWDER COATING APPLICATIONS

    EPA Science Inventory

    Picklex, a proprietary formulation, is an alternative to conventional metal surface pretreatments. Its developers claim that it does not produce waste or lower production rates, and it will maintain performance compared to conventional processes. A laboratory program was designed...

  1. Reduction of oxygen concentration by heater design during Czochralski Si growth

    NASA Astrophysics Data System (ADS)

    Zhou, Bing; Chen, Wenliang; Li, Zhihui; Yue, Ruicun; Liu, Guowei; Huang, Xinming

    2018-02-01

    Oxygen is one of the highest-concentration impurities in single crystals grown by the Czochralski (CZ) process, and seriously impairs the quality of the Si wafer. In this study, computer simulations were applied to design a new CZ system. A more appropriate thermal field was acquired by optimization of the heater structure. The simulation results showed that, compared with the conventional system, the oxygen concentration in the newly designed CZ system was reduced significantly throughout the entire CZ process because of the lower crucible wall temperature and optimized convection. To verify the simulation results, experiments were conducted on an industrial single-crystal furnace. The experimental results showed that the oxygen concentration was reduced significantly, especially at the top of the CZ-Si ingot. Specifically, the oxygen concentration was 6.19 × 1017 atom/cm3 at the top of the CZ-Si ingot with the newly designed CZ system, compared with 9.22 × 1017 atom/cm3 with the conventional system. Corresponding light-induced degradation of solar cells based on the top of crystals from the newly designed CZ system was 1.62%, a reduction of 0.64% compared with crystals from the conventional system (2.26%).

  2. A New Comparison Between Conventional Indexing (MEDLARS) and Automatic Text Processing (SMART)

    ERIC Educational Resources Information Center

    Salton, G.

    1972-01-01

    A new testing process is described. The design of the test procedure is covered in detail, and the several language processing features incorporated into the SMART system are individually evaluated. (20 references) (Author)

  3. A guide to structural factors for advanced composites used on spacecraft

    NASA Technical Reports Server (NTRS)

    Vanwagenen, Robert

    1989-01-01

    The use of composite materials in spacecraft systems is constantly increasing. Although the areas of composite design and fabrication are maturing, they remain distinct from the same activities performed using conventional materials and processes. This has led to some confusion regarding the precise meaning of the term 'factor of safety' as it applies to these structures. In addition, composite engineering introduces terms such as 'knock-down factors' to further modify material properties for design purposes. This guide is intended to clarify these terms as well as their use in the design of composite structures for spacecraft. It is particularly intended to be used by the engineering community not involved in the day-to-day composites design process. An attempt is also made to explain the wide range of factors of safety encountered in composite designs as well as their relationship to the 1.4 factor of safety conventionally applied to metallic structures.

  4. Data on processing of Ti-25Nb-25Zr β-titanium alloys via powder metallurgy route: Methodology, microstructure and mechanical properties.

    PubMed

    Ueda, D; Dirras, G; Hocini, A; Tingaud, D; Ameyama, K; Langlois, P; Vrel, D; Trzaska, Z

    2018-04-01

    The data presented in this article are related to the research article entitled "Cyclic Shear behavior of conventional and harmonic structure-designed Ti-25Nb-25Zr β-titanium alloy: Back-stress hardening and twinning inhibition" (Dirras et al., 2017) [1]. The datasheet describes the methods used to fabricate two β-titanium alloys having conventional microstructure and so-called harmonic structure (HS) design via a powder metallurgy route, namely the spark plasma sintering (SPS) route. The data show the as-processed unconsolidated powder microstructures as well as the post-SPS ones. The data illustrate the mechanical response under cyclic shear loading of consolidated alloy specimens. The data show how electron back scattering diffraction(EBSD) method is used to clearly identify induced deformation features in the case of the conventional alloy.

  5. The Modern Design of Experiments: A Technical and Marketing Framework

    NASA Technical Reports Server (NTRS)

    DeLoach, R.

    2000-01-01

    A new wind tunnel testing process under development at NASA Langley Research Center, called Modern Design of Experiments (MDOE), differs from conventional wind tunnel testing techniques on a number of levels. Chief among these is that MDOE focuses on the generation of adequate prediction models rather than high-volume data collection. Some cultural issues attached to this and other distinctions between MDOE and conventional wind tunnel testing are addressed in this paper.

  6. Improvements in the efficiency of turboexpanders in cryogenic applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agahi, R.R.; Lin, M.C.; Ershaghi, B.

    1996-12-31

    Process designers have utilized turboexpanders in cryogenic processes because of their higher thermal efficiencies when compared with conventional refrigeration cycles. Process design and equipment performance have improved substantially through the utilization of modern technologies. Turboexpander manufacturers have also adopted Computational Fluid Dynamic Software, Computer Numerical Control Technology and Holography Techniques to further improve an already impressive turboexpander efficiency performance. In this paper, the authors explain the design process of the turboexpander utilizing modern technology. Two cases of turboexpanders processing helium (4.35{degrees}K) and hydrogen (56{degrees}K) will be presented.

  7. Utilization of design data on conventional system to building information modeling (BIM)

    NASA Astrophysics Data System (ADS)

    Akbar, Boyke M.; Z. R., Dewi Larasati

    2017-11-01

    Nowadays infrastructure development becomes one of the main priorities in the developed country such as Indonesia. The use of conventional design system is considered no longer effectively support the infrastructure projects, especially for the high complexity building design, due to its fragmented system issues. BIM comes as one of the solutions in managing projects in an integrated manner. Despite of the all known BIM benefits, there are some obstacles on the migration process to BIM. The two main of the obstacles are; the BIM implementation unpreparedness of some project parties and a concerns to leave behind the existing database and create a new one on the BIM system. This paper discusses the utilization probabilities of the existing CAD data from the conventional design system for BIM system. The existing conventional CAD data's and BIM design system output was studied to examine compatibility issues between two subject and followed by an utilization scheme-strategy probabilities. The goal of this study is to add project parties' eagerness in migrating to BIM by maximizing the existing data utilization and hopefully could also increase BIM based project workflow quality.

  8. Estimating the safety benefits of context sensitive solutions.

    DOT National Transportation Integrated Search

    2011-11-01

    Context Sensitive Solutions (CSS), also commonly known by the original name Context Sensitive Design : (CSD), is an alternative approach to the conventional transportation-oriented decision-making and design : processes. The CSS approach can be used ...

  9. Utility of a Newly Designed Film Holder for Premolar Bitewing Radiography.

    PubMed

    Safi, Yaser; Esmaeelinejad, Mohammad; Vasegh, Zahra; Valizadeh, Solmaz; Aghdasi, Mohammad Mehdi; Sarani, Omid; Afsahi, Mahmoud

    2015-11-01

    Bitewing radiography is a valuable technique for assessment of proximal caries, alveolar crest and periodontal status. Technical errors during radiography result in erroneous radiographic interpretation, misdiagnosis, possible mistreatment or unnecessary exposure of patient for taking a repeat radiograph. In this study, we aimed to evaluate the efficacy of a film holder modified from the conventional one and compared it with that of conventional film holder. Our study population comprised of 70 patients who were referred to the Radiology Department for bilateral premolar bitewing radiographs as requested by their attending clinician. Bitewing radiographs in each patient were taken using the newly designed holder in one side and the conventional holder in the other side. The acceptability of the two holders from the perspectives of the technician and patients was determined using a 0-20 point scale. The frequency of overlap and film positioning errors was calculated for each method. The conventional holder had greater acceptability among patients compared to the newly designed holder (mean score of 16.59 versus 13.37). From the technicians' point of view, the newly designed holder was superior to the conventional holder (mean score of 17.33 versus 16.44). The frequency of overlap was lower using the newly designed holder (p<0.001) and it allowed more accurate film positioning (p=0.005). The newly designed holder may facilitate the process of radiography for technicians and may be associated with less frequency of radiographic errors compared to the conventional holder.

  10. Distributive Distillation Enabled by Microchannel Process Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arora, Ravi

    The application of microchannel technology for distributive distillation was studied to achieve the Grand Challenge goals of 25% energy savings and 10% return on investment. In Task 1, a detailed study was conducted and two distillation systems were identified that would meet the Grand Challenge goals if the microchannel distillation technology was used. Material and heat balance calculations were performed to develop process flow sheet designs for the two distillation systems in Task 2. The process designs were focused on two methods of integrating the microchannel technology 1) Integrating microchannel distillation to an existing conventional column, 2) Microchannel distillation formore » new plants. A design concept for a modular microchannel distillation unit was developed in Task 3. In Task 4, Ultrasonic Additive Machining (UAM) was evaluated as a manufacturing method for microchannel distillation units. However, it was found that a significant development work would be required to develop process parameters to use UAM for commercial distillation manufacturing. Two alternate manufacturing methods were explored. Both manufacturing approaches were experimentally tested to confirm their validity. The conceptual design of the microchannel distillation unit (Task 3) was combined with the manufacturing methods developed in Task 4 and flowsheet designs in Task 2 to estimate the cost of the microchannel distillation unit and this was compared to a conventional distillation column. The best results were for a methanol-water separation unit for the use in a biodiesel facility. For this application microchannel distillation was found to be more cost effective than conventional system and capable of meeting the DOE Grand Challenge performance requirements.« less

  11. Harnessing the Potential of Additive Manufacturing

    DTIC Science & Technology

    2016-12-01

    manufacturing age, which is dominated by standards for materials, processes and process control. Conventional manufacturing is based upon a design that is...documented either in a drawing or a computer-aided design (CAD) file. The manufacturing team then develops a docu- mented public or private process for...31 Defense AT&L: November-December 2016 Harnessing the Potential of Additive Manufacturing Bill Decker Decker is director of Technology

  12. Launch vehicle systems design analysis

    NASA Technical Reports Server (NTRS)

    Ryan, Robert; Verderaime, V.

    1993-01-01

    Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.

  13. Modelling of teeth of a gear transmission for modern manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Monica, Z.; Banaś, W.; Ćwikla, G.; Topolska, S.

    2017-08-01

    The technological process of manufacturing of gear wheels is influenced by many factors. It is designated depending on the type of material from which the gear is to be produced, its heat treatment parameters, the required accuracy, the geometrical form and the modifications of the tooth. Therefor the parameters selection process is not easy and moreover it is unambiguous. Another important stage of the technological process is the selection of appropriate tools to properly machine teeth in the operations of both roughing and finishing. In the presented work the focus is put first of all on modern production methods of gears using technologically advanced instruments in comparison with conventional tools. Conventional processing tools such as gear hobbing cutters or Fellows gear-shaper cutters are used from the beginning of the machines for the production of gear wheels. With the development of technology and the creation of CNC machines designated for machining of gears wheel it was also developed the manufacturing technology as well as the design knowledge concerning the technological tools. Leading manufacturers of cutting tools extended the range of tools designated for machining of gears on the so-called hobbing cutters with inserted cemented carbide tips. The same have be introduced to Fellows gear-shaper cutters. The results of tests show that is advantaged to use hobbing cutters with inserted cemented carbide tips for milling gear wheels with a high number of teeth, where the time gains are very high, in relation to the use of conventional milling cutters.

  14. 3D Printer-Manufacturing of Complex Geometry Elements

    NASA Astrophysics Data System (ADS)

    Ciubară, A.; Burlea, Ș L.; Axinte, M.; Cimpoeșu, R.; Chicet, D. L.; Manole, V.; Burlea, G.; Cimpoeșu, N.

    2018-06-01

    In the last 5-10 years the process of 3D printing has an incredible advanced in all the fields with a tremendous number of applications. Plastic materials exhibit highly beneficial mechanical properties while delivering complex designs impossible to achieve using conventional manufacturing. In this article the printing process (filling degree, time, complications and details finesse) of few plastic elements with complicated geometry and fine details was analyzed and comment. 3D printing offers many of the thermoplastics and industrial materials found in conventional manufacturing. The advantages and disadvantages of 3D printing for plastic parts are discussed. Time of production for an element with complex geometry, from the design to final cut, was evaluated.

  15. Nondestructive Evaluation of the Friction Weld Process on 2195/2219 Grade Aluminum

    NASA Technical Reports Server (NTRS)

    Suits, Michael W.; Clark, Linda S.; Cox, Dwight E.

    1999-01-01

    In 1996, NASA's Marshall Space Flight Center began an ambitious program designed to find alternative methods of repairing conventional TIG (Tungsten Inert Gas) welds and VPPA (Variable Polarity Plasma Arc) welds on the Space Shuttle External Tank without producing additional heat-related anomalies or conditions. Therefore, a relatively new method, invented by The Welding Institute (TWI) in Cambridge, England, called Friction Stir Welding (FSW), was investigated for use in this application, as well as being used potentially as an initial weld process. As with the conventional repair welding processes, nondestructive evaluation (NDE) plays a crucial role in the verification of these repairs. Since it was feared that conventional NDE might have trouble with this type of weld structure (due to shape of nugget, grain structure, etc.) it was imperative that a complete study be performed to address the adequacy of the NDE process. This paper summarizes that process.

  16. Improved molding process ensures plastic parts of higher tensile strength

    NASA Technical Reports Server (NTRS)

    Heier, W. C.

    1968-01-01

    Single molding process ensures that plastic parts /of a given mechanical design/ produced from a conventional thermosetting molding compound will have a maximum tensile strength. The process can also be used for other thermosetting compounds to produce parts with improved physical properties.

  17. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  18. An economical route to high quality lubricants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J.P.; Hahn, S.K.; Kwon, S.H.

    1996-12-01

    The current rends in the automotive and industrial markets toward more efficient engines, longer drain intervals, and lower emissions all contribute to placing increasingly stringent performance requirements on lubricants. The demand for higher quality synthetic and non-conventional basestocks is expected to grow at a much faster rate than that of conventional lube basestocks to meet these higher performance standards. Yukong Limited has developed a novel technology (the Yukong UCO Lube Process) for the economic production of high quality, high-viscosity-index lube basestocks from a fuels hydrocracker unconverted oil stream. A pilot plant based on this process has been producing oils formore » testing purposes since May 1994. A commercial facility designed to produce 3,500 BPD of VHVI lube basestocks cane on-stream at Yukong`s Ulsan refinery in October 1995. The Badger Technology Center of Raytheon Engineers and Constructors assisted Yukong during the development of the technology and prepared the basic process design package for the commercial facility. This paper presents process aspects of the technology and comparative data on investment and operating costs. Yukong lube basestock product properties and performance data are compared to basestocks produced by conventional means and by lube hydrocracking.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yablinsky, C. A.; Tippey, K. E.; Vaynman, S.

    In this study, the development of oxide dispersion strengthened ferrous alloys has shown that microstructures designed for excellent irradiation resistance and thermal stability ideally contain stable nanoscale precipitates and dislocation sinks. Based upon this understanding, the microstructures of conventionally manufactured ferritic and ferritic-martensitic steels can be designed to include controlled volume fractions of fine, stable precipitates and dislocation sinks via specific alloying and processing paths. The concepts proposed here are categorized as advanced high-Cr ferritic-martensitic (AHCr-FM) and novel tailored precipitate ferritic (TPF) steels, which have the potential to improve the in-reactor performance of conventionally manufactured alloys. AHCr-FM steels have modifiedmore » alloy content relative to current reactor materials (such as alloy NF616/P92) to maximize desirable precipitates and control phase stability. TPF steels are designed to incorporate nickel aluminides, in addition to microalloy carbides, in a ferritic matrix to produce fine precipitate arrays with good thermal stability. Both alloying concepts may also benefit from thermomechanical processing to establish dislocation sinks and modify phase transformation behaviors. Alloying and processing paths toward designed microstructures are discussed for both AHCr-FM and TPF material classes.« less

  20. Integrating Green Building Criteria Into Housing Design Processes Case Study: Tropical Apartment At Kebon Melati, Jakarta

    NASA Astrophysics Data System (ADS)

    Farid, V. L.; Wonorahardjo, S.

    2018-05-01

    The implementation of Green Building criteria is relatively new in architectural practice, especially in Indonesia. Consequently, the integration of these criteria into design process has the potential to change the design process itself. The implementation of the green building criteria into the conventional design process will be discussed in this paper. The concept of this project is to design a residential unit with a natural air-conditioning system. To achieve this purpose, the Green Building criteria has been implemented since the beginning of the design process until the detailing process on the end of the project. Several studies was performed throughout the design process, such as: (1) Conceptual review, where several professionally proved theories related to Tropical Architecture and passive design are used for a reference, and (2) Computer simulations, such as Computational Fluid Dynamics (CFD) and wind tunnel simulation, used to represent the dynamic response of the surrounding environment towards the building. Hopefully this paper may become a reference for designing a green residential building.

  1. Energy-efficient digital and wireless IC design for wireless smart sensing

    NASA Astrophysics Data System (ADS)

    Zhou, Jun; Huang, Xiongchuan; Wang, Chao; Tae-Hyoung Kim, Tony; Lian, Yong

    2017-10-01

    Wireless smart sensing is now widely used in various applications such as health monitoring and structural monitoring. In conventional wireless sensor nodes, significant power is consumed in wirelessly transmitting the raw data. Smart sensing adds local intelligence to the sensor node and reduces the amount of wireless data transmission via on-node digital signal processing. While the total power consumption is reduced compared to conventional wireless sensing, the power consumption of the digital processing becomes as dominant as wireless data transmission. This paper reviews the state-of-the-art energy-efficient digital and wireless IC design techniques for reducing the power consumption of the wireless smart sensor node to prolong battery life and enable self-powered applications.

  2. Lithographic fabrication of nanoapertures

    DOEpatents

    Fleming, James G.

    2003-01-01

    A new class of silicon-based lithographically defined nanoapertures and processes for their fabrication using conventional silicon microprocessing technology have been invented. The new ability to create and control such structures should significantly extend our ability to design and implement chemically selective devices and processes.

  3. High-speed machining of Space Shuttle External Tank (ET) panels

    NASA Technical Reports Server (NTRS)

    Miller, J. A.

    1983-01-01

    Potential production rates and project cost savings achieved by converting the conventional machining process in manufacturing shuttle external tank panels to high speed machining (HSM) techniques were studied. Savings were projected from the comparison of current production rates with HSM rates and with rates attainable on new conventional machines. The HSM estimates were also based on rates attainable by retrofitting existing conventional equipment with high speed spindle motors and rates attainable using new state of the art machines designed and built for HSM.

  4. CMLLite: a design philosophy for CML

    PubMed Central

    2011-01-01

    CMLLite is a collection of definitions and processes which provide strong and flexible validation for a document in Chemical Markup Language (CML). It consists of an updated CML schema (schema3), conventions specifying rules in both human and machine-understandable forms and a validator available both online and offline to check conformance. This article explores the rationale behind the changes which have been made to the schema, explains how conventions interact and how they are designed, formulated, implemented and tested, and gives an overview of the validation service. PMID:21999395

  5. Hypersonic aircraft design

    NASA Technical Reports Server (NTRS)

    Alkamhawi, Hani; Greiner, Tom; Fuerst, Gerry; Luich, Shawn; Stonebraker, Bob; Wray, Todd

    1990-01-01

    A hypersonic aircraft is designed which uses scramjets to accelerate from Mach 6 to Mach 10 and sustain that speed for two minutes. Different propulsion systems were considered and it was decided that the aircraft would use one full scale turbofan-ramjet. Two solid rocket boosters were added to save fuel and help the aircraft pass through the transonic region. After considering aerodynamics, aircraft design, stability and control, cooling systems, mission profile, and landing systems, a conventional aircraft configuration was chosen over that of a waverider. The conventional design was chosen due to its landing characteristics and the relative expense compared to the waverider. Fuel requirements and the integration of the engine systems and their inlets are also taken into consideration in the final design. A hypersonic aircraft was designed which uses scramjets to accelerate from Mach 6 to Mach 10 and sustain that speed for two minutes. Different propulsion systems were considered and a full scale turbofan-ramjet was chosen. Two solid rocket boosters were added to save fuel and help the aircraft pass through the transonic reqion. After the aerodynamics, aircraft design, stability and control, cooling systems, mission profile, landing systems, and their physical interactions were considered, a conventional aircraft configuration was chosen over that of a waverider. The conventional design was chosen due to its landing characteristics and the relative expense compared to the waverider. Fuel requirements and the integration of the engine systems and their inlets were also considered in the designing process.

  6. Comparison of Conventional and Microwave Treatment on Soymilk for Inactivation of Trypsin Inhibitors and In Vitro Protein Digestibility

    PubMed Central

    Vagadia, Brinda Harish; Raghavan, Vijaya

    2018-01-01

    Soymilk is lower in calories compared to cow’s milk, since it is derived from a plant source (no cholesterol) and is an excellent source of protein. Despite the beneficial factors, soymilk is considered as one of the most controversial foods in the world. It contains serine protease inhibitors which lower its nutritional value and digestibility. Processing techniques for the elimination of trypsin inhibitors and lipoxygenase, which have shorter processing time and lower production costs are required for the large-scale manufacturing of soymilk. In this study, the suitable conditions of time and temperature are optimized during microwave processing to obtain soymilk with maximum digestibility with inactivation of trypsin inhibitors, in comparison to the conventional thermal treatment. The microwave processing conditions at a frequency of 2.45 GHz and temperatures of 70 °C, 85 °C and 100 °C for 2, 5 and 8 min were investigated and were compared to conventional thermal treatments at the same temperature for 10, 20 and 30 min. Response surface methodology is used to design and optimize the experimental conditions. Thermal processing was able to increase digestibility by 7% (microwave) and 11% (conventional) compared to control, while trypsin inhibitor activity reduced to 1% in microwave processing and 3% in conventional thermal treatment when compared to 10% in raw soybean. PMID:29316679

  7. Materials and Process Design for High-Temperature Carburizing: Integrating Processing and Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Apelian

    2007-07-23

    The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.

  8. Word Processing and the Writing Process: Enhancement or Distraction?

    ERIC Educational Resources Information Center

    Dalton, David W.; Watson, James F.

    This study examined the effects of a year-long word processing program on learners' holistic writing skills. Based on results of a writing pretest, 80 seventh grade students were designated as relatively high or low in prior writing achievement and assigned to one of two groups: a word processing treatment and a conventional writing process…

  9. Strategic Co-Location in a Hybrid Process Involving Desalination and Pressure Retarded Osmosis (PRO).

    PubMed

    Sim, Victor S T; She, Qianhong; Chong, Tzyy Haur; Tang, Chuyang Y; Fane, Anthony G; Krantz, William B

    2013-07-04

    This paper focuses on a Hybrid Process that uses feed salinity dilution and osmotic power recovery from Pressure Retarded Osmosis (PRO) to achieve higher overall water recovery. This reduces the energy consumption and capital costs of conventional seawater desalination and water reuse processes. The Hybrid Process increases the amount of water recovered from the current 66.7% for conventional seawater desalination and water reuse processes to a potential 80% through the use of reclaimed water brine as an impaired water source. A reduction of up to 23% in energy consumption is projected via the Hybrid Process. The attractiveness is amplified by potential capital cost savings ranging from 8.7%-20% compared to conventional designs of seawater desalination plants. A decision matrix in the form of a customizable scorecard is introduced for evaluating a Hybrid Process based on the importance of land space, capital costs, energy consumption and membrane fouling. This study provides a new perspective, looking at processes not as individual systems but as a whole utilizing strategic co-location to unlock the synergies available in the water-energy nexus for more sustainable desalination.

  10. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to meet current aerospace challenges. Overarching goal is to avoid the reoccurring situation of optimizing an already ill-fated solution.

  11. Using a retrospective pretest instead of a conventional pretest is replacing biases: a qualitative study of cognitive processes underlying responses to thentest items.

    PubMed

    Taminiau-Bloem, Elsbeth F; Schwartz, Carolyn E; van Zuuren, Florence J; Koeneman, Margot A; Visser, Mechteld R M; Tishelman, Carol; Koning, Caro C E; Sprangers, Mirjam A G

    2016-06-01

    The thentest design aims to detect and control for recalibration response shift. This design assumes (1) more consistency in the content of the cognitive processes underlying patients' quality of life (QoL) between posttest and thentest assessments than between posttest and pretest assessments; and (2) consistency in the time frame and description of functioning referenced at pretest and thentest. Our objective is to utilize cognitive interviewing to qualitatively examine both assumptions. We conducted think-aloud interviews with 24 patients with cancer prior to and after radiotherapy to elicit cognitive processes underlying their assessment of seven EORTC QLQ-C30 items at pretest, posttest and thentest. We used an analytic scheme based on the cognitive process models of Tourangeau et al. and Rapkin and Schwartz that yielded five cognitive processes. We subsequently used this input for quantitative analysis of count data. Contrary to expectation, the number of dissimilar cognitive processes between posttest and thentest was generally larger than between pretest and posttest across patients. Further, patients considered a range of time frames when answering the thentest questions. Moreover, patients' description at the thentest of their pretest functioning was often not similar to that which was noted at pretest. Items referring to trouble taking a short walk, overall health and QoL were most often violating the assumptions. Both assumptions underlying the thentest design appear not to be supported by the patients' cognitive processes. Replacing the conventional pretest-posttest design with the thentest design may simply be replacing one set of biases with another.

  12. X-ray topography as a process control tool in semiconductor and microcircuit manufacture

    NASA Technical Reports Server (NTRS)

    Parker, D. L.; Porter, W. A.

    1977-01-01

    A bent wafer camera, designed to identify crystal lattice defects in semiconductor materials, was investigated. The camera makes use of conventional X-ray topographs and an innovative slightly bent wafer which allows rays from the point source to strike all portions of the wafer simultaneously. In addition to being utilized in solving production process control problems, this camera design substantially reduces the cost per topograph.

  13. COST ESTIMATION MODELS FOR DRINKING WATER TREATMENT UNIT PROCESSES

    EPA Science Inventory

    Cost models for unit processes typically utilized in a conventional water treatment plant and in package treatment plant technology are compiled in this paper. The cost curves are represented as a function of specified design parameters and are categorized into four major catego...

  14. Demonstration-scale evaluation of a novel high-solids anaerobic digestion process for converting organic wastes to fuel gas and compost.

    PubMed

    Rivard, C J; Duff, B W; Dickow, J H; Wiles, C C; Nagle, N J; Gaddy, J L; Clausen, E C

    1998-01-01

    Early evaluations of the bioconversion potential for combined wastes such as tuna sludge and sorted municipal solid waste (MSW) were conducted at laboratory scale and compared conventional low-solids, stirred-tank anaerobic systems with the novel, high-solids anaerobic digester (HSAD) design. Enhanced feedstock conversion rates and yields were determined for the HSAD system. In addition, the HSAD system demonstrated superior resiliency to process failure. Utilizing relatively dry feedstocks, the HSAD system is approximately one-tenth the size of conventional low-solids systems. In addition, the HSAD system is capable of organic loading rates (OLRs) on the order of 20-25 g volatile solids per liter digester volume per d (gVS/L/d), roughly 4-5 times those of conventional systems. Current efforts involve developing a demonstration-scale (pilot-scale) HSAD system. A two-ton/d plant has been constructed in Stanton, CA and is currently in the commissioning/startup phase. The purposes of the project are to verify laboratory- and intermediate-scale process performance; test the performance of large-scale prototype mechanical systems; demonstrate the long-term reliability of the process; and generate the process and economic data required for the design, financing, and construction of full-scale commercial systems. This study presents conformational fermentation data obtained at intermediate-scale and a snapshot of the pilot-scale project.

  15. A New Approach to Flood Protection Design and Riparian Management

    Treesearch

    Philip B. Williams; Mitchell L. Swanson

    1989-01-01

    Conventional engineering methods of flood control design focus narrowly on the efficient conveyance of water, with little regard for environmental resource planning and natural geomorphic processes. Consequently, flood control projects are often environmentally disastrous, expensive to maintain, and even inadequate to control floods. In addition, maintenance programs...

  16. Feedstock Supply System Design and Economics for Conversion of Lignocellulosic Biomass to Hydrocarbon Fuels Conversion Pathway: Fast Pyrolysis and Hydrotreating Bio-Oil Pathway "The 2017 Design Case"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin L. Kenney; Kara G. Cafferty; Jacob J. Jacobson

    The U.S. Department of Energy promotes the production of liquid fuels from lignocellulosic biomass feedstocks by funding fundamental and applied research that advances the state of technology in biomass sustainable supply, logistics, conversion, and overall system sustainability. As part of its involvement in this program, Idaho National Laboratory (INL) investigates the feedstock logistics economics and sustainability of these fuels. Between 2000 and 2012, INL quantified and the economics and sustainability of moving biomass from the field or stand to the throat of the conversion process using conventional equipment and processes. All previous work to 2012 was designed to improve themore » efficiency and decrease costs under conventional supply systems. The 2012 programmatic target was to demonstrate a biomass logistics cost of $55/dry Ton for woody biomass delivered to fast pyrolysis conversion facility. The goal was achieved by applying field and process demonstration unit-scale data from harvest, collection, storage, preprocessing, handling, and transportation operations into INL’s biomass logistics model.« less

  17. Experimental comparison of conventional and nonlinear model-based control of a mixing tank

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haeggblom, K.E.

    1993-11-01

    In this case study concerning control of a laboratory-scale mixing tank, conventional multiloop single-input single-output (SISO) control is compared with model-based'' control where the nonlinearity and multivariable characteristics of the process are explicitly taken into account. It is shown, especially if the operating range of the process is large, that the two outputs (level and temperature) cannot be adequately controlled by multiloop SISO control even if gain scheduling is used. By nonlinear multiple-input multiple-output (MIMO) control, on the other hand, very good control performance is obtained. The basic approach to nonlinear control used in this study is first to transformmore » the process into a globally linear and decoupled system, and then to design controllers for this system. Because of the properties of the resulting MIMO system, the controller design is very easy. Two nonlinear control system designs based on a steady-state and a dynamic model, respectively, are considered. In the dynamic case, both setpoint tracking and disturbance rejection can be addressed separately.« less

  18. 10 CFR 434.601 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... a conventional simulation tool, of the Proposed Design. A life cycle cost analysis shall be used to select the fuel source for the HVAC systems, service hot water, and process loads from available...

  19. DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1989-01-01

    This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.

  20. Preliminary engineering design of sodium-cooled CANDLE core

    NASA Astrophysics Data System (ADS)

    Takaki, Naoyuki; Namekawa, Azuma; Yoda, Tomoyuki; Mizutani, Akihiko; Sekimoto, Hiroshi

    2012-06-01

    The CANDLE burning process is characterized by the autonomous shifting of burning region with constant reactivity and constant spacial power distribution. Evaluations of such critical burning process by using widely used neutron diffusion and burning codes under some realistic engineering constraints are valuable to confirm the technical feasibility of the CANDLE concept and to put the idea into concrete core design. In the first part of this paper, it is discussed that whether the sustainable and stable CANDLE burning process can be reproduced even by using conventional core analysis tools such as SLAROM and CITATION-FBR. As a result, it is certainly possible to demonstrate it if the proper core configuration and initial fuel composition required as CANDLE core are applied to the analysis. In the latter part, an example of a concrete image of sodium cooled, metal fuel, 2000MWt rating CANDLE core has been presented by assuming an emerging inevitable technology of recladding. The core satisfies engineering design criteria including cladding temperature, pressure drop, linear heat rate, and cumulative damage fraction (CDF) of cladding, fast neutron fluence and sodium void reactivity which are defined in the Japanese FBR design project. It can be concluded that it is feasible to design CADLE core by using conventional codes while satisfying some realistic engineering design constraints assuming that recladding at certain time interval is technically feasible.

  1. Appendix B: Rapid development approaches for system engineering and design

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Conventional processes often produce systems which are obsolete before they are fielded. This paper explores some of the reasons for this, and provides a vision of how we can do better. This vision is based on our explorations in improved processes and system/software engineering tools.

  2. From conventional towards new - natural surfactants in drug delivery systems design: current status and perspectives.

    PubMed

    Savić, Snezana; Tamburić, Slobodanka; Savić, Miroslav M

    2010-03-01

    Surfactants play an important role in the development of both conventional and advanced (colloidal) drug delivery systems. There are several commercial surfactants, but a proportionally small group of them is approved as pharmaceutical excipients, recognized in various pharmacopoeias and therefore widely accepted by the pharmaceutical industry. The review covers some of the main categories of natural, sugar-based surfactants (alkyl polyglucosides and sugar esters) as prospective pharmaceutical excipients. It provides analysis of the physicochemical characteristics of sugar-based surfactants and their possible roles in the design of conventional or advanced drug delivery systems for different routes of administration. Summary and analysis of recent data on functionality, applied concentrations and formulation improvements produced by alkyl polyglucosides and sugar esters in different conventional and advanced delivery systems could be of interest to researchers dealing with drug formulation. Recent FDA certification of an alkyl polyglucoside surfactant for topical formulation presents a significant step in the process of recognition of this relatively new group of surfactants. This could trigger further research into the potential benefits of naturally derived materials in both conventional and new drug delivery systems.

  3. Developing a workstation-based, real-time simulation for rapid handling qualities evaluations during design

    NASA Technical Reports Server (NTRS)

    Anderson, Frederick; Biezad, Daniel J.

    1994-01-01

    This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.

  4. Recent advancements in low cost solar cell processing

    NASA Technical Reports Server (NTRS)

    Ralph, E. L.

    1975-01-01

    A proof-of-concept solar cell process has been developed that is adaptable to automation. This involved the development of a new contact system, a new antireflection coating system, a drift field cell design and a new contoured surface treatment. All these processes are performed without the use of vacuum chambers and expensive masking techniques, thus providing the possibility of reduced costs by automation using conventional semiconductor processing machinery. The contacts were printed on the cells by conventional silk screen machinery. The P(+) back field was formed by diffusing in aluminum from a printed aluminum back contact. The antireflection coating was formed by spinning on and baking a TiO2-SiO2 glass film. Air-mass-zero efficiencies of over 10% were achieved using this completely vacuum-free process.

  5. Next Generation Non-Vacuum, Maskless, Low Temperature Nanoparticle Ink Laser Digital Direct Metal Patterning for a Large Area Flexible Electronics

    PubMed Central

    Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P.; Ko, Seung Hwan

    2012-01-01

    Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition– and photolithography-based conventional metal patterning processes. The “digital” nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays. PMID:22900011

  6. Next generation non-vacuum, maskless, low temperature nanoparticle ink laser digital direct metal patterning for a large area flexible electronics.

    PubMed

    Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P; Ko, Seung Hwan

    2012-01-01

    Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition- and photolithography-based conventional metal patterning processes. The "digital" nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays.

  7. Strategic Co-Location in a Hybrid Process Involving Desalination and Pressure Retarded Osmosis (PRO)

    PubMed Central

    Sim, Victor S.T.; She, Qianhong; Chong, Tzyy Haur; Tang, Chuyang Y.; Fane, Anthony G.; Krantz, William B.

    2013-01-01

    This paper focuses on a Hybrid Process that uses feed salinity dilution and osmotic power recovery from Pressure Retarded Osmosis (PRO) to achieve higher overall water recovery. This reduces the energy consumption and capital costs of conventional seawater desalination and water reuse processes. The Hybrid Process increases the amount of water recovered from the current 66.7% for conventional seawater desalination and water reuse processes to a potential 80% through the use of reclaimed water brine as an impaired water source. A reduction of up to 23% in energy consumption is projected via the Hybrid Process. The attractiveness is amplified by potential capital cost savings ranging from 8.7%–20% compared to conventional designs of seawater desalination plants. A decision matrix in the form of a customizable scorecard is introduced for evaluating a Hybrid Process based on the importance of land space, capital costs, energy consumption and membrane fouling. This study provides a new perspective, looking at processes not as individual systems but as a whole utilizing strategic co-location to unlock the synergies available in the water-energy nexus for more sustainable desalination. PMID:24956940

  8. The Effect of Adaptive Nonlinear Frequency Compression on Phoneme Perception.

    PubMed

    Glista, Danielle; Hawkins, Marianne; Bohnert, Andrea; Rehmann, Julia; Wolfe, Jace; Scollie, Susan

    2017-12-12

    This study implemented a fitting method, developed for use with frequency lowering hearing aids, across multiple testing sites, participants, and hearing aid conditions to evaluate speech perception with a novel type of frequency lowering. A total of 8 participants, including children and young adults, participated in real-world hearing aid trials. A blinded crossover design, including posttrial withdrawal testing, was used to assess aided phoneme perception. The hearing aid conditions included adaptive nonlinear frequency compression (NFC), static NFC, and conventional processing. Enabling either adaptive NFC or static NFC improved group-level detection and recognition results for some high-frequency phonemes, when compared with conventional processing. Mean results for the distinction component of the Phoneme Perception Test (Schmitt, Winkler, Boretzki, & Holube, 2016) were similar to those obtained with conventional processing. Findings suggest that both types of NFC tested in this study provided a similar amount of speech perception benefit, when compared with group-level performance with conventional hearing aid technology. Individual-level results are presented with discussion around patterns of results that differ from the group average.

  9. FOCIS: A forest classification and inventory system using LANDSAT and digital terrain data

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Franklin, J.; Woodcook, C. E.; Logan, T. L.

    1981-01-01

    Accurate, cost-effective stratification of forest vegetation and timber inventory is the primary goal of a Forest Classification and Inventory System (FOCIS). Conventional timber stratification using photointerpretation can be time-consuming, costly, and inconsistent from analyst to analyst. FOCIS was designed to overcome these problems by using machine processing techniques to extract and process tonal, textural, and terrain information from registered LANDSAT multispectral and digital terrain data. Comparison of samples from timber strata identified by conventional procedures showed that both have about the same potential to reduce the variance of timber volume estimates over simple random sampling.

  10. The design of wavefront coded imaging system

    NASA Astrophysics Data System (ADS)

    Lan, Shun; Cen, Zhaofeng; Li, Xiaotong

    2016-10-01

    Wavefront Coding is a new method to extend the depth of field, which combines optical design and signal processing together. By using optical design software ZEMAX ,we designed a practical wavefront coded imaging system based on a conventional Cooke triplet system .Unlike conventional optical system, the wavefront of this new system is modulated by a specially designed phase mask, which makes the point spread function (PSF)of optical system not sensitive to defocus. Therefore, a series of same blurred images obtained at the image plane. In addition, the optical transfer function (OTF) of the wavefront coded imaging system is independent of focus, which is nearly constant with misfocus and has no regions of zeros. All object information can be completely recovered through digital filtering at different defocus positions. The focus invariance of MTF is selected as merit function in this design. And the coefficients of phase mask are set as optimization goals. Compared to conventional optical system, wavefront coded imaging system obtains better quality images under different object distances. Some deficiencies appear in the restored images due to the influence of digital filtering algorithm, which are also analyzed in this paper. The depth of field of the designed wavefront coded imaging system is about 28 times larger than initial optical system, while keeping higher optical power and resolution at the image plane.

  11. Practical 3D Printing of Antennas and RF Electronics

    DTIC Science & Technology

    2017-03-01

    Passive RF; Combiners Introduction Additive manufacturing can reduce the time and material costs in a design cycle and enable the on-demand printing of...performance, and create Computer Assisted Manufacturing (CAM) files. By intelligently leveraging this process, the design can be readily updated or...advances in 3D printing technology now enable antennas and RF electronics to be designed and prototyped significantly faster than conventional

  12. Effect of enzyme concentration, addition of water and incubation time on increase in yield of starch from potato.

    PubMed

    Sit, Nandan; Agrawal, U S; Deka, Sankar C

    2014-05-01

    Enzymatic treatment process for starch extraction from potato was investigated using cellulase enzyme and compared with conventional process. The effects of three parameters, cellulase enzyme concentration, incubation time and addition of water were evaluated for increase in starch yield as compared to the conventional process i.e., without using enzyme. A two-level full factorial design was used to study the process. The results indicated that all the main parameters and their interactions are statistically significant. Enzyme concentration and incubation time had a positive effect on the increase in starch yield while addition of water had a negative effect. The increase in starch yield ranged from 1.9% at low enzyme concentration and incubation time and high addition of water to a maximum of 70% increase from conventional process in starch yield was achieved when enzyme concentration and incubation time were high and addition of water was low suggesting water present in the ground potato meal is sufficient for access to the enzyme with in the slurry ensuring adequate contact with the substrate.

  13. DNA analysis using an integrated microchip for multiplex PCR amplification and electrophoresis for reference samples.

    PubMed

    Le Roux, Delphine; Root, Brian E; Reedy, Carmen R; Hickey, Jeffrey A; Scott, Orion N; Bienvenue, Joan M; Landers, James P; Chassagne, Luc; de Mazancourt, Philippe

    2014-08-19

    A system that automatically performs the PCR amplification and microchip electrophoretic (ME) separation for rapid forensic short tandem repeat (STR) forensic profiling in a single disposable plastic chip is demonstrated. The microchip subassays were optimized to deliver results comparable to conventional benchtop methods. The microchip process was accomplished in sub-90 min compared with >2.5 h for the conventional approach. An infrared laser with a noncontact temperature sensing system was optimized for a 45 min PCR compared with the conventional 90 min amplification time. The separation conditions were optimized using LPA-co-dihexylacrylamide block copolymers specifically designed for microchip separations to achieve accurate DNA size calling in an effective length of 7 cm in a plastic microchip. This effective separation length is less than half of other reports for integrated STR analysis and allows a compact, inexpensive microchip design. This separation quality was maintained when integrated with microchip PCR. Thirty samples were analyzed conventionally and then compared with data generated by the microfluidic chip system. The microfluidic system allele calling was 100% concordant with the conventional process. This study also investigated allelic ladder consistency over time. The PCR-ME genetic profiles were analyzed using binning palettes generated from two sets of allelic ladders run three and six months apart. Using these binning palettes, no allele calling errors were detected in the 30 samples demonstrating that a microfluidic platform can be highly consistent over long periods of time.

  14. Rapid prototyping of multi-scale biomedical microdevices by combining additive manufacturing technologies.

    PubMed

    Hengsbach, Stefan; Lantada, Andrés Díaz

    2014-08-01

    The possibility of designing and manufacturing biomedical microdevices with multiple length-scale geometries can help to promote special interactions both with their environment and with surrounding biological systems. These interactions aim to enhance biocompatibility and overall performance by using biomimetic approaches. In this paper, we present a design and manufacturing procedure for obtaining multi-scale biomedical microsystems based on the combination of two additive manufacturing processes: a conventional laser writer to manufacture the overall device structure, and a direct-laser writer based on two-photon polymerization to yield finer details. The process excels for its versatility, accuracy and manufacturing speed and allows for the manufacture of microsystems and implants with overall sizes up to several millimeters and with details down to sub-micrometric structures. As an application example we have focused on manufacturing a biomedical microsystem to analyze the impact of microtextured surfaces on cell motility. This process yielded a relevant increase in precision and manufacturing speed when compared with more conventional rapid prototyping procedures.

  15. Conventions and workflows for using Situs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wriggers, Willy, E-mail: wriggers@biomachina.org

    2012-04-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs tomore » be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed.« less

  16. Production of durable expanded perlite microspheres in a Vertical Electrical Furnace

    NASA Astrophysics Data System (ADS)

    Panagiotis, M.; Angelopoulos, P.; Taxiarchou, M.; Paspaliaris, I.

    2016-04-01

    Expanded perlite constitutes one of the most competitive insulating materials that is widely used in construction and manufacturing industry due to its unique properties combination; it is white, natural, lightweight, chemically inert, and exhibits superior insulating properties (thermal and acoustic) and fire resistance. Conventionally, perlite expansion is performed in vertical gas-fired furnaces; the conventional perlite expansion process has certain disadvantages which affect expanded products quality, thus limiting their performance and range of applications. In order to overcome the drawbacks of the conventional expansion technique, a new perlite expansion process has been designed based on a vertical electrical furnace (VEF). In the current study, fine perlite samples (-150 μm) from Milos Island, Greece, were expansed in the novel VEF and a conventional gas-fired furnace with the aim to evaluate and compare the main physical properties of the expanded products. The novel expanded perlite particles were characterised by superior properties, namely increased compression strength, competitive water and oil absorption capability, size homogeneity, spherical shape and decreased surface porosity in comparison to conventionally expanded samples.

  17. WORKING PARK-FUEL CELL COMBINED HEAT AND POWER SYSTEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan Jones

    2003-09-01

    This report covers the aims and objectives of the project which was to design, install and operate a fuel cell combined heat and power (CHP) system in Woking Park, the first fuel cell CHP system in the United Kingdom. The report also covers the benefits that were expected to accrue from the work in an understanding of the full technology procurement process (including planning, design, installation, operation and maintenance), the economic and environmental performance in comparison with both conventional UK fuel supply and conventional CHP and the commercial viability of fuel cell CHP energy supply in the new deregulated energymore » markets.« less

  18. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial.

    PubMed

    Joda, Tim; Brägger, Urs

    2015-01-01

    To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.

  19. [Establishment of design space for production process of traditional Chinese medicine preparation].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Qiao, Yan-Jiang; Wu, Zhi-Sheng; Lin, Zhao-Zhou

    2013-03-01

    The philosophy of quality by design (QbD) is now leading the changes in the drug manufacturing mode from the conventional test-based approach to the science and risk based approach focusing on the detailed research and understanding of the production process. Along with the constant deepening of the understanding of the manufacturing process, the design space will be determined, and the emphasis of quality control will be shifted from the quality standards to the design space. Therefore, the establishment of the design space is core step in the implementation of QbD, and it is of great importance to study the methods for building the design space. This essay proposes the concept of design space for the production process of traditional Chinese medicine (TCM) preparations, gives a systematic introduction of the concept of the design space, analyzes the feasibility and significance to build the design space in the production process of traditional Chinese medicine preparations, and proposes study approaches on the basis of examples that comply with the characteristics of traditional Chinese medicine preparations, as well as future study orientations.

  20. Enculturating Seamless Language Learning through Artifact Creation and Social Interaction Process

    ERIC Educational Resources Information Center

    Wong, Lung-Hsiang; Chai, Ching Sing; Aw, Guat Poh; King, Ronnel B.

    2015-01-01

    This paper reports a design-based research (DBR) cycle of MyCLOUD (My Chinese ubiquitOUs learning Days). MyCLOUD is a seamless language learning model that addresses identified limitations of conventional Chinese language teaching, such as the decontextualized and unauthentic learning processes that usually hinder reflection and deep learning.…

  1. Reflectance measurements

    NASA Technical Reports Server (NTRS)

    Brown, R. A.

    1982-01-01

    The productivity of spectroreflectometer equipment and operating personnel and the accuracy and sensitivity of the measurements were investigated. Increased optical sensitivity and better design of the data collection and processing scheme to eliminate some of the unnecessary present operations were conducted. Two promising approaches to increased sensitivity were identified, conventional processing with error compensation and detection of random noise modulation.

  2. Passive-solar homes for Texas

    NASA Astrophysics Data System (ADS)

    Garrison, M. L.

    1982-06-01

    Acceptance of passive solar technologies has been slow within the conventional building trades in Texas because it is a common misconception that solar is expensive, and data on local applications is severely limited or nonexistent. It is the purpose of this solar development to move passive solar design into the mainstream of public acceptance by helping to overcome and eliminate these barriers. Specifically, the goal is to develop a set of regional climatic building standards to help guide the conventional building trade toward the utilization of soft energy systems which will reduce overall consumption at a price and convenience most Texans can afford. To meet this objective, eight sample passive design structures are presented. These designs represent state of the art regional applications of passive solar space conditioning. The methodology used in the passive solar design process included: analysis of regional climatic data; analysis of historical regional building prototypes; determination of regional climatic design priorities and assets; prototypical design models for the discretionary housing market; quantitative thermal analysis of prototypical designs; and construction drawings of building prototypes.

  3. Application of Optical Coherence Tomography Freeze-Drying Microscopy for Designing Lyophilization Process and Its Impact on Process Efficiency and Product Quality.

    PubMed

    Korang-Yeboah, Maxwell; Srinivasan, Charudharshini; Siddiqui, Akhtar; Awotwe-Otoo, David; Cruz, Celia N; Muhammad, Ashraf

    2018-01-01

    Optical coherence tomography freeze-drying microscopy (OCT-FDM) is a novel technique that allows the three-dimensional imaging of a drug product during the entire lyophilization process. OCT-FDM consists of a single-vial freeze dryer (SVFD) affixed with an optical coherence tomography (OCT) imaging system. Unlike the conventional techniques, such as modulated differential scanning calorimetry (mDSC) and light transmission freeze-drying microscopy, used for predicting the product collapse temperature (Tc), the OCT-FDM approach seeks to mimic the actual product and process conditions during the lyophilization process. However, there is limited understanding on the application of this emerging technique to the design of the lyophilization process. In this study, we investigated the suitability of OCT-FDM technique in designing a lyophilization process. Moreover, we compared the product quality attributes of the resulting lyophilized product manufactured using Tc, a critical process control parameter, as determined by OCT-FDM versus as estimated by mDSC. OCT-FDM analysis revealed the absence of collapse even for the low protein concentration (5 mg/ml) and low solid content formulation (1%w/v) studied. This was confirmed by lab scale lyophilization. In addition, lyophilization cycles designed using Tc values obtained from OCT-FDM were more efficient with higher sublimation rate and mass flux than the conventional cycles, since drying was conducted at higher shelf temperature. Finally, the quality attributes of the products lyophilized using Tc determined by OCT-FDM and mDSC were similar, and product shrinkage and cracks were observed in all the batches of freeze-dried products irrespective of the technique employed in predicting Tc.

  4. Sorption enhanced reaction process (SERP) for the production of hydrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufton, J.; Mayorga, S.; Gaffney, T.

    1998-08-01

    The novel Sorption Enhanced Reaction Process has the potential to decrease the cost of hydrogen production by steam methane reforming. Current effort for development of this technology has focused on adsorbent development, experimental process concept testing, and process development and design. A preferred CO{sub 2} adsorbent, K{sub 2}CO{sub 3} promoted hydrotalcite, satisfies all of the performance targets and it has been scaled up for process testing. A separate class of adsorbents has been identified which could potentially improve the performance of the H{sub 2}-SER process. Although this material exhibits improved CO{sub 2} adsorption capacity compared to the HTC adsorbent, itsmore » hydrothermal stability must be improved. Single-step process experiments (not cyclic) indicate that the H{sub 2}-SER reactor performance during the reaction step improves with decreasing pressure and increasing temperature and steam to methane ratio in the feed. Methane conversion in the H{sub 2}-SER reactor is higher than for a conventional catalyst-only reactor operated at similar temperature and pressure. The reactor effluent gas consists of 90+% H{sub 2}, balance CH{sub 4}, with only trace levels (< 50 ppm) of carbon oxides. A best-case process design (2.5 MMSCFD of 99.9+% H{sub 2}) based on the HTC adsorbent properties and a revised SER process cycle has been generated. Economic analysis of this design indicates the process has the potential to reduce the H{sub 2} product cost by 25--31% compared to conventional steam methane reforming.« less

  5. Decisionmaking in practice: The dynamics of muddling through.

    PubMed

    Flach, John M; Feufel, Markus A; Reynolds, Peter L; Parker, Sarah Henrickson; Kellogg, Kathryn M

    2017-09-01

    An alternative to conventional models that treat decisions as open-loop independent choices is presented. The alterative model is based on observations of work situations such as healthcare, where decisionmaking is more typically a closed-loop, dynamic, problem-solving process. The article suggests five important distinctions between the processes assumed by conventional models and the reality of decisionmaking in practice. It is suggested that the logic of abduction in the form of an adaptive, muddling through process is more consistent with the realities of practice in domains such as healthcare. The practical implication is that the design goal should not be to improve consistency with normative models of rationality, but to tune the representations guiding the muddling process to increase functional perspicacity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Investigation of coherent receiver designs in high-speed optical inter-satellite links using digital signal processing

    NASA Astrophysics Data System (ADS)

    Schaefer, S.; Gregory, M.; Rosenkranz, W.

    2017-09-01

    Due to higher data rates, better data security and unlicensed spectral usage optical inter-satellite links (OISL) offer an attractive alternative to conventional RF-communication. However, the very high transmission distances necessitate an optical receiver design enabling high receiver sensitivity which requires careful carrier synchronization and a quasi-coherent detection scheme.

  7. Glyphosate-tolerant soybeans remain compositionally equivalent to conventional soybeans (Glycine max L.) during three years of field testing.

    PubMed

    McCann, Melinda C; Liu, Keshun; Trujillo, William A; Dobert, Raymond C

    2005-06-29

    Previous studies have shown that the composition of glyphosate-tolerant soybeans (GTS) and selected processed fractions was substantially equivalent to that of conventional soybeans over a wide range of analytes. This study was designed to determine if the composition of GTS remains substantially equivalent to conventional soybeans over the course of several years and when introduced into multiple genetic backgrounds. Soybean seed samples of both GTS and conventional varieties were harvested during 2000, 2001, and 2002 and analyzed for the levels of proximates, lectin, trypsin inhibitor, and isoflavones. The measured analytes are representative of the basic nutritional and biologically active components in soybeans. Results show a similar range of natural variability for the GTS soybeans as well as conventional soybeans. It was concluded that the composition of commercial GTS over the three years of breeding into multiple varieties remains equivalent to that of conventional soybeans.

  8. An In-Depth Review on Direct Additive Manufacturing of Metals

    NASA Astrophysics Data System (ADS)

    Azam, Farooq I.; Rani, Ahmad Majdi Abdul; Altaf, Khurram; Rao, T. V. V. L. N.; Aimi Zaharin, Haizum

    2018-03-01

    Additive manufacturing (AM), also known as 3D Printing, is a revolutionary manufacturing technique which has been developing rapidly in the last 30 years. The evolution of this precision manufacturing process from rapid prototyping to ready-to-use parts has significantly alleviated manufacturing constraints and design freedom has been outstandingly widened. AM is a non-conventional manufacturing technique which utilizes a 3D CAD model data to build parts by adding one material layer at a time, rather than removing it and fulfills the demand for manufacturing parts with complex geometric shapes, great dimensional accuracy, and easy to assemble parts. Additive manufacturing of metals has become the area of extensive research, progressing towards the production of final products and replacing conventional manufacturing methods. This paper provides an insight to the available metal additive manufacturing technologies that can be used to produce end user products without using conventional manufacturing methods. The paper also includes the comparison of mechanical and physical properties of parts produced by AM with the parts manufactured using conventional processes.

  9. Application of Multi-Threshold NULL Convention Logic to Adaptive Beamforming Circuits for Ultra-Low Power

    DTIC Science & Technology

    2016-03-31

    Abstract: With the decrease of transistor feature sizes into the ultra-deep submicron range, leakage power becomes an important design challenge for...MTNCL design showed substantial improvements in terms of active energy and leakage power compared to the equivalent synchronous design. Keywords...switching could use a large portion of power. Additionally, leakage power has come to dominate power consumption as process sizes shrink. Adaptive

  10. Optimization of rotor shaft shrink fit method for motor using "Robust design"

    NASA Astrophysics Data System (ADS)

    Toma, Eiji

    2018-01-01

    This research is collaborative investigation with the general-purpose motor manufacturer. To review construction method in production process, we applied the parameter design method of quality engineering and tried to approach the optimization of construction method. Conventionally, press-fitting method has been adopted in process of fitting rotor core and shaft which is main component of motor, but quality defects such as core shaft deflection occurred at the time of press fitting. In this research, as a result of optimization design of "shrink fitting method by high-frequency induction heating" devised as a new construction method, its construction method was feasible, and it was possible to extract the optimum processing condition.

  11. Exploring open innovation with a patient focus in drug discovery: an evolving paradigm of patient engagement.

    PubMed

    Allarakhia, Minna

    2015-06-01

    It is suggested in this article that patient engagement should occur further upstream during the drug discovery stage. 'Lead patients', namely those patients who are proactive with respect to their health, possess knowledge of their disease and resulting symptoms. They are also well informed about the conventional as well as non-conventional treatments for disease management; and so can provide a nuanced perspective to drug design. Understanding how patients view the management of their diseases and how they view the use of conventional versus non-conventional interventions is of imperative importance to researchers. Indeed, this can provide insight into how conventional treatments might be designed from the outset to encourage compliance and positive health outcomes. Consequently, a continuum of lead patient engagement is employed that focuses on drug discovery processes ranging from participative, informative to collaborative engagement. This article looks at a variety of open innovation models that are currently employed across this engagement spectrum. It is no longer sufficient for industry stakeholders to consider conventional therapies as the only mechanisms being sought after by patients. Without patient engagement, the industry risks being re-prioritized in terms of its role in the patient journey towards not only recovery of health, but also sustained health and wellness before disease onset.

  12. Development of an Ointment Formulation Using Hot-Melt Extrusion Technology.

    PubMed

    Bhagurkar, Ajinkya M; Angamuthu, Muralikrishnan; Patil, Hemlata; Tiwari, Roshan V; Maurya, Abhijeet; Hashemnejad, Seyed Meysam; Kundu, Santanu; Murthy, S Narasimha; Repka, Michael A

    2016-02-01

    Ointments are generally prepared either by fusion or by levigation methods. The current study proposes the use of hot-melt extrusion (HME) processing for the preparation of a polyethylene glycol base ointment. Lidocaine was used as a model drug. A modified screw design was used in this process, and parameters such as feeding rate, barrel temperature, and screw speed were optimized to obtain a uniform product. The product characteristics were compared with an ointment of similar composition prepared by conventional fusion method. The rheological properties, drug release profile, and texture characteristics of the hot-melt extruded product were similar to the conventionally prepared product. This study demonstrates a novel application of the hot-melt extrusion process in the manufacturing of topical semi-solids.

  13. High efficiency solar cell processing

    NASA Technical Reports Server (NTRS)

    Ho, F.; Iles, P. A.

    1985-01-01

    At the time of writing, cells made by several groups are approaching 19% efficiency. General aspects of the processing required for such cells are discussed. Most processing used for high efficiency cells is derived from space-cell or concentrator cell technology, and recent advances have been obtained from improved techniques rather than from better understanding of the limiting mechanisms. Theory and modeling are fairly well developed, and adequate to guide further asymptotic increases in performance of near conventional cells. There are several competitive cell designs with promise of higher performance ( 20%) but for these designs further improvements are required. The available cell processing technology to fabricate high efficiency cells is examined.

  14. Pattern centric design based sensitive patterns and process monitor in manufacturing

    NASA Astrophysics Data System (ADS)

    Hsiang, Chingyun; Cheng, Guojie; Wu, Kechih

    2017-03-01

    When design rule is mitigating to smaller dimension, process variation requirement is tighter than ever and challenges the limits of device yield. Masks, lithography, etching and other processes have to meet very tight specifications in order to keep defect and CD within the margins of the process window. Conventionally, Inspection and metrology equipments are utilized to monitor and control wafer quality in-line. In high throughput optical inspection, nuisance and review-classification become a tedious labor intensive job in manufacturing. Certain high-resolution SEM images are taken to validate defects after optical inspection. These high resolution SEM images catch not only optical inspection highlighted point, also its surrounding patterns. However, this pattern information is not well utilized in conventional quality control method. Using this complementary design based pattern monitor not only monitors and analyzes the variation of patterns sensitivity but also reduce nuisance and highlight defective patterns or killer defects. After grouping in either single or multiple layers, systematic defects can be identified quickly in this flow. In this paper, we applied design based pattern monitor in different layers to monitor process variation impacts on all kinds of patterns. First, the contour of high resolutions SEM image is extracted and aligned to design with offset adjustment and fine alignment [1]. Second, specified pattern rules can be applied on design clip area, the same size as SEM image, and form POI (pattern of interest) areas. Third, the discrepancy of contour and design measurement at different pattern types in measurement blocks. Fourth, defective patterns are reported by discrepancy detection criteria and pattern grouping [4]. Meanwhile, reported pattern defects are ranked by number and severity by discrepancy. In this step, process sensitive high repeatable systematic defects can be identified quickly Through this design based process pattern monitor method, most of optical inspection nuisances can be filtered out at contour to design discrepancy measurement. Daily analysis results are stored at database as reference to compare with incoming data. Defective pattern library contains existing and known systematic defect patterns which help to catch and identify new pattern defects or process impacts. On the other hand, this defect pattern library provides extra valuable information for mask, pattern and defects verification, inspection care area generation, further OPC fix and process enhancement and investigation.

  15. Comparison of performance and operation of side-by-side integrated fixed-film and conventional activated sludge processes at demonstration scale.

    PubMed

    Stricker, Anne-Emmanuelle; Barrie, Ashley; Maas, Carol L A; Fernandes, William; Lishman, Lori

    2009-03-01

    A full-scale demonstration of an integrated fixed-film activated sludge (IFFAS) process with floating carriers has been conducted in Ontario, Canada, since August 2003. In this study, data collected on-site from July 2005 to December 2006 are analyzed and compared with the performance of a conventional activated sludge train operated in parallel. Both trains received similar loadings and maintained comparable mixed liquor concentrations; however, the IFFAS had 50% more biomass when the attached growth was considered. In the winter, the conventional train operated at the critical solids retention time (SRT) and had fluctuating partial nitrification. The IFFAS nitrified more consistently and had a doubled average capacity. In the summer, the suspended SRT was less limiting, and the benefit of IFFAS for nitrification was marginal. The lessons learned from the operational requirements and challenges of the IFFAS process (air flow, carrier management, and seasonal foaming) are discussed, and design recommendations are proposed for whole plant retrofit.

  16. Pre-processing SAR image stream to facilitate compression for transport on bandwidth-limited-link

    DOEpatents

    Rush, Bobby G.; Riley, Robert

    2015-09-29

    Pre-processing is applied to a raw VideoSAR (or similar near-video rate) product to transform the image frame sequence into a product that resembles more closely the type of product for which conventional video codecs are designed, while sufficiently maintaining utility and visual quality of the product delivered by the codec.

  17. Design of a novel freeform lens for LED uniform illumination and conformal phosphor coating.

    PubMed

    Hu, Run; Luo, Xiaobing; Zheng, Huai; Qin, Zong; Gan, Zhiqiang; Wu, Bulong; Liu, Sheng

    2012-06-18

    A conformal phosphor coating can realize a phosphor layer with uniform thickness, which could enhance the angular color uniformity (ACU) of light-emitting diode (LED) packaging. In this study, a novel freeform lens was designed for simultaneous realization of LED uniform illumination and conformal phosphor coating. The detailed algorithm of the design method, which involves an extended light source and double refractions, was presented. The packaging configuration of the LED modules and the modeling of the light-conversion process were also presented. Monte Carlo ray-tracing simulations were conducted to validate the design method by comparisons with a conventional freeform lens. It is demonstrated that for the LED module with the present freeform lens, the illumination uniformity and ACU was 0.89 and 0.9283, respectively. The present freeform lens can realize equivalent illumination uniformity, but the angular color uniformity can be enhanced by 282.3% when compared with the conventional freeform lens.

  18. Synthetic-lattice enabled all-optical devices based on orbital angular momentum of light.

    PubMed

    Luo, Xi-Wang; Zhou, Xingxiang; Xu, Jin-Shi; Li, Chuan-Feng; Guo, Guang-Can; Zhang, Chuanwei; Zhou, Zheng-Wei

    2017-07-14

    All-optical photonic devices are crucial for many important photonic technologies and applications, ranging from optical communication to quantum information processing. Conventional design of all-optical devices is based on photon propagation and interference in real space, which may rely on large numbers of optical elements, and the requirement of precise control makes this approach challenging. Here we propose an unconventional route for engineering all-optical devices using the photon's internal degrees of freedom, which form photonic crystals in such synthetic dimensions for photon propagation and interference. We demonstrate this design concept by showing how important optical devices such as quantum memory and optical filters can be realized using synthetic orbital angular momentum (OAM) lattices in degenerate cavities. The design route utilizing synthetic photonic lattices may significantly reduce the requirement for numerous optical elements and their fine tuning in conventional design, paving the way for realistic all-optical photonic devices with novel functionalities.

  19. Standing wave design and experimental validation of a tandem simulated moving bed process for insulin purification.

    PubMed

    Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda

    2002-01-01

    A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.

  20. Cognitive dimensions of talim: evaluating weaving notation through cognitive dimensions (CDs) framework.

    PubMed

    Kaur, Gagan Deep

    2017-05-01

    The design process in Kashmiri carpet weaving is distributed over a number of actors and artifacts and is mediated by a weaving notation called talim. The script encodes entire design in practice-specific symbols. This encoded script is decoded and interpreted via design-specific conventions by weavers to weave the design embedded in it. The cognitive properties of this notational system are described in the paper employing cognitive dimensions (CDs) framework of Green (People and computers, Cambridge University Press, Cambridge, 1989) and Blackwell et al. (Cognitive technology: instruments of mind-CT 2001, LNAI 2117, Springer, Berlin, 2001). After introduction to the practice, the design process is described in 'The design process' section which includes coding and decoding of talim. In 'Cognitive dimensions of talim' section, after briefly discussing CDs framework, the specific cognitive dimensions possessed by talim are described in detail.

  1. Enzymatic corn wet milling: engineering process and cost model

    PubMed Central

    Ramírez, Edna C; Johnston, David B; McAloon, Andrew J; Singh, Vijay

    2009-01-01

    Background Enzymatic corn wet milling (E-milling) is a process derived from conventional wet milling for the recovery and purification of starch and co-products using proteases to eliminate the need for sulfites and decrease the steeping time. In 2006, the total starch production in USA by conventional wet milling equaled 23 billion kilograms, including modified starches and starches used for sweeteners and ethanol production [1]. Process engineering and cost models for an E-milling process have been developed for a processing plant with a capacity of 2.54 million kg of corn per day (100,000 bu/day). These models are based on the previously published models for a traditional wet milling plant with the same capacity. The E-milling process includes grain cleaning, pretreatment, enzymatic treatment, germ separation and recovery, fiber separation and recovery, gluten separation and recovery and starch separation. Information for the development of the conventional models was obtained from a variety of technical sources including commercial wet milling companies, industry experts and equipment suppliers. Additional information for the present models was obtained from our own experience with the development of the E-milling process and trials in the laboratory and at the pilot plant scale. The models were developed using process and cost simulation software (SuperPro Designer®) and include processing information such as composition and flow rates of the various process streams, descriptions of the various unit operations and detailed breakdowns of the operating and capital cost of the facility. Results Based on the information from the model, we can estimate the cost of production per kilogram of starch using the input prices for corn, enzyme and other wet milling co-products. The work presented here describes the E-milling process and compares the process, the operation and costs with the conventional process. Conclusion The E-milling process was found to be cost competitive with the conventional process during periods of high corn feedstock costs since the enzymatic process enhances the yields of the products in a corn wet milling process. This model is available upon request from the authors for educational, research and non-commercial uses. PMID:19154623

  2. New, high-efficiency ion trap mobility detection system for narcotics and explosives

    NASA Astrophysics Data System (ADS)

    McGann, William J.; Bradley, V.; Borsody, A.; Lepine, S.

    1994-10-01

    A new patented Ion Trap Mobility Spectrometer (ITMS) design is presented. Conventional IMS designs typically operate below 0.1% efficiency. This is due primarily to electric field driven, sample ion discharge on a shutter grid. Since 99.9% of the sample ions generated in the reaction region are lost in this discharge process, the sensitivity of conventional systems is limited. The new design provides greater detection efficiency than conventional designs through the use of an `ion trap' concept. The paper describes the plasma and sample ion dynamics in the reaction region of the new detector and discusses the advantages of utilizing a `field-free' space to generate sample ions with high efficiency. Fast electronic switching is described which is used to perturb the field-free space and pulse the sample ions into the drift region for separation and subsequent detection using pseudo real-time software for analysis and display of the data. Many applications for this new detector are now being considered including the detection of narcotics and explosives. Preliminary ion spectra, reduced mobility data and sensitivity data are presented for fifteen narcotics, including cocaine, THC and LSD are reported.

  3. New high-efficiency ion-trap mobility detection system for narcotics

    NASA Astrophysics Data System (ADS)

    McGann, William J.

    1997-02-01

    A new patented Ion Trap Mobility Spectrometer design is presented. Conventional IMS designs typically operate below 0.1 percent efficiency. This is due primarily to electric field driven, sample ion discharge on a shutter grid. Since 99.9 percent of the sample ions generated in the reaction region are lost int his discharge process, the sensitivity of conventional systems is limited. The new design provides greater detection efficiency than conventional designs through the use of an 'ion trap' concept. The paper describes the plasma and sample ion dynamics in the reaction region of the new detector and discusses the advantages of utilizing a 'field-free' space to generate sample ions with high efficiency. Fast electronic switching is described which is used to perturb the field-free space and pulse the sample ions into the drift region for separation and subsequent detection using pseudo real-time software for analysis and display of the data. One application for this new detector is now being developed, a portable, hand-held system with switching capability for the detection of drugs and explosives. Preliminary ion spectra and sensitivity data are presented for cocaine and heroin using a hand sniffer configuration.

  4. New high-efficiency ion trap mobility detection system for narcotics and explosives

    NASA Astrophysics Data System (ADS)

    McGann, William J.; Jenkins, Anthony; Ribiero, K.; Napoli, J.

    1994-03-01

    A new patented ion trap mobility spectrometer design is presented. Conventional IMS designs typically operate below 0.1% efficiency. This is due primarily to electrical-field-driven, sample ion discharge on a shutter grid. Since 99.9% of the sample ions generated in the reaction region are lost in this discharge process, the sensitivity of conventional systems is limited. The new design provides greater detection efficiency than conventional designs through the use of an `ion trap' concept. The paper describes the plasma and sample ion dynamics in the reaction region of the new detector and discusses the advantages of utilizing a `field-free' space to generate sample ions with high efficiency. Fast electronic switching is described which is used to perturb the field-free space and pulse the sample ions into the drift region for separation and subsequent detection using pseudo real-time software for analysis and display of the data. Many applications for this new detector are now being considered including the detection of narcotics and explosives. Preliminary ion spectra, reduced mobility data and sensitivity data are presented for fifteen narcotics, including cocaine, THC, and LSD are reported.

  5. SRAM As An Array Of Energetic-Ion Detectors

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G.; Blaes, Brent R.; Lieneweg, Udo; Nixon, Robert H.

    1993-01-01

    Static random-access memory (SRAM) designed for use as array of energetic-ion detectors. Exploits well-known tendency of incident energetic ions to cause bit flips in cells of electronic memories. Design of ion-detector SRAM involves modifications of standard SRAM design to increase sensitivity to ions. Device fabricated by use of conventional complementary metal oxide/semiconductor (CMOS) processes. Potential uses include gas densimetry, position sensing, and measurement of cosmic-ray spectrum.

  6. Morphology control in polymer blend fibers—a high throughput computing approach

    NASA Astrophysics Data System (ADS)

    Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar

    2016-08-01

    Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.

  7. Thermodynamic and economic analysis of heat pumps for energy recovery in industrial processes

    NASA Astrophysics Data System (ADS)

    Urdaneta-B, A. H.; Schmidt, P. S.

    1980-09-01

    A computer code has been developed for analyzing the thermodynamic performance, cost and economic return for heat pump applications in industrial heat recovery. Starting with basic defining characteristics of the waste heat stream and the desired heat sink, the algorithm first evaluates the potential for conventional heat recovery with heat exchangers, and if applicable, sizes the exchanger. A heat pump system is then designed to process the residual heating and cooling requirements of the streams. In configuring the heat pump, the program searches a number of parameters, including condenser temperature, evaporator temperature, and condenser and evaporator approaches. All system components are sized for each set of parameters, and economic return is estimated and compared with system economics for conventional processing of the heated and cooled streams (i.e., with process heaters and coolers). Two case studies are evaluated, one in a food processing application and the other in an oil refinery unit.

  8. Factorial design studies of antiretroviral drug-loaded stealth liposomal injectable: PEGylation, lyophilization and pharmacokinetic studies

    NASA Astrophysics Data System (ADS)

    Sudhakar, Beeravelli; Krishna, Mylangam Chaitanya; Murthy, Kolapalli Venkata Ramana

    2016-01-01

    The aim of the present study was to formulate and evaluate the ritonavir-loaded stealth liposomes by using 32 factorial design and intended to delivered by parenteral delivery. Liposomes were prepared by ethanol injection method using 32 factorial designs and characterized for various physicochemical parameters such as drug content, size, zeta potential, entrapment efficiency and in vitro drug release. The optimization process was carried out using desirability and overlay plots. The selected formulation was subjected to PEGylation using 10 % PEG-10000 solution. Stealth liposomes were characterized for the above-mentioned parameters along with surface morphology, Fourier transform infrared spectrophotometer, differential scanning calorimeter, stability and in vivo pharmacokinetic studies in rats. Stealth liposomes showed better result compared to conventional liposomes due to effect of PEG-10000. The in vivo studies revealed that stealth liposomes showed better residence time compared to conventional liposomes and pure drug solution. The conventional liposomes and pure drug showed dose-dependent pharmacokinetics, whereas stealth liposomes showed long circulation half-life compared to conventional liposomes and pure ritonavir solution. The results of statistical analysis showed significance difference as the p value is (<0.05) by one-way ANOVA. The result of the present study revealed that stealth liposomes are promising tool in antiretroviral therapy.

  9. Lab-on-a-chip based total-phosphorus analysis device utilizing a photocatalytic reaction

    NASA Astrophysics Data System (ADS)

    Jung, Dong Geon; Jung, Daewoong; Kong, Seong Ho

    2018-02-01

    A lab-on-a-chip (LOC) device for total phosphorus (TP) analysis was fabricated for water quality monitoring. Many commercially available TP analysis systems used to estimate water quality have good sensitivity and accuracy. However, these systems also have many disadvantages such as bulky size, complex pretreatment processes, and high cost, which limit their application. In particular, conventional TP analysis systems require an indispensable pretreatment step, in which the fluidic analyte is heated to 120 °C for 30 min to release the dissolved phosphate, because many phosphates are soluble in water at a standard temperature and pressure. In addition, this pretreatment process requires elevated pressures of up to 1.1 kg cm-2 in order to prevent the evaporation of the heated analyte. Because of these limiting conditions required by the pretreatment processes used in conventional systems, it is difficult to miniaturize TP analysis systems. In this study, we employed a photocatalytic reaction in the pretreatment process. The reaction was carried out by illuminating a photocatalytic titanium dioxide (TiO2) surface formed in a microfluidic channel with ultraviolet (UV) light. This pretreatment process does not require elevated temperatures and pressures. By applying this simplified, photocatalytic-reaction-based pretreatment process to a TP analysis system, greater degrees of freedom are conferred to the design and fabrication of LOC devices for TP monitoring. The fabricated LOC device presented in this paper was characterized by measuring the TP concentration of an unknown sample, and comparing the results with those measured by a conventional TP analysis system. The TP concentrations of the unknown sample measured by the proposed LOC device and the conventional TP analysis system were 0.018 mgP/25 mL and 0.019 mgP/25 mL, respectively. The experimental results revealed that the proposed LOC device had a performance comparable to the conventional bulky TP analysis system. Therefore, our device could be directly employed in water quality monitoring as an alternative to conventional TP analysis systems.

  10. The Joint Convention - Its Structure, the Articles and its Administration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metcalf, P.; Louvat, D.

    The objective of the Joint Convention on the Safety of Spent Fuel Management and on the Safety of Radioactive Waste Management (The Joint Convention) is to achieve a high level of safety worldwide in the management of spent nuclear and fuel and radioactive waste. [1] It is an incentive convention designed to encourage and assist countries to achieve the objective. Contracting Parties to the Joint Convention are required to compile and submit a national report on how they meet the articles of the Joint Convention. The reports are peer reviewed by other Contracting Parties to the Joint Convention and thenmore » countries have to defend the report at a review meeting of all the Contracting Parties. The process entails both a self appraisal in compiling the report and independent international peer review. Summaries are compiled of the various reviews and these are presented in plenary, with a view to identifying generic issues and areas in which countries are improving safety or have identified for further development. The process also presents an opportunity for countries involved to benchmark their national spent fuel and radioactive waste safety programmes against prevailing international practice. The paper elaborates the detailed elements involved and discusses the experience from the first review meeting of Contracting Parties, and issues envisaged for consideration at the second review meeting scheduled for May 2006. (authors)« less

  11. Control structures for high speed processors

    NASA Technical Reports Server (NTRS)

    Maki, G. K.; Mankin, R.; Owsley, P. A.; Kim, G. M.

    1982-01-01

    A special processor was designed to function as a Reed Solomon decoder with throughput data rate in the Mhz range. This data rate is significantly greater than is possible with conventional digital architectures. To achieve this rate, the processor design includes sequential, pipelined, distributed, and parallel processing. The processor was designed using a high level language register transfer language. The RTL can be used to describe how the different processes are implemented by the hardware. One problem of special interest was the development of dependent processes which are analogous to software subroutines. For greater flexibility, the RTL control structure was implemented in ROM. The special purpose hardware required approximately 1000 SSI and MSI components. The data rate throughput is 2.5 megabits/second. This data rate is achieved through the use of pipelined and distributed processing. This data rate can be compared with 800 kilobits/second in a recently proposed very large scale integration design of a Reed Solomon encoder.

  12. On the use of distributed sensing in control of large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Montgomery, Raymond C.; Ghosh, Dave

    1990-01-01

    Distributed processing technology is being developed to process signals from distributed sensors using distributed computations. Thiw work presents a scheme for calculating the operators required to emulate a conventional Kalman filter and regulator using such a computer. The scheme makes use of conventional Kalman theory as applied to the control of large flexible structures. The required computation of the distributed operators given the conventional Kalman filter and regulator is explained. A straightforward application of this scheme may lead to nonsmooth operators whose convergence is not apparent. This is illustrated by application to the Mini-Mast, a large flexible truss at the Langley Research Center used for research in structural dynamics and control. Techniques for developing smooth operators are presented. These involve spatial filtering as well as adjusting the design constants in the Kalman theory. Results are presented that illustrate the degree of smoothness achieved.

  13. A Debugger for Computational Grid Applications

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele

    2000-01-01

    The p2d2 project at NAS has built a debugger for applications running on heterogeneous computational grids. It employs a client-server architecture to simplify the implementation. Its user interface has been designed to provide process control and state examination functions on a computation containing a large number of processes. It can find processes participating in distributed computations even when those processes were not created under debugger control. These process identification techniques work both on conventional distributed executions as well as those on a computational grid.

  14. Culture: Copying, Compression, and Conventionality

    ERIC Educational Resources Information Center

    Tamariz, Mónica; Kirby, Simon

    2015-01-01

    Through cultural transmission, repeated learning by new individuals transforms cultural information, which tends to become increasingly compressible (Kirby, Cornish, & Smith, 2008; Smith, Tamariz, & Kirby, 2013). Existing diffusion chain studies include in their design two processes that could be responsible for this tendency: learning…

  15. Magnetic resonance techniques for investigation of multiple sclerosis

    NASA Astrophysics Data System (ADS)

    MacKay, Alex; Laule, Cornelia; Li, David K. B.; Meyers, Sandra M.; Russell-Schulz, Bretta; Vavasour, Irene M.

    2014-11-01

    Multiple sclerosis (MS) is a common neurological disease which can cause loss of vision and balance, muscle weakness, impaired speech, fatigue, cognitive dysfunction and even paralysis. The key pathological processes in MS are inflammation, edema, myelin loss, axonal loss and gliosis. Unfortunately, the cause of MS is still not understood and there is currently no cure. Magnetic resonance imaging (MRI) is an important clinical and research tool for MS. 'Conventional' MRI images of MS brain reveal bright lesions, or plaques, which demark regions of severe tissue damage. Conventional MRI has been extremely valuable for the diagnosis and management of people who have MS and also for the assessment of therapies designed to reduce inflammation and promote repair. While conventional MRI is clearly valuable, it lack pathological specificity and, in some cases, sensitivity to non-lesional pathology. Advanced MR techniques have been developed to provide information that is more sensitive and specific than what is available with clinical scanning. Diffusion tensor imaging and magnetization transfer provide a general but non-specific measure of the pathological state of brain tissue. MR spectroscopy provides concentrations of brain metabolites which can be related to specific pathologies. Myelin water imaging was designed to assess brain myelination and has proved useful for measuring myelin loss in MS. To combat MS, it is crucial that the pharmaceutical industry finds therapies which can reverse the neurodegenerative processes which occur in the disease. The challenge for magnetic resonance researchers is to design imaging techniques which can provide detailed pathological information relating to the mechanisms of MS therapies. This paper briefly describes the pathologies of MS and demonstrates how MS-associated pathologies can be followed using both conventional and advanced MR imaging protocols.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Broad Funding Opportunity Announcement Project: Led by MIT professor Donald Sadoway, the Electroville project team is creating a community-scale electricity storage device using new materials and a battery design inspired by the aluminum production process known as smelting. A conventional battery includes a liquid electrolyte and a solid separator between its 2 solid electrodes. MIT’s battery contains liquid metal electrodes and a molten salt electrolyte. Because metals and salt don’t mix, these 3 liquids of different densities naturally separate into layers, eliminating the need for a solid separator. This efficient design significantly reduces packaging materials, which reduces cost and allowsmore » more space for storing energy than conventional batteries offer. MIT’s battery also uses cheap, earth-abundant, domestically available materials and is more scalable. By using all liquids, the design can also easily be resized according to the changing needs of local communities.« less

  17. Sodium metasilicate based fiber opening for greener leather processing.

    PubMed

    Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasamit, Thirumalachari

    2008-03-01

    Growing environmental regulations propound the need for a transformation in the current practice of leather making. The conventional dehairing and fiber opening process results in high negative impact on the environment because of its uncleanliness. This process accounts for most of the biochemical oxygen demand and chemical oxygen demand in tannery wastewater and generation of H2S gas. Hence, this study explores the use of a biological material and a nontoxic chemical for performing the above process more cleanly. In this study, the dehairing and fiber opening processes has been designed using enzyme and sodium metasilicate. The amount of sodium metasilicate required for fiber opening is standardized through the removal of proteoglycan, increase in weight, and bulk properties of leathers. It has been found that the extent of opening up of fiber bundles is comparable to that of conventionally processed leathers using a 2% sodium metasilicate solution. This has been substantiated through scanning electron microscopic analysis and softness measurements. The presence of silica in the crust leather enhances the bulk properties of the leather. This has been confirmed from the energy dispersive X-ray analysis. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. The process also exhibits significant reduction in chemical oxygen demand and total solid loads by 55 and 24%, respectively. Further, this newly developed process seems to be economically beneficial.

  18. Evaluation of advanced wastewater treatment systems for water reuse in the era of advanced wastewater treatment

    NASA Astrophysics Data System (ADS)

    Kon, Hisao; Watanabe, Masahiro

    This study focuses on effluent COD concentration from wastewater treatment in regards to the reduction of pathogenic bacteria and trace substances in public waters. The main types of secondary wastewater treatment were conventional activated sludge processes. Recently, however, advance wastewater treatment processes have been developed aimed at the removal of nitrogen and phosphorus, and the effluent quality of these processes was analyzed in this study. Treatment processes for water reclamation that make effluent to meet the target water quality for reuse purposes were selected and also optimum design parameters for these processes were proposed. It was found that the treatment cost to water reclamation was greatly affected by the effluent COD of the secondary treatment. It is important to maintain low COD concentration in the secondary treated effluent. Therefore, it is considered that adequate cost benefits would be obtained by achieving target COD quality through shifting from a conventional activated sludge process to an advanced treatment process.

  19. Optimization of Surfactant Mixtures and Their Interfacial Behavior for Advanced Oil Recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somasundaran, Prof. P.

    2002-03-04

    The objective of this project was to develop a knowledge base that is helpful for the design of improved processes for mobilizing and producing oil left untapped using conventional techniques. The main goal was to develop and evaluate mixtures of new or modified surfactants for improved oil recovery. In this regard, interfacial properties of novel biodegradable n-alkyl pyrrolidones and sugar-based surfactants have been studied systematically. Emphasis was on designing cost-effective processes compatible with existing conditions and operations in addition to ensuring minimal reagent loss.

  20. On processing development for fabrication of fiber reinforced composite, part 2

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung; Hou, Gene J. W.; Sheen, Jeen S.

    1989-01-01

    Fiber-reinforced composite laminates are used in many aerospace and automobile applications. The magnitudes and durations of the cure temperature and the cure pressure applied during the curing process have significant consequences for the performance of the finished product. The objective of this study is to exploit the potential of applying the optimization technique to the cure cycle design. Using the compression molding of a filled polyester sheet molding compound (SMC) as an example, a unified Computer Aided Design (CAD) methodology, consisting of three uncoupled modules, (i.e., optimization, analysis and sensitivity calculations), is developed to systematically generate optimal cure cycle designs. Various optimization formulations for the cure cycle design are investigated. The uniformities in the distributions of the temperature and the degree with those resulting from conventional isothermal processing conditions with pre-warmed platens. Recommendations with regards to further research in the computerization of the cure cycle design are also addressed.

  1. Fabrication of the planar angular rotator using the CMOS process

    NASA Astrophysics Data System (ADS)

    Dai, Ching-Liang; Chang, Chien-Liu; Chen, Hung-Lin; Chang, Pei-Zen

    2002-05-01

    In this investigation we propose a novel planar angular rotator fabricated by the conventional complementary metal-oxide semiconductor (CMOS) process. Following the 0.6 μm single poly triple metal (SPTM) CMOS process, the device is completed by a simple maskless, post-process etching step. The rotor of the planar angular rotator rotates around its geometric center with electrostatic actuation. The proposed design adopts an intelligent mechanism including the slider-crank system to permit simultaneous motion. The CMOS planar angular rotator could be driven with driving voltages of around 40 V. The design proposed here has a shorter response time and longer life, without problems of friction and wear, compared to the more common planar angular micromotor.

  2. A Comparative Study : Microprogrammed Vs Risc Architectures For Symbolic Processing

    NASA Astrophysics Data System (ADS)

    Heudin, J. C.; Metivier, C.; Demigny, D.; Maurin, T.; Zavidovique, B.; Devos, F.

    1987-05-01

    It is oftenclaimed that conventional computers are not well suited for human-like tasks : Vision (Image Processing), Intelligence (Symbolic Processing) ... In the particular case of Artificial Intelligence, dynamic type-checking is one example of basic task that must be improved. The solution implemented in most Lisp work-stations consists in a microprogrammed architecture with a tagged memory. Another way to gain efficiency is to design a well suited instruction set for symbolic processing, which reduces the semantic gap between the high level language and the machine code. In this framework, the RISC concept provides a convenient approach to study new architectures for symbolic processing. This paper compares both approaches and describes our projectof designing a compact symbolic processor for Artificial Intelligence applications.

  3. Supercritical Fluid Technologies to Fabricate Proliposomes.

    PubMed

    Falconer, James R; Svirskis, Darren; Adil, Ali A; Wu, Zimei

    2015-01-01

    Proliposomes are stable drug carrier systems designed to form liposomes upon addition of an aqueous phase. In this review, current trends in the use of supercritical fluid (SCF) technologies to prepare proliposomes are discussed. SCF methods are used in pharmaceutical research and industry to address limitations associated with conventional methods of pro/liposome fabrication. The SCF solvent methods of proliposome preparation are eco-friendly (known as green technology) and, along with the SCF anti-solvent methods, could be advantageous over conventional methods; enabling better design of particle morphology (size and shape). The major hurdles of SCF methods include poor scalability to industrial manufacturing which may result in variable particle characteristics. In the case of SCF anti-solvent methods, another hurdle is the reliance on organic solvents. However, the amount of solvent required is typically less than that used by the conventional methods. Another hurdle is that most of the SCF methods used have complicated manufacturing processes, although once the setup has been completed, SCF technologies offer a single-step process in the preparation of proliposomes compared to the multiple steps required by many other methods. Furthermore, there is limited research into how proliposomes will be converted into liposomes for the end-user, and how such a product can be prepared reproducibly in terms of vesicle size and drug loading. These hurdles must be overcome and with more research, SCF methods, especially where the SCF acts as a solvent, have the potential to offer a strong alternative to the conventional methods to prepare proliposomes.

  4. Numerical investigation & comparison of a tandem-bladed turbocharger centrifugal compressor stage with conventional design

    NASA Astrophysics Data System (ADS)

    Danish, Syed Noman; Qureshi, Shafiq Rehman; EL-Leathy, Abdelrahman; Khan, Salah Ud-Din; Umer, Usama; Ma, Chaochen

    2014-12-01

    Extensive numerical investigations of the performance and flow structure in an unshrouded tandem-bladed centrifugal compressor are presented in comparison to a conventional compressor. Stage characteristics are explored for various tip clearance levels, axial spacings and circumferential clockings. Conventional impeller was modified to tandem-bladed design with no modifications in backsweep angle, meridional gas passage and camber distributions in order to have a true comparison with conventional design. Performance degradation is observed for both the conventional and tandem designs with increase in tip clearance. Linear-equation models for correlating stage characteristics with tip clearance are proposed. Comparing two designs, it is clearly evident that the conventional design shows better performance at moderate flow rates. However; near choke flow, tandem design gives better results primarily because of the increase in throat area. Surge point flow rate also seems to drop for tandem compressor resulting in increased range of operation.

  5. Investigation of thermochemical biorefinery sizing and environmental sustainability impacts for conventional supply system and distributed pre-processing supply system designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David J. Muth, Jr.; Matthew H. Langholtz; Eric C. D. Tan

    The 2011 US Billion-Ton Update estimates that by 2030 there will be enough agricultural and forest resources to sustainably provide at least one billion dry tons of biomass annually, enough to displace approximately 30% of the country's current petroleum consumption. A portion of these resources are inaccessible at current cost targets with conventional feedstock supply systems because of their remoteness or low yields. Reliable analyses and projections of US biofuels production depend on assumptions about the supply system and biorefinery capacity, which, in turn, depend upon economic value, feedstock logistics, and sustainability. A cross-functional team has examined combinations of advancesmore » in feedstock supply systems and biorefinery capacities with rigorous design information, improved crop yield and agronomic practices, and improved estimates of sustainable biomass availability. A previous report on biochemical refinery capacity noted that under advanced feedstock logistic supply systems that include depots and pre-processing operations there are cost advantages that support larger biorefineries up to 10 000 DMT/day facilities compared to the smaller 2000 DMT/day facilities. This report focuses on analyzing conventional versus advanced depot biomass supply systems for a thermochemical conversion and refinery sizing based on woody biomass. The results of this analysis demonstrate that the economies of scale enabled by advanced logistics offsets much of the added logistics costs from additional depot processing and transportation, resulting in a small overall increase to the minimum ethanol selling price compared to the conventional logistic supply system. While the overall costs do increase slightly for the advanced logistic supply systems, the ability to mitigate moisture and ash in the system will improve the storage and conversion processes. In addition, being able to draw on feedstocks from further distances will decrease the risk of biomass supply to the conversion facility.« less

  6. NOVEL REACTOR DESIGN FOR BIODIESEL PRODUCTION

    EPA Science Inventory

    The goal of this project is to scale-up a novel reactor for producing Biodiesel from alternative feedstocks. Biodiesel is an alternative fuel that can be produced from a wide variety of plant oils, animal oils and waste oils from food processing. The conventional feedstocks fo...

  7. Multidisciplinary design of a rocket-based combined cycle SSTO launch vehicle using Taguchi methods

    NASA Technical Reports Server (NTRS)

    Olds, John R.; Walberg, Gerald D.

    1993-01-01

    Results are presented from the optimization process of a winged-cone configuration SSTO launch vehicle that employs a rocket-based ejector/ramjet/scramjet/rocket operational mode variable-cycle engine. The Taguchi multidisciplinary parametric-design method was used to evaluate the effects of simultaneously changing a total of eight design variables, rather than changing them one at a time as in conventional tradeoff studies. A combination of design variables was in this way identified which yields very attractive vehicle dry and gross weights.

  8. Heat pipe cooling of power processing magnetics

    NASA Technical Reports Server (NTRS)

    Hansen, I. G.; Chester, M.

    1979-01-01

    The constant demand for increased power and reduced mass has raised the internal temperature of conventionally cooled power magnetics toward the upper limit of acceptability. The conflicting demands of electrical isolation, mechanical integrity, and thermal conductivity preclude significant further advancements using conventional approaches. However, the size and mass of multikilowatt power processing systems may be further reduced by the incorporation of heat pipe cooling directly into the power magnetics. Additionally, by maintaining lower more constant temperatures, the life and reliability of the magnetic devices will be improved. A heat pipe cooled transformer and input filter have been developed for the 2.4 kW beam supply of a 30-cm ion thruster system. This development yielded a mass reduction of 40% (1.76 kg) and lower mean winding temperature (20 C lower). While these improvements are significant, preliminary designs predict even greater benefits to be realized at higher power. This paper presents the design details along with the results of thermal vacuum operation and the component performance in a 3 kW breadboard power processor.

  9. The Effect of Process Intervention on the Attitudes and Learning in a College Freshman Composition Class.

    ERIC Educational Resources Information Center

    Wahlberg, William Auman

    This study was designed to explore one method of intervening in the process of a conventional academic classroom to affect student attitude and improve the learning climate. Two college freshman composition classes of 22 students each provided the subjects for the study. Each class was taught by the same instructor for three hours a week; one…

  10. Simplify to survive: prescriptive layouts ensure profitable scaling to 32nm and beyond

    NASA Astrophysics Data System (ADS)

    Liebmann, Lars; Pileggi, Larry; Hibbeler, Jason; Rovner, Vyacheslav; Jhaveri, Tejas; Northrop, Greg

    2009-03-01

    The time-to-market driven need to maintain concurrent process-design co-development, even in spite of discontinuous patterning, process, and device innovation is reiterated. The escalating design rule complexity resulting from increasing layout sensitivities in physical and electrical yield and the resulting risk to profitable technology scaling is reviewed. Shortcomings in traditional Design for Manufacturability (DfM) solutions are identified and contrasted to the highly successful integrated design-technology co-optimization used for SRAM and other memory arrays. The feasibility of extending memory-style design-technology co-optimization, based on a highly simplified layout environment, to logic chips is demonstrated. Layout density benefits, modeled patterning and electrical yield improvements, as well as substantially improved layout simplicity are quantified in a conventional versus template-based design comparison on a 65nm IBM PowerPC 405 microprocessor core. The adaptability of this highly regularized template-based design solution to different yield concerns and design styles is shown in the extension of this work to 32nm with an increased focus on interconnect redundancy. In closing, the work not covered in this paper, focused on the process side of the integrated process-design co-optimization, is introduced.

  11. Design of pyrolysis reactor for production of bio-oil and bio-char simultaneously

    NASA Astrophysics Data System (ADS)

    Aladin, Andi; Alwi, Ratna Surya; Syarif, Takdir

    2017-05-01

    The residues from the wood industry are the main contributors to biomass waste in Indonesia. The conventional pyrolysis process, which needs a large energy as well as to produce various toxic chemical to the environment. Therefore, a pyrolysis unit on the laboratory scale was designed that can be a good alternative to achieve zero-waste and low energy cost. In this paper attempts to discuss design and system of pyrolysis reactor to produce bio-oil and bio-char simultaneously.

  12. Magnetohydrodynamics (MHD) Engineering Test Facility (ETF) 200 MWe power plant. Design Requirements Document (DRD)

    NASA Technical Reports Server (NTRS)

    Rigo, H. S.; Bercaw, R. W.; Burkhart, J. A.; Mroz, T. S.; Bents, D. J.; Hatch, A. M.

    1981-01-01

    A description and the design requirements for the 200 MWe (nominal) net output MHD Engineering Test Facility (ETF) Conceptual Design, are presented. Performance requirements for the plant are identified and process conditions are indicated at interface stations between the major systems comprising the plant. Also included are the description, functions, interfaces and requirements for each of these major systems. The lastest information (1980-1981) from the MHD technology program are integrated with elements of a conventional steam electric power generating plant.

  13. Influence of dynamic coupled hydro-bio-mechanical processes on response of municipal solid waste and liner system in bioreactor landfills.

    PubMed

    Reddy, Krishna R; Kumar, Girish; Giri, Rajiv K

    2017-05-01

    A two-dimensional (2-D) mathematical model is presented to predict the response of municipal solid waste (MSW) of conventional as well as bioreactor landfills undergoing coupled hydro-bio-mechanical processes. The newly developed and validated 2-D coupled mathematical modeling framework combines and simultaneously solves a two-phase flow model based on the unsaturated Richard's equation, a plain-strain formulation of Mohr-Coulomb mechanical model and first-order decay kinetics biodegradation model. The performance of both conventional and bioreactor landfill was investigated holistically, by evaluating the mechanical settlement, extent of waste degradation with subsequent changes in geotechnical properties, landfill slope stability, and in-plane shear behavior (shear stress-displacement) of composite liner system and final cover system. It is concluded that for the given specific conditions considered, bioreactor landfill attained an overall stabilization after a continuous leachate injection of 16years, whereas the stabilization was observed after around 50years of post-closure in conventional landfills, with a total vertical strain of 36% and 37% for bioreactor and conventional landfills, respectively. The significant changes in landfill settlement, the extent of MSW degradation, MSW geotechnical properties, along with their influence on the in-plane shear response of composite liner and final cover system, between the conventional and bioreactor landfills, observed using the mathematical model proposed in this study, corroborates the importance of considering coupled hydro-bio-mechanical processes while designing and predicting the performance of engineered bioreactor landfills. The study underscores the importance of considering the effect of coupled processes while examining the stability and integrity of the liner and cover systems, which form the integral components of a landfill. Moreover, the spatial and temporal variations in the landfill settlement, the stability of landfill slope under pressurized leachate injection conditions and the rapid changes in the MSW properties with degradation emphasizes the complexity of the bioreactor landfill system and the need for understanding the interrelated processes to design and operate stable and effective bioreactor landfills. A detailed discussion on the results obtained from the numerical simulations along with limitations and key challenges in this study are also presented. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Smart Screening System (S3) In Taconite Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daryoush Allaei; Ryan Wartman; David Tarnowski

    2006-03-01

    The conventional screening machines used in processing plants have had undesirable high noise and vibration levels. They also have had unsatisfactorily low screening efficiency, high energy consumption, high maintenance cost, low productivity, and poor worker safety. These conventional vibrating machines have been used in almost every processing plant. Most of the current material separation technology uses heavy and inefficient electric motors with an unbalanced rotating mass to generate the shaking. In addition to being excessively noisy, inefficient, and high-maintenance, these vibrating machines are often the bottleneck in the entire process. Furthermore, these motors, along with the vibrating machines and supportingmore » structure, shake other machines and structures in the vicinity. The latter increases maintenance costs while reducing worker health and safety. The conventional vibrating fine screens at taconite processing plants have had the same problems as those listed above. This has resulted in lower screening efficiency, higher energy and maintenance cost, and lower productivity and workers safety concerns. The focus of this work is on the design of a high performance screening machine suitable for taconite processing plants. SmartScreens{trademark} technology uses miniaturized motors, based on smart materials, to generate the shaking. The underlying technologies are Energy Flow Control{trademark} and Vibration Control by Confinement{trademark}. These concepts are used to direct energy flow and confine energy efficiently and effectively to the screen function. The SmartScreens{trademark} technology addresses problems related to noise and vibration, screening efficiency, productivity, and maintenance cost and worker safety. Successful development of SmartScreens{trademark} technology will bring drastic changes to the screening and physical separation industry. The final designs for key components of the SmartScreens{trademark} have been developed. The key components include smart motor and associated electronics, resonators, and supporting structural elements. It is shown that the smart motors have an acceptable life and performance. Resonator (or motion amplifier) designs are selected based on the final system requirement and vibration characteristics. All the components for a fully functional prototype are fabricated. The development program is on schedule. The last semi-annual report described the completion of the design refinement phase. This phase resulted in a Smart Screen design that meets performance targets both in the dry condition and with taconite slurry flow using PZT motors. This system was successfully demonstrated for the DOE and partner companies at the Coleraine Mineral Research Laboratory in Coleraine, Minnesota. Since then, the fabrication of the dry application prototype (incorporating an electromagnetic drive mechanism and a new deblinding concept) has been completed and successfully tested at QRDC's lab.« less

  15. Automatic mathematical modeling for real time simulation system

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1988-01-01

    A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.

  16. Memristor-based cellular nonlinear/neural network: design, analysis, and applications.

    PubMed

    Duan, Shukai; Hu, Xiaofang; Dong, Zhekang; Wang, Lidan; Mazumder, Pinaki

    2015-06-01

    Cellular nonlinear/neural network (CNN) has been recognized as a powerful massively parallel architecture capable of solving complex engineering problems by performing trillions of analog operations per second. The memristor was theoretically predicted in the late seventies, but it garnered nascent research interest due to the recent much-acclaimed discovery of nanocrossbar memories by engineers at the Hewlett-Packard Laboratory. The memristor is expected to be co-integrated with nanoscale CMOS technology to revolutionize conventional von Neumann as well as neuromorphic computing. In this paper, a compact CNN model based on memristors is presented along with its performance analysis and applications. In the new CNN design, the memristor bridge circuit acts as the synaptic circuit element and substitutes the complex multiplication circuit used in traditional CNN architectures. In addition, the negative differential resistance and nonlinear current-voltage characteristics of the memristor have been leveraged to replace the linear resistor in conventional CNNs. The proposed CNN design has several merits, for example, high density, nonvolatility, and programmability of synaptic weights. The proposed memristor-based CNN design operations for implementing several image processing functions are illustrated through simulation and contrasted with conventional CNNs. Monte-Carlo simulation has been used to demonstrate the behavior of the proposed CNN due to the variations in memristor synaptic weights.

  17. Research in the Optical Sciences.

    DTIC Science & Technology

    1984-10-01

    cannot tolerate the high temperatures used for 9 conventional hard MgF, depositions. The ion beam processing led to durable films (in some cases more...sputter epitaxy techniques for the production of high-reflectivity mirrors for near-normal incidence in the x-ray-ultraviolet (X- UV ) wavelength range...codes for X- UV multilayer mirror design, (2) acquisition of a data base of optical constants in this wavelength range, (3) theoretical designs of

  18. First 65nm tape-out using inverse lithography technology (ILT)

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Zhang, Bin; Tang, Deming; Guo, Eric; Pang, Linyong; Liu, Yong; Moore, Andrew; Wang, Kechang

    2005-11-01

    This paper presents SMIC's first 65nm tape out results, in particularly, using ILT. ILT mathematically determines the mask features that produce the desired on-wafer results with best wafer pattern fidelity, largest process window or both. SMIC applied it to its first 65nm tape-out to study ILT performance and benefits for deep sub-wavelength lithography. SMIC selected 3 SRAM designs as the first test case, because SRAM bit-cells contain features which are challenging lithographically. Mask patterns generated from both conventional OPC and ILT were placed on the mask side-by-side. Mask manufacturability (including fracturing, writing time, inspection, and metrology) and wafer print performance of ILT were studied. The results demonstrated that ILT achieved better CD accuracy, produced substantially larger process window than conventional OPC, and met SMIC's 65nm process window requirements.

  19. Development of High-Power Hall Thruster Power Processing Units at NASA GRC

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Bozak, Karin E.; Santiago, Walter; Scheidegger, Robert J.; Birchenough, Arthur G.

    2015-01-01

    NASA GRC successfully designed, built and tested four different power processor concepts for high power Hall thrusters. Each design satisfies unique goals including the evaluation of a novel silicon carbide semiconductor technology, validation of innovative circuits to overcome the problems with high input voltage converter design, development of a direct-drive unit to demonstrate potential benefits, or simply identification of lessonslearned from the development of a PPU using a conventional design approach. Any of these designs could be developed further to satisfy NASA's needs for high power electric propulsion in the near future.

  20. Manufacturing of hybrid aluminum copper joints by electromagnetic pulse welding - Identification of quantitative process windows

    NASA Astrophysics Data System (ADS)

    Psyk, Verena; Scheffler, Christian; Linnemann, Maik; Landgrebe, Dirk

    2017-10-01

    Compared to conventional joining techniques, electromagnetic pulse welding offers important advantages especially when it comes to dissimilar material connections as e.g. copper aluminum welds. However, due to missing guidelines and tools for process design, the process has not been widely implemented in industrial production, yet. In order to contribute to overcoming this obstacle, a combined numerical and experimental process analysis for electromagnetic pulse welding of Cu-DHP and EN AW-1050 was carried out and the results were consolidated in a quantitative collision parameter based process window.

  1. Synthetic-lattice enabled all-optical devices based on orbital angular momentum of light

    PubMed Central

    Luo, Xi-Wang; Zhou, Xingxiang; Xu, Jin-Shi; Li, Chuan-Feng; Guo, Guang-Can; Zhang, Chuanwei; Zhou, Zheng-Wei

    2017-01-01

    All-optical photonic devices are crucial for many important photonic technologies and applications, ranging from optical communication to quantum information processing. Conventional design of all-optical devices is based on photon propagation and interference in real space, which may rely on large numbers of optical elements, and the requirement of precise control makes this approach challenging. Here we propose an unconventional route for engineering all-optical devices using the photon’s internal degrees of freedom, which form photonic crystals in such synthetic dimensions for photon propagation and interference. We demonstrate this design concept by showing how important optical devices such as quantum memory and optical filters can be realized using synthetic orbital angular momentum (OAM) lattices in degenerate cavities. The design route utilizing synthetic photonic lattices may significantly reduce the requirement for numerous optical elements and their fine tuning in conventional design, paving the way for realistic all-optical photonic devices with novel functionalities. PMID:28706215

  2. Optimization of chlorine fluxing process for magnesium removal from molten aluminum

    NASA Astrophysics Data System (ADS)

    Fu, Qian

    High-throughput and low operational cost are the keys to a successful industrial process. Much aluminum is now recycled in the form of used beverage cans and this aluminum is of alloys that contain high levels of magnesium. It is common practice to "demag" the metal by injecting chlorine that preferentially reacts with the magnesium. In the conventional chlorine fluxing processes, low reaction efficiency results in excessive reactive gas emissions. In this study, through an experimental investigation of the reaction kinetics involved in this process, a mathematical model is set up for the purpose of process optimization. A feedback controlled chlorine reduction process strategy is suggested for demagging the molten aluminum to the desired magnesium level without significant gas emissions. This strategy also needs the least modification of the existing process facility. The suggested process time will only be slightly longer than conventional methods and chlorine usage and emissions will be reduced. In order to achieve process optimization through novel designs in any fluxing process, a system is necessary for measuring the bubble distribution in liquid metals. An electro-resistivity probe described in the literature has low accuracy and its capability to measure bubble distribution has not yet been fully demonstrated. A capacitance bubble probe was designed for bubble measurements in molten metals. The probe signal was collected and processed digitally. Higher accuracy was obtained by higher discrimination against corrupted signals. A single-size bubble experiment in Belmont metal was designed to reveal the characteristic response of the capacitance probe. This characteristic response fits well with a theoretical model. It is suggested that using a properly designed deconvolution process, the actual bubble size distribution can be calculated. The capacitance probe was used to study some practical bubble generation devices. Preliminary results on bubble distribution generated by a porous plug in Belmont metal showed bubbles much bigger than those in a water model. Preliminary results in molten aluminum showed that the probe was applicable in this harsh environment. An interesting bubble coalescence phenomenon was also observed in both Belmont metal and molten aluminum.

  3. Implantable electronics: emerging design issues and an ultra light-weight security solution.

    PubMed

    Narasimhan, Seetharam; Wang, Xinmu; Bhunia, Swarup

    2010-01-01

    Implantable systems that monitor biological signals require increasingly complex digital signal processing (DSP) electronics for real-time in-situ analysis and compression of the recorded signals. While it is well-known that such signal processing hardware needs to be implemented under tight area and power constraints, new design requirements emerge with their increasing complexity. Use of nanoscale technology shows tremendous benefits in implementing these advanced circuits due to dramatic improvement in integration density and power dissipation per operation. However, it also brings in new challenges such as reliability and large idle power (due to higher leakage current). Besides, programmability of the device as well as security of the recorded information are rapidly becoming major design considerations of such systems. In this paper, we analyze the emerging issues associated with the design of the DSP unit in an implantable system. Next, we propose a novel ultra light-weight solution to address the information security issue. Unlike the conventional information security approaches like data encryption, which come at large area and power overhead and hence are not amenable for resource-constrained implantable systems, we propose a multilevel key-based scrambling algorithm, which exploits the nature of the biological signal to effectively obfuscate it. Analysis of the proposed algorithm in the context of neural signal processing and its hardware implementation shows that we can achieve high level of security with ∼ 13X lower power and ∼ 5X lower area overhead than conventional cryptographic solutions.

  4. A Taxonomy of Knowledge Types for Use in Curriculum Design

    ERIC Educational Resources Information Center

    Carson, Robert N.

    2004-01-01

    This article proposes the use of a taxonomy to help curriculum planners distinguish between different kinds of knowledge. Nine categories are suggested: empirical, rational, conventional, conceptual, cognitive process skills, psychomotor, affective, narrative, and received. Analyzing lessons into the sources of their resident knowledge helps the…

  5. Stepping Stones to Literacy. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2007

    2007-01-01

    Stepping Stones to Literacy (SSL) is a supplemental curriculum designed to promote listening, print conventions, phonological awareness, phonemic awareness, and serial processing/rapid naming (quickly naming familiar visual symbols and stimuli such as letters or colors). The program targets kindergarten and older preschool students considered to…

  6. Multifunctional Nanomaterials: Design, Synthesis and Application Properties.

    PubMed

    Martinelli, Marisa; Strumia, Miriam Cristina

    2017-02-07

    The immense scope of variation in dendritic molecules (hyper-branching, nano-sized, hydrophobicity/hydrophilicity, rigidity/flexibility balance, etc.) and their versatile functionalization, with the possibility of multivalent binding, permit the design of highly improved, novel materials. Dendritic-based materials are therefore viable alternatives to conventional polymers. The overall aim of this work is to show the advantages of dendronization processes by presenting the synthesis and characterization of three different dendronized systems: (I) microbeads of functionalized chitosan; (II) nanostructuration of polypropylene surfaces; and (III) smart dendritic nanogels. The particular properties yielded by these systems could only be achieved thanks to the dendronization process.

  7. Development of the weld-braze joining process

    NASA Technical Reports Server (NTRS)

    Bales, T. T.; Royster, D. M.; Arnold, W. E., Jr.

    1973-01-01

    A joining process, designated weld-brazing, was developed which combines resistance spot welding and brazing. Resistance spot welding is used to position and aline the parts, as well as to establish a suitable faying-surface gap for brazing. Fabrication is then completed at elevated temperature by capillary flow of the braze alloy into the joint. The process was used successfully to fabricate Ti-6Al-4V alloy joints by using 3003 aluminum braze alloy and should be applicable to other metal-braze systems. Test results obtained on single-overlap and hat-stiffened panel specimens show that weld-brazed joints were superior in tensile shear, stress rupture, fatigue, and buckling compared with joints fabricated by conventional means. Another attractive feature of the process is that the brazed joint is hermetically sealed by the braze material, which may eliminate many of the sealing problems encountered with riveted or spot welded structures. The relative ease of fabrication associated with the weld-brazing process may make it cost effective over conventional joining techniques.

  8. Rapid prototype fabrication processes for high-performance thrust cells

    NASA Technical Reports Server (NTRS)

    Hunt, K.; Chwiedor, T.; Diab, J.; Williams, R.

    1994-01-01

    The Thrust Cell Technologies Program (Air Force Phillips Laboratory Contract No. F04611-92-C-0050) is currently being performed by Rocketdyne to demonstrate advanced materials and fabrication technologies which can be utilized to produce low-cost, high-performance thrust cells for launch and space transportation rocket engines. Under Phase 2 of the Thrust Cell Technologies Program (TCTP), rapid prototyping and investment casting techniques are being employed to fabricate a 12,000-lbf thrust class combustion chamber for delivery and hot-fire testing at Phillips Lab. The integrated process of investment casting directly from rapid prototype patterns dramatically reduces design-to-delivery cycle time, and greatly enhances design flexibility over conventionally processed cast or machined parts.

  9. Development of a Bunched Beam Electron Cooler based on ERL and Circulator Ring Technology for the Jefferson Lab Electron-Ion Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, Stephen V.; Derbenev, Yaroslav S.; Douglas, David R.

    Jefferson Lab is in the process of designing an electron ion collider with unprecedented luminosity at a 45 GeV center-of-mass energy. This luminosity relies on ion cooling in both the booster and the storage ring of the accelerator complex. The cooling in the booster will use a conventional DC cooler similar to the one at COSY. The high-energy storage ring, operating at a momentum of up to 100 GeV/nucleon, requires novel use of bunched-beam cooling. There are two designs for such a cooler. The first uses a conventional Energy Recovery Linac (ERL) with a magnetized beam while the second usesmore » a circulating ring to enhance both peak and average currents experienced by the ion beam. This presentation will describe the design of both the Circulator Cooling Ring (CCR) design and that of the backup option using the stand-alone ERL operated at lower charge but higher repetition rate than the ERL injector required by the CCR-based design.« less

  10. Proposal for a new categorization of aseptic processing facilities based on risk assessment scores.

    PubMed

    Katayama, Hirohito; Toda, Atsushi; Tokunaga, Yuji; Katoh, Shigeo

    2008-01-01

    Risk assessment of aseptic processing facilities was performed using two published risk assessment tools. Calculated risk scores were compared with experimental test results, including environmental monitoring and media fill run results, in three different types of facilities. The two risk assessment tools used gave a generally similar outcome. However, depending on the tool used, variations were observed in the relative scores between the facilities. For the facility yielding the lowest risk scores, the corresponding experimental test results showed no contamination, indicating that these ordinal testing methods are insufficient to evaluate this kind of facility. A conventional facility having acceptable aseptic processing lines gave relatively high risk scores. The facility showing a rather high risk score demonstrated the usefulness of conventional microbiological test methods. Considering the significant gaps observed in calculated risk scores and in the ordinal microbiological test results between advanced and conventional facilities, we propose a facility categorization based on risk assessment. The most important risk factor in aseptic processing is human intervention. When human intervention is eliminated from the process by advanced hardware design, the aseptic processing facility can be classified into a new risk category that is better suited for assuring sterility based on a new set of criteria rather than on currently used microbiological analysis. To fully benefit from advanced technologies, we propose three risk categories for these aseptic facilities.

  11. The Effect of Scientific Inquiry Learning Model Based on Conceptual Change on Physics Cognitive Competence and Science Process Skill (SPS) of Students at Senior High School

    ERIC Educational Resources Information Center

    Sahhyar; Nst, Febriani Hastini

    2017-01-01

    The purpose of this research was to analyze the physics cognitive competence and science process skill of students using scientific inquiry learning model based on conceptual change better than using conventional learning. The research type was quasi experiment and two group pretest-posttest designs were used in this study. The sample were Class…

  12. Sliding wear and corrosion behaviour of alloyed austempered ductile iron subjected to novel two step austempering treatment

    NASA Astrophysics Data System (ADS)

    Sethuram, D.; Srisailam, Shravani; Rao Ponangi, Babu

    2018-04-01

    Austempered Ductile Iron(ADI) is an exciting alloy of iron which offers the design engineers the best combination high strength-to-weight ratio, low cost design flexibility, good toughness, wear resistance along with fatigue strength. The two step austempering procedure helps in simultaneously improving the tensile strength as-well as the ductility to more than that of the conventional austempering process. Extensive literature survey reveals that it’s mechanical and wear behaviour are dependent on heat treatment and alloy additions. Current work focuses on characterizing the two-step ADI samples (TSADI) developed by novel heat treatment process for resistance to corrosion and wear. The samples of Ductile Iron were austempered by the two-Step Austempering process at temperatures 300°C to 450°C in the steps of 50°C.Temperaturesare gradually increased at the rate of 14°C/Hour. In acidic medium (H2SO4), the austempered samples showed better corrosive resistance compared to conventional ductile iron. It has been observed from the wear studies that TSADI sample at 350°C is showing better wear resistance compared to ductile iron. The results are discussed in terms of fractographs, process variables and microstructural features of TSADI samples.

  13. Approximate simulation model for analysis and optimization in engineering system design

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Computational support of the engineering design process routinely requires mathematical models of behavior to inform designers of the system response to external stimuli. However, designers also need to know the effect of the changes in design variable values on the system behavior. For large engineering systems, the conventional way of evaluating these effects by repetitive simulation of behavior for perturbed variables is impractical because of excessive cost and inadequate accuracy. An alternative is described based on recently developed system sensitivity analysis that is combined with extrapolation to form a model of design. This design model is complementary to the model of behavior and capable of direct simulation of the effects of design variable changes.

  14. Machining and characterization of self-reinforced polymers

    NASA Astrophysics Data System (ADS)

    Deepa, A.; Padmanabhan, K.; Kuppan, P.

    2017-11-01

    This Paper focuses on obtaining the mechanical properties and the effect of the different machining techniques on self-reinforced composites sample and to derive the best machining method with remarkable properties. Each sample was tested by the Tensile and Flexural tests, fabricated using hot compaction test and those loads were calculated. These composites are machined using conventional methods because of lack of advanced machinery in most of the industries. The advanced non-conventional methods like Abrasive water jet machining were used. These machining techniques are used to get the better output for the composite materials with good mechanical properties compared to conventional methods. But the use of non-conventional methods causes the changes in the work piece, tool properties and more economical compared to the conventional methods. Finding out the best method ideal for the designing of these Self Reinforced Composites with and without defects and the use of Scanning Electron Microscope (SEM) analysis for the comparing the microstructure of the PP and PE samples concludes our process.

  15. Design and implementation of a sigma delta technology based pulse oximeter's acquisition stage

    NASA Astrophysics Data System (ADS)

    Rossi, E. E.; Peñalva, A.; Schaumburg, F.

    2011-12-01

    Pulse oximetry is a widely used tool in medical practice for estimating patient's fraction of hemoglobin bonded to oxygen. Conventional oximetry presents limitations when changes in the baseline, or low amplitude of signals involved occur. The aim of this paper is to simultaneously solve these constraints and to simplify the circuitry needed, by using ΣΔ technology. For this purpose, a board for the acquisition of the needed signals was developed, together with a PC managed software which controls it, and displays and processes in real time the information acquired. Also laboratory and field tests where designed and executed to verify the performance of this equipment in adverse situations. A simple, robust and economic instrument was achieved, capable of obtaining signals even in situations where conventional oximetry fails.

  16. Slush hydrogen liquid level system

    NASA Technical Reports Server (NTRS)

    Hamlet, J. F.; Adams, R. G.

    1972-01-01

    A discrete capacitance liquid level system developed is specifically for slush hydrogen, but applicable to LOX, LN2, LH2, and RP1 without modification is described. The signal processing portion of the system is compatible with conventional liquid level sensors. Compatibility with slush hydrogen was achieved by designing the sensor with adequate spacing, while retaining the electrical characteristics of conventional sensors. Tests indicate excellent stability of the system over a temperature range of -20 C to 70 C for the circuit and to cryogenic temperatures of the sensor. The sensor was tested up to 40 g's rms random vibration with no damage to the sensor. Operation with 305 m of cable between the sensor and signal processor was demonstrated. It is concluded that this design is more than adequate for most flight and ground applications.

  17. New microwave-integrated Soxhlet extraction. An advantageous tool for the extraction of lipids from food products.

    PubMed

    Virot, Matthieu; Tomao, Valérie; Colnagui, Giulio; Visinoni, Franco; Chemat, Farid

    2007-12-07

    A new process of Soxhlet extraction assisted by microwave was designed and developed. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. A second-order central composite design (CCD) has been used to investigate the performance of the new device. The results provided by analysis of variance and Pareto chart, indicated that the extraction time was the most important factor followed by the leaching time. The response surface methodology allowed us to determine optimal conditions for olive oil extraction: 13 min of extraction time, 17 min of leaching time, and 720 W of irradiation power. The proposed process is suitable for lipids determination from food. Microwave-integrated Soxhlet (MIS) extraction has been compared with a conventional technique, Soxhlet extraction, for the extraction of oil from olives (Aglandau, Vaucluse, France). The oils extracted by MIS for 32 min were quantitatively (yield) and qualitatively (fatty acid composition) similar to those obtained by conventional Soxhlet extraction for 8 h. MIS is a green technology and appears as a good alternative for the extraction of fat and oils from food products.

  18. Knowledge-Based Manufacturing and Structural Design for a High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.

    1994-01-01

    The aerospace industry is currently addressing the problem of integrating manufacturing and design. To address the difficulties associated with using many conventional procedural techniques and algorithms, one feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors present their reasons for selecting a KBS to integrate design and manufacturing. A methodology for an aircraft producibility assessment is proposed, utilizing a KBS for manufacturing process selection, that addresses both procedural and heuristic aspects of designing and manufacturing of a High Speed Civil Transport (HSCT) wing. A cost model is discussed that would allow system level trades utilizing information describing the material characteristics as well as the manufacturing process selections. Statements of future work conclude the paper.

  19. Low temperature multi-alkali photocathode processing technique for sealed intensified CCD tubes

    NASA Technical Reports Server (NTRS)

    Doliber, D. L.; Dozier, E. E.; Wenzel, H.; Beaver, E. A.; Hier, R. G.

    1989-01-01

    A low temperature photocathode process has been used to fabricate an intensified CCD visual photocathode image tube, by incorporating a thinned, backside-illumined CCD as the target anode of a digicon tube of Hubble Space Telescope (HST) design. The CCD digicon tube employs the HST's sodium bialkali photocathode and MgF2 substrate, thereby allowing a direct photocathode quantum efficiency comparison between photocathodes produced by the presently employed low temperature process and those of the conventional high temperature process. Attention is given to the processing chamber used, as well as the details of gas desorption and photocathode processing.

  20. The effect of hearing aid technologies on listening in an automobile

    PubMed Central

    Wu, Yu-Hsiang; Stangl, Elizabeth; Bentler, Ruth A.; Stanziola, Rachel W.

    2014-01-01

    Background Communication while traveling in an automobile often is very difficult for hearing aid users. This is because the automobile /road noise level is usually high, and listeners/drivers often do not have access to visual cues. Since the talker of interest usually is not located in front of the driver/listener, conventional directional processing that places the directivity beam toward the listener’s front may not be helpful, and in fact, could have a negative impact on speech recognition (when compared to omnidirectional processing). Recently, technologies have become available in commercial hearing aids that are designed to improve speech recognition and/or listening effort in noisy conditions where talkers are located behind or beside the listener. These technologies include (1) a directional microphone system that uses a backward-facing directivity pattern (Back-DIR processing), (2) a technology that transmits audio signals from the ear with the better signal-to-noise ratio (SNR) to the ear with the poorer SNR (Side-Transmission processing), and (3) a signal processing scheme that suppresses the noise at the ear with the poorer SNR (Side-Suppression processing). Purpose The purpose of the current study was to determine the effect of (1) conventional directional microphones and (2) newer signal processing schemes (Back-DIR, Side-Transmission, and Side-Suppression) on listener’s speech recognition performance and preference for communication in a traveling automobile. Research design A single-blinded, repeated-measures design was used. Study Sample Twenty-five adults with bilateral symmetrical sensorineural hearing loss aged 44 through 84 years participated in the study. Data Collection and Analysis The automobile/road noise and sentences of the Connected Speech Test (CST) were recorded through hearing aids in a standard van moving at a speed of 70 miles/hour on a paved highway. The hearing aids were programmed to omnidirectional microphone, conventional adaptive directional microphone, and the three newer schemes. CST sentences were presented from the side and back of the hearing aids, which were placed on the ears of a manikin. The recorded stimuli were presented to listeners via earphones in a sound treated booth to assess speech recognition performance and preference with each programmed condition. Results Compared to omnidirectional microphones, conventional adaptive directional processing had a detrimental effect on speech recognition when speech was presented from the back or side of the listener. Back-DIR and Side-Transmission processing improved speech recognition performance (relative to both omnidirectional and adaptive directional processing) when speech was from the back and side, respectively. The performance with Side-Suppression processing was better than with adaptive directional processing when speech was from the side. The participants’ preferences for a given processing scheme were generally consistent with speech recognition results. Conclusions The finding that performance with adaptive directional processing was poorer than with omnidirectional microphones demonstrates the importance of selecting the correct microphone technology for different listening situations. The results also suggest the feasibility of using hearing aid technologies to provide a better listening experience for hearing aid users in automobiles. PMID:23886425

  1. Smart Screening System (S3) In Taconite Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daryoush Allaei; Angus Morison; David Tarnowski

    2005-09-01

    The conventional screening machines used in processing plants have had undesirable high noise and vibration levels. They also have had unsatisfactorily low screening efficiency, high energy consumption, high maintenance cost, low productivity, and poor worker safety. These conventional vibrating machines have been used in almost every processing plant. Most of the current material separation technology uses heavy and inefficient electric motors with an unbalanced rotating mass to generate the shaking. In addition to being excessively noisy, inefficient, and high-maintenance, these vibrating machines are often the bottleneck in the entire process. Furthermore, these motors, along with the vibrating machines and supportingmore » structure, shake other machines and structures in the vicinity. The latter increases maintenance costs while reducing worker health and safety. The conventional vibrating fine screens at taconite processing plants have had the same problems as those listed above. This has resulted in lower screening efficiency, higher energy and maintenance cost, and lower productivity and workers safety concerns. The focus of this work is on the design of a high performance screening machine suitable for taconite processing plants. SmartScreens{trademark} technology uses miniaturized motors, based on smart materials, to generate the shaking. The underlying technologies are Energy Flow Control{trademark} and Vibration Control by Confinement{trademark}. These concepts are used to direct energy flow and confine energy efficiently and effectively to the screen function. The SmartScreens{trademark} technology addresses problems related to noise and vibration, screening efficiency, productivity, and maintenance cost and worker safety. Successful development of SmartScreens{trademark} technology will bring drastic changes to the screening and physical separation industry. The final designs for key components of the SmartScreens{trademark} have been developed. The key components include smart motor and associated electronics, resonators, and supporting structural elements. It is shown that the smart motors have an acceptable life and performance. Resonator (or motion amplifier) designs are selected based on the final system requirement and vibration characteristics. All the components for a fully functional prototype are fabricated. The development program is on schedule. The last semi-annual report described the process of FE model validation and correlation with experimental data in terms of dynamic performance and predicted stresses. It also detailed efforts into making the supporting structure less important to system performance. Finally, an introduction into the dry application concept was presented. Since then, the design refinement phase was completed. This has resulted in a Smart Screen design that meets performance targets both in the dry condition and with taconite slurry flow using PZT motors. Furthermore, this system was successfully demonstrated for the DOE and partner companies at the Coleraine Mineral Research Laboratory in Coleraine, Minnesota.« less

  2. Efficiency and economics of large scale hydrogen liquefaction. [for future generation aircraft requirements

    NASA Technical Reports Server (NTRS)

    Baker, C. R.

    1975-01-01

    Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.

  3. Extended performance solar electric propulsion thrust system study. Volume 4: Thruster technology evaluation

    NASA Technical Reports Server (NTRS)

    Poeschel, R. L.; Hawthorne, E. I.; Weisman, Y. C.; Frisman, M.; Benson, G. C.; Mcgrath, R. J.; Martinelli, R. M.; Linsenbardt, T. L.; Beattie, J. R.

    1977-01-01

    Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30 cm engineering model thruster as the technology base. Emphasis was placed on relatively high power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentrator solar array concept and is designed to interface with the Space Shuttle.

  4. Extended performance solar electric propulsion thrust system study. Volume 2: Baseline thrust system

    NASA Technical Reports Server (NTRS)

    Poeschel, R. L.; Hawthorne, E. I.

    1977-01-01

    Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30- cm engineering model thruster as the technology base. Emphasis was placed on relatively high-power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power-processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentractor solar array concept and is designed to interface with the space shuttle.

  5. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  6. Optimization of a novel enzyme treatment process for early-stage processing of sheepskins.

    PubMed

    Lim, Y F; Bronlund, J E; Allsop, T F; Shilton, A N; Edmonds, R L

    2010-01-01

    An enzyme treatment process for early-stage processing of sheepskins has been previously reported by the Leather and Shoe Research Association of New Zealand (LASRA) as an alternative to current industry operations. The newly developed process had marked benefits over conventional processing in terms of a lowered energy usage (73%), processing time (47%) as well as water use (49%), but had been developed as a "proof of principle''. The objective of this work was to develop the process further to a stage ready for adoption by industry. Mass balancing was used to investigate potential modifications for the process based on the understanding developed from a detailed analysis of preliminary design trials. Results showed that a configuration utilising a 2 stage counter-current system for the washing stages and segregation and recycling of enzyme float prior to dilution in the neutralization stage was a significant improvement. Benefits over conventional processing include a reduction of residual TDS by 50% at the washing stages and 70% savings on water use overall. Benefits over the un-optimized LASRA process are reduction of solids in product after enzyme treatment and neutralization stages by 30%, additional water savings of 21%, as well as 10% savings of enzyme usage.

  7. The NASA Exploration Design Team; Blueprint for a New Design Paradigm

    NASA Technical Reports Server (NTRS)

    Oberto, Robert E.; Nilsen, Erik; Cohen, Ron; Wheeler, Rebecca; DeFlorio, Paul

    2005-01-01

    NASA has chosen JPL to deliver a NASA-wide rapid-response real-time collaborative design team to perform rapid execution of program, system, mission, and technology trade studies. This team will draw on the expertise of all NASA centers and external partners necessary. The NASA Exploration Design Team (NEDT) will be led by NASA Headquarters, with field centers and partners added according to the needs of each study. Through real-time distributed collaboration we will effectively bring all NASA field centers directly inside Headquarters. JPL's Team X pioneered the technique of real time collaborative design 8 years ago. Since its inception, Team X has performed over 600 mission studies and has reduced per-study cost by a factor of 5 and per-study duration by a factor of 10 compared to conventional design processes. The Team X concept has spread to other NASA centers, industry, academia, and international partners. In this paper, we discuss the extension of the JPL Team X process to the NASA-wide collaborative design team. We discuss the architecture for such a process and elaborate on the implementation challenges of this process. We further discuss our current ideas on how to address these challenges.

  8. Micro- to Macroroughness of Additively Manufactured Titanium Implants in Terms of Coagulation and Contact Activation.

    PubMed

    Klingvall Ek, Rebecca; Hong, Jaan; Thor, Andreas; Bäckström, Mikael; Rännar, Lars-Erik

    This study aimed to evaluate how as-built electron beam melting (EBM) surface properties affect the onset of blood coagulation. The properties of EBM-manufactured implant surfaces for placement have, until now, remained largely unexplored in literature. Implants with conventional designs and custom-made implants have been manufactured using EBM technology and later placed into the human body. Many of the conventional implants used today, such as dental implants, display modified surfaces to optimize bone ingrowth, whereas custom-made implants, by and large, have machined surfaces. However, titanium in itself demonstrates good material properties for the purpose of bone ingrowth. Specimens manufactured using EBM were selected according to their surface roughness and process parameters. EBM-produced specimens, conventional machined titanium surfaces, as well as PVC surfaces for control were evaluated using the slide chamber model. A significant increase in activation was found, in all factors evaluated, between the machined samples and EBM-manufactured samples. The results show that EBM-manufactured implants with as-built surfaces augment the thrombogenic properties. EBM that uses Ti6Al4V powder appears to be a good manufacturing solution for load-bearing implants with bone anchorage. The as-built surfaces can be used "as is" for direct bone contact, although any surface treatment available for conventional implants can be performed on EBM-manufactured implants with a conventional design.

  9. Design of a lamella settler for biomass recycling in continuous ethanol fermentation process.

    PubMed

    Tabera, J; Iznaola, M A

    1989-04-20

    The design and application of a settler to a continuous fermentation process with yeast recycle were studied. The compact lamella-type settler was chosen to avoid large volumes associated with conventional settling tanks. A rationale of the design method is covered. The sedimentation area was determined by classical batch settling rate tests and sedimentation capacity calculation. Limitations on the residence time of the microorganisms in the settler, rather than sludge thickening considerations, was the approach employed for volume calculation. Fermentation rate tests with yeast after different sedimentation periods were carried out to define a suitable residence time. Continuous cell recycle fermentation runs, performed with the old and new sedimentation devices, show that lamella settler improves biomass recycling efficiency, being the process able to operate at higher sugar concentrations and faster dilution rates.

  10. A Discriminative Approach to EEG Seizure Detection

    PubMed Central

    Johnson, Ashley N.; Sow, Daby; Biem, Alain

    2011-01-01

    Seizures are abnormal sudden discharges in the brain with signatures represented in electroencephalograms (EEG). The efficacy of the application of speech processing techniques to discriminate between seizure and non-seizure states in EEGs is reported. The approach accounts for the challenges of unbalanced datasets (seizure and non-seizure), while also showing a system capable of real-time seizure detection. The Minimum Classification Error (MCE) algorithm, which is a discriminative learning algorithm with wide-use in speech processing, is applied and compared with conventional classification techniques that have already been applied to the discrimination between seizure and non-seizure states in the literature. The system is evaluated on 22 pediatric patients multi-channel EEG recordings. Experimental results show that the application of speech processing techniques and MCE compare favorably with conventional classification techniques in terms of classification performance, while requiring less computational overhead. The results strongly suggests the possibility of deploying the designed system at the bedside. PMID:22195192

  11. Fuzzy control of a fluidized bed dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taprantzis, A.V.; Siettos, C.I.; Bafas, G.V.

    1997-05-01

    Fluidized bed dryers are utilized in almost every area of drying applications and therefore improved control strategies are always of great interest. The nonlinear character of the process, exhibited in the mathematical model and the open loop analysis, implies that a fuzzy logic controller is appropriate because, in contrast with conventional control schemes, fuzzy control inherently compensates for process nonlinearities and exhibits more robust behavior. In this study, a fuzzy logic controller is proposed; its design is based on a heuristic approach and its performance is compared against a conventional PI controller for a variety of responses. It is shownmore » that the fuzzy controller exhibits a remarkable dynamic behavior, equivalent if not better than the PI controller, for a wide range of disturbances. In addition, the proposed fuzzy controller seems to be less sensitive to the nonlinearities of the process, achieves energy savings and enables MIMO control.« less

  12. Manufacturing process design for multi commodities in agriculture

    NASA Astrophysics Data System (ADS)

    Prasetyawan, Yudha; Santosa, Andrian Henry

    2017-06-01

    High-potential commodities within particular agricultural sectors should be accompanied by maximum benefit value that can be attained by both local farmers and business players. In several cases, the business players are small-medium enterprises (SMEs) which have limited resources to perform added value process of the local commodities into the potential products. The weaknesses of SMEs such as the manual production process with low productivity, limited capacity to maintain prices, and unattractive packaging due to conventional production. Agricultural commodity is commonly created into several products such as flour, chips, crackers, oil, juice, and other products. This research was initiated by collecting data by interview method particularly to obtain the perspectives of SMEs as the business players. Subsequently, the information was processed based on the Quality Function Deployment (QFD) to determine House of Quality from the first to fourth level. A proposed design as the result of QFD was produced and evaluated with Technology Assessment Model (TAM) and continued with a revised design. Finally, the revised design was analyzed with financial perspective to obtain the cost structure of investment, operational, maintenance, and workers. The machine that performs manufacturing process, as the result of revised design, was prototyped and tested to determined initial production process. The designed manufacturing process offers IDR 337,897, 651 of Net Present Value (NPV) in comparison with the existing process value of IDR 9,491,522 based on similar production input.

  13. Centrifugal Sieve for Gravity-Level-Independent Size Segregation of Granular Materials

    NASA Technical Reports Server (NTRS)

    Walton, Otis R.; Dreyer, Christopher; Riedel, Edward

    2013-01-01

    Conventional size segregation or screening in batch mode, using stacked vibrated screens, is often a time-consuming process. Utilization of centrifugal force instead of gravity as the primary body force can significantly shorten the time to segregate feedstock into a set of different-sized fractions. Likewise, under reduced gravity or microgravity, a centrifugal sieve system would function as well as it does terrestrially. When vibratory and mechanical blade sieving screens designed for terrestrial conditions were tested under lunar gravity conditions, they did not function well. The centrifugal sieving design of this technology overcomes the issues that prevented sieves designed for terrestrial conditions from functioning under reduced gravity. These sieves feature a rotating outer (cylindrical or conical) screen wall, rotating fast enough for the centrifugal forces near the wall to hold granular material against the rotating screen. Conventional centrifugal sieves have a stationary screen and rapidly rotating blades that shear the granular solid near the stationary screen, and effect the sieving process assisted by the airflow inside the unit. The centrifugal sieves of this new design may (or may not) have an inner blade or blades, moving relative to the rotating wall screen. Some continuous flow embodiments would have no inner auger or blades, but achieve axial motion through vibration. In all cases, the shearing action is gentler than conventional centrifugal sieves, which have very high velocity differences between the stationary outer screen and the rapidly rotating blades. The new design does not depend on airflow in the sieving unit, so it will function just as well in vacuum as in air. One advantage of the innovation for batch sieving is that a batch-mode centrifugal sieve may accomplish the same sieving operation in much less time than a conventional stacked set of vibrated screens (which utilize gravity as the primary driving force for size separation). In continuous mode, the centrifugal sieves can provide steady streams of fine and coarse material separated from a mixed feedstock flow stream. The centrifugal sieves can be scaled to any desired size and/or mass flow rate. Thus, they could be made in sizes suitable for small robotic exploratory missions, or for semi-permanent processing of regolith for extraction of volatiles of minerals. An advantage of the continuous-mode system is that it can be made with absolutely no gravity flow components for feeding material into, or for extracting the separated size streams from, the centrifugal sieve. Thus, the system is capable of functioning in a true microgravity environment. Another advantage of the continuous-mode system is that some embodiments of the innovation have no internal blades or vanes, and thus, can be designed to handle a very wide range of feedstock sizes, including occasional very large oversized pieces, without jamming or seizing up.

  14. Optical surface analysis: a new technique for the inspection and metrology of optoelectronic films and wafers

    NASA Astrophysics Data System (ADS)

    Bechtler, Laurie; Velidandla, Vamsi

    2003-04-01

    In response to demand for higher volumes and greater product capability, integrated optoelectronic device processing is rapidly increasing in complexity, benefiting from techniques developed for conventional silicon integrated circuit processing. The needs for high product yield and low manufacturing cost are also similar to the silicon wafer processing industry. This paper discusses the design and use of an automated inspection instrument called the Optical Surface Analyzer (OSA) to evaluate two critical production issues in optoelectronic device manufacturing: (1) film thickness uniformity, and (2) defectivity at various process steps. The OSA measurement instrument is better suited to photonics process development than most equipment developed for conventional silicon wafer processing in two important ways: it can handle both transparent and opaque substrates (unlike most inspection and metrology tools), and it is a full-wafer inspection method that captures defects and film variations over the entire substrate surface (unlike most film thickness measurement tools). Measurement examples will be provided in the paper for a variety of films and substrates used for optoelectronics manufacturing.

  15. 40 CFR 408.210 - Applicability; description of the non-Alaskan conventional bottom fish processing subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-Alaskan conventional bottom fish processing subcategory. 408.210 Section 408.210 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Non-Alaskan Conventional Bottom Fish Processing Subcategory § 408.210 Applicability; description of the non-Alaskan conventional bottom fish processing subcategory. The provisions of...

  16. 40 CFR 408.210 - Applicability; description of the non-Alaskan conventional bottom fish processing subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...-Alaskan conventional bottom fish processing subcategory. 408.210 Section 408.210 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Non-Alaskan Conventional Bottom Fish Processing Subcategory § 408.210 Applicability; description of the non-Alaskan conventional bottom fish processing subcategory. The provisions of...

  17. 40 CFR 408.210 - Applicability; description of the non-Alaskan conventional bottom fish processing subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-Alaskan conventional bottom fish processing subcategory. 408.210 Section 408.210 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Non-Alaskan Conventional Bottom Fish Processing Subcategory § 408.210 Applicability; description of the non-Alaskan conventional bottom fish processing subcategory. The provisions of...

  18. 40 CFR 408.210 - Applicability; description of the non-Alaskan conventional bottom fish processing subcategory.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...-Alaskan conventional bottom fish processing subcategory. 408.210 Section 408.210 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Non-Alaskan Conventional Bottom Fish Processing Subcategory § 408.210 Applicability; description of the non-Alaskan conventional bottom fish processing subcategory. The provisions of...

  19. 40 CFR 408.210 - Applicability; description of the non-Alaskan conventional bottom fish processing subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-Alaskan conventional bottom fish processing subcategory. 408.210 Section 408.210 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Non-Alaskan Conventional Bottom Fish Processing Subcategory § 408.210 Applicability; description of the non-Alaskan conventional bottom fish processing subcategory. The provisions of...

  20. High-Fidelity Multidisciplinary Design Optimization of Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Martins, Joaquim R. R. A.; Kenway, Gaetan K. W.; Burdette, David; Jonsson, Eirikur; Kennedy, Graeme J.

    2017-01-01

    To evaluate new airframe technologies we need design tools based on high-fidelity models that consider multidisciplinary interactions early in the design process. The overarching goal of this NRA is to develop tools that enable high-fidelity multidisciplinary design optimization of aircraft configurations, and to apply these tools to the design of high aspect ratio flexible wings. We develop a geometry engine that is capable of quickly generating conventional and unconventional aircraft configurations including the internal structure. This geometry engine features adjoint derivative computation for efficient gradient-based optimization. We also added overset capability to a computational fluid dynamics solver, complete with an adjoint implementation and semiautomatic mesh generation. We also developed an approach to constraining buffet and started the development of an approach for constraining utter. On the applications side, we developed a new common high-fidelity model for aeroelastic studies of high aspect ratio wings. We performed optimal design trade-o s between fuel burn and aircraft weight for metal, conventional composite, and carbon nanotube composite wings. We also assessed a continuous morphing trailing edge technology applied to high aspect ratio wings. This research resulted in the publication of 26 manuscripts so far, and the developed methodologies were used in two other NRAs. 1

  1. Evolution and Development of Effective Feedstock Specifications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garold Gresham; Rachel Emerson; Amber Hoover

    The U.S. Department of Energy promotes the production of a range of liquid fuels and fuel blend stocks from lignocellulosic biomass feedstocks by funding fundamental and applied research that advances the state of technology in biomass collection, conversion, and sustainability. As part of its involvement in this program, the Idaho National Laboratory (INL) investigates the feedstock logistics economics and sustainability of these fuels. The 2012 feedstock logistics milestone demonstrated that for high-yield areas that minimize the transportation distances of a low-density, unstable biomass, we could achieve a delivered cost of $35/ton. Based on current conventional equipment and processes, the 2012more » logistics design is able to deliver the volume of biomass needed to fulfill the 2012 Renewable Fuel Standard’s targets for ethanol. However, the Renewable Fuel Standard’s volume targets are continuing to increase and are expected to peak in 2022 at 36 billion gallons. Meeting these volume targets and achieving a national-scale biofuels industry will require expansion of production capacity beyond the 2012 Conventional Feedstock Supply Design Case to access diverse available feedstocks, regardless of their inherent ability to meet preliminary biorefinery quality feedstock specifications. Implementation of quality specifications (specs), as outlined in the 2017 Design Case – “Feedstock Supply System Design and Economics for Conversion of Lignocellulosic Biomass to Hydrocarbon Fuels” (in progress), requires insertion of deliberate, active quality controls into the feedstock supply chain, whereas the 2012 Conventional Design only utilizes passive quality controls.« less

  2. Accessing microfluidics through feature-based design software for 3D printing.

    PubMed

    Shankles, Peter G; Millet, Larry J; Aufrecht, Jayde A; Retterer, Scott T

    2018-01-01

    Additive manufacturing has been a cornerstone of the product development pipeline for decades, playing an essential role in the creation of both functional and cosmetic prototypes. In recent years, the prospects for distributed and open source manufacturing have grown tremendously. This growth has been enabled by an expanding library of printable materials, low-cost printers, and communities dedicated to platform development. The microfluidics community has embraced this opportunity to integrate 3D printing into the suite of manufacturing strategies used to create novel fluidic architectures. The rapid turnaround time and low cost to implement these strategies in the lab makes 3D printing an attractive alternative to conventional micro- and nanofabrication techniques. In this work, the production of multiple microfluidic architectures using a hybrid 3D printing-soft lithography approach is demonstrated and shown to enable rapid device fabrication with channel dimensions that take advantage of laminar flow characteristics. The fabrication process outlined here is underpinned by the implementation of custom design software with an integrated slicer program that replaces less intuitive computer aided design and slicer software tools. Devices are designed in the program by assembling parameterized microfluidic building blocks. The fabrication process and flow control within 3D printed devices were demonstrated with a gradient generator and two droplet generator designs. Precise control over the printing process allowed 3D microfluidics to be printed in a single step by extruding bridge structures to 'jump-over' channels in the same plane. This strategy was shown to integrate with conventional nanofabrication strategies to simplify the operation of a platform that incorporates both nanoscale features and 3D printed microfluidics.

  3. Accessing microfluidics through feature-based design software for 3D printing

    PubMed Central

    Shankles, Peter G.; Millet, Larry J.; Aufrecht, Jayde A.

    2018-01-01

    Additive manufacturing has been a cornerstone of the product development pipeline for decades, playing an essential role in the creation of both functional and cosmetic prototypes. In recent years, the prospects for distributed and open source manufacturing have grown tremendously. This growth has been enabled by an expanding library of printable materials, low-cost printers, and communities dedicated to platform development. The microfluidics community has embraced this opportunity to integrate 3D printing into the suite of manufacturing strategies used to create novel fluidic architectures. The rapid turnaround time and low cost to implement these strategies in the lab makes 3D printing an attractive alternative to conventional micro- and nanofabrication techniques. In this work, the production of multiple microfluidic architectures using a hybrid 3D printing-soft lithography approach is demonstrated and shown to enable rapid device fabrication with channel dimensions that take advantage of laminar flow characteristics. The fabrication process outlined here is underpinned by the implementation of custom design software with an integrated slicer program that replaces less intuitive computer aided design and slicer software tools. Devices are designed in the program by assembling parameterized microfluidic building blocks. The fabrication process and flow control within 3D printed devices were demonstrated with a gradient generator and two droplet generator designs. Precise control over the printing process allowed 3D microfluidics to be printed in a single step by extruding bridge structures to ‘jump-over’ channels in the same plane. This strategy was shown to integrate with conventional nanofabrication strategies to simplify the operation of a platform that incorporates both nanoscale features and 3D printed microfluidics. PMID:29596418

  4. Design of a compact disk-like microfluidic platform for enzyme-linked immunosorbent assay.

    PubMed

    Lai, Siyi; Wang, Shengnian; Luo, Jun; Lee, L James; Yang, Shang-Tian; Madou, Marc J

    2004-04-01

    This paper presents an integrated microfluidic device on a compact disk (CD) that performs an enzyme-linked immunosorbent assay (ELISA) for rat IgG from a hybridoma cell culture. Centrifugal and capillary forces were used to control the flow sequence of different solutions involved in the ELISA process. The microfluidic device was fabricated on a plastic CD. Each step of the ELISA process was carried out automatically by controlling the rotation speed of the CD. The work on analysis of rat IgG from hybridoma culture showed that the microchip-based ELISA has the same detection range as the conventional method on the 96-well microtiter plate but has advantages such as less reagent consumption and shorter assay time over the conventional method.

  5. Accelerating Molecular Dynamic Simulation on Graphics Processing Units

    PubMed Central

    Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.

    2009-01-01

    We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337

  6. Early detection of materials degradation

    NASA Astrophysics Data System (ADS)

    Meyendorf, Norbert

    2017-02-01

    Lightweight components for transportation and aerospace applications are designed for an estimated lifecycle, taking expected mechanical and environmental loads into account. The main reason for catastrophic failure of components within the expected lifecycle are material inhomogeneities, like pores and inclusions as origin for fatigue cracks, that have not been detected by NDE. However, material degradation by designed or unexpected loading conditions or environmental impacts can accelerate the crack initiation or growth. Conventional NDE methods are usually able to detect cracks that are formed at the end of the degradation process, but methods for early detection of fatigue, creep, and corrosion are still a matter of research. For conventional materials ultrasonic, electromagnetic, or thermographic methods have been demonstrated as promising. Other approaches are focused to surface damage by using optical methods or characterization of the residual surface stresses that can significantly affect the creation of fatigue cracks. For conventional metallic materials, material models for nucleation and propagation of damage have been successfully applied for several years. Material microstructure/property relations are well established and the effect of loading conditions on the component life can be simulated. For advanced materials, for example carbon matrix composites or ceramic matrix composites, the processes of nucleation and propagation of damage is still not fully understood. For these materials NDE methods can not only be used for the periodic inspections, but can significantly contribute to the material scientific knowledge to understand and model the behavior of composite materials.

  7. Media processors using a new microsystem architecture designed for the Internet era

    NASA Astrophysics Data System (ADS)

    Wyland, David C.

    1999-12-01

    The demands of digital image processing, communications and multimedia applications are growing more rapidly than traditional design methods can fulfill them. Previously, only custom hardware designs could provide the performance required to meet the demands of these applications. However, hardware design has reached a crisis point. Hardware design can no longer deliver a product with the required performance and cost in a reasonable time for a reasonable risk. Software based designs running on conventional processors can deliver working designs in a reasonable time and with low risk but cannot meet the performance requirements. What is needed is a media processing approach that combines very high performance, a simple programming model, complete programmability, short time to market and scalability. The Universal Micro System (UMS) is a solution to these problems. The UMS is a completely programmable (including I/O) system on a chip that combines hardware performance with the fast time to market, low cost and low risk of software designs.

  8. Design Sketches For Optical Crossbar Switches Intended For Large-Scale Parallel Processing Applications

    NASA Astrophysics Data System (ADS)

    Hartmann, Alfred; Redfield, Steve

    1989-04-01

    This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.

  9. Conductor requirements for high-temperature superconducting utility power transformers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pleva, E. F.; Mehrotra, V.; Schwenterly, S W

    High-temperature superconducting (HTS) coated conductors in utility power transformers must satisfy a set of operating requirements that are driven by two major considerations-HTS transformers must be economically competitive with conventional units, and the conductor must be robust enough to be used in a commercial manufacturing environment. The transformer design and manufacturing process will be described in order to highlight the various requirements that it imposes on the HTS conductor. Spreadsheet estimates of HTS transformer costs allow estimates of the conductor cost required for an HTS transformer to be competitive with a similarly performing conventional unit.

  10. Optical metasurfaces for high angle steering at visible wavelengths

    DOE PAGES

    Lin, Dianmin; Melli, Mauro; Poliakov, Evgeni; ...

    2017-05-23

    Metasurfaces have facilitated the replacement of conventional optical elements with ultrathin and planar photonic structures. Previous designs of metasurfaces were limited to small deflection angles and small ranges of the angle of incidence. Here, we have created two types of Si-based metasurfaces to steer visible light to a large deflection angle. These structures exhibit high diffraction efficiencies over a broad range of angles of incidence. We have demonstrated metasurfaces working both in transmission and reflection modes based on conventional thin film silicon processes that are suitable for the large-scale fabrication of high-performance devices.

  11. Additive Manufacturing of Functional Elements on Sheet Metal

    NASA Astrophysics Data System (ADS)

    Schaub, Adam; Ahuja, Bhrigu; Butzhammer, Lorenz; Osterziel, Johannes; Schmidt, Michael; Merklein, Marion

    Laser Beam Melting (LBM) process with its advantages of high design flexibility and free form manufacturing methodology is often applied limitedly due to its low productivity and unsuitability for mass production compared to conventional manufacturing processes. In order to overcome these limitations, a hybrid manufacturing methodology is developed combining the additive manufacturing process of laser beam melting with sheet forming processes. With an interest towards aerospace and medical industry, the material in focus is Ti-6Al-4V. Although Ti-6Al-4V is a commercially established material and its application for LBM process has been extensively investigated, the combination of LBM of Ti-6Al-4V with sheet metal still needs to be researched. Process dynamics such as high temperature gradients and thermally induced stresses lead to complex stress states at the interaction zone between the sheet and LBM structure. Within the presented paper mechanical characterization of hybrid parts will be performed by shear testing. The association of shear strength with process parameters is further investigated by analyzing the internal structure of the hybrid geometry at varying energy inputs during the LBM process. In order to compare the hybrid manufacturing methodology with conventional fabrication, the conventional methodologies subtractive machining and state of the art Laser Beam Melting is evaluated within this work. These processes will be analyzed for their mechanical characteristics and productivity by determining the build time and raw material consumption for each case. The paper is concluded by presenting the characteristics of the hybrid manufacturing methodology compared to alternative manufacturing technologies.

  12. Design of concrete waste basin in Integrated Temporarily Sanitary Landfill (ITSL) in Siosar, Karo Regency, Indonesia on supporting clean environment and sustainable fertilizers for farmers

    NASA Astrophysics Data System (ADS)

    Ginting, N.; Siahaan, J.; Tarigan, A. P.

    2018-03-01

    A new settlement in Siosar village of Karo Regency has been developed for people whose villages have been completely destroyed by the prolong eruptions of Sinabung. An integrated temporarily sanitary landfill (ITSL) was built there to support the new living environment. The objective of this study is to investigate the organic waste decomposing in order to improve the design of the conventional concrete waste basin installed in the ITSL. The study was last from May until August 2016. The used design was Completely Randomized Design (CRD) in which organic waste was treated using decomposer with five replications in three composter bins. Decomposting process lasted for three weeks. Research parameters were pH, temperature, waste reduction in weight, C/N, and organic fertilizer production(%). The results of waste compost as follows : pH was 9.45, ultimate temperature was 31.6°C, C/N was in the range of 10.5-12.4, waste reduction was 53% and organic fertilizer production was 47%. Based on the decomposting process and the analysis, it is recommended that the conventional concrete waste basin should be divided into three colums and each column would be filled with waste when previous column is fulled. It is predicted that when the third column is fully occupied then the waste in the first column already become a sustainable fertilizer.

  13. Design, fabrication and experimental validation of a novel dry-contact sensor for measuring electroencephalography signals without skin preparation.

    PubMed

    Liao, Lun-De; Wang, I-Jan; Chen, Sheng-Fu; Chang, Jyh-Yeong; Lin, Chin-Teng

    2011-01-01

    In the present study, novel dry-contact sensors for measuring electro-encephalography (EEG) signals without any skin preparation are designed, fabricated by an injection molding manufacturing process and experimentally validated. Conventional wet electrodes are commonly used to measure EEG signals; they provide excellent EEG signals subject to proper skin preparation and conductive gel application. However, a series of skin preparation procedures for applying the wet electrodes is always required and usually creates trouble for users. To overcome these drawbacks, novel dry-contact EEG sensors were proposed for potential operation in the presence or absence of hair and without any skin preparation or conductive gel usage. The dry EEG sensors were designed to contact the scalp surface with 17 spring contact probes. Each probe was designed to include a probe head, plunger, spring, and barrel. The 17 probes were inserted into a flexible substrate using a one-time forming process via an established injection molding procedure. With these 17 spring contact probes, the flexible substrate allows for high geometric conformity between the sensor and the irregular scalp surface to maintain low skin-sensor interface impedance. Additionally, the flexible substrate also initiates a sensor buffer effect, eliminating pain when force is applied. The proposed dry EEG sensor was reliable in measuring EEG signals without any skin preparation or conductive gel usage, as compared with the conventional wet electrodes.

  14. Design, Fabrication and Experimental Validation of a Novel Dry-Contact Sensor for Measuring Electroencephalography Signals without Skin Preparation

    PubMed Central

    Liao, Lun-De; Wang, I-Jan; Chen, Sheng-Fu; Chang, Jyh-Yeong; Lin, Chin-Teng

    2011-01-01

    In the present study, novel dry-contact sensors for measuring electro-encephalography (EEG) signals without any skin preparation are designed, fabricated by an injection molding manufacturing process and experimentally validated. Conventional wet electrodes are commonly used to measure EEG signals; they provide excellent EEG signals subject to proper skin preparation and conductive gel application. However, a series of skin preparation procedures for applying the wet electrodes is always required and usually creates trouble for users. To overcome these drawbacks, novel dry-contact EEG sensors were proposed for potential operation in the presence or absence of hair and without any skin preparation or conductive gel usage. The dry EEG sensors were designed to contact the scalp surface with 17 spring contact probes. Each probe was designed to include a probe head, plunger, spring, and barrel. The 17 probes were inserted into a flexible substrate using a one-time forming process via an established injection molding procedure. With these 17 spring contact probes, the flexible substrate allows for high geometric conformity between the sensor and the irregular scalp surface to maintain low skin-sensor interface impedance. Additionally, the flexible substrate also initiates a sensor buffer effect, eliminating pain when force is applied. The proposed dry EEG sensor was reliable in measuring EEG signals without any skin preparation or conductive gel usage, as compared with the conventional wet electrodes. PMID:22163929

  15. Feedstock and Conversion Supply System Design and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobson, J.; Mohammad, R.; Cafferty, K.

    The success of the earlier logistic pathway designs (Biochemical and Thermochemical) from a feedstock perspective was that it demonstrated that through proper equipment selection and best management practices, conventional supply systems (referred to in this report as “conventional designs,” or specifically the 2012 Conventional Design) can be successfully implemented to address dry matter loss, quality issues, and enable feedstock cost reductions that help to reduce feedstock risk of variable supply and quality and enable industry to commercialize biomass feedstock supply chains. The caveat of this success is that conventional designs depend on high density, low-cost biomass with no disruption frommore » incremental weather. In this respect, the success of conventional designs is tied to specific, highly productive regions such as the southeastern U.S. which has traditionally supported numerous pulp and paper industries or the Midwest U.S for corn stover.« less

  16. Concurrent design of quasi-random photonic nanostructures

    PubMed Central

    Lee, Won-Kyu; Yu, Shuangcheng; Engel, Clifford J.; Reese, Thaddeus; Rhee, Dongjoon; Chen, Wei

    2017-01-01

    Nanostructured surfaces with quasi-random geometries can manipulate light over broadband wavelengths and wide ranges of angles. Optimization and realization of stochastic patterns have typically relied on serial, direct-write fabrication methods combined with real-space design. However, this approach is not suitable for customizable features or scalable nanomanufacturing. Moreover, trial-and-error processing cannot guarantee fabrication feasibility because processing–structure relations are not included in conventional designs. Here, we report wrinkle lithography integrated with concurrent design to produce quasi-random nanostructures in amorphous silicon at wafer scales that achieved over 160% light absorption enhancement from 800 to 1,200 nm. The quasi-periodicity of patterns, materials filling ratio, and feature depths could be independently controlled. We statistically represented the quasi-random patterns by Fourier spectral density functions (SDFs) that could bridge the processing–structure and structure–performance relations. Iterative search of the optimal structure via the SDF representation enabled concurrent design of nanostructures and processing. PMID:28760975

  17. Design and Analysis of a Two-Stage Adsorption Air Chiller

    NASA Astrophysics Data System (ADS)

    Benrajesh, P.; Rajan, A. John

    2017-05-01

    The objective of this article is to design and build a bio-friendly air-conditioner, by using adsorption method in the presence of 15% of calcium carbide in water. Aluminum sheet metals are used to form three identical tunnels, to pass the air for processing. Exhaust heat generated from the dairy sterilizing unit process is reutilized, for cooling the environment through this equipment. This equipment is designed, and the analysis is carried out to quantify the COP, SCP, and cooling power. Heat exchangers are designed; its Performance Parameters are quantified and correlated with the conventional designs. It is observed that the new adsorption chiller can produce the coefficient of performance of chiller as 1.068; the Specific cooling power of 10.66 (W/Kg); and the Cooling power of 4.2 KW. This equipment needs 0 to 15 minutes to reach the desired cool breeze (24°c) from the existing room temperature (29°c).

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, P.L.

    As the Oct. 31 deadline for an initial design review approaches, the four participants in the Energy Research and Development Administration's (ERDA) industrial process hot water program are putting the final touches to plans for solar systems that will supplement conventional energy sources in the textile, food processing, concrete block and cleaning industries. Participating in the project are AAI Corp., Baltimore, which designed a solar hot water system for the concrete block curing operation of York Building Products Co., Harrisburg, Pa.; Acurex Corp., Mountain View, Calif., which designed a solar hot water system for a can washing line at themore » Campbell Soup Co. plant in Sacramento, Calif.; General Electric Co., Philadelphia, which designed a solar hot water system for Riegel Textile Corp., La France, S.C.; and Jacobs Engineering Co., Pasadena, Calif., which designed a solar hot water and steam system for commercial laundry use at American Linen Supply in El Centro., Calif. (MCW)« less

  19. Very high temperature fiber processing and testing through the use of ultrahigh solar energy concentration

    NASA Astrophysics Data System (ADS)

    Jacobson, Benjamin A.; Gleckman, Philip L.; Holman, Robert L.; Sagie, Daniel; Winston, Roland

    1991-10-01

    We have demonstrated the feasibility of a high temperature cool-wall optical furnace that harnesses the unique power of concentrated solar heating for advanced materials processing and testing. Out small-scale test furnace achieved temperatures as high as 2400 C within a 10 mm X 0.44 mm cylindrical hot-zone. Optimum performance and efficiency resulted from an innovative two-stage optical design using a long-focal length, point-focus, conventional primary concentrator and a non-imaging secondary concentrator specifically designed for the cylindrical geometry of the target fiber. A scale-up analysis suggests that even higher temperatures can be achieved over hot zones large enough for practical commercial fiber post- processing and testing.

  20. Integral bypass diodes in an amorphous silicon alloy photovoltaic module

    NASA Technical Reports Server (NTRS)

    Hanak, J. J.; Flaisher, H.

    1991-01-01

    Thin-film, tandem-junction, amorphous silicon (a-Si) photovoltaic modules were constructed in which a part of the a-Si alloy cell material is used to form bypass protection diodes. This integral design circumvents the need for incorporating external, conventional diodes, thus simplifying the manufacturing process and reducing module weight.

  1. Attitudes of Prospective Human Resource Personnel towards Distance Learning Degrees

    ERIC Educational Resources Information Center

    Udegbe, I. Bola

    2012-01-01

    This study investigated the attitudes of Prospective Human Resource Personnel toward degrees obtained by distance learning in comparison to those obtained through conventional degree program. Using a cross-sectional survey design, a total of 215 postgraduate students who had been or had potential to be involved in the hiring process in their…

  2. Interactive Learning Environment for Bio-Inspired Optimization Algorithms for UAV Path Planning

    ERIC Educational Resources Information Center

    Duan, Haibin; Li, Pei; Shi, Yuhui; Zhang, Xiangyin; Sun, Changhao

    2015-01-01

    This paper describes the development of BOLE, a MATLAB-based interactive learning environment, that facilitates the process of learning bio-inspired optimization algorithms, and that is dedicated exclusively to unmanned aerial vehicle path planning. As a complement to conventional teaching methods, BOLE is designed to help students consolidate the…

  3. Design of a Programmable Gain, Temperature Compensated Current-Input Current-Output CMOS Logarithmic Amplifier.

    PubMed

    Ming Gu; Chakrabartty, Shantanu

    2014-06-01

    This paper presents the design of a programmable gain, temperature compensated, current-mode CMOS logarithmic amplifier that can be used for biomedical signal processing. Unlike conventional logarithmic amplifiers that use a transimpedance technique to generate a voltage signal as a logarithmic function of the input current, the proposed approach directly produces a current output as a logarithmic function of the input current. Also, unlike a conventional transimpedance amplifier the gain of the proposed logarithmic amplifier can be programmed using floating-gate trimming circuits. The synthesis of the proposed circuit is based on the Hart's extended translinear principle which involves embedding a floating-voltage source and a linear resistive element within a translinear loop. Temperature compensation is then achieved using a translinear-based resistive cancelation technique. Measured results from prototypes fabricated in a 0.5 μm CMOS process show that the amplifier has an input dynamic range of 120 dB and a temperature sensitivity of 230 ppm/°C (27 °C- 57°C), while consuming less than 100 nW of power.

  4. Exploratory development of foams from liquid crystal polymers

    NASA Technical Reports Server (NTRS)

    Chung, T. S.

    1985-01-01

    Two types of liquid crystal polymer (LCP) compositions were studied and evaluated as structural foam materials. One is a copolymer of 6-hydroxy-2-naphthoic acid, terephthalic acid, and p-acetoxyacetanilide (designed HNA/TA/AAA), and the other is a copolymer of p-hydroxybenzoic acid and 6-hydroxy-2-naphthoic acid (designated HBA/HNA). Experimental results showed that the extruded HNA/TA/AA foams have better mechanical quality and appearance than HBA/HNA foams. Heat treatment improved foam tensile strength and break elongation, but reduced their modulus. The injection molding results indicated that nitrogen foaming agents with a low-pressure process gave better void distribution in the injection molded LCP foams than those made by the conventional injection-molding machine and chemical blowing agents. However, in comparing LCP foams with other conventional plastic foams, HBA/HNA foams have better mechanical properties than foamed ABS and PS, but are comparable to PBT and inferior to polycarbonate foams, especially in heat-deflection temperature and impact resistance energy. These deficiencies are due to LCP molecules not having been fully oriented during the Union-Carbide low-pressure foaming process.

  5. Investigation of thermochemical biorefinery sizing and environmental sustainability impacts for conventional supply system and distributed preprocessing supply system designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muth, jr., David J.; Langholtz, Matthew H.; Tan, Eric

    2014-03-31

    The 2011 US Billion-Ton Update estimates that by 2030 there will be enough agricultural and forest resources to sustainably provide at least one billion dry tons of biomass annually, enough to displace approximately 30% of the country's current petroleum consumption. A portion of these resources are inaccessible at current cost targets with conventional feedstock supply systems because of their remoteness or low yields. Reliable analyses and projections of US biofuels production depend on assumptions about the supply system and biorefinery capacity, which, in turn, depend upon economic value, feedstock logistics, and sustainability. A cross-functional team has examined combinations of advancesmore » in feedstock supply systems and biorefinery capacities with rigorous design information, improved crop yield and agronomic practices, and improved estimates of sustainable biomass availability. A previous report on biochemical refinery capacity noted that under advanced feedstock logistic supply systems that include depots and pre-processing operations there are cost advantages that support larger biorefineries up to 10 000 DMT/day facilities compared to the smaller 2000 DMT/day facilities. This report focuses on analyzing conventional versus advanced depot biomass supply systems for a thermochemical conversion and refinery sizing based on woody biomass. The results of this analysis demonstrate that the economies of scale enabled by advanced logistics offsets much of the added logistics costs from additional depot processing and transportation, resulting in a small overall increase to the minimum ethanol selling price compared to the conventional logistic supply system. While the overall costs do increase slightly for the advanced logistic supply systems, the ability to mitigate moisture and ash in the system will improve the storage and conversion processes. In addition, being able to draw on feedstocks from further distances will decrease the risk of biomass supply to the conversion facility.« less

  6. Image gathering and processing - Information and fidelity

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Halyo, N.; Samms, R. W.; Stacy, K.

    1985-01-01

    In this paper we formulate and use information and fidelity criteria to assess image gathering and processing, combining optical design with image-forming and edge-detection algorithms. The optical design of the image-gathering system revolves around the relationship among sampling passband, spatial response, and signal-to-noise ratio (SNR). Our formulations of information, fidelity, and optimal (Wiener) restoration account for the insufficient sampling (i.e., aliasing) common in image gathering as well as for the blurring and noise that conventional formulations account for. Performance analyses and simulations for ordinary optical-design constraints and random scences indicate that (1) different image-forming algorithms prefer different optical designs; (2) informationally optimized designs maximize the robustness of optimal image restorations and lead to the highest-spatial-frequency channel (relative to the sampling passband) for which edge detection is reliable (if the SNR is sufficiently high); and (3) combining the informationally optimized design with a 3 by 3 lateral-inhibitory image-plane-processing algorithm leads to a spatial-response shape that approximates the optimal edge-detection response of (Marr's model of) human vision and thus reduces the data preprocessing and transmission required for machine vision.

  7. Information theoretical assessment of visual communication with subband coding

    NASA Astrophysics Data System (ADS)

    Rahman, Zia-ur; Fales, Carl L.; Huck, Friedrich O.

    1994-09-01

    A well-designed visual communication channel is one which transmits the most information about a radiance field with the fewest artifacts. The role of image processing, encoding and restoration is to improve the quality of visual communication channels by minimizing the error in the transmitted data. Conventionally this role has been analyzed strictly in the digital domain neglecting the effects of image-gathering and image-display devices on the quality of the image. This results in the design of a visual communication channel which is `suboptimal.' We propose an end-to-end assessment of the imaging process which incorporates the influences of these devices in the design of the encoder and the restoration process. This assessment combines Shannon's communication theory with Wiener's restoration filter and with the critical design factors of the image gathering and display devices, thus providing the metrics needed to quantify and optimize the end-to-end performance of the visual communication channel. Results show that the design of the image-gathering device plays a significant role in determining the quality of the visual communication channel and in designing the analysis filters for subband encoding.

  8. Virtual reality microscope versus conventional microscope regarding time to diagnosis: an experimental study.

    PubMed

    Randell, Rebecca; Ruddle, Roy A; Mello-Thoms, Claudia; Thomas, Rhys G; Quirke, Phil; Treanor, Darren

    2013-01-01

      To create and evaluate a virtual reality (VR) microscope that is as efficient as the conventional microscope, seeking to support the introduction of digital slides into routine practice.   A VR microscope was designed and implemented by combining ultra-high-resolution displays with VR technology, techniques for fast interaction, and high usability. It was evaluated using a mixed factorial experimental design with technology and task as within-participant variables and grade of histopathologist as a between-participant variable. Time to diagnosis was similar for the conventional and VR microscopes. However, there was a significant difference in the mean magnification used between the two technologies, with participants working at a higher level of magnification on the VR microscope.   The results suggest that, with the right technology, efficient use of digital pathology for routine practice is a realistic possibility. Further work is required to explore what magnification is required on the VR microscope for histopathologists to identify diagnostic features, and the effect on this of the digital slide production process. © 2012 Blackwell Publishing Limited.

  9. Reversing the conventional leather processing sequence for cleaner leather production.

    PubMed

    Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2006-02-01

    Conventional leather processing generally involves a combination of single and multistep processes that employs as well as expels various biological, inorganic, and organic materials. It involves nearly 14-15 steps and discharges a huge amount of pollutants. This is primarily due to the fact that conventional leather processing employs a "do-undo" process logic. In this study, the conventional leather processing steps have been reversed to overcome the problems associated with the conventional method. The charges of the skin matrix and of the chemicals and pH profiles of the process have been judiciously used for reversing the process steps. This reversed process eventually avoids several acidification and basification/neutralization steps used in conventional leather processing. The developed process has been validated through various analyses such as chromium content, shrinkage temperature, softness measurements, scanning electron microscopy, and physical testing of the leathers. Further, the performance of the leathers is shown to be on par with conventionally processed leathers through bulk property evaluation. The process enjoys a significant reduction in COD and TS by 53 and 79%, respectively. Water consumption and discharge is reduced by 65 and 64%, respectively. Also, the process benefits from significant reduction in chemicals, time, power, and cost compared to the conventional process.

  10. Computer-Aided Diagnosis with Deep Learning Architecture: Applications to Breast Lesions in US Images and Pulmonary Nodules in CT Scans

    NASA Astrophysics Data System (ADS)

    Cheng, Jie-Zhi; Ni, Dong; Chou, Yi-Hong; Qin, Jing; Tiu, Chui-Mei; Chang, Yeun-Chung; Huang, Chiun-Sheng; Shen, Dinggang; Chen, Chung-Ming

    2016-04-01

    This paper performs a comprehensive study on the deep-learning-based computer-aided diagnosis (CADx) for the differential diagnosis of benign and malignant nodules/lesions by avoiding the potential errors caused by inaccurate image processing results (e.g., boundary segmentation), as well as the classification bias resulting from a less robust feature set, as involved in most conventional CADx algorithms. Specifically, the stacked denoising auto-encoder (SDAE) is exploited on the two CADx applications for the differentiation of breast ultrasound lesions and lung CT nodules. The SDAE architecture is well equipped with the automatic feature exploration mechanism and noise tolerance advantage, and hence may be suitable to deal with the intrinsically noisy property of medical image data from various imaging modalities. To show the outperformance of SDAE-based CADx over the conventional scheme, two latest conventional CADx algorithms are implemented for comparison. 10 times of 10-fold cross-validations are conducted to illustrate the efficacy of the SDAE-based CADx algorithm. The experimental results show the significant performance boost by the SDAE-based CADx algorithm over the two conventional methods, suggesting that deep learning techniques can potentially change the design paradigm of the CADx systems without the need of explicit design and selection of problem-oriented features.

  11. Computer-Aided Diagnosis with Deep Learning Architecture: Applications to Breast Lesions in US Images and Pulmonary Nodules in CT Scans.

    PubMed

    Cheng, Jie-Zhi; Ni, Dong; Chou, Yi-Hong; Qin, Jing; Tiu, Chui-Mei; Chang, Yeun-Chung; Huang, Chiun-Sheng; Shen, Dinggang; Chen, Chung-Ming

    2016-04-15

    This paper performs a comprehensive study on the deep-learning-based computer-aided diagnosis (CADx) for the differential diagnosis of benign and malignant nodules/lesions by avoiding the potential errors caused by inaccurate image processing results (e.g., boundary segmentation), as well as the classification bias resulting from a less robust feature set, as involved in most conventional CADx algorithms. Specifically, the stacked denoising auto-encoder (SDAE) is exploited on the two CADx applications for the differentiation of breast ultrasound lesions and lung CT nodules. The SDAE architecture is well equipped with the automatic feature exploration mechanism and noise tolerance advantage, and hence may be suitable to deal with the intrinsically noisy property of medical image data from various imaging modalities. To show the outperformance of SDAE-based CADx over the conventional scheme, two latest conventional CADx algorithms are implemented for comparison. 10 times of 10-fold cross-validations are conducted to illustrate the efficacy of the SDAE-based CADx algorithm. The experimental results show the significant performance boost by the SDAE-based CADx algorithm over the two conventional methods, suggesting that deep learning techniques can potentially change the design paradigm of the CADx systems without the need of explicit design and selection of problem-oriented features.

  12. Computer-Aided Diagnosis with Deep Learning Architecture: Applications to Breast Lesions in US Images and Pulmonary Nodules in CT Scans

    PubMed Central

    Cheng, Jie-Zhi; Ni, Dong; Chou, Yi-Hong; Qin, Jing; Tiu, Chui-Mei; Chang, Yeun-Chung; Huang, Chiun-Sheng; Shen, Dinggang; Chen, Chung-Ming

    2016-01-01

    This paper performs a comprehensive study on the deep-learning-based computer-aided diagnosis (CADx) for the differential diagnosis of benign and malignant nodules/lesions by avoiding the potential errors caused by inaccurate image processing results (e.g., boundary segmentation), as well as the classification bias resulting from a less robust feature set, as involved in most conventional CADx algorithms. Specifically, the stacked denoising auto-encoder (SDAE) is exploited on the two CADx applications for the differentiation of breast ultrasound lesions and lung CT nodules. The SDAE architecture is well equipped with the automatic feature exploration mechanism and noise tolerance advantage, and hence may be suitable to deal with the intrinsically noisy property of medical image data from various imaging modalities. To show the outperformance of SDAE-based CADx over the conventional scheme, two latest conventional CADx algorithms are implemented for comparison. 10 times of 10-fold cross-validations are conducted to illustrate the efficacy of the SDAE-based CADx algorithm. The experimental results show the significant performance boost by the SDAE-based CADx algorithm over the two conventional methods, suggesting that deep learning techniques can potentially change the design paradigm of the CADx systems without the need of explicit design and selection of problem-oriented features. PMID:27079888

  13. Knowledge Dissemination of Intimate Partner Violence Intervention Studies Measured Using Alternative Metrics: Results From a Scoping Review.

    PubMed

    Madden, Kim; Evaniew, Nathan; Scott, Taryn; Domazetoska, Elena; Dosanjh, Pritnek; Li, Chuan Silvia; Thabane, Lehana; Bhandari, Mohit; Sprague, Sheila

    2016-07-01

    Alternative metrics measure the number of online mentions that an academic paper receives, including mentions in social media and online news outlets. It is important to monitor and measure dispersion of intimate partner violence (IPV) victim intervention research so that we can improve our knowledge translation and exchange (KTE) processes improving utilization of study findings. The objective of this study is to describe the dissemination of published IPV victim intervention studies and to explore which study characteristics are associated with a greater number of alternative metric mentions and conventional citations. As part of a larger scoping review, we conducted a literature search to identify IPV intervention studies. Outcomes included znumber of alternative metric mentions and conventional citations. Fifty-nine studies were included in this study. The median number of alternative metric mentions was six, and the median number of conventional citations was two. Forty-one percent of the studies (24/59) had no alternative metric mentions, and 27% (16/59) had no conventional citations. Longer time since publication was significantly associated with a greater number of mentions and citations, as were systematic reviews and randomized controlled trial designs. The majority of IPV studies receive little to no online attention or citations in academic journals, indicating a need for the field to focus on implementing strong knowledge dissemination plans. The papers receiving the most alternative metric mentions and conventional citations were also the more rigorous study designs, indicating a need to focus on study quality. We recommend using alternative metrics in conjunction with conventional metrics to evaluate the full dissemination of IPV research.

  14. 40 CFR 408.20 - Applicability; description of the conventional blue crab processing subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... conventional blue crab processing subcategory. 408.20 Section 408.20 Protection of Environment ENVIRONMENTAL... SOURCE CATEGORY Conventional Blue Crab Processing Subcategory § 408.20 Applicability; description of the conventional blue crab processing subcategory. The provisions of this subpart are applicable to discharges...

  15. 40 CFR 408.20 - Applicability; description of the conventional blue crab processing subcategory.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... conventional blue crab processing subcategory. 408.20 Section 408.20 Protection of Environment ENVIRONMENTAL... SOURCE CATEGORY Conventional Blue Crab Processing Subcategory § 408.20 Applicability; description of the conventional blue crab processing subcategory. The provisions of this subpart are applicable to discharges...

  16. 40 CFR 408.20 - Applicability; description of the conventional blue crab processing subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... conventional blue crab processing subcategory. 408.20 Section 408.20 Protection of Environment ENVIRONMENTAL... SOURCE CATEGORY Conventional Blue Crab Processing Subcategory § 408.20 Applicability; description of the conventional blue crab processing subcategory. The provisions of this subpart are applicable to discharges...

  17. 40 CFR 408.20 - Applicability; description of the conventional blue crab processing subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... conventional blue crab processing subcategory. 408.20 Section 408.20 Protection of Environment ENVIRONMENTAL... SOURCE CATEGORY Conventional Blue Crab Processing Subcategory § 408.20 Applicability; description of the conventional blue crab processing subcategory. The provisions of this subpart are applicable to discharges...

  18. 40 CFR 408.20 - Applicability; description of the conventional blue crab processing subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... conventional blue crab processing subcategory. 408.20 Section 408.20 Protection of Environment ENVIRONMENTAL... SOURCE CATEGORY Conventional Blue Crab Processing Subcategory § 408.20 Applicability; description of the conventional blue crab processing subcategory. The provisions of this subpart are applicable to discharges...

  19. Hybrid rocket propulsion systems for outer planet exploration missions

    NASA Astrophysics Data System (ADS)

    Jens, Elizabeth T.; Cantwell, Brian J.; Hubbard, G. Scott

    2016-11-01

    Outer planet exploration missions require significant propulsive capability, particularly to achieve orbit insertion. Missions to explore the moons of outer planets place even more demanding requirements on propulsion systems, since they involve multiple large ΔV maneuvers. Hybrid rockets present a favorable alternative to conventional propulsion systems for many of these missions. They typically enjoy higher specific impulse than solids, can be throttled, stopped/restarted, and have more flexibility in their packaging configuration. Hybrids are more compact and easier to throttle than liquids and have similar performance levels. In order to investigate the suitability of these propulsion systems for exploration missions, this paper presents novel hybrid motor designs for two interplanetary missions. Hybrid propulsion systems for missions to Europa and Uranus are presented and compared to conventional in-space propulsion systems. The hybrid motor design for each of these missions is optimized across a range of parameters, including propellant selection, O/F ratio, nozzle area ratio, and chamber pressure. Details of the design process are described in order to provide guidance for researchers wishing to evaluate hybrid rocket motor designs for other missions and applications.

  20. User-centered design in clinical handover: exploring post-implementation outcomes for clinicians.

    PubMed

    Wong, Ming Chao; Cummings, Elizabeth; Turner, Paul

    2013-01-01

    This paper examines the outcomes for clinicians from their involvement in the development of an electronic clinical hand-over tool developed using principles of user-centered design. Conventional e-health post-implementation evaluations tend to emphasize technology-related (mostly positive) outcomes. More recently, unintended (mostly negative) consequences arising from the implementation of e-health technologies have also been reported. There remains limited focus on the post-implementation outcomes for users, particularly those directly involved in e-health design processes. This paper presents detailed analysis and insights into the outcomes experienced post-implementation by a cohort of junior clinicians involved in developing an electronic clinical handover tool in Tasmania, Australia. The qualitative methods used included observations, semi-structured interviews and analysis of clinical handover notes. Significantly, a number of unanticipated flow-on effects were identified that mitigated some of the challenges arising during the design and implementation of the tool. The paper concludes by highlighting the importance of identifying post-implementation user outcomes beyond conventional system adoption and use and also points to the need for more comprehensive evaluative frameworks to encapsulate these broader socio-technical user outcomes.

  1. Optimization of orodispersible and conventional tablets using simplex lattice design: Relationship among excipients and banana extract.

    PubMed

    Duangjit, Sureewan; Kraisit, Pakorn

    2018-08-01

    The objective of this study was focused on the optimization of the pharmaceutical excipients and banana extract in the preparation of orally disintegrating banana extract tablets (OD-BET) and conventional banana extract tablets (CO-BET) using a simplex lattice design. Various ratios of banana extract (BE), dibasic calcium phosphate (DCP) and microcrystalline cellulose (MCC) were used to prepare banana extract tablets (BET). The results indicated that the optimal OD-BET and CO-BET consisted of BE: DCP: MCC at 10.0, 88.8, 1.2, 10.0, 83.8: and 6.2, respectively. AFM demonstrated that the surface of BET with BE + MCC was smooth and compacted when compared to BET with BE + DCP + MCC and BE + DCP. FTIR and XRD showed a correlation in the results and indicated that no interaction of each ingredient occurred in the process of BET formulation. Therefore, the experimental design is potentially useful in formulated OD-BET and CO-BET by using only one design simultaneously. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A new RF window designed for high-power operation in an S-band LINAC RF system

    NASA Astrophysics Data System (ADS)

    Joo, Youngdo; Kim, Seung-Hwan; Hwang, Woonha; Ryu, Jiwan; Roh, Sungjoo

    2016-09-01

    A new RF window is designed for high-power operation at the Pohang Light Source-II (PLSII) S-band linear accelerator (LINAC) RF system. In order to reduce the strength of the electric field component perpendicular to the ceramic disk, which is commonly known as the main cause of most discharge breakdowns in ceramic disk, we replace the pill-box type cavity in the conventional RF window with an overmoded cavity. The overmoded cavity is coupled with input and output waveguides through dual side-wall coupling irises to reduce the electric field strength at the iris and the number of possible mode competitions. The finite-difference time-domain (FDTD) simulation, CST MWS, was used in the design process. The simulated maximum electric field component perpendicular to the ceramic for the new RF window is reduced by an order of magnitude compared with taht for the conventional RF window, which holds promise for stable high-power operation.

  3. Extending the Capture Volume of an Iris Recognition System Using Wavefront Coding and Super-Resolution.

    PubMed

    Hsieh, Sheng-Hsun; Li, Yung-Hui; Tien, Chung-Hao; Chang, Chin-Chen

    2016-12-01

    Iris recognition has gained increasing popularity over the last few decades; however, the stand-off distance in a conventional iris recognition system is too short, which limits its application. In this paper, we propose a novel hardware-software hybrid method to increase the stand-off distance in an iris recognition system. When designing the system hardware, we use an optimized wavefront coding technique to extend the depth of field. To compensate for the blurring of the image caused by wavefront coding, on the software side, the proposed system uses a local patch-based super-resolution method to restore the blurred image to its clear version. The collaborative effect of the new hardware design and software post-processing showed great potential in our experiment. The experimental results showed that such improvement cannot be achieved by using a hardware-or software-only design. The proposed system can increase the capture volume of a conventional iris recognition system by three times and maintain the system's high recognition rate.

  4. Novel process windows for enabling, accelerating, and uplifting flow chemistry.

    PubMed

    Hessel, Volker; Kralisch, Dana; Kockmann, Norbert; Noël, Timothy; Wang, Qi

    2013-05-01

    Novel Process Windows make use of process conditions that are far from conventional practices. This involves the use of high temperatures, high pressures, high concentrations (solvent-free), new chemical transformations, explosive conditions, and process simplification and integration to boost synthetic chemistry on both the laboratory and production scale. Such harsh reaction conditions can be safely reached in microstructured reactors due to their excellent transport intensification properties. This Review discusses the different routes towards Novel Process Windows and provides several examples for each route grouped into different classes of chemical and process-design intensification. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Hynol: An economic process for methanol production from biomass and natural gas with reduced CO2 emission

    NASA Astrophysics Data System (ADS)

    Steinberg, M.; Dong, Yuanji

    1993-10-01

    The Hynol process is proposed to meet the demand for an economical process for methanol production with reduced CO2 emission. This new process consists of three reaction steps: (1) hydrogasification of biomass, (2) steam reforming of the produced gas with additional natural gas feedstock, and (3) methanol synthesis of the hydrogen and carbon monoxide produced during the previous two steps. The H2-rich gas remaining after methanol synthesis is recycled to gasify the biomass in an energy neutral reactor so that there is no need for an expensive oxygen plant as required by commercial steam gasifiers. Recycling gas allows the methanol synthesis reactor to perform at a relatively lower pressure than conventional while the plant still maintains high methanol yield. Energy recovery designed into the process minimizes heat loss and increases the process thermal efficiency. If the Hynol methanol is used as an alternative and more efficient automotive fuel, an overall 41% reduction in CO2 emission can be achieved compared to the use of conventional gasoline fuel. A preliminary economic estimate shows that the total capital investment for a Hynol plant is 40% lower than that for a conventional biomass gasification plant. The methanol production cost is $0.43/gal for a 1085 million gal/yr Hynol plant which is competitive with current U.S. methanol and equivalent gasoline prices. Process flowsheet and simulation data using biomass and natural gas as cofeedstocks are presented. The Hynol process can convert any condensed carbonaceous material, especially municipal solid waste (MSW), to produce methanol.

  6. Further Development and Assessment of a Broadband Liner Optimization Process

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.

    2016-01-01

    The utilization of advanced fan designs (including higher bypass ratios) and shorter engine nacelles has highlighted a need for increased fan noise reduction over a broader frequency range. Thus, improved broadband liner designs must account for these constraints and, where applicable, take advantage of advanced manufacturing techniques that have opened new possibilities for novel configurations. This work focuses on the use of an established broadband acoustic liner optimization process to design a variable-depth, multi-degree of freedom liner for a high speed fan. Specifically, in-duct attenuation predictions with a statistical source model are used to obtain optimum impedance spectra over the conditions of interest. The predicted optimum impedance information is then used with acoustic liner modeling tools to design a liner aimed at producing impedance spectra that most closely match the predicted optimum values. The multi-degree of freedom design is carried through design, fabrication, and testing. In-duct attenuation predictions compare well with measured data and the multi-degree of freedom liner is shown to outperform a more conventional liner over a range of flow conditions. These promising results provide further confidence in the design tool, as well as the enhancements made to the overall design process.

  7. Nonlinear inversion of electrical resistivity imaging using pruning Bayesian neural networks

    NASA Astrophysics Data System (ADS)

    Jiang, Fei-Bo; Dai, Qian-Wei; Dong, Li

    2016-06-01

    Conventional artificial neural networks used to solve electrical resistivity imaging (ERI) inversion problem suffer from overfitting and local minima. To solve these problems, we propose to use a pruning Bayesian neural network (PBNN) nonlinear inversion method and a sample design method based on the K-medoids clustering algorithm. In the sample design method, the training samples of the neural network are designed according to the prior information provided by the K-medoids clustering results; thus, the training process of the neural network is well guided. The proposed PBNN, based on Bayesian regularization, is used to select the hidden layer structure by assessing the effect of each hidden neuron to the inversion results. Then, the hyperparameter α k , which is based on the generalized mean, is chosen to guide the pruning process according to the prior distribution of the training samples under the small-sample condition. The proposed algorithm is more efficient than other common adaptive regularization methods in geophysics. The inversion of synthetic data and field data suggests that the proposed method suppresses the noise in the neural network training stage and enhances the generalization. The inversion results with the proposed method are better than those of the BPNN, RBFNN, and RRBFNN inversion methods as well as the conventional least squares inversion.

  8. Unstructured Grids for Sonic Boom Analysis and Design

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Nayani, Sudheer N.

    2015-01-01

    An evaluation of two methods for improving the process for generating unstructured CFD grids for sonic boom analysis and design has been conducted. The process involves two steps: the generation of an inner core grid using a conventional unstructured grid generator such as VGRID, followed by the extrusion of a sheared and stretched collar grid through the outer boundary of the core grid. The first method evaluated, known as COB, automatically creates a cylindrical outer boundary definition for use in VGRID that makes the extrusion process more robust. The second method, BG, generates the collar grid by extrusion in a very efficient manner. Parametric studies have been carried out and new options evaluated for each of these codes with the goal of establishing guidelines for best practices for maintaining boom signature accuracy with as small a grid as possible. In addition, a preliminary investigation examining the use of the CDISC design method for reducing sonic boom utilizing these grids was conducted, with initial results confirming the feasibility of a new remote design approach.

  9. Bioreactors for high cell density and continuous multi-stage cultivations: options for process intensification in cell culture-based viral vaccine production.

    PubMed

    Tapia, Felipe; Vázquez-Ramírez, Daniel; Genzel, Yvonne; Reichl, Udo

    2016-03-01

    With an increasing demand for efficacious, safe, and affordable vaccines for human and animal use, process intensification in cell culture-based viral vaccine production demands advanced process strategies to overcome the limitations of conventional batch cultivations. However, the use of fed-batch, perfusion, or continuous modes to drive processes at high cell density (HCD) and overextended operating times has so far been little explored in large-scale viral vaccine manufacturing. Also, possible reductions in cell-specific virus yields for HCD cultivations have been reported frequently. Taking into account that vaccine production is one of the most heavily regulated industries in the pharmaceutical sector with tough margins to meet, it is understandable that process intensification is being considered by both academia and industry as a next step toward more efficient viral vaccine production processes only recently. Compared to conventional batch processes, fed-batch and perfusion strategies could result in ten to a hundred times higher product yields. Both cultivation strategies can be implemented to achieve cell concentrations exceeding 10(7) cells/mL or even 10(8) cells/mL, while keeping low levels of metabolites that potentially inhibit cell growth and virus replication. The trend towards HCD processes is supported by development of GMP-compliant cultivation platforms, i.e., acoustic settlers, hollow fiber bioreactors, and hollow fiber-based perfusion systems including tangential flow filtration (TFF) or alternating tangential flow (ATF) technologies. In this review, these process modes are discussed in detail and compared with conventional batch processes based on productivity indicators such as space-time yield, cell concentration, and product titers. In addition, options for the production of viral vaccines in continuous multi-stage bioreactors such as two- and three-stage systems are addressed. While such systems have shown similar virus titers compared to batch cultivations, keeping high yields for extended production times is still a challenge. Overall, we demonstrate that process intensification of cell culture-based viral vaccine production can be realized by the consequent application of fed-batch, perfusion, and continuous systems with a significant increase in productivity. The potential for even further improvements is high, considering recent developments in establishment of new (designer) cell lines, better characterization of host cell metabolism, advances in media design, and the use of mathematical models as a tool for process optimization and control.

  10. Fabrication development for ODS-superalloy, air-cooled turbine blades

    NASA Technical Reports Server (NTRS)

    Moracz, D. J.

    1984-01-01

    MA-600 is a gamma prime and oxide dispersion strengthened superalloy made by mechanical alloying. At the initiation of this program, MA-6000 was available as an experimental alloy only and did not go into production until late in the program. The objective of this program was to develop a thermal-mechanical-processing approach which would yield the necessary elongated grain structure and desirable mechanical properties after conventional press forging. Forging evaluations were performed to select optimum thermal-mechanical-processing conditions. These forging evaluations indicated that MA-6000 was extremely sensitive to die chilling. In order to conventionally hot forge the alloy, an adherent cladding, either the original extrusion can or a thick plating, was required to prevent cracking of the workpiece. Die design must reflect the requirement of cladding. MA-6000 was found to be sensitive to the forging temperature. The correct temperature required to obtain the proper grain structure after recrystallization was found to be between 1010-1065 C (1850-1950 F). The deformation level did not affect subsequent crystallization; however, sharp transition areas in tooling designs should be avoided in forming a blade shape because of the potential for grain structure discontinuities. Starting material to be used for forging should be processed so that it is capable of being zone annealed to a coarse elongated grain structure as bar stock. This conclusion means that standard processed bar materials can be used.

  11. How to Study Chronic Diseases-Implications of the Convention on the Rights of Persons with Disabilities for Research Designs.

    PubMed

    von Peter, Sebastian; Bieler, Patrick

    2017-01-01

    The Convention on the Rights of Persons with Disabilities (CRPD) has been received considerable attention internationally. The Convention's main arguments are conceptually analyzed. Implications for the development of research designs are elaborated upon. The Convention entails both a human rights and a sociopolitical dimension. Advancing a relational notion of disability, it enters a rather foreign terrain to medical sciences. Research designs have to be changed accordingly. Research designs in accordance with the CRPD should employ and further develop context-sensitive research strategies and interdisciplinary collaboration. Complex designs that allow for a relational analysis of personalized effects have to be established and evaluated, thereby systematically integrating qualitative methods.

  12. Numerical investigation of effects on blanks for press hardening process during longitudinal flux heating

    NASA Astrophysics Data System (ADS)

    Dietrich, André; Nacke, Bernard

    2018-05-01

    With the induction heating technology, it is possible to heat up blanks for the press hardening process in 20 s or less. Furthermore, the dimension of an induction system is small and easy to control in comparison to conventional heating systems. To bring the induction heating technology to warm forming industry it is necessary to analyze the process under the view of induction. This paper investigates the edge- and end-effects of a batch heated blank. The results facilitate the later design of induction heating systems for the batch process.

  13. Comparison of retention between maxillary milled and conventional denture bases: A clinical study.

    PubMed

    AlHelal, Abdulaziz; AlRumaih, Hamad S; Kattadiyil, Mathew T; Baba, Nadim Z; Goodacre, Charles J

    2017-02-01

    Clinical studies comparing the retention values of milled denture bases with those of conventionally processed denture bases are lacking. The purpose of this clinical study was to compare the retention values of conventional heat-polymerized denture bases with those of digitally milled maxillary denture bases. Twenty individuals with completely edentulous maxillary arches participated in this study. Definitive polyvinyl siloxane impressions were scanned (iSeries; Dental Wings), and the standard tessellation language files were sent to Global Dental Science for the fabrication of a computer-aided design and computer-aided manufacturing (CAD-CAM) milled denture base (group MB) (AvaDent). The impression was then poured to obtain a definitive cast that was used to fabricate a heat-polymerized acrylic resin denture base resin (group HB). A custom-designed testing device was used to measure denture retention (N). Each denture base was subjected to a vertical pulling force by using an advanced digital force gauge 3 times at 10-minute intervals. The average retention of the 2 fabrication methods was compared using repeated ANOVA (α=.05). Significantly increased retention was observed for the milled denture bases compared with that of the conventional heat-polymerized denture bases (P<.001). The retention offered by milled complete denture bases from prepolymerized poly(methyl methacrylate) resin was significantly higher than that offered by conventional heat- polymerized denture bases. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  14. Design of Efficient Mirror Adder in Quantum- Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Mishra, Prashant Kumar; Chattopadhyay, Manju K.

    2018-03-01

    Lower power consumption is an essential demand for portable multimedia system using digital signal processing algorithms and architectures. Quantum dot cellular automata (QCA) is a rising nano technology for the development of high performance ultra-dense low power digital circuits. QCA based several efficient binary and decimal arithmetic circuits are implemented, however important improvements are still possible. This paper demonstrate Mirror Adder circuit design in QCA. We present comparative study of mirror adder cells designed using conventional CMOS technique and mirror adder cells designed using quantum-dot cellular automata. QCA based mirror adders are better in terms of area by order of three.

  15. Failure modes and effects analysis automation

    NASA Technical Reports Server (NTRS)

    Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron

    1988-01-01

    A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.

  16. Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)

    2000-01-01

    HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).

  17. Nanophotonic particle simulation and inverse design using artificial neural networks.

    PubMed

    Peurifoy, John; Shen, Yichen; Jing, Li; Yang, Yi; Cano-Renteria, Fidel; DeLacy, Brendan G; Joannopoulos, John D; Tegmark, Max; Soljačić, Marin

    2018-06-01

    We propose a method to use artificial neural networks to approximate light scattering by multilayer nanoparticles. We find that the network needs to be trained on only a small sampling of the data to approximate the simulation to high precision. Once the neural network is trained, it can simulate such optical processes orders of magnitude faster than conventional simulations. Furthermore, the trained neural network can be used to solve nanophotonic inverse design problems by using back propagation, where the gradient is analytical, not numerical.

  18. New features and applications of PRESTO, a computer code for the performance of regenerative, superheated steam turbine cycles

    NASA Technical Reports Server (NTRS)

    Choo, Y. K.; Staiger, P. J.

    1982-01-01

    The code was designed to analyze performance at valves-wide-open design flow. The code can model conventional steam cycles as well as cycles that include such special features as process steam extraction and induction and feedwater heating by external heat sources. Convenience features and extensions to the special features were incorporated into the PRESTO code. The features are described, and detailed examples illustrating the use of both the original and the special features are given.

  19. Production of hydrogen by electron transfer catalysis using conventional and photochemical means

    NASA Technical Reports Server (NTRS)

    Rillema, D. P.

    1981-01-01

    Alternate methods of generating hydrogen from the sulfuric acid thermal or electrochemical cycles are presented. A number of processes requiring chemical, electrochemical or photochemical methods are also presented. These include the design of potential photoelectrodes and photocatalytic membranes using Ru impregnated nafion tubing, and the design of experiments to study the catalyzed electrolytic formation of hydrogen and sulfuric acid from sulfur dioxide and water using quinones as catalysts. Experiments are carried out to determine the value of these approaches to energy conversion.

  20. Structure Property Studies for Additively Manufactured Parts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milenski, Helen M; Schmalzer, Andrew Michael; Kelly, Daniel

    2015-08-17

    Since the invention of modern Additive Manufacturing (AM) processes engineers and designers have worked hard to capitalize on the unique building capabilities that AM allows. By being able to customize the interior fill of parts it is now possible to design components with a controlled density and customized internal structure. The creation of new polymers and polymer composites allow for even greater control over the mechanical properties of AM parts. One of the key reasons to explore AM, is to bring about a new paradigm in part design, where materials can be strategically optimized in a way that conventional subtractivemore » methods cannot achieve. The two processes investigated in my research were the Fused Deposition Modeling (FDM) process and the Direct Ink Write (DIW) process. The objectives of the research were to determine the impact of in-fill density and morphology on the mechanical properties of FDM parts, and to determine if DIW printed samples could be produced where the filament diameter was varied while the overall density remained constant.« less

  1. Ultrasonically enhanced extraction of bioactive principles from Quillaja Saponaria Molina.

    PubMed

    Gaete-Garretón, L; Vargas-Hernández, Yolanda; Cares-Pacheco, María G; Sainz, Javier; Alarcón, John

    2011-07-01

    A study of ultrasonic enhancement in the extraction of bioactive principles from Quillaja Saponaria Molina (Quillay) is presented. The effects influencing the extraction process were studied through a two-level factorial design. The effects considered in the experimental design were: granulometry, extraction time, acoustic Power, raw matter/solvent ratio (concentration) and acoustic impedance. It was found that for aqueous extraction the main factors affecting the ultrasonically-assisted process were: granulometry, raw matter/solvent ratio and extraction time. The extraction ratio was increased by Ultrasonics effect and a reduction in extraction time was verified without any influence in the product quality. In addition the process can be carried out at lower temperatures than the conventional method. As the process developed uses chips from the branches of trees, and not only the bark, this research contributes to make the saponin exploitation process a sustainable industry. Copyright © 2010 Elsevier B.V. All rights reserved.

  2. Design Tool Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.

  3. Experimental evaluation of tool wear throughout a continuous stroke blanking process of quenched 22MnB5 ultra-high-strength steel

    NASA Astrophysics Data System (ADS)

    Vogt, S.; Neumayer, F. F.; Serkyov, I.; Jesner, G.; Kelsch, R.; Geile, M.; Sommer, A.; Golle, R.; Volk, W.

    2017-09-01

    Steel is the most common material used in vehicles’ chassis, which makes its research an important topic for the automotive industry. Recently developed ultra-high-strength steels (UHSS) provide extreme tensile strength up to 1,500 MPa and combine great crashworthiness with good weight reduction potential. However, in order to reach the final shape of sheet metal parts additional cutting steps such as trimming and piercing are often required. The final trimming of quenched metal sheets presents a huge challenge to a conventional process, mainly because of the required extreme cutting force. The high cutting impact, due to the materials’ brittleness, causes excessive tool wear or even sudden tool failure. Therefore, a laser is commonly used for the cutting process, which is time and energy consuming. The purpose of this paper is to demonstrate the capability of a conventional blanking tool design in a continuous stroke piercing process using boron steel 22MnB5 sheets. Two different types of tool steel were tested for their suitability as active cutting elements: electro-slag remelted (ESR) cold work tool steel Bohler K340 ISODUR and powder-metallurgic (PM) high speed steel Bohler S390 MICROCLEAN. A FEM study provided information about an optimized punch design, which withstands buckling under high cutting forces. The wear behaviour of the process was assessed by the tool wear of the active cutting elements as well as the quality of cut surfaces.

  4. The role of simulation in the design of a neural network chip

    NASA Technical Reports Server (NTRS)

    Desai, Utpal; Roppel, Thaddeus A.; Padgett, Mary L.

    1993-01-01

    An iterative, simulation-based design procedure for a neural network chip is introduced. For this design procedure, the goal is to produce a chip layout for a neural network in which the weights are determined by transistor gate width-to-length ratios. In a given iteration, the current layout is simulated using the circuit simulator SPICE, and layout adjustments are made based on conventional gradient-decent methods. After the iteration converges, the chip is fabricated. Monte Carlo analysis is used to predict the effect of statistical fabrication process variations on the overall performance of the neural network chip.

  5. Interrupting the telos: locating subsistence in contemporary US forests

    Treesearch

    Marla R. Emery; Alan R. Pierce

    2005-01-01

    People continue to hunt, fish, trap, and gather for subsistence purposes in the contemporary United States. This fact has implications for forest policy, as suggested by an international convention on temperate and boreal forests, commonly known as the Montreal Process. Three canons of law provide a legal basis for subsistence activities by designated social groups in...

  6. Teaching Tobacco Cessation Skills to Uruguayan Physicians Using Information and Communication Technologies

    ERIC Educational Resources Information Center

    Llambi, Laura; Esteves, Elba; Martinez, Elisa; Forster, Thais; Garcia, Sofia; Miranda, Natalia; Arredondo, Antonio Lopez; Margolis, Alvaro

    2011-01-01

    Introduction: Since 2004, with the ratification of the Framework Convention on Tobacco Control, Uruguay has implemented a wide range of legal restrictions designed to reduce the devastating impacts of tobacco. This legal process generated an increase in demand for tobacco cessation treatment, which led to the need to train a large number of…

  7. Welding parameter optimization of alloy material by friction stir welding using Taguchi approach and design of experiments

    NASA Astrophysics Data System (ADS)

    Karwande, Amit H.; Rao, Seeram Srinivasa

    2018-04-01

    Friction stir welding (FSW) a welding process in which metals are joint by melting them at their solid state. In different engineering areas such as civil, mechanical, naval and aeronautical engineering beams are widely used of the magnesium alloys for different applications and that are joined by conventional inert gas welding process. Magnesium metal has less density and low melting point for that reason large heat generation in the common welding process so its necessity to adapt new welding process. FSW process increases the weld quality which observed under various mechanical testing by using different tool size.

  8. Laser doppler blood flow imaging using a CMOS imaging sensor with on-chip signal processing.

    PubMed

    He, Diwei; Nguyen, Hoang C; Hayes-Gill, Barrie R; Zhu, Yiqun; Crowe, John A; Gill, Cally; Clough, Geraldine F; Morgan, Stephen P

    2013-09-18

    The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF) imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  9. 77 FR 69568 - Special Conditions: Bombardier Aerospace, Model BD-500-1A10 and BD-500-1A11 Airplanes; Sidestick...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-20

    ... sidestick controller instead of a conventional control column and wheel. This kind of controller is designed... conventional control column and wheel. This kind of controller is designed for one-hand operation. Discussion... controller instead of a conventional wheel or control stick. This kind of controller is designed to be...

  10. 78 FR 11089 - Special Conditions: Bombardier Aerospace, Model BD-500-1A10 and BD-500-1A11 Airplanes; Sidestick...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-15

    ... controller instead of a conventional control column and wheel. This kind of controller is designed for only... following novel or unusual design feature: A sidestick controller instead of a conventional control column... conventional wheel or control stick. This kind of controller is designed to be operated using only one hand...

  11. [Evaluation of production and clinical working time of computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays for complete denture].

    PubMed

    Wei, L; Chen, H; Zhou, Y S; Sun, Y C; Pan, S X

    2017-02-18

    To compare the technician fabrication time and clinical working time of custom trays fabricated using two different methods, the three-dimensional printing custom trays and the conventional custom trays, and to prove the feasibility of the computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays in clinical use from the perspective of clinical time cost. Twenty edentulous patients were recruited into this study, which was prospective, single blind, randomized self-control clinical trials. Two custom trays were fabricated for each participant. One of the custom trays was fabricated using functional suitable denture (FSD) system through CAD/CAM process, and the other was manually fabricated using conventional methods. Then the final impressions were taken using both the custom trays, followed by utilizing the final impression to fabricate complete dentures respectively. The technician production time of the custom trays and the clinical working time of taking the final impression was recorded. The average time spent on fabricating the three-dimensional printing custom trays using FSD system and fabricating the conventional custom trays manually were (28.6±2.9) min and (31.1±5.7) min, respectively. The average time spent on making the final impression with the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually were (23.4±11.5) min and (25.4±13.0) min, respectively. There was significant difference in the technician fabrication time and the clinical working time between the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually (P<0.05). The average time spent on fabricating three-dimensional printing custom trays using FSD system and making the final impression with the trays are less than those of the conventional custom trays fabricated manually, which reveals that the FSD three-dimensional printing custom trays is less time-consuming both in the clinical and laboratory process than the conventional custom trays. In addition, when we manufacture custom trays by three-dimensional printing method, there is no need to pour preliminary cast after taking the primary impression, therefore, it can save the impression material and model material. As to completing denture restoration, manufacturing custom trays using FSD system is worth being popularized.

  12. The effect of hearing aid technologies on listening in an automobile.

    PubMed

    Wu, Yu-Hsiang; Stangl, Elizabeth; Bentler, Ruth A; Stanziola, Rachel W

    2013-06-01

    Communication while traveling in an automobile often is very difficult for hearing aid users. This is because the automobile/road noise level is usually high, and listeners/drivers often do not have access to visual cues. Since the talker of interest usually is not located in front of the listener/driver, conventional directional processing that places the directivity beam toward the listener's front may not be helpful and, in fact, could have a negative impact on speech recognition (when compared to omnidirectional processing). Recently, technologies have become available in commercial hearing aids that are designed to improve speech recognition and/or listening effort in noisy conditions where talkers are located behind or beside the listener. These technologies include (1) a directional microphone system that uses a backward-facing directivity pattern (Back-DIR processing), (2) a technology that transmits audio signals from the ear with the better signal-to-noise ratio (SNR) to the ear with the poorer SNR (Side-Transmission processing), and (3) a signal processing scheme that suppresses the noise at the ear with the poorer SNR (Side-Suppression processing). The purpose of the current study was to determine the effect of (1) conventional directional microphones and (2) newer signal processing schemes (Back-DIR, Side-Transmission, and Side-Suppression) on listener's speech recognition performance and preference for communication in a traveling automobile. A single-blinded, repeated-measures design was used. Twenty-five adults with bilateral symmetrical sensorineural hearing loss aged 44 through 84 yr participated in the study. The automobile/road noise and sentences of the Connected Speech Test (CST) were recorded through hearing aids in a standard van moving at a speed of 70 mph on a paved highway. The hearing aids were programmed to omnidirectional microphone, conventional adaptive directional microphone, and the three newer schemes. CST sentences were presented from the side and back of the hearing aids, which were placed on the ears of a manikin. The recorded stimuli were presented to listeners via earphones in a sound-treated booth to assess speech recognition performance and preference with each programmed condition. Compared to omnidirectional microphones, conventional adaptive directional processing had a detrimental effect on speech recognition when speech was presented from the back or side of the listener. Back-DIR and Side-Transmission processing improved speech recognition performance (relative to both omnidirectional and adaptive directional processing) when speech was from the back and side, respectively. The performance with Side-Suppression processing was better than with adaptive directional processing when speech was from the side. The participants' preferences for a given processing scheme were generally consistent with speech recognition results. The finding that performance with adaptive directional processing was poorer than with omnidirectional microphones demonstrates the importance of selecting the correct microphone technology for different listening situations. The results also suggest the feasibility of using hearing aid technologies to provide a better listening experience for hearing aid users in automobiles. American Academy of Audiology.

  13. Potential Space Applications for Body-Centric Wireless and E-Textile Antennas

    NASA Technical Reports Server (NTRS)

    Kennedy, Timothy F.; Fink, Patrick W.; Chu, Andrew W.; Studor, George F.

    2007-01-01

    Space environment benefits of body-centric wireless communications are numerous, particularly in the context of long duration Lunar and Martian outposts that are in planning stages at several space agencies around the world. Since crew time for such missions is a scarce commodity, seamless integration of body-centric wireless from various sources is paramount. Sources include traditional data, such as audio, video, tracking, and biotelemetry. Newer data sources include positioning, orientation, and status of handheld tools and devices, as well as management and status of on-body inventories. In addition to offering lighter weight and flexibility, performance benefits of e-textile antennas are anticipated due to advantageous use of the body s surface area. In creating e-textile antennas and RF devices, researchers are faced with the challenge of transferring conventional and novel designs to textiles. Lack of impedance control, limited conductivity, and the inability to automatically create intricate designs are examples of limitations frequently attributed to e-textiles. Reliable interfaces between e-textiles and conventional hardware also represent significant challenges. Addressing these limitations is critical to the continued development and acceptance of fabric-based circuits for body-centric wireless applications. Here we present several examples of e-textile antennas and RF devices, created using a NASA-developed process, that overcome several of these limitations. The design and performance of an equiangular spiral, miniaturized spiral-loaded slot antenna, and a hybrid coupler are considered, with the e-textile devices showing comparable performance to like designs using conventional materials.

  14. Barriers to the utilization of synthetic fuels for transportation

    NASA Technical Reports Server (NTRS)

    Parker, H. W.; Reilly, M. J.

    1981-01-01

    The principal types of engines for transportation uses are reviewed and the specifications for conventional fuels are compared with specifications for synthetic fuels. Synfuel processes nearing the commercialization phase are reviewed. The barriers to using synfuels can be classified into four groups: technical, such as the uncertainty that a new engine design can satisfy the desired performance criteria; environmental, such as the risk that the engine emissions cannot meet the applicable environmental standards; economic, including the cost of using a synfuel relative to conventional transportation fuels; and market, involving market penetration by offering new engines, establishing new distribution systems and/or changing user expectations.

  15. Subwavelength grating enabled on-chip ultra-compact optical true time delay line

    PubMed Central

    Wang, Junjia; Ashrafi, Reza; Adams, Rhys; Glesk, Ivan; Gasulla, Ivana; Capmany, José; Chen, Lawrence R.

    2016-01-01

    An optical true time delay line (OTTDL) is a basic photonic building block that enables many microwave photonic and optical processing operations. The conventional design for an integrated OTTDL that is based on spatial diversity uses a length-variable waveguide array to create the optical time delays, which can introduce complexities in the integrated circuit design. Here we report the first ever demonstration of an integrated index-variable OTTDL that exploits spatial diversity in an equal length waveguide array. The approach uses subwavelength grating waveguides in silicon-on-insulator (SOI), which enables the realization of OTTDLs having a simple geometry and that occupy a compact chip area. Moreover, compared to conventional wavelength-variable delay lines with a few THz operation bandwidth, our index-variable OTTDL has an extremely broad operation bandwidth practically exceeding several tens of THz, which supports operation for various input optical signals with broad ranges of central wavelength and bandwidth. PMID:27457024

  16. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  17. Subwavelength grating enabled on-chip ultra-compact optical true time delay line.

    PubMed

    Wang, Junjia; Ashrafi, Reza; Adams, Rhys; Glesk, Ivan; Gasulla, Ivana; Capmany, José; Chen, Lawrence R

    2016-07-26

    An optical true time delay line (OTTDL) is a basic photonic building block that enables many microwave photonic and optical processing operations. The conventional design for an integrated OTTDL that is based on spatial diversity uses a length-variable waveguide array to create the optical time delays, which can introduce complexities in the integrated circuit design. Here we report the first ever demonstration of an integrated index-variable OTTDL that exploits spatial diversity in an equal length waveguide array. The approach uses subwavelength grating waveguides in silicon-on-insulator (SOI), which enables the realization of OTTDLs having a simple geometry and that occupy a compact chip area. Moreover, compared to conventional wavelength-variable delay lines with a few THz operation bandwidth, our index-variable OTTDL has an extremely broad operation bandwidth practically exceeding several tens of THz, which supports operation for various input optical signals with broad ranges of central wavelength and bandwidth.

  18. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  19. Outcomes With a Self-Fitting Hearing Aid.

    PubMed

    Keidser, Gitte; Convery, Elizabeth

    2018-01-01

    Self-fitting hearing aids (SFHAs)-devices that enable self-directed threshold measurements leading to a prescribed hearing aid (HA) setting, and fine-tuning, without the need for professional support-are now commercially available. This study examined outcomes obtained with one commercial SFHA, the Companion (SoundWorld Solutions), when support was available from a clinical assistant during self-fitting. Participants consisted of 27 experienced and 25 new HA users who completed the self-fitting process, resulting in 38 user-driven and 14 clinician-driven fittings. Following 12 weeks' experience with the SFHAs in the field, outcomes measured included the following: coupler gain and output, HA handling and management skills, speech recognition in noise, and self-reported benefit and satisfaction. In addition, the conventionally fitted HAs of 22 of the experienced participants who had user-driven fittings were evaluated. Irrespective of HA experience, the type of fitting (user- or clinician-driven) had no significant effect on coupler gain, speech recognition scores, or self-reported benefit and satisfaction. Users selected significantly higher low-frequency gain in the SFHAs when compared with the conventionally fitted HAs. The conventionally fitted HAs were rated significantly higher for benefit and satisfaction on some subscales due to negative issues with the physical design and implementation of the SFHAs, rather than who drove the fitting process. Poorer cognitive function was associated with poorer handling and management of the SFHAs. Findings suggest that with the right design and support, SFHAs may be a viable option to improve the accessibility of hearing health care.

  20. Anaerobic digestion of municipal solid waste: Technical developments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivard, C.J.

    1996-01-01

    The anaerobic biogasification of organic wastes generates two useful products: a medium-Btu fuel gas and a compost-quality organic residue. Although commercial-scale digestion systems are used to treat municipal sewage wastes, the disposal of solid organic wastes, including municipal solid wastes (MSW), requires a more cost-efficient process. Modern biogasification systems employ high-rate, high-solids fermentation methods to improve process efficiency and reduce capital costs. The design criteria and development stages are discussed. These systems are also compared with conventional low-solids fermentation technology.

  1. Freeform Optics: current challenges for future serial production

    NASA Astrophysics Data System (ADS)

    Schindler, C.; Köhler, T.; Roth, E.

    2017-10-01

    One of the major developments in optics industry recently is the commercial manufacturing of freeform surfaces for optical mid- and high performance systems. The loss of limitation on rotational symmetry enables completely new optical design solutions - but causes completely new challenges for the manufacturer too. Adapting the serial production from radial-symmetric to freeform optics cannot be done just by the extension of machine capabilities and software for every process step. New solutions for conventional optics productions or completely new process chains are necessary.

  2. Photonics applications in high-capacity data link terminals

    NASA Astrophysics Data System (ADS)

    Shi, Zan; Foshee, James J.

    2001-12-01

    Radio systems and, in particular, RF data link systems are evolving toward progressively more bandwidth and higher data rates. For many military RF data link applications the data transfer requirements exceed one Gigabit per second. Airborne collectors need to transfer sensor information and other large data files to ground locations and other airborne terminals, including the rel time transfer of files. It is a challenge to the system designer to provide a system design, which meets the RF link budget requirements for a one Gigabit per second data link; and there is a corresponding challenge in the development of the terminal architecture and hardware. The utilization of photonic circuitry and devices as a part of the terminal design offers the designer some alternatives to the conventional RF hardware design within the radio. Areas of consideration for the implementation of photonic technology include Gigabit per second baseband data interfaces with fiber along with the associated clocking rates and extending these Gigabit data rates into the radio for optical processing technology; optical interconnections within the individual circuit boards in the radio; and optical backplanes to allow the transfer of not only the Gigabit per second data rates and high speed clocks but other RF signals within the radio. True time delay using photonics in phased array antennas has been demonstrated and is an alternative to the conventional phase shifter designs used in phased array antennas, and remoting of phased array antennas from the terminal electronics in the Ku and Ka frequency bands using fiber optics as the carrier to minimize the RF losses, negate the use of the conventional waveguides, and allow the terminal equipment to be located with other electronic equipment in the aircraft suitable for controlled environment, ready access, and maintenance. The various photonics design alternatives will be discussed including specific photonic design approaches. Packaging, performance, and affordability of the various design alternatives will also be discussed.

  3. Advanced optical sensing and processing technologies for the distributed control of large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Williams, G. M.; Fraser, J. C.

    1991-01-01

    The objective was to examine state-of-the-art optical sensing and processing technology applied to control the motion of flexible spacecraft. Proposed large flexible space systems, such an optical telescopes and antennas, will require control over vast surfaces. Most likely distributed control will be necessary involving many sensors to accurately measure the surface. A similarly large number of actuators must act upon the system. The used technical approach included reviewing proposed NASA missions to assess system needs and requirements. A candidate mission was chosen as a baseline study spacecraft for comparison of conventional and optical control components. Control system requirements of the baseline system were used for designing both a control system containing current off-the-shelf components and a system utilizing electro-optical devices for sensing and processing. State-of-the-art surveys of conventional sensor, actuator, and processor technologies were performed. A technology development plan is presented that presents a logical, effective way to develop and integrate advancing technologies.

  4. Influence of electrical and hybrid heating on bread quality during baking.

    PubMed

    Chhanwal, N; Ezhilarasi, P N; Indrani, D; Anandharamakrishnan, C

    2015-07-01

    Energy efficiency and product quality are the key factors for any food processing industry. The aim of the study was to develop energy and time efficient baking process. The hybrid heating (Infrared + Electrical) oven was designed and fabricated using two infrared lamps and electric heating coils. The developed oven can be operated in serial or combined heating modes. The standardized baking conditions were 18 min at 220°C to produce the bread from hybrid heating oven. Effect of baking with hybrid heating mode (H-1 and H-2, hybrid oven) on the quality characteristics of bread as against conventional heating mode (C-1, pilot scale oven; C-2, hybrid oven) was studied. The results showed that breads baked in hybrid heating mode (H-2) had higher moisture content (28.87%), higher volume (670 cm(3)), lower crumb firmness value (374.6 g), and overall quality score (67.0) comparable to conventional baking process (68.5). Moreover, bread baked in hybrid heating mode showed 28% reduction in baking time.

  5. Cell-controlled hybrid perfusion fed-batch CHO cell process provides significant productivity improvement over conventional fed-batch cultures.

    PubMed

    Hiller, Gregory W; Ovalle, Ana Maria; Gagnon, Matthew P; Curran, Meredith L; Wang, Wenge

    2017-07-01

    A simple method originally designed to control lactate accumulation in fed-batch cultures of Chinese Hamster Ovary (CHO) cells has been modified and extended to allow cells in culture to control their own rate of perfusion to precisely deliver nutritional requirements. The method allows for very fast expansion of cells to high density while using a minimal volume of concentrated perfusion medium. When the short-duration cell-controlled perfusion is performed in the production bioreactor and is immediately followed by a conventional fed-batch culture using highly concentrated feeds, the overall productivity of the culture is approximately doubled when compared with a highly optimized state-of-the-art fed-batch process. The technology was applied with near uniform success to five CHO cell processes producing five different humanized monoclonal antibodies. The increases in productivity were due to the increases in sustained viable cell densities. Biotechnol. Bioeng. 2017;114: 1438-1447. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. A cognitive approach for design of a multimedia informed consent video and website in pediatric research.

    PubMed

    Antal, Holly; Bunnell, H Timothy; McCahan, Suzanne M; Pennington, Chris; Wysocki, Tim; Blake, Kathryn V

    2017-02-01

    Poor participant comprehension of research procedures following the conventional face-to-face consent process for biomedical research is common. We describe the development of a multimedia informed consent video and website that incorporates cognitive strategies to enhance comprehension of study related material directed to parents and adolescents. A multidisciplinary team was assembled for development of the video and website that included human subjects professionals; psychologist researchers; institutional video and web developers; bioinformaticians and programmers; and parent and adolescent stakeholders. Five learning strategies that included Sensory-Modality view, Coherence, Signaling, Redundancy, and Personalization were integrated into a 15-min video and website material that describes a clinical research trial. A diverse team collaborated extensively over 15months to design and build a multimedia platform for obtaining parental permission and adolescent assent for participant in as asthma clinical trial. Examples of the learning principles included, having a narrator describe what was being viewed on the video (sensory-modality); eliminating unnecessary text and graphics (coherence); having the initial portion of the video explain the sections of the video to be viewed (signaling); avoiding simultaneous presentation of text and graphics (redundancy); and having a consistent narrator throughout the video (personalization). Existing conventional and multimedia processes for obtaining research informed consent have not actively incorporated basic principles of human cognition and learning in the design and implementation of these processes. The present paper illustrates how this can be achieved, setting the stage for rigorous evaluation of potential benefits such as improved comprehension, satisfaction with the consent process, and completion of research objectives. New consent strategies that have an integrated cognitive approach need to be developed and tested in controlled trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. A Three-Dimensional Hydrodynamic Focusing Method for Polyplex Synthesis

    PubMed Central

    Lu, Mengqian; Ho, Yi-Ping; Grigsby, Christopher L.; Nawaz, Ahmad Ahsan; Leong, Kam W.; Huang, Tony Jun

    2014-01-01

    Successful intracellular delivery of nucleic acid therapeutics relies on multi-aspect optimization, one of which is formulation. While there has been ample innovation on chemical design of polymeric gene carriers, the same cannot be said for physical processing of polymer-DNA nanocomplexes (polyplexes). Conventional synthesis of polyplexes by bulk mixing depends on the operators’ experience. The poorly controlled bulk-mixing process may also lead to batch-to-batch variation and consequent irreproducibility. Here, we synthesize polyplexes by using a three-dimensional hydrodynamic focusing (3D-HF) technique in a single-layered, planar microfluidic device. Without any additional chemical treatment or post processing, the polyplexes prepared by the 3D-HF method show smaller size, slower aggregation rate, and higher transfection efficiency, while exhibiting reduced cytotoxicity compared to the ones synthesized by conventional bulk mixing. In addition, by introducing external acoustic perturbation, mixing can be further enhanced, leading to even smaller nanocomplexes. The 3D-HF method provides a simple and reproducible process for synthesizing high-quality polyplexes, addressing a critical barrier in the eventual translation of nucleic acid therapeutics. PMID:24341632

  8. Mini Solar and Sea Current Power Generation System

    NASA Astrophysics Data System (ADS)

    Almenhali, Abdulrahman; Alshamsi, Hatem; Aljunaibi, Yaser; Almussabi, Dheyab; Alshehhi, Ahmed; Hilal, Hassan Bu

    2017-07-01

    The power demand in United Arab Emirates is increased so that there is a consistent power cut in our region. This is because of high power consumption by factories and also due to less availability of conventional energy resources. Electricity is most needed facility for the human being. All the conventional energy resources are depleting day by day. So we have to shift from conventional to non-conventional energy resources. In this the combination of two energy resources is takes place i.e. wind and solar energy. This process reviles the sustainable energy resources without damaging the nature. We can give uninterrupted power by using hybrid energy system. Basically this system involves the integration of two energy system that will give continuous power. Solar panels are used for converting solar energy and wind turbines are used for converting wind energy into electricity. This electrical power can utilize for various purpose. Generation of electricity will be takes place at affordable cost. This paper deals with the generation of electricity by using two sources combine which leads to generate electricity with affordable cost without damaging the nature balance. The purpose of this project was to design a portable and low cost power system that combines both sea current electric turbine and solar electric technologies. This system will be designed in efforts to develop a power solution for remote locations or use it as another source of green power.

  9. Enzyme reactor design under thermal inactivation.

    PubMed

    Illanes, Andrés; Wilson, Lorena

    2003-01-01

    Temperature is a very relevant variable for any bioprocess. Temperature optimization of bioreactor operation is a key aspect for process economics. This is especially true for enzyme-catalyzed processes, because enzymes are complex, unstable catalysts whose technological potential relies on their operational stability. Enzyme reactor design is presented with a special emphasis on the effect of thermal inactivation. Enzyme thermal inactivation is a very complex process from a mechanistic point of view. However, for the purpose of enzyme reactor design, it has been oversimplified frequently, considering one-stage first-order kinetics of inactivation and data gathered under nonreactive conditions that poorly represent the actual conditions within the reactor. More complex mechanisms are frequent, especially in the case of immobilized enzymes, and most important is the effect of catalytic modulators (substrates and products) on enzyme stability under operation conditions. This review focuses primarily on reactor design and operation under modulated thermal inactivation. It also presents a scheme for bioreactor temperature optimization, based on validated temperature-explicit functions for all the kinetic and inactivation parameters involved. More conventional enzyme reactor design is presented merely as a background for the purpose of highlighting the need for a deeper insight into enzyme inactivation for proper bioreactor design.

  10. A Model-based B2B (Batch to Batch) Control for An Industrial Batch Polymerization Process

    NASA Astrophysics Data System (ADS)

    Ogawa, Morimasa

    This paper describes overview of a model-based B2B (batch to batch) control for an industrial batch polymerization process. In order to control the reaction temperature precisely, several methods based on the rigorous process dynamics model are employed at all design stage of the B2B control, such as modeling and parameter estimation of the reaction kinetics which is one of the important part of the process dynamics model. The designed B2B control consists of the gain scheduled I-PD/II2-PD control (I-PD with double integral control), the feed-forward compensation at the batch start time, and the model adaptation utilizing the results of the last batch operation. Throughout the actual batch operations, the B2B control provides superior control performance compared with that of conventional control methods.

  11. Designing Meaning with Multiple Media Sources: A Case Study of an Eight-Year-Old Student's Writing Processes

    ERIC Educational Resources Information Center

    Ranker, Jason

    2007-01-01

    This case study closely examines how John (a former student of mine, age eight, second grade) composed during an informal writing group at school. Using qualitative research methods, I found that John selectively took up conventions, characters, story grammars, themes, and motifs from video games, television, Web pages, and comics. Likening his…

  12. Proceedings of Selected Research and Development Presentations at the 1996 National Convention of the Association for Educational Communications and Technology Sponsored by the Research and Theory Division (18th, Indianapolis, IN, 1996).

    ERIC Educational Resources Information Center

    Simonson, Michael R., Ed.; And Others

    1996-01-01

    This proceedings volume contains 77 papers. Subjects addressed include: image processing; new faculty research methods; preinstructional activities for preservice teacher education; computer "window" presentation styles; interface design; stress management instruction; cooperative learning; graphical user interfaces; student attitudes,…

  13. Biologically inspired binaural hearing aid algorithms: Design principles and effectiveness

    NASA Astrophysics Data System (ADS)

    Feng, Albert

    2002-05-01

    Despite rapid advances in the sophistication of hearing aid technology and microelectronics, listening in noise remains problematic for people with hearing impairment. To solve this problem two algorithms were designed for use in binaural hearing aid systems. The signal processing strategies are based on principles in auditory physiology and psychophysics: (a) the location/extraction (L/E) binaural computational scheme determines the directions of source locations and cancels noise by applying a simple subtraction method over every frequency band; and (b) the frequency-domain minimum-variance (FMV) scheme extracts a target sound from a known direction amidst multiple interfering sound sources. Both algorithms were evaluated using standard metrics such as signal-to-noise-ratio gain and articulation index. Results were compared with those from conventional adaptive beam-forming algorithms. In free-field tests with multiple interfering sound sources our algorithms performed better than conventional algorithms. Preliminary intelligibility and speech reception results in multitalker environments showed gains for every listener with normal or impaired hearing when the signals were processed in real time with the FMV binaural hearing aid algorithm. [Work supported by NIH-NIDCD Grant No. R21DC04840 and the Beckman Institute.

  14. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  15. Real-time image processing for non-contact monitoring of dynamic displacements using smartphone technologies

    NASA Astrophysics Data System (ADS)

    Min, Jae-Hong; Gelo, Nikolas J.; Jo, Hongki

    2016-04-01

    The newly developed smartphone application, named RINO, in this study allows measuring absolute dynamic displacements and processing them in real time using state-of-the-art smartphone technologies, such as high-performance graphics processing unit (GPU), in addition to already powerful CPU and memories, embedded high-speed/ resolution camera, and open-source computer vision libraries. A carefully designed color-patterned target and user-adjustable crop filter enable accurate and fast image processing, allowing up to 240fps for complete displacement calculation and real-time display. The performances of the developed smartphone application are experimentally validated, showing comparable accuracy with those of conventional laser displacement sensor.

  16. The Tool for Designing Engineering Systems Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    The conventional optimization methods were based on a deterministic approach, since their purpose is to find out an exact solution. However, these methods have initial condition dependence and risk of falling into local solution. In this paper, we propose a new optimization method based on a concept of path integral method used in quantum mechanics. The method obtains a solutions as an expected value (stochastic average) using a stochastic process. The advantages of this method are not to be affected by initial conditions and not to need techniques based on experiences. We applied the new optimization method to a design of the hang glider. In this problem, not only the hang glider design but also its flight trajectory were optimized. The numerical calculation results showed that the method has a sufficient performance.

  17. Object-oriented design and programming in medical decision support.

    PubMed

    Heathfield, H; Armstrong, J; Kirkham, N

    1991-12-01

    The concept of object-oriented design and programming has recently received a great deal of attention from the software engineering community. This paper highlights the realisable benefits of using the object-oriented approach in the design and development of clinical decision support systems. These systems seek to build a computational model of some problem domain and therefore tend to be exploratory in nature. Conventional procedural design techniques do not support either the process of model building or rapid prototyping. The central concepts of the object-oriented paradigm are introduced, namely encapsulation, inheritance and polymorphism, and their use illustrated in a case study, taken from the domain of breast histopathology. In particular, the dual roles of inheritance in object-oriented programming are examined, i.e., inheritance as a conceptual modelling tool and inheritance as a code reuse mechanism. It is argued that the use of the former is not entirely intuitive and may be difficult to incorporate into the design process. However, inheritance as a means of optimising code reuse offers substantial technical benefits.

  18. DNA rendering of polyhedral meshes at the nanoscale

    NASA Astrophysics Data System (ADS)

    Benson, Erik; Mohammed, Abdulmelik; Gardell, Johan; Masich, Sergej; Czeizler, Eugen; Orponen, Pekka; Högberg, Björn

    2015-07-01

    It was suggested more than thirty years ago that Watson-Crick base pairing might be used for the rational design of nanometre-scale structures from nucleic acids. Since then, and especially since the introduction of the origami technique, DNA nanotechnology has enabled increasingly more complex structures. But although general approaches for creating DNA origami polygonal meshes and design software are available, there are still important constraints arising from DNA geometry and sense/antisense pairing, necessitating some manual adjustment during the design process. Here we present a general method of folding arbitrary polygonal digital meshes in DNA that readily produces structures that would be very difficult to realize using previous approaches. The design process is highly automated, using a routeing algorithm based on graph theory and a relaxation simulation that traces scaffold strands through the target structures. Moreover, unlike conventional origami designs built from close-packed helices, our structures have a more open conformation with one helix per edge and are therefore stable under the ionic conditions usually used in biological assays.

  19. DNA rendering of polyhedral meshes at the nanoscale.

    PubMed

    Benson, Erik; Mohammed, Abdulmelik; Gardell, Johan; Masich, Sergej; Czeizler, Eugen; Orponen, Pekka; Högberg, Björn

    2015-07-23

    It was suggested more than thirty years ago that Watson-Crick base pairing might be used for the rational design of nanometre-scale structures from nucleic acids. Since then, and especially since the introduction of the origami technique, DNA nanotechnology has enabled increasingly more complex structures. But although general approaches for creating DNA origami polygonal meshes and design software are available, there are still important constraints arising from DNA geometry and sense/antisense pairing, necessitating some manual adjustment during the design process. Here we present a general method of folding arbitrary polygonal digital meshes in DNA that readily produces structures that would be very difficult to realize using previous approaches. The design process is highly automated, using a routeing algorithm based on graph theory and a relaxation simulation that traces scaffold strands through the target structures. Moreover, unlike conventional origami designs built from close-packed helices, our structures have a more open conformation with one helix per edge and are therefore stable under the ionic conditions usually used in biological assays.

  20. Finite Element Modelling and Analysis of Conventional Pultrusion Processes

    NASA Astrophysics Data System (ADS)

    Akishin, P.; Barkanov, E.; Bondarchuk, A.

    2015-11-01

    Pultrusion is one of many composite manufacturing techniques and one of the most efficient methods for producing fiber reinforced polymer composite parts with a constant cross-section. Numerical simulation is helpful for understanding the manufacturing process and developing scientific means for the pultrusion tooling design. Numerical technique based on the finite element method has been developed for the simulation of pultrusion processes. It uses the general purpose finite element software ANSYS Mechanical. It is shown that the developed technique predicts the temperature and cure profiles, which are in good agreement with those published in the open literature.

  1. Effect of the self-pumped limiter concept on the tritium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finn, P.A.; Sze, D.K.; Hassanein, A.

    1988-01-01

    The self-pumped limiter concept for impurity control of the plasma of a fusion reactor has a major impact on the design of the tritium systems. To achieve a sustained burn, conventional limiters and divertors remove large quantities of unburnt tritium and deuterium from the plasma which must be then recycled using a plasma processing system. The self-pumped limiter which does not remove the hydrogen species, does not require any plasma processing equipment. The blanket system and the coolant processing systems acquire greater importance with the use of this unconventional impurity control system. 3 refs., 2 figs.

  2. DESIGN OF A PATTERN RECOGNITION DIGITAL COMPUTER WITH APPLICATION TO THE AUTOMATIC SCANNING OF BUBBLE CHAMBER NEGATIVES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCormick, B.H.; Narasimhan, R.

    1963-01-01

    The overall computer system contains three main parts: an input device, a pattern recognition unit (PRU), and a control computer. The bubble chamber picture is divided into a grid of st run. Concent 1-mm squares on the film. It is then processed in parallel in a two-dimensional array of 1024 identical processing modules (stalactites) of the PRU. The array can function as a two- dimensional shift register in which results of successive shifting operations can be accumulated. The pattern recognition process is generally controlled by a conventional arithmetic computer. (A.G.W.)

  3. Energy-Efficient Routes for the Production of Gasoline from Biogas and Pyrolysis Oil-Process Design and Life-Cycle Assessment.

    PubMed

    Sundaram, Smitha; Kolb, Gunther; Hessel, Volker; Wang, Qi

    2017-03-29

    Two novel routes for the production of gasoline from pyrolysis oil (from timber pine) and biogas (from ley grass) are simulated, followed by a cradle-to-gate life-cycle assessment of the two production routes. The main aim of this work is to conduct a holistic evaluation of the proposed routes and benchmark them against the conventional route of producing gasoline from natural gas. A previously commercialized method of synthesizing gasoline involves conversion of natural gas to syngas, which is further converted to methanol, and then as a last step, the methanol is converted to gasoline. In the new proposed routes, the syngas production step is different; syngas is produced from a mixture of pyrolysis oil and biogas in the following two ways: (i) autothermal reforming of pyrolysis oil and biogas, in which there are two reactions in one reactor (ATR) and (ii) steam reforming of pyrolysis oil and catalytic partial oxidation of biogas, in which there are separated but thermally coupled reactions and reactors (CR). The other two steps to produce methanol from syngas, and gasoline from methanol, remain the same. The purpose of this simulation is to have an ex-ante comparison of the performance of the new routes against a reference, in terms of energy and sustainability. Thus, at this stage of simulations, nonrigorous, equilibrium-based models have been used for reactors, which will give the best case conversions for each step. For the conventional production route, conversion and yield data available in the literature have been used, wherever available.The results of the process design showed that the second method (separate, but thermally coupled reforming) has a carbon efficiency of 0.53, compared to the conventional route (0.48), as well as the first route (0.40). The life-cycle assessment results revealed that the newly proposed processes have a clear advantage over the conventional process in some categories, particularly the global warming potential and primary energy demand; but there are also some in which the conventional route fares better, such as the human toxicity potential and the categories related to land-use change such as biotic production potential and the groundwater resistance indicator. The results confirmed that even though using biomass such as timber pine as raw material does result in reduced greenhouse gas emissions, the activities associated with biomass, such as cultivation and harvesting, contribute to the environmental footprint, particularly the land use change categories. This gives an impetus to investigate the potential of agricultural, forest, or even food waste, which would be likely to have a substantially lower impact on the environment. Moreover, it could be seen that the source of electricity used in the process has a major impact on the environmental performance.

  4. CNN based approach for activity recognition using a wrist-worn accelerometer.

    PubMed

    Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R

    2017-07-01

    In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.

  5. Accurate Modeling Method for Cu Interconnect

    NASA Astrophysics Data System (ADS)

    Yamada, Kenta; Kitahara, Hiroshi; Asai, Yoshihiko; Sakamoto, Hideo; Okada, Norio; Yasuda, Makoto; Oda, Noriaki; Sakurai, Michio; Hiroi, Masayuki; Takewaki, Toshiyuki; Ohnishi, Sadayuki; Iguchi, Manabu; Minda, Hiroyasu; Suzuki, Mieko

    This paper proposes an accurate modeling method of the copper interconnect cross-section in which the width and thickness dependence on layout patterns and density caused by processes (CMP, etching, sputtering, lithography, and so on) are fully, incorporated and universally expressed. In addition, we have developed specific test patterns for the model parameters extraction, and an efficient extraction flow. We have extracted the model parameters for 0.15μm CMOS using this method and confirmed that 10%τpd error normally observed with conventional LPE (Layout Parameters Extraction) was completely dissolved. Moreover, it is verified that the model can be applied to more advanced technologies (90nm, 65nm and 55nm CMOS). Since the interconnect delay variations due to the processes constitute a significant part of what have conventionally been treated as random variations, use of the proposed model could enable one to greatly narrow the guardbands required to guarantee a desired yield, thereby facilitating design closure.

  6. Improved fuzzy PID controller design using predictive functional control structure.

    PubMed

    Wang, Yuzhong; Jin, Qibing; Zhang, Ridong

    2017-11-01

    In conventional PID scheme, the ensemble control performance may be unsatisfactory due to limited degrees of freedom under various kinds of uncertainty. To overcome this disadvantage, a novel PID control method that inherits the advantages of fuzzy PID control and the predictive functional control (PFC) is presented and further verified on the temperature model of a coke furnace. Based on the framework of PFC, the prediction of the future process behavior is first obtained using the current process input signal. Then, the fuzzy PID control based on the multi-step prediction is introduced to acquire the optimal control law. Finally, the case study on a temperature model of a coke furnace shows the effectiveness of the fuzzy PID control scheme when compared with conventional PID control and fuzzy self-adaptive PID control. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  8. Comparison of seven kinds of drinking water treatment processes to enhance organic material removal: a pilot test.

    PubMed

    Chen, Chao; Zhang, Xiaojian; He, Wenjie; Lu, Wei; Han, Hongda

    2007-08-15

    Organic matter in source water has presented many challenges in the field of water purification, especially for conventional treatment. A two-year-long pilot test comparing water treatment processes was conducted to enhance organic matter removal. The tested process combinations included the conventional process, conventional plus advanced treatment, pre-oxidation plus conventional process and pre-oxidation plus conventional plus advanced treatment. The efficiency of each kind of process was assayed with the comprehensive indices of COD(Mn), TOC, UV(254), AOC, BDOC, THMs, and HAAs and their formation potential. The results showed that the combination of the conventional process and O(3)-BAC provides integrated removal of organic matter and meets the required standards. It is the best performing treatment tested in this investigation for treating polluted source water in China. Moreover, much attention should be paid to organic removal before disinfection to control DBP formation and preserve biostability. This paper also reports the range of efficiency of each unit process to calculate the total efficiency of different process combinations in order to help choose the appropriate water treatment process.

  9. The influence of droplet size on the stability, in vivo digestion, and oral bioavailability of vitamin E emulsions.

    PubMed

    Parthasarathi, S; Muthukumar, S P; Anandharamakrishnan, C

    2016-05-18

    Vitamin E (α-tocopherol) is a nutraceutical compound, which has been shown to possess potent antioxidant and anticancer activity. However, its biological activity may be limited by its poor bioavailability. Colloidal delivery systems have shown wide applications in the food and pharmaceutical industries to deliver lipophilic bioactive compounds. In this study, we have developed conventional and nanoemulsions of vitamin E from food grade ingredients (sunflower oil, saponin, and water) and showed the nanoemulsion formulation increased the oral bioavailability when compared to the conventional emulsion. The mean droplet diameters in the nano and conventional emulsions were 0.277 and 1.285 μm, respectively. The stability of the emulsion formulation after thermal processing, long-term storage at different temperatures, mechanical stress and in plasma was determined. The results showed that the saponin coated nanoemulsion was stable to droplet coalescence during thermal processing (30-90 °C), long-term storage and mechanical stress when compared to the conventional emulsion. The biological fate of the emulsion formulations were studied using male Wistar rats as an animal model. The emulsion droplet stability during passage through the gastrointestinal tract was evaluated by their introduction into rat stomachs. Microscopy was used to investigate the structural changes that occurred during digestion. Both the conventional emulsion and nanoemulsion formulations showed strong evidence of droplet flocculation and coalescence during in vivo digestion. The in vivo oral bioavailability study revealed that vitamin E in a nanoemulsion form showed a 3-fold increase in the AUC when compared to the conventional emulsion. The information reported in this study will facilitate the design of colloidal delivery systems using nanoemulsion formulations.

  10. Modified tandem gratings anastigmatic imaging spectrometer with oblique incidence for spectral broadband

    NASA Astrophysics Data System (ADS)

    Cui, Chengguang; Wang, Shurong; Huang, Yu; Xue, Qingsheng; Li, Bo; Yu, Lei

    2015-09-01

    A modified spectrometer with tandem gratings that exhibits high spectral resolution and imaging quality for solar observation, monitoring, and understanding of coastal ocean processes is presented in this study. Spectral broadband anastigmatic imaging condition, spectral resolution, and initial optical structure are obtained based on geometric aberration theory. Compared with conventional tandem gratings spectrometers, this modified design permits flexibility in selecting gratings. A detailed discussion of the optical design and optical performance of an ultraviolet spectrometer with tandem gratings is also included to explain the advantage of oblique incidence for spectral broadband.

  11. Upconversion fiber-optic confocal microscopy under near-infrared pumping.

    PubMed

    Kim, Do-Hyun; Kang, Jin U; Ilev, Ilko K

    2008-03-01

    We present a simple upconversion fiber-optic confocal microscope design using a near-infrared laser for pumping of a rare-earth-doped glass powder. The nonlinear optical frequency conversion process is highly efficient with more than 2% upconversion fluorescence efficiency at a near-infrared pumping wavelength of 1.55 microm. The upconversion confocal design allows the use of conventional Si detectors and 1.55 microm near-infrared pump light. The lateral and axial resolutions of the system were equal to or better than 1.10 and 13.11 microm, respectively.

  12. Nanophotonic particle simulation and inverse design using artificial neural networks

    PubMed Central

    Peurifoy, John; Shen, Yichen; Jing, Li; Cano-Renteria, Fidel; DeLacy, Brendan G.; Joannopoulos, John D.; Tegmark, Max

    2018-01-01

    We propose a method to use artificial neural networks to approximate light scattering by multilayer nanoparticles. We find that the network needs to be trained on only a small sampling of the data to approximate the simulation to high precision. Once the neural network is trained, it can simulate such optical processes orders of magnitude faster than conventional simulations. Furthermore, the trained neural network can be used to solve nanophotonic inverse design problems by using back propagation, where the gradient is analytical, not numerical. PMID:29868640

  13. Integrated potentiometric detector for use in chip-based flow cells

    PubMed

    Tantra; Manz

    2000-07-01

    A new kind of potentiometric chip sensor for ion-selective electrodes (ISE) based on a solvent polymeric membrane is described. The chip sensor is designed to trap the organic cocktail inside the chip and to permit sample solution to flow past the membrane. The design allows the sensor to overcome technical problems of ruggedness and would therefore be ideal for industrial processes. The sensor performance for a Ba2+-ISE membrane based on a Vogtle ionophore showed electrochemical behavior similar to that observed in conventional electrodes and microelectrode arrangements.

  14. Monitoring system for the quality assessment in additive manufacturing

    NASA Astrophysics Data System (ADS)

    Carl, Volker

    2015-03-01

    Additive Manufacturing (AM) refers to a process by which a set of digital data -representing a certain complex 3dim design - is used to grow the respective 3dim real structure equal to the corresponding design. For the powder-based EOS manufacturing process a variety of plastic and metal materials can be used. Thereby, AM is in many aspects a very powerful tool as it can help to overcome particular limitations in conventional manufacturing. AM enables more freedom of design, complex, hollow and/or lightweight structures as well as product individualisation and functional integration. As such it is a promising approach with respect to the future design and manufacturing of complex 3dim structures. On the other hand, it certainly calls for new methods and standards in view of quality assessment. In particular, when utilizing AM for the design of complex parts used in aviation and aerospace technologies, appropriate monitoring systems are mandatory. In this respect, recently, sustainable progress has been accomplished by joining the common efforts and concerns of a manufacturer Additive Manufacturing systems and respective materials (EOS), along with those of an operator of such systems (MTU Aero Engines) and experienced application engineers (Carl Metrology), using decent know how in the field of optical and infrared methods regarding non-destructive-examination (NDE). The newly developed technology is best described by a high-resolution layer by layer inspection technique, which allows for a 3D tomography-analysis of the complex part at any time during the manufacturing process. Thereby, inspection costs are kept rather low by using smart image-processing methods as well as CMOS sensors instead of infrared detectors. Moreover, results from conventional physical metallurgy may easily be correlated with the predictive results of the monitoring system which not only allows for improvements of the AM monitoring system, but finally leads to an optimisation of the quality and insurance of material security of the complex structure being manufactured. Both, our poster and our oral presentation will explain the data flow between the above mentioned parties involved. A suitable monitoring system for Additive Manufacturing will be introduced, along with a presentation of the respective high resolution data acquisition, as well as the image processing and the data analysis allowing for a precise control of the 3dim growth-process.

  15. Monitoring system for the quality assessment in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl, Volker, E-mail: carl@t-zfp.de

    Additive Manufacturing (AM) refers to a process by which a set of digital data -representing a certain complex 3dim design - is used to grow the respective 3dim real structure equal to the corresponding design. For the powder-based EOS manufacturing process a variety of plastic and metal materials can be used. Thereby, AM is in many aspects a very powerful tool as it can help to overcome particular limitations in conventional manufacturing. AM enables more freedom of design, complex, hollow and/or lightweight structures as well as product individualisation and functional integration. As such it is a promising approach with respectmore » to the future design and manufacturing of complex 3dim structures. On the other hand, it certainly calls for new methods and standards in view of quality assessment. In particular, when utilizing AM for the design of complex parts used in aviation and aerospace technologies, appropriate monitoring systems are mandatory. In this respect, recently, sustainable progress has been accomplished by joining the common efforts and concerns of a manufacturer Additive Manufacturing systems and respective materials (EOS), along with those of an operator of such systems (MTU Aero Engines) and experienced application engineers (Carl Metrology), using decent know how in the field of optical and infrared methods regarding non-destructive-examination (NDE). The newly developed technology is best described by a high-resolution layer by layer inspection technique, which allows for a 3D tomography-analysis of the complex part at any time during the manufacturing process. Thereby, inspection costs are kept rather low by using smart image-processing methods as well as CMOS sensors instead of infrared detectors. Moreover, results from conventional physical metallurgy may easily be correlated with the predictive results of the monitoring system which not only allows for improvements of the AM monitoring system, but finally leads to an optimisation of the quality and insurance of material security of the complex structure being manufactured. Both, our poster and our oral presentation will explain the data flow between the above mentioned parties involved. A suitable monitoring system for Additive Manufacturing will be introduced, along with a presentation of the respective high resolution data acquisition, as well as the image processing and the data analysis allowing for a precise control of the 3dim growth-process.« less

  16. Planar junctionless phototransistor: A potential high-performance and low-cost device for optical-communications

    NASA Astrophysics Data System (ADS)

    Ferhati, H.; Djeffal, F.

    2017-12-01

    In this paper, a new junctionless optical controlled field effect transistor (JL-OCFET) and its comprehensive theoretical model is proposed to achieve high optical performance and low cost fabrication process. Exhaustive study of the device characteristics and comparison between the proposed junctionless design and the conventional inversion mode structure (IM-OCFET) for similar dimensions are performed. Our investigation reveals that the proposed design exhibits an outstanding capability to be an alternative to the IM-OCFET due to the high performance and the weak signal detection benefit offered by this design. Moreover, the developed analytical expressions are exploited to formulate the objective functions to optimize the device performance using Genetic Algorithms (GAs) approach. The optimized JL-OCFET not only demonstrates good performance in terms of derived drain current and responsivity, but also exhibits superior signal to noise ratio, low power consumption, high-sensitivity, high ION/IOFF ratio and high-detectivity as compared to the conventional IM-OCFET counterpart. These characteristics make the optimized JL-OCFET potentially suitable for developing low cost and ultrasensitive photodetectors for high-performance and low cost inter-chips data communication applications.

  17. Partial Oxidation Gas Turbine for Power and Hydrogen Co-Production from Coal-Derived Fuel in Industrial Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph Rabovitser

    The report presents a feasibility study of a new type of gas turbine. A partial oxidation gas turbine (POGT) shows potential for really high efficiency power generation and ultra low emissions. There are two main features that distinguish a POGT from a conventional gas turbine. These are associated with the design arrangement and the thermodynamic processes used in operation. A primary design difference of the POGT is utilization of a non?catalytic partial oxidation reactor (POR) in place of a conventional combustor. Another important distinction is that a much smaller compressor is required, one that typically supplies less than half ofmore » the air flow required in a conventional gas turbine. From an operational and thermodynamic point of view a key distinguishing feature is that the working fluid, fuel gas provided by the OR, has a much higher specific heat than lean combustion products and more energy per unit mass of fluid can be extracted by the POGT expander than in the conventional systems. The POGT exhaust stream contains unreacted fuel that can be combusted in different bottoming ycle or used as syngas for hydrogen or other chemicals production. POGT studies include feasibility design for conversion a conventional turbine to POGT duty, and system analyses of POGT based units for production of power solely, and combined production of power and yngas/hydrogen for different applications. Retrofit design study was completed for three engines, SGT 800, SGT 400, and SGT 100, and includes: replacing the combustor with the POR, compressor downsizing for about 50% design flow rate, generator replacement with 60 90% ower output increase, and overall unit integration, and extensive testing. POGT performances for four turbines with power output up to 350 MW in POGT mode were calculated. With a POGT as the topping cycle for power generation systems, the power output from the POGT ould be increased up to 90% compared to conventional engine keeping hot section temperatures, pressures, and volumetric flows practically identical. In POGT mode, the turbine specific power (turbine net power per lb mass flow from expander exhaust) is twice the value of the onventional turbine. POGT based IGCC plant conceptual design was developed and major components have been identified. Fuel flexible fluid bed gasifier, and novel POGT unit are the key components of the 100 MW IGCC plant for co producing electricity, hydrogen and/or yngas. Plant performances were calculated for bituminous coal and oxygen blown versions. Various POGT based, natural gas fueled systems for production of electricity only, coproduction of electricity and hydrogen, and co production of electricity and syngas for gas to liquid and hemical processes were developed and evaluated. Performance calculations for several versions of these systems were conducted. 64.6 % LHV efficiency for fuel to electricity in combined cycle was achieved. Such a high efficiency arise from using of syngas from POGT exhaust s a fuel that can provide required temperature level for superheated steam generation in HRSG, as well as combustion air preheating. Studies of POGT materials and combustion instabilities in POR were conducted and results reported. Preliminary market assessment was performed, and recommendations for POGT systems applications in oil industry were defined. POGT technology is ready to proceed to the engineering prototype stage, which is recommended.« less

  18. Optimization and Simulation of Plastic Injection Process using Genetic Algorithm and Moldflow

    NASA Astrophysics Data System (ADS)

    Martowibowo, Sigit Yoewono; Kaswadi, Agung

    2017-03-01

    The use of plastic-based products is continuously increasing. The increasing demands for thinner products, lower production costs, yet higher product quality has triggered an increase in the number of research projects on plastic molding processes. An important branch of such research is focused on mold cooling system. Conventional cooling systems are most widely used because they are easy to make by using conventional machining processes. However, the non-uniform cooling processes are considered as one of their weaknesses. Apart from the conventional systems, there are also conformal cooling systems that are designed for faster and more uniform plastic mold cooling. In this study, the conformal cooling system is applied for the production of bowl-shaped product made of PP AZ564. Optimization is conducted to initiate machine setup parameters, namely, the melting temperature, injection pressure, holding pressure and holding time. The genetic algorithm method and Moldflow were used to optimize the injection process parameters at a minimum cycle time. It is found that, an optimum injection molding processes could be obtained by setting the parameters to the following values: T M = 180 °C; P inj = 20 MPa; P hold = 16 MPa and t hold = 8 s, with a cycle time of 14.11 s. Experiments using the conformal cooling system yielded an average cycle time of 14.19 s. The studied conformal cooling system yielded a volumetric shrinkage of 5.61% and the wall shear stress was found at 0.17 MPa. The difference between the cycle time obtained through simulations and experiments using the conformal cooling system was insignificant (below 1%). Thus, combining process parameters optimization and simulations by using genetic algorithm method with Moldflow can be considered as valid.

  19. Dynamic tables: an architecture for managing evolving, heterogeneous biomedical data in relational database management systems.

    PubMed

    Corwin, John; Silberschatz, Avi; Miller, Perry L; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute-value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models.

  20. Design and measurement of fully digital ternary content addressable memory using ratioless static random access memory cells and hierarchical-AND matching comparator

    NASA Astrophysics Data System (ADS)

    Nishikata, Daisuke; Ali, Mohammad Alimudin Bin Mohd; Hosoda, Kento; Matsumoto, Hiroshi; Nakamura, Kazuyuki

    2018-04-01

    A 36-bit × 32-entry fully digital ternary content addressable memory (TCAM) using the ratioless static random access memory (RL-SRAM) technology and fully complementary hierarchical-AND matching comparators (HAMCs) was developed. Since its fully complementary and digital operation enables the effect of device variabilities to be avoided, it can operate with a quite low supply voltage. A test chip incorporating a conventional TCAM and a proposed 24-transistor ratioless TCAM (RL-TCAM) cells and HAMCs was developed using a 0.18 µm CMOS process. The minimum operating voltage of 0.25 V of the developed RL-TCAM, which is less than half of that of the conventional TCAM, was measured via the conventional CMOS push–pull output buffers with the level-shifting and flipping technique using optimized pull-up voltage and resistors.

  1. Design and performance of a production-worthy excimer-laser-based stepper

    NASA Astrophysics Data System (ADS)

    Unger, Robert; Sparkes, Christopher; Disessa, Peter A.; Elliott, David J.

    1992-06-01

    Excimer-laser-based steppers have matured to a production-worthy state. Widefield high-NA lenses have been developed and characterized for imaging down to 0.35 micron and below. Excimer lasers have attained practical levels of performance capability and stability, reliability, safety, and operating cost. Excimer stepper system integration and control issues such as focus, exposure, and overlay stability have been addressed. Enabling support technologies -- resist systems, resist processing, metrology and conventional mask making -- continue to progress and are becoming available. This paper discusses specific excimer stepper design challenges, and presents characterization data from several field installations of XLSTM deep-UV steppers configured with an advanced lens design.

  2. Design issues of a reinforcement-based self-learning fuzzy controller for petrochemical process control

    NASA Technical Reports Server (NTRS)

    Yen, John; Wang, Haojin; Daugherity, Walter C.

    1992-01-01

    Fuzzy logic controllers have some often-cited advantages over conventional techniques such as PID control, including easier implementation, accommodation to natural language, and the ability to cover a wider range of operating conditions. One major obstacle that hinders the broader application of fuzzy logic controllers is the lack of a systematic way to develop and modify their rules; as a result the creation and modification of fuzzy rules often depends on trial and error or pure experimentation. One of the proposed approaches to address this issue is a self-learning fuzzy logic controller (SFLC) that uses reinforcement learning techniques to learn the desirability of states and to adjust the consequent part of its fuzzy control rules accordingly. Due to the different dynamics of the controlled processes, the performance of a self-learning fuzzy controller is highly contingent on its design. The design issue has not received sufficient attention. The issues related to the design of a SFLC for application to a petrochemical process are discussed, and its performance is compared with that of a PID and a self-tuning fuzzy logic controller.

  3. Low-cost, high-resolution, single-structure array telescopes for imaging of low-earth-orbit satellites

    NASA Technical Reports Server (NTRS)

    Massie, N. A.; Oster, Yale; Poe, Greg; Seppala, Lynn; Shao, Mike

    1992-01-01

    Telescopes that are designed for the unconventional imaging of near-earth satellites must follow unique design rules. The costs must be reduced substantially over those of the conventional telescope designs, and the design must accommodate a technique to circumvent atmospheric distortion of the image. Apertures of 12 m and more along with altitude-altitude mounts that provide high tracking rates are required. A novel design for such a telescope, optimized for speckle imaging, has been generated. Its mount closely resembles a radar mount, and it does not use the conventional dome. Costs for this design are projected to be considerably lower than those for the conventional designs. Results of a design study are presented with details of the electro-optical and optical designs.

  4. Sparse signal representation and its applications in ultrasonic NDE.

    PubMed

    Zhang, Guang-Ming; Zhang, Cheng-Zhong; Harvey, David M

    2012-03-01

    Many sparse signal representation (SSR) algorithms have been developed in the past decade. The advantages of SSR such as compact representations and super resolution lead to the state of the art performance of SSR for processing ultrasonic non-destructive evaluation (NDE) signals. Choosing a suitable SSR algorithm and designing an appropriate overcomplete dictionary is a key for success. After a brief review of sparse signal representation methods and the design of overcomplete dictionaries, this paper addresses the recent accomplishments of SSR for processing ultrasonic NDE signals. The advantages and limitations of SSR algorithms and various overcomplete dictionaries widely-used in ultrasonic NDE applications are explored in depth. Their performance improvement compared to conventional signal processing methods in many applications such as ultrasonic flaw detection and noise suppression, echo separation and echo estimation, and ultrasonic imaging is investigated. The challenging issues met in practical ultrasonic NDE applications for example the design of a good dictionary are discussed. Representative experimental results are presented for demonstration. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

    PubMed

    Scandurra, I; Hägglund, M; Koch, S

    2008-08-01

    This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.

  6. Gearing

    NASA Technical Reports Server (NTRS)

    Coy, J. J.; Townsend, D. P.; Zaretsky, E. V.

    1985-01-01

    Gearing technology in its modern form has a history of only 100 years. However, the earliest form of gearing can probably be traced back to fourth century B.C. Greece. Current gear practice and recent advances in the technology are drawn together. The history of gearing is reviewed briefly in the Introduction. Subsequent sections describe types of gearing and their geometry, processing, and manufacture. Both conventional and more recent methods of determining gear stress and deflections are considered. The subjects of life prediction and lubrication are additions to the literature. New and more complete methods of power loss predictions as well as an optimum design of spur gear meshes are described. Conventional and new types of power transmission systems are presented.

  7. Economic feasibility of irradiation-composting plant of sewage sludge

    NASA Astrophysics Data System (ADS)

    Hashimoto, S.; Nishimura, K.; Machi, S.

    Design and cost analysis were made for a sewage sludge treatment plant (capacity 25 - 200 ton sludge/day) with an electron accelerator. Dewatered sludge is spreaded on a rolling drum through a flat nozzle and disinfected by electron irradiation with a dose of 5 kGy. Composting of the irradiated sludge is also made at the optimum temperature for 3 days. The accelerating voltage of electron and capacity of the accelerator are 1.5 MV and 15 kW, respectively. Total volume of the fermentor is about one third of that of conventional composting process because the irradiation makes the time of composting shorter. The cost of sludge treatment is almost the same as that of conventional method.

  8. Simplified Analysis of Pulse Detonation Rocket Engine Blowdown Gasdynamics and Performance

    NASA Technical Reports Server (NTRS)

    Morris, C. I.; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Pulse detonation rocket engines (PDREs) offer potential performance improvements over conventional designs, but represent a challenging modellng task. A simplified model for an idealized, straight-tube, single-shot PDRE blowdown process and thrust determination is described and implemented. In order to form an assessment of the accuracy of the model, the flowfield time history is compared to experimental data from Stanford University. Parametric Studies of the effect of mixture stoichiometry, initial fill temperature, and blowdown pressure ratio on the performance of a PDRE are performed using the model. PDRE performance is also compared with a conventional steady-state rocket engine over a range of pressure ratios using similar gasdynamic assumptions.

  9. Development of a microcomputer-based magnetic heading sensor

    NASA Technical Reports Server (NTRS)

    Garner, H. D.

    1987-01-01

    This paper explores the development of a flux-gate magnetic heading reference using a single-chip microcomputer to process heading information and to present it to the pilot in appropriate form. This instrument is intended to replace the conventional combination of mechanical compass and directional gyroscope currently in use in general aviation aircraft, at appreciable savings in cost and reduction in maintenance. Design of the sensing element, the signal processing electronics, and the computer algorithms which calculate the magnetic heading of the aircraft from the magnetometer data have been integrated in such a way as to minimize hardware requirements and simplify calibration procedures. Damping and deviation errors are avoided by the inherent design of the device, and a technique for compensating for northerly-turning-error is described.

  10. Process analysis of an in store production of knitted clothing

    NASA Astrophysics Data System (ADS)

    Buecher, D.; Kemper, M.; Schmenk, B.; Gloy, Y.-S.; Gries, T.

    2017-10-01

    In the textile and clothing industry, global value-added networks are widespread for textile and clothing production. As a result of global networking, the value chain is fragmented and a great deal of effort is required to coordinate the production processes [1]. In addition, the planning effort on the quantity and design of the goods is high and risky. Today the fashion industry is facing an increasing customer demand for individual and customizable products in addition to short delivery times [2]. These challenges are passed down to the textile and clothing industry decreasing batch sizes and production times. Conventional clothing production cannot fulfill those demands especially when combined with more and more individual or customizable designs. Hence new production concepts have to be developed.

  11. Instrumentation, control, and automation for submerged anaerobic membrane bioreactors.

    PubMed

    Robles, Ángel; Durán, Freddy; Ruano, María Victoria; Ribes, Josep; Rosado, Alfredo; Seco, Aurora; Ferrer, José

    2015-01-01

    A submerged anaerobic membrane bioreactor (AnMBR) demonstration plant with two commercial hollow-fibre ultrafiltration systems (PURON®, Koch Membrane Systems, PUR-PSH31) was designed and operated for urban wastewater treatment. An instrumentation, control, and automation (ICA) system was designed and implemented for proper process performance. Several single-input-single-output (SISO) feedback control loops based on conventional on-off and PID algorithms were implemented to control the following operating variables: flow-rates (influent, permeate, sludge recycling and wasting, and recycled biogas through both reactor and membrane tanks), sludge wasting volume, temperature, transmembrane pressure, and gas sparging. The proposed ICA for AnMBRs for urban wastewater treatment enables the optimization of this new technology to be achieved with a high level of process robustness towards disturbances.

  12. Turboexpander plant designs can provide high ethane recovery without inlet CO/sub 2/ removal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkinson, J.D.; Hudson, H.M.

    1982-05-01

    Several new turboexpander gas-plant schemes offer two advantages over conventional processes: they can recover over 85% of the natural gas stream's ethane while handling higher inlet CO/sub 2/ concentrations without freezing - this saves considerable costs by allowing smaller CO/sub 2/ removal units or eliminating the need for them entirely, and the liquids recovery system requires no more external horsepower and in many cases, even less; this maximized the quantity of liquids recovered per unit of energy input, thus further lowering costs. The economic benefits associated with the proved plant designs make the processes attractive even for inlet gas streamsmore » containing little or no CO/sub 2/.« less

  13. Design and validation of a tissue bath 3-D printed with PLA for optically mapping suspended whole heart preparations.

    PubMed

    Entz, Michael; King, D Ryan; Poelzing, Steven

    2017-12-01

    With the sudden increase in affordable manufacturing technologies, the relationship between experimentalists and the designing process for laboratory equipment is rapidly changing. While experimentalists are still dependent on engineers and manufacturers for precision electrical, mechanical, and optical equipment, it has become a realistic option for in house manufacturing of other laboratory equipment with less precise design requirements. This is possible due to decreasing costs and increasing functionality of desktop three-dimensional (3-D) printers and 3-D design software. With traditional manufacturing methods, iterative design processes are expensive and time consuming, and making more than one copy of a custom piece of equipment is prohibitive. Here, we provide an overview to design a tissue bath and stabilizer for a customizable, suspended, whole heart optical mapping apparatus that can be produced significantly faster and less expensive than conventional manufacturing techniques. This was accomplished through a series of design steps to prevent fluid leakage in the areas where the optical imaging glass was attached to the 3-D printed bath. A combination of an acetone dip along with adhesive was found to create a water tight bath. Optical mapping was used to quantify cardiac conduction velocity and action potential duration to compare 3-D printed baths to a bath that was designed and manufactured in a machine shop. Importantly, the manufacturing method did not significantly affect conduction, action potential duration, or contraction, suggesting that 3-D printed baths are equally effective for optical mapping experiments. NEW & NOTEWORTHY This article details three-dimensional printable equipment for use in suspended whole heart optical mapping experiments. This equipment is less expensive than conventional manufactured equipment as well as easily customizable to the experimentalist. The baths can be waterproofed using only a three-dimensional printer, acetone, a glass microscope slide, c-clamps, and adhesive. Copyright © 2017 the American Physiological Society.

  14. Clinical results of computerized tomography-based simulation with laser patient marking.

    PubMed

    Ragan, D P; Forman, J D; He, T; Mesina, C F

    1996-02-01

    Accuracy of a patient treatment portal marking device and computerized tomography (CT) simulation have been clinically tested. A CT-based simulator has been assembled based on a commercial CT scanner. This includes visualization software and a computer-controlled laser drawing device. This laser drawing device is used to transfer the setup, central axis, and/or radiation portals from the CT simulator to the patient for appropriate patient skin marking. A protocol for clinical testing is reported. Twenty-five prospectively, sequentially accessioned patients have been analyzed. The simulation process can be completed in an average time of 62 min. Under many cases, the treatment portals can be designed and the patient marked in one session. Mechanical accuracy of the system was found to be within +/- 1mm. The portal projection accuracy in clinical cases is observed to be better than +/- 1.2 mm. Operating costs are equivalent to the conventional simulation process it replaces. Computed tomography simulation is a clinical accurate substitute for conventional simulation when used with an appropriate patient marking system and digitally reconstructed radiographs. Personnel time spent in CT simulation is equivalent to time in conventional simulation.

  15. Calculation of energy recovery and greenhouse gas emission reduction from palm oil mill effluent treatment by an anaerobic granular-sludge process.

    PubMed

    Show, K Y; Ng, C A; Faiza, A R; Wong, L P; Wong, L Y

    2011-01-01

    Conventional aerobic and low-rate anaerobic processes such as pond and open-tank systems have been widely used in wastewater treatment. In order to improve treatment efficacy and to avoid greenhouse gas emissions, conventional treatment can be upgraded to a high performance anaerobic granular-sludge system. The anaerobic granular-sludge systems are designed to capture the biogas produced, rendering a potential for claims of carbon credits under the Kyoto Protocol for reducing emissions of greenhouse gases. Certified Emission Reductions (CERs) would be issued, which can be exchanged between businesses or bought and sold in international markets at the prevailing market prices. As the advanced anaerobic granular systems are capable of handling high organic loadings concomitant with high strength wastewater and short hydraulic retention time, they render more carbon credits than other conventional anaerobic systems. In addition to efficient waste degradation, the carbon credits can be used to generate revenue and to finance the project. This paper presents a scenario on emission avoidance based on a methane recovery and utilization project. An example analysis on emission reduction and an overview of the global emission market are also outlined.

  16. Comparative UAV and Field Phenotyping to Assess Yield and Nitrogen Use Efficiency in Hybrid and Conventional Barley.

    PubMed

    Kefauver, Shawn C; Vicente, Rubén; Vergara-Díaz, Omar; Fernandez-Gallego, Jose A; Kerfal, Samir; Lopez, Antonio; Melichar, James P E; Serret Molins, María D; Araus, José L

    2017-01-01

    With the commercialization and increasing availability of Unmanned Aerial Vehicles (UAVs) multiple rotor copters have expanded rapidly in plant phenotyping studies with their ability to provide clear, high resolution images. As such, the traditional bottleneck of plant phenotyping has shifted from data collection to data processing. Fortunately, the necessarily controlled and repetitive design of plant phenotyping allows for the development of semi-automatic computer processing tools that may sufficiently reduce the time spent in data extraction. Here we present a comparison of UAV and field based high throughput plant phenotyping (HTPP) using the free, open-source image analysis software FIJI (Fiji is just ImageJ) using RGB (conventional digital cameras), multispectral and thermal aerial imagery in combination with a matching suite of ground sensors in a study of two hybrids and one conventional barely variety with ten different nitrogen treatments, combining different fertilization levels and application schedules. A detailed correlation network for physiological traits and exploration of the data comparing between treatments and varieties provided insights into crop performance under different management scenarios. Multivariate regression models explained 77.8, 71.6, and 82.7% of the variance in yield from aerial, ground, and combined data sets, respectively.

  17. Bottom-up design of de novo thermoelectric hybrid materials using chalcogenide resurfacing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sahu, Ayaskanta; Russ, Boris; Su, Norman C.

    Hybrid organic/inorganic thermoelectric materials based on conducting polymers and inorganic nanostructures have been demonstrated to combine both the inherently low thermal conductivity of the polymer and the superior charge transport properties (high power factors) of the inorganic component. While their performance today still lags behind that of conventional inorganic thermoelectric materials, solution-processable hybrids have made rapid progress and also offer unique advantages not available to conventional rigid inorganic thermoelectrics, namely: (1) low cost fabrication on rigid and flexible substrates, as well as (2) engineering complex conformal geometries for energy harvesting/cooling. While the number of reports of new classes of viablemore » hybrid thermoelectric materials is growing, no group has reported a general approach for bottom-up design of both p- and n-type materials from one common base. Thus, unfortunately, the literature comprises mostly of disconnected discoveries, which limits development and calls for a first-principles approach for property manipulation analogous to doping in traditional semiconductor thermoelectrics. Here, molecular engineering at the organic/inorganic interface and simple processing techniques are combined to demonstrate a modular approach enabling de novo design of complex hybrid thermoelectric systems. Here, we chemically modify the surfaces of inorganic nanostructures and graft conductive polymers to yield robust solution processable p- and n-type inorganic/organic hybrid nanostructures. Our new modular approach not only offers researchers new tools to perform true bottom-up design of thermoelectric hybrids, but also strong performance advantages as well due to the quality of the designed interfaces. For example, we obtain enhanced power factors in existing (by up to 500% in Te/PEDOT:PSS) and novel (Bi 2S 3/PEDOT:PSS) p-type systems, and also generate water-processable and air-stable high performing n-type hybrid systems (Bi 2Te 3/PEDOT:PSS), thus highlighting the potency of our ex situ strategy in opening up new material options for thermoelectric applications. Finally, this strategy establishes a unique platform with broad handles for custom tailoring of thermal and electrical properties through hybrid material tunability and enables independent control over inorganic material chemistry, nanostructure geometry, and organic material properties, thus providing a robust pathway to major performance enhancements.« less

  18. Bottom-up design of de novo thermoelectric hybrid materials using chalcogenide resurfacing

    DOE PAGES

    Sahu, Ayaskanta; Russ, Boris; Su, Norman C.; ...

    2017-01-01

    Hybrid organic/inorganic thermoelectric materials based on conducting polymers and inorganic nanostructures have been demonstrated to combine both the inherently low thermal conductivity of the polymer and the superior charge transport properties (high power factors) of the inorganic component. While their performance today still lags behind that of conventional inorganic thermoelectric materials, solution-processable hybrids have made rapid progress and also offer unique advantages not available to conventional rigid inorganic thermoelectrics, namely: (1) low cost fabrication on rigid and flexible substrates, as well as (2) engineering complex conformal geometries for energy harvesting/cooling. While the number of reports of new classes of viablemore » hybrid thermoelectric materials is growing, no group has reported a general approach for bottom-up design of both p- and n-type materials from one common base. Thus, unfortunately, the literature comprises mostly of disconnected discoveries, which limits development and calls for a first-principles approach for property manipulation analogous to doping in traditional semiconductor thermoelectrics. Here, molecular engineering at the organic/inorganic interface and simple processing techniques are combined to demonstrate a modular approach enabling de novo design of complex hybrid thermoelectric systems. Here, we chemically modify the surfaces of inorganic nanostructures and graft conductive polymers to yield robust solution processable p- and n-type inorganic/organic hybrid nanostructures. Our new modular approach not only offers researchers new tools to perform true bottom-up design of thermoelectric hybrids, but also strong performance advantages as well due to the quality of the designed interfaces. For example, we obtain enhanced power factors in existing (by up to 500% in Te/PEDOT:PSS) and novel (Bi 2S 3/PEDOT:PSS) p-type systems, and also generate water-processable and air-stable high performing n-type hybrid systems (Bi 2Te 3/PEDOT:PSS), thus highlighting the potency of our ex situ strategy in opening up new material options for thermoelectric applications. Finally, this strategy establishes a unique platform with broad handles for custom tailoring of thermal and electrical properties through hybrid material tunability and enables independent control over inorganic material chemistry, nanostructure geometry, and organic material properties, thus providing a robust pathway to major performance enhancements.« less

  19. Integrated Application of Quality-by-Design Principles to Drug Product Development: A Case Study of Brivanib Alaninate Film-Coated Tablets.

    PubMed

    Badawy, Sherif I F; Narang, Ajit S; LaMarche, Keirnan R; Subramanian, Ganeshkumar A; Varia, Sailesh A; Lin, Judy; Stevens, Tim; Shah, Pankaj A

    2016-01-01

    Modern drug product development is expected to follow quality-by-design (QbD) paradigm. At the same time, although there are several issue-specific examples in the literature that demonstrate the application of QbD principles, a holistic demonstration of the application of QbD principles to drug product development and control strategy, is lacking. This article provides an integrated case study on the systematic application of QbD to product development and demonstrates the implementation of QbD concepts in the different aspects of product and process design for brivanib alaninate film-coated tablets. Using a risk-based approach, the strategy for development entailed identification of product critical quality attributes (CQAs), assessment of risks to the CQAs, and performing experiments to understand and mitigate identified risks. Quality risk assessments and design of experiments were performed to understand the quality of the input raw materials required for a robust formulation and the impact of manufacturing process parameters on CQAs. In addition to the material property and process parameter controls, the proposed control strategy includes use of process analytical technology and conventional analytical tests to control in-process material attributes and ensure quality of the final product. Copyright © 2016. Published by Elsevier Inc.

  20. Modern process designs for very high NGL recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finn, A.J.; Tomlinson, T.R.; Johnson, G.L.

    1999-07-01

    Typical margins between NGL and sales gas can justify consideration of very high NGL recovery from natural gas but traditionally, very high percentage recovery of propane or ethane has led to disproportionally high incremental power consumption and hence expensive compressors. Recent technical advances in the process design of cryogenic gas processing plants and in the equipment they se have led to a new breed of flowsheets that can cost-effectively give propane recoveries of as high as 99%. The high NGL recovery achievable with modern plants is economically possible due to their high thermodynamic efficiency. This is mainly because they usemore » the refrigeration available from the process more effectively and so recover more NGL. A high pressure rectification step can further improve NGL recovery economically, especially on larger plants. This residual NGL content would normally remain in the sales gas on a conventional turboexpander plant. Improved recovery of NGL can be obtained with little or no increase in sales gas compression power compared to conventional plants by judicious use of heat exchanger area. With high feed gas pressure and particularly with dense phase operation, the use of two expanders in series for feed gas let-down gives good process efficiency and relatively low specific power per ton of NGL recovered. Use of two expanders also avoids excessive liquid flows in the expander exhaust, thus improving the performance and reliability of the turboexpander system. The techniques discussed in the paper can be employed on revamps to improve NGL recovery. Improved process performance relies heavily on the use of efficient, multistream plant-fin exchangers and these can be easily added to an existing facility to increase NGL production.« less

  1. Development of the Neuron Assessment for Measuring Biology Students’ Use of Experimental Design Concepts and Representations

    PubMed Central

    Dasgupta, Annwesa P.; Anderson, Trevor R.; Pelaez, Nancy J.

    2016-01-01

    Researchers, instructors, and funding bodies in biology education are unanimous about the importance of developing students’ competence in experimental design. Despite this, only limited measures are available for assessing such competence development, especially in the areas of molecular and cellular biology. Also, existing assessments do not measure how well students use standard symbolism to visualize biological experiments. We propose an assessment-design process that 1) provides background knowledge and questions for developers of new “experimentation assessments,” 2) elicits practices of representing experiments with conventional symbol systems, 3) determines how well the assessment reveals expert knowledge, and 4) determines how well the instrument exposes student knowledge and difficulties. To illustrate this process, we developed the Neuron Assessment and coded responses from a scientist and four undergraduate students using the Rubric for Experimental Design and the Concept-Reasoning Mode of representation (CRM) model. Some students demonstrated sound knowledge of concepts and representations. Other students demonstrated difficulty with depicting treatment and control group data or variability in experimental outcomes. Our process, which incorporates an authentic research situation that discriminates levels of visualization and experimentation abilities, shows potential for informing assessment design in other disciplines. PMID:27146159

  2. A Systematic Process for Developing High Quality SaaS Cloud Services

    NASA Astrophysics Data System (ADS)

    La, Hyun Jung; Kim, Soo Dong

    Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.

  3. Design conceptuel d'un avion blended wing body de 200 passagers

    NASA Astrophysics Data System (ADS)

    Ammar, Sami

    The Blended Wing Body is built based on the flying wing concept and performance improvements compared to conventional aircraft. Contrariwise, most studies have focused on large aircraft and it is not sure whether the gains are the same for smaller aircraft. The main of objective is to perform the conceptual design of a BWB of 200 passengers and compare the performance obtained with a conventional aircraft equivalent in terms of payload and range. The design of the Blended Wing Body was carried out under the CEASIOM environment. This platform design suitable for conventional aircraft design has been modified and additional tools have been integrated in order to achieve the aerodynamic analysis, performance and stability of the aircraft fuselage built. A plane model is obtained in the geometric module AcBuilder CEASIOM from the design variables of a wing. Estimates of mass are made from semi- empirical formulas adapted to the geometry of the BWB and calculations centering and inertia are possible through BWB model developed in CATIA. Low fidelity methods, such as TORNADO and semi- empirical formulas are used to analyze the aerodynamic performance and stability of the aircraft. The aerodynamic results are validated using a high-fidelity analysis using FLUENT CFD software. An optimization process is implemented in order to obtain improved while maintaining a feasible design performance. It is an optimization of the plan form of the aircraft fuselage integrated with a number of passengers and equivalent to that of a A320.Les performance wing aircraft merged optimized maximum range are compared to A320 also optimized. Significant gains were observed. An analysis of the dynamics of longitudinal and lateral flight is carried out on the aircraft optimized BWB finesse and mass. This study identified the stable and unstable modes of the aircraft. Thus, this analysis has highlighted the stability problems associated with the oscillation of incidence and the Dutch roll for the absence of stabilizers.

  4. Interactive computer aided technology, evolution in the design/manufacturing process

    NASA Technical Reports Server (NTRS)

    English, C. H.

    1975-01-01

    A powerful computer-operated three dimensional graphic system and associated auxiliary computer equipment used in advanced design, production design, and manufacturing was described. This system has made these activities more productive than when using older and more conventional methods to design and build aerospace vehicles. With the use of this graphic system, designers are now able to define parts using a wide variety of geometric entities, define parts as fully surface 3-dimensional models as well as "wire-frame" models. Once geometrically defined, the designer is able to take section cuts of the surfaced model and automatically determine all of the section properties of the planar cut, lightpen detect all of the surface patches and automatically determine the volume and weight of the part. Further, his designs are defined mathematically at a degree of accuracy never before achievable.

  5. Achieving 50% Energy Savings in New Schools, Advanced Energy Design Guides: K-12 Schools (Brochure)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This fact sheet summarizes recommendations for designing elementary, middle, and high school buildings that will result in 50% less energy use than conventional new schools built to minimum code requirements. The recommendations are drawn from the Advanced Energy Design Guide for K-12 School Buildings, an ASHRAE publication that provides comprehensive recommendations for designing low-energy-use school buildings (see sidebar). Designed as a stand-alone document, this fact sheet provides key principles and a set of prescriptive design recommendations appropriate for smaller schools with insufficient budgets to fully implement best practices for integrated design and optimized performance. The recommendations have undergone a thoroughmore » analysis and review process through ASHRAE, and have been deemed the best combination of measures to achieve 50% savings in the greatest number of schools.« less

  6. Silicon CMOS optical receiver circuits with integrated thin-film compound semiconductor detectors

    NASA Astrophysics Data System (ADS)

    Brooke, Martin A.; Lee, Myunghee; Jokerst, Nan Marie; Camperi-Ginestet, C.

    1995-04-01

    While many circuit designers have tackled the problem of CMOS digital communications receiver design, few have considered the problem of circuitry suitable for an all CMOS digital IC fabrication process. Faced with a high speed receiver design the circuit designer will soon conclude that a high speed analog-oriented fabrication process provides superior performance advantages to a digital CMOS process. However, for applications where there are overwhelming reasons to integrate the receivers on the same IC as large amounts of conventional digital circuitry, the low yield and high cost of the exotic analog-oriented fabrication is no longer an option. The issues that result from a requirement to use a digital CMOS IC process cut across all aspects of receiver design, and result in significant differences in circuit design philosophy and topology. Digital ICs are primarily designed to yield small, fast CMOS devices for digital logic gates, thus no effort is put into providing accurate or high speed resistances, or capacitors. This lack of any reliable resistance or capacitance has a significant impact on receiver design. Since resistance optimization is not a prerogative of the digital IC process engineer, the wisest option is thus to not use these elements, opting instead for active circuitry to replace the functions normally ascribed to resistance and capacitance. Depending on the application receiver noise may be a dominant design constraint. The noise performance of CMOS amplifiers is different than bipolar or GaAs MESFET circuits, shot noise is generally insignificant when compared to channel thermal noise. As a result the optimal input stage topology is significantly different for the different technologies. It is found that, at speeds of operation approaching the limits of the digital CMOS process, open loop designs have noise-power-gain-bandwidth tradeoff performance superior to feedback designs. Furthermore, the lack of good resisters and capacitors complicates the use of feedback circuits. Thus feedback is generally not used in the front-end of our digital process CMOS receivers.

  7. Investigation of AWG demultiplexer based SOI for CWDM application

    NASA Astrophysics Data System (ADS)

    Juhari, Nurjuliana; Susthitha Menon, P.; Shaari, Sahbudin; Annuar Ehsan, Abang

    2017-11-01

    9-channel Arrayed Waveguide Grating (AWG) demultiplexer for conventional and tapered structure were simulated using beam propagation method (BPM) with channel spacing of 20 nm. The AWG demultiplexer was design using high refractive index (n 3.47) material namely silicon-on-insulator (SOI) with rib waveguide structure. The characteristics of insertion loss, adjacent crosstalk and output spectrum response at central wavelength of 1.55 μm for both designs were compared and analyzed. The conventional AWG produced a minimum insertion loss of 6.64 dB whereas the tapered AWG design reduced the insertion loss by 2.66 dB. The lowest adjacent crosstalk value of -16.96 dB was obtained in the conventional AWG design and this was much smaller compared to the tapered AWG design where the lowest crosstalk value is -17.23 dB. Hence, a tapered AWG design significantly reduces the insertion loss but has a slightly higher adjacent crosstalk compared to the conventional AWG design. On the other hand, the output spectrum responses that are obtained from both designs were close to the Coarse Wavelength Division Multiplexing (CWDM) wavelength grid.

  8. Hydrocarbonaceous material processing methods and apparatus

    DOEpatents

    Brecher, Lee E [Laramie, WY

    2011-07-12

    Methods and apparatus are disclosed for possibly producing pipeline-ready heavy oil from substantially non-pumpable oil feeds. The methods and apparatus may be designed to produce such pipeline-ready heavy oils in the production field. Such methods and apparatus may involve thermal soaking of liquid hydrocarbonaceous inputs in thermal environments (2) to generate, though chemical reaction, an increased distillate amount as compared with conventional boiling technologies.

  9. Appendix C: Rapid development approaches for system engineering and design

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Conventional system architectures, development processes, and tool environments often produce systems which exceed cost expectations and are obsolete before they are fielded. This paper explores some of the reasons for this and provides recommendations for how we can do better. These recommendations are based on DoD and NASA system developments and on our exploration and development of system/software engineering tools.

  10. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and auralizations.

  11. QFD emphasis of IME design

    NASA Astrophysics Data System (ADS)

    Erickson, C. M.; Martinez, A.

    1993-06-01

    The 1992 Integrated Modular Engine (IME) design concept, proposed to the Air Force Space Systems Division as a candidate for a National Launch System (NLS) upper stage, emphasized a detailed Quality Functional Deployment (QFD) procedure which set the basis for its final selection. With a list of engine requirements defined and prioritized by the customer, a QFD procedure was implemented where the characteristics of a number of engine and component configurations were assessed for degree of requirement satisfaction. The QFD process emphasized operability, cost, reliability and performance, with relative importance specified by the customer. Existing technology and near-term advanced technology were surveyed to achieve the required design strategies. In the process, advanced nozzles, advanced turbomachinery, valves, controls, and operational procedures were evaluated. The integrated arrangement of three conventional bell nozzle thrust chambers with two advanced turbopump sets selected as the configuration meeting all requirements was rated significantly ahead of the other candidates, including the Aerospike and horizontal flow nozzle configurations.

  12. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  13. A robust variable sampling time BLDC motor control design based upon μ-synthesis.

    PubMed

    Hung, Chung-Wen; Yen, Jia-Yush

    2013-01-01

    The variable sampling rate system is encountered in many applications. When the speed information is derived from the position marks along the trajectory, one would have a speed dependent sampling rate system. The conventional fixed or multisampling rate system theory may not work in these cases because the system dynamics include the uncertainties which resulted from the variable sampling rate. This paper derived a convenient expression for the speed dependent sampling rate system. The varying sampling rate effect is then translated into multiplicative uncertainties to the system. The design then uses the popular μ-synthesis process to achieve a robust performance controller design. The implementation on a BLDC motor demonstrates the effectiveness of the design approach.

  14. A Robust Variable Sampling Time BLDC Motor Control Design Based upon μ-Synthesis

    PubMed Central

    Yen, Jia-Yush

    2013-01-01

    The variable sampling rate system is encountered in many applications. When the speed information is derived from the position marks along the trajectory, one would have a speed dependent sampling rate system. The conventional fixed or multisampling rate system theory may not work in these cases because the system dynamics include the uncertainties which resulted from the variable sampling rate. This paper derived a convenient expression for the speed dependent sampling rate system. The varying sampling rate effect is then translated into multiplicative uncertainties to the system. The design then uses the popular μ-synthesis process to achieve a robust performance controller design. The implementation on a BLDC motor demonstrates the effectiveness of the design approach. PMID:24327804

  15. Digital redesign of anti-wind-up controller for cascaded analog system.

    PubMed

    Chen, Y S; Tsai, J S H; Shieh, L S; Moussighi, M M

    2003-01-01

    The cascaded conventional anti-wind-up (CAW) design method for integral controller is discussed. Then, the prediction-based digital redesign methodology is utilized to find the new pulse amplitude modulated (PAM) digital controller for effective digital control of the analog plant with input saturation constraint. The desired digital controller is determined from existing or pre-designed CAW analog controller. The proposed method provides a novel methodology for indirect digital design of a continuous-time unity output-feedback system with a cascaded analog controller as in the case of PID controllers for industrial control processes with the presence of actuator saturations. It enables us to implement an existing or pre-designed cascaded CAW analog controller via a digital controller effectively.

  16. High speed civil transport

    NASA Technical Reports Server (NTRS)

    Bogardus, Scott; Loper, Brent; Nauman, Chris; Page, Jeff; Parris, Rusty; Steinbach, Greg

    1990-01-01

    The design process of the High Speed Civil Transport (HSCT) combines existing technology with the expectation of future technology to create a Mach 3.0 transport. The HSCT was designed to have a range in excess of 6000 nautical miles and carry up to 300 passengers. This range will allow the HSCT to service the economically expanding Pacific Basin region. Effort was made in the design to enable the aircraft to use conventional airports with standard 12,000 foot runways. With a takeoff thrust of 250,000 pounds, the four supersonic through-flow engines will accelerate the HSCT to a cruise speed of Mach 3.0. The 679,000 pound (at takeoff) HSCT is designed to cruise at an altitude of 70,000 feet, flying above most atmospheric disturbances.

  17. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  18. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  19. Influencing Factors and Workpiece's Microstructure in Laser-Assisted Milling of Titanium

    NASA Astrophysics Data System (ADS)

    Wiedenmann, R.; Liebl, S.; Zaeh, M. F.

    Today's lightweight components have to withstand increasing mechanical and thermal loads. Therefore, advanced materials substitute conventional materials like steel or aluminum alloys. Using these high-performance materials the associated costs become prohibitively high. This paper presents the newest fundamental investigations on the hybrid process 'laser-assisted milling' which is an innovative technique to process such materials. The focus is on the validation of a numerical database for a CAD/CAM process control unit which is calculated by using simulation. Prior to that, the influencing factors on a laser-assisted milling process are systematically investigated using Design of Experiments (DoE) to identify the main influencing parameters coming from the laser and the milling operation.

  20. Influences of the manufacturing process chain design on the near surface condition and the resulting fatigue behaviour of quenched and tempered SAE 4140

    NASA Astrophysics Data System (ADS)

    Klein, M.; Eifler, D.

    2010-07-01

    To analyse interactions between single steps of process chains, variations in material properties, especially the microstructure and the resulting mechanical properties, specimens with tension screw geometry were manufactured with five process chains. The different process chains as well as their parameters influence the near surface condition and consequently the fatigue behaviour in a characteristic manner. The cyclic deformation behaviour of these specimens can be benchmarked equivalently with conventional strain measurements as well as with high-precision temperature and electrical resistance measurements. The development of temperature-values provides substantial information on cyclic load dependent changes in the microstructure.

  1. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  2. Design issues for a reinforcement-based self-learning fuzzy controller

    NASA Technical Reports Server (NTRS)

    Yen, John; Wang, Haojin; Dauherity, Walter

    1993-01-01

    Fuzzy logic controllers have some often cited advantages over conventional techniques such as PID control: easy implementation, its accommodation to natural language, the ability to cover wider range of operating conditions and others. One major obstacle that hinders its broader application is the lack of a systematic way to develop and modify its rules and as result the creation and modification of fuzzy rules often depends on try-error or pure experimentation. One of the proposed approaches to address this issue is self-learning fuzzy logic controllers (SFLC) that use reinforcement learning techniques to learn the desirability of states and to adjust the consequent part of fuzzy control rules accordingly. Due to the different dynamics of the controlled processes, the performance of self-learning fuzzy controller is highly contingent on the design. The design issue has not received sufficient attention. The issues related to the design of a SFLC for the application to chemical process are discussed and its performance is compared with that of PID and self-tuning fuzzy logic controller.

  3. Achieving 50% Energy Savings in Office Buildings, Advanced Energy Design Guides: Office Buildings (Brochure)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2014-09-01

    This fact sheet summarizes recommendations for designing new office buildings that result in 50% less energy use than conventional designs meeting minimum code requirements. The recommendations are drawn from the Advanced Energy Design Guide for Small to Medium Office Buildings, an ASHRAE publication that provides comprehensive recommendations for designing low-energy-use office buildings with gross floor areas up to 100,000 ft2 (see sidebar). Designed as a stand-alone document, this fact sheet provides key principles and a set of prescriptive design recommendations appropriate for smaller office buildings with insufficient budgets to fully implement best practices for integrated design and optimized performance. Themore » recommendations have undergone a thorough analysis and review process through ASHRAE, and have been deemed the best combination of measures to achieve 50% savings in the greatest number of office buildings.« less

  4. Alternatives for randomization in lifestyle intervention studies in cancer patients were not better than conventional randomization.

    PubMed

    Velthuis, Miranda J; May, Anne M; Monninkhof, Evelyn M; van der Wall, Elsken; Peeters, Petra H M

    2012-03-01

    Assessing effects of lifestyle interventions in cancer patients has some specific challenges. Although randomization is urgently needed for evidence-based knowledge, sometimes it is difficult to apply conventional randomization (i.e., consent preceding randomization and intervention) in daily settings. Randomization before seeking consent was proposed by Zelen, and additional modifications were proposed since. We discuss four alternatives for conventional randomization: single and double randomized consent design, two-stage randomized consent design, and the design with consent to postponed information. We considered these designs when designing a study to assess the impact of physical activity on cancer-related fatigue and quality of life. We tested the modified Zelen design with consent to postponed information in a pilot. The design was chosen to prevent drop out of participants in the control group because of disappointment about the allocation. The result was a low overall participation rate most likely because of perceived lack of information by eligible patients and a relatively high dropout in the intervention group. We conclude that the alternatives were not better than conventional randomization. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Macrosorb Kieselguhr-agarose composite adsorbents. New tools for downstream process design and scale up. Scientific note.

    PubMed

    Bite, M G; Berezenko, S; Reed, F J; Derry, L

    1988-08-01

    Incompressible Macrosorb composite adsorbents, while retaining all the desirable properties of traditional agarose-based hydrogel media, overcome the operational limitations imposed by the use of soft hydrogels: They permit useful application of fast flow rates without restrictions on bed depth and they can be used in fluidized bed mode. Considerations which are important when contemplating scaled-up processing are discussed. A comparative cost estimate for a production process for extracting albumin from bovine serum in column equipment illustrates the various advantages which may be exploited when using a composite adsorbent in place of a conventional soft gel equivalent.

  6. Effects of conventional ozonation and electro-peroxone pretreatment of surface water on disinfection by-product formation during subsequent chlorination.

    PubMed

    Mao, Yuqin; Guo, Di; Yao, Weikun; Wang, Xiaomao; Yang, Hongwei; Xie, Yuefeng F; Komarneni, Sridhar; Yu, Gang; Wang, Yujue

    2018-03-01

    The electro-peroxone (E-peroxone) process is an emerging ozone-based electrochemical advanced oxidation process that combines conventional ozonation with in-situ cathodic hydrogen peroxide (H 2 O 2 ) production for oxidative water treatment. In this study, the effects of the E-peroxone pretreatment on disinfection by-product (DBP) formation from chlorination of a synthetic surface water were investigated and compared to conventional ozonation. Results show that due to the enhanced transformation of ozone (O 3 ) to hydroxyl radicals (OH) by electro-generated H 2 O 2 , the E-peroxone process considerably enhanced dissolved organic carbon (DOC) abatement and significantly reduced bromate (BrO 3 - ) formation compared to conventional ozonation. However, natural organic matter (NOM) with high UV 254 absorbance, which is the major precursors of chlorination DBPs, was less efficiently abated during the E-peroxone process than conventional ozonation. Consequently, while both conventional ozonation and the E-peroxone process substantially reduced the formation of DBPs (trihalomethanes and haloacetic acids) during post-chlorination, higher DBP concentrations were generally observed during chlorination of the E-peroxone pretreated waters than conventional ozonation treated. In addition, because of conventional ozonation or the E-peroxone treatment, DBPs formed during post-chlorination shifted to more brominated species. The overall yields of brominated DBPs exhibited strong correlations with the bromide concentrations in water. Therefore, while the E-peroxone process can effectively suppress bromide transformation to bromate, it may lead to higher formation of brominated DBPs during post-chlorination compared to conventional ozonation. These results suggest that the E-peroxone process can lead to different DBP formation and speciation during water treatment trains compared to conventional ozonation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Power processing for electric propulsion

    NASA Technical Reports Server (NTRS)

    Finke, R. C.; Herron, B. G.; Gant, G. D.

    1975-01-01

    The potential of achieving up to 30 per cent more spacecraft payload or 50 per cent more useful operating life by the use of electric propulsion in place of conventional cold gas or hydrazine systems in science, communications, and earth applications spacecraft is a compelling reason to consider the inclusion of electric thruster systems in new spacecraft design. The propulsion requirements of such spacecraft dictate a wide range of thruster power levels and operational lifetimes, which must be matched by lightweight, efficient, and reliable thruster power processing systems. This paper will present electron bombardment ion thruster requirements; review the performance characteristics of present power processing systems; discuss design philosophies and alternatives in areas such as inverter type, arc protection, and control methods; and project future performance potentials for meeting goals in the areas of power processor weight (10 kg/kW), efficiency (approaching 92 per cent), reliability (0.96 for 15,000 hr), and thermal control capability (0.3 to 5 AU).

  8. How inverse solver technologies can support die face development and process planning in the automotive industry

    NASA Astrophysics Data System (ADS)

    Huhn, Stefan; Peeling, Derek; Burkart, Maximilian

    2017-10-01

    With the availability of die face design tools and incremental solver technologies to provide detailed forming feasibility results in a timely fashion, the use of inverse solver technologies and resulting process improvements during the product development process of stamped parts often is underestimated. This paper presents some applications of inverse technologies that are currently used in the automotive industry to streamline the product development process and greatly increase the quality of a developed process and the resulting product. The first focus is on the so-called target strain technology. Application examples will show how inverse forming analysis can be applied to support the process engineer during the development of a die face geometry for Class `A' panels. The drawing process is greatly affected by the die face design and the process designer has to ensure that the resulting drawn panel will meet specific requirements regarding surface quality and a minimum strain distribution to ensure dent resistance. The target strain technology provides almost immediate feedback to the process engineer during the die face design process if a specific change of the die face design will help to achieve these specific requirements or will be counterproductive. The paper will further show how an optimization of the material flow can be achieved through the use of a newly developed technology called Sculptured Die Face (SDF). The die face generation in SDF is more suited to be used in optimization loops than any other conventional die face design technology based on cross section design. A second focus in this paper is on the use of inverse solver technologies for secondary forming operations. The paper will show how the application of inverse technology can be used to accurately and quickly develop trim lines on simple as well as on complex support geometries.

  9. Applications of Evolutionary Technology to Manufacturing and Logistics Systems : State-of-the Art Survey

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Lin, Lin

    Many combinatorial optimization problems from industrial engineering and operations research in real-world are very complex in nature and quite hard to solve them by conventional techniques. Since the 1960s, there has been an increasing interest in imitating living beings to solve such kinds of hard combinatorial optimization problems. Simulating the natural evolutionary process of human beings results in stochastic optimization techniques called evolutionary algorithms (EAs), which can often outperform conventional optimization methods when applied to difficult real-world problems. In this survey paper, we provide a comprehensive survey of the current state-of-the-art in the use of EA in manufacturing and logistics systems. In order to demonstrate the EAs which are powerful and broadly applicable stochastic search and optimization techniques, we deal with the following engineering design problems: transportation planning models, layout design models and two-stage logistics models in logistics systems; job-shop scheduling, resource constrained project scheduling in manufacturing system.

  10. Multi-parameter optimization of monolithic high-index contrast grating reflectors

    NASA Astrophysics Data System (ADS)

    Marciniak, Magdalena; Gebski, Marcin; Dems, Maciej; Wasiak, Michał; Czyszanowski, Tomasz

    2016-03-01

    Conventional High-index Contrast Gratings (HCG) consist of periodically distributed high refractive index stripes surrounded by low index media. Practically, such low/high index stack can be fabricated in several ways however low refractive index layers are electrical insulators of poor thermal conductivities. Monolithic High-index Contrast Gratings (MHCGs) overcome those limitations since they can be implemented in any material with a real refractive index larger than 1.75 without the need of the combination of low and high refractive index materials. The freedom of use of various materials allows to provide more efficient current injection and better heat flow through the mirror, in contrary to the conventional HCGs. MHCGs can simplify the construction of VCSELs, reducing their epitaxial design to monolithic wafer with carrier confinement and active region inside and etched stripes on both surfaces in post processing. We present numerical analysis of MHCGs using a three-dimensional, fully vectorial optical model. We investigate possible designs of MHCGs using multidimensional optimization of grating parameters for different refractive indices.

  11. Nonlinear Frequency Compression in Hearing Aids: Impact on Speech and Language Development

    PubMed Central

    Bentler, Ruth; Walker, Elizabeth; McCreery, Ryan; Arenas, Richard M.; Roush, Patricia

    2015-01-01

    Objectives The research questions of this study were: (1) Are children using nonlinear frequency compression (NLFC) in their hearing aids getting better access to the speech signal than children using conventional processing schemes? The authors hypothesized that children whose hearing aids provided wider input bandwidth would have more access to the speech signal, as measured by an adaptation of the Speech Intelligibility Index, and (2) are speech and language skills different for children who have been fit with the two different technologies; if so, in what areas? The authors hypothesized that if the children were getting increased access to the speech signal as a result of their NLFC hearing aids (question 1), it would be possible to see improved performance in areas of speech production, morphosyntax, and speech perception compared with the group with conventional processing. Design Participants included 66 children with hearing loss recruited as part of a larger multisite National Institutes of Health–funded study, Outcomes for Children with Hearing Loss, designed to explore the developmental outcomes of children with mild to severe hearing loss. For the larger study, data on communication, academic and psychosocial skills were gathered in an accelerated longitudinal design, with entry into the study between 6 months and 7 years of age. Subjects in this report consisted of 3-, 4-, and 5-year-old children recruited at the North Carolina test site. All had at least at least 6 months of current hearing aid usage with their NLFC or conventional amplification. Demographic characteristics were compared at the three age levels as well as audibility and speech/language outcomes; speech-perception scores were compared for the 5-year-old groups. Results Results indicate that the audibility provided did not differ between the technology options. As a result, there was no difference between groups on speech or language outcome measures at 4 or 5 years of age, and no impact on speech perception (measured at 5 years of age). The difference in Comprehensive Assessment of Spoken Language and mean length of utterance scores for the 3-year-old group favoring the group with conventional amplification may be a consequence of confounding factors such as increased incidence of prematurity in the group using NLFC. Conclusions Children fit with NLFC had similar audibility, as measured by a modified Speech Intelligibility Index, compared with a matched group of children using conventional technology. In turn, there were no differences in their speech and language abilities. PMID:24892229

  12. Flight Test Validation of Optimal Input Design and Comparison to Conventional Inputs

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1997-01-01

    A technique for designing optimal inputs for aerodynamic parameter estimation was flight tested on the F-18 High Angle of Attack Research Vehicle (HARV). Model parameter accuracies calculated from flight test data were compared on an equal basis for optimal input designs and conventional inputs at the same flight condition. In spite of errors in the a priori input design models and distortions of the input form by the feedback control system, the optimal inputs increased estimated parameter accuracies compared to conventional 3-2-1-1 and doublet inputs. In addition, the tests using optimal input designs demonstrated enhanced design flexibility, allowing the optimal input design technique to use a larger input amplitude to achieve further increases in estimated parameter accuracy without departing from the desired flight test condition. This work validated the analysis used to develop the optimal input designs, and demonstrated the feasibility and practical utility of the optimal input design technique.

  13. Computational Design of a Krueger Flap Targeting Conventional Slat Aerodynamics

    NASA Technical Reports Server (NTRS)

    Akaydin, H. Dogus; Housman, Jeffrey A.; Kiris, Cetin C.; Bahr, Christopher J.; Hutcheson, Florence V.

    2016-01-01

    In this study, we demonstrate the design of a Krueger flap as a substitute for a conventional slat in a high-lift system. This notional design, with the objective of matching equivalent-mission performance on aircraft approach, was required for a comparative aeroacoustic study with computational and experimental components. We generated a family of high-lift systems with Krueger flaps based on a set of design parameters. Then, we evaluated the high-lift systems using steady 2D RANS simulations to find a good match for the conventional slat, based on total lift coefficients in free-air. Finally, we evaluated the mean aerodynamics of the high-lift systems with Krueger flap and conventional slat as they were installed in an open-jet wind tunnel flow. The surface pressures predicted with the simulations agreed well with experimental results.

  14. Computer-assisted versus conventional free fibula flap technique for craniofacial reconstruction: an outcomes comparison.

    PubMed

    Seruya, Mitchel; Fisher, Mark; Rodriguez, Eduardo D

    2013-11-01

    There has been rising interest in computer-aided design/computer-aided manufacturing for preoperative planning and execution of osseous free flap reconstruction. The purpose of this study was to compare outcomes between computer-assisted and conventional fibula free flap techniques for craniofacial reconstruction. A two-center, retrospective review was carried out on patients who underwent fibula free flap surgery for craniofacial reconstruction from 2003 to 2012. Patients were categorized by the type of reconstructive technique: conventional (between 2003 and 2009) or computer-aided design/computer-aided manufacturing (from 2010 to 2012). Demographics, surgical factors, and perioperative and long-term outcomes were compared. A total of 68 patients underwent microsurgical craniofacial reconstruction: 58 conventional and 10 computer-aided design and manufacturing fibula free flaps. By demographics, patients undergoing the computer-aided design/computer-aided manufacturing method were significantly older and had a higher rate of radiotherapy exposure compared with conventional patients. Intraoperatively, the median number of osteotomies was significantly higher (2.0 versus 1.0, p=0.002) and the median ischemia time was significantly shorter (120 minutes versus 170 minutes, p=0.004) for the computer-aided design/computer-aided manufacturing technique compared with conventional techniques; operative times were shorter for patients undergoing the computer-aided design/computer-aided manufacturing technique, although this did not reach statistical significance. Perioperative and long-term outcomes were equivalent for the two groups, notably, hospital length of stay, recipient-site infection, partial and total flap loss, and rate of soft-tissue and bony tissue revisions. Microsurgical craniofacial reconstruction using a computer-assisted fibula flap technique yielded significantly shorter ischemia times amidst a higher number of osteotomies compared with conventional techniques. Therapeutic, III.

  15. Screen printing of a capacitive cantilever-based motion sensor on fabric using a novel sacrificial layer process for smart fabric applications

    NASA Astrophysics Data System (ADS)

    Wei, Yang; Torah, Russel; Yang, Kai; Beeby, Steve; Tudor, John

    2013-07-01

    Free-standing cantilevers have been fabricated by screen printing sacrificial and structural layers onto a standard polyester cotton fabric. By printing additional conductive layers, a complete capacitive motion sensor on fabric using only screen printing has been fabricated. This type of free-standing structure cannot currently be fabricated using conventional fabric manufacturing processes. In addition, compared to conventional smart fabric fabrication processes (e.g. weaving and knitting), screen printing offers the advantages of geometric design flexibility and the ability to simultaneously print multiple devices of the same or different designs. Furthermore, a range of active inks exists from the printed electronics industry which can potentially be applied to create many types of smart fabric. Four cantilevers with different lengths have been printed on fabric using a five-layer structure with a sacrificial material underneath the cantilever. The sacrificial layer is subsequently removed at 160 °C for 30 min to achieve a freestanding cantilever above the fabric. Two silver electrodes, one on top of the cantilever and the other on top of the fabric, are used to capacitively detect the movement of the cantilever. In this way, an entirely printed motion sensor is produced on a standard fabric. The motion sensor was initially tested on an electromechanical shaker rig at a low frequency range to examine the linearity and the sensitivity of each design. Then, these sensors were individually attached to a moving human forearm to evaluate more representative results. A commercial accelerometer (Microstrain G-link) was mounted alongside for comparison. The printed sensors have a similar motion response to the commercial accelerometer, demonstrating the potential of a printed smart fabric motion sensor for use in intelligent clothing applications.

  16. Developing geogebra-assisted reciprocal teaching strategy to improve junior high school students’ abstraction ability, lateral thinking and mathematical persistence

    NASA Astrophysics Data System (ADS)

    Priatna, N.; Martadiputra, B. A. P.; Wibisono, Y.

    2018-05-01

    The development of science and technology requires reform in the utilization of various resources for mathematics teaching and learning process. One of the efforts that can be made is the implementation of GeoGebra-assisted Reciprocal Teaching strategy in mathematics instruction as an effective strategy in improving students’ cognitive, affective, and psychomotor abilities. This research is intended to implement GeoGebra-assisted Reciprocal Teaching strategy in improving abstraction ability, lateral thinking, and mathematical persistence of junior high school students. It employed quasi-experimental method with non-random pre-test and post-test control design. More specifically, it used the 2x3 factorial design, namely the learning factors that included GeoGebra-assisted Reciprocal Teaching and conventional teaching learning, and levels of early mathematical ability (high, middle, and low). The subjects in this research were the eighth grade students of junior high school, taken with purposive sampling. The results of this research show: Abstraction and lateral abilities of students who were taught with GeoGebra-assisted Reciprocal Teaching strategy were significantly higher than those of students who received conventional learning. Mathematical persistence of students taught with GeoGebra-assisted Reciprocal Teaching strategy was also significantly higher than of those taught with conventional learning.

  17. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.

  18. Analysis of digester design concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashare, E.; Wilson, E. H.

    1979-01-29

    Engineering economic analyses were performed on various digester design concepts to determine the relative performance for various biomass feedstocks. A comprehensive literature survey describing the state-of-the-art of the various digestion designs is included. The digester designs included in the analyses are CSTR, plug flow, batch, CSTR in series, multi-stage digestion and biomethanation. Other process options investigated included pretreatment processes such as shredding, degritting, and chemical pretreatment, and post-digestion processes, such as dewatering and gas purification. The biomass sources considered include feedlot manure, rice straw, and bagasse. The results of the analysis indicate that the most economical (on a unit gasmore » cost basis) digester design concept is the plug flow reactor. This conclusion results from this system providing a high gas production rate combined with a low capital hole-in-the-ground digester design concept. The costs determined in this analysis do not include any credits or penalties for feedstock or by-products, but present the costs only for conversion of biomass to methane. The batch land-fill type digester design was shown to have a unit gas cost comparable to that for a conventional stirred tank digester, with the potential of reducing the cost if a land-fill site were available for a lower cost per unit volume. The use of chemical pretreatment resulted in a higher unit gas cost, primarily due to the cost of pretreatment chemical. A sensitivity analysis indicated that the use of chemical pretreatment could improve the economics provided a process could be developed which utilized either less pretreatment chemical or a less costly chemical. The use of other process options resulted in higher unit gas costs. These options should only be used when necessary for proper process performance, or to result in production of a valuable by-product.« less

  19. Light-weight cyptography for resource constrained environments

    NASA Astrophysics Data System (ADS)

    Baier, Patrick; Szu, Harold

    2006-04-01

    We give a survey of "light-weight" encryption algorithms designed to maximise security within tight resource constraints (limited memory, power consumption, processor speed, chip area, etc.) The target applications of such algorithms are RFIDs, smart cards, mobile phones, etc., which may store, process and transmit sensitive data, but at the same time do not always support conventional strong algorithms. A survey of existing algorithms is given and new proposal is introduced.

  20. Testing the Efficacy of Two New Variants of Recasts with Standard Recasts in Communicative Conversational Settings: An Exploratory Longitudinal Study

    ERIC Educational Resources Information Center

    Wacha, Richard Charles; Liu, Yeu-Ting

    2017-01-01

    The purpose of this exploratory longitudinal study was to evaluate the efficacy of two new forms of recasts (i.e., elaborated and paraphrased recasts), each of which was designed to be more in accordance with contested views of input processing. The effectiveness of the two new forms of recasts was compared to that of conventional standard…

  1. Color separation gratings

    NASA Technical Reports Server (NTRS)

    Farn, Michael W.; Knowlden, Robert E.

    1993-01-01

    In this paper, we describe the theory, fabrication and test of a binary optics 'echelon'. The echelon is a grating structure which separates electromagnetic radiation of different wavelengths, but it does so according to diffraction order rather than by dispersion within one diffraction order, as is the case with conventional gratings. A prototype echelon, designed for the visible spectrum, is fabricated using the binary optics process. Tests of the prototype show good agreement with theoretical predictions.

  2. Modeling of the cranking and charging processes of conventional valve regulated lead acid (VRLA) batteries in micro-hybrid applications

    NASA Astrophysics Data System (ADS)

    Gou, Jun; Lee, Anson; Pyko, Jan

    2014-10-01

    The cranking and charging processes of a VRLA battery during stop-start cycling in micro-hybrid applications were simulated by one dimensional mathematical modeling, to study the formation and distribution of lead sulfate across the cell and analyze the resulting effect on battery aging. The battery focused on in this study represents a conventional VRLA battery without any carbon additives in the electrodes or carbon-based electrodes. The modeling results were validated against experimental data and used to analyze the "sulfation" of negative electrodes - the common failure mode of lead acid batteries under high-rate partial state of charge (HRPSoC) cycling. The analyses were based on two aging mechanisms proposed in previous studies and the predictions showed consistency with the previous teardown observations that the sulfate formed at the negative interface is more difficult to be converted back than anywhere else in the electrodes. The impact of cranking pulses during stop-start cycling on current density and the corresponding sulfate layer production was estimated. The effects of some critical design parameters on sulfate formation, distribution and aging over cycling were investigated, which provided guidelines for developing models and designing of VRLA batteries in micro-hybrid applications.

  3. Monitoring Pre-Stressed Composites Using Optical Fibre Sensors.

    PubMed

    Krishnamurthy, Sriram; Badcock, Rodney A; Machavaram, Venkata R; Fernando, Gerard F

    2016-05-28

    Residual stresses in fibre reinforced composites can give rise to a number of undesired effects such as loss of dimensional stability and premature fracture. Hence, there is significant merit in developing processing techniques to mitigate the development of residual stresses. However, tracking and quantifying the development of these fabrication-induced stresses in real-time using conventional non-destructive techniques is not straightforward. This article reports on the design and evaluation of a technique for manufacturing pre-stressed composite panels from unidirectional E-glass/epoxy prepregs. Here, the magnitude of the applied pre-stress was monitored using an integrated load-cell. The pre-stressing rig was based on a flat-bed design which enabled autoclave-based processing. A method was developed to end-tab the laminated prepregs prior to pre-stressing. The development of process-induced residual strain was monitored in-situ using embedded optical fibre sensors. Surface-mounted electrical resistance strain gauges were used to measure the strain when the composite was unloaded from the pre-stressing rig at room temperature. Four pre-stress levels were applied prior to processing the laminated preforms in an autoclave. The results showed that the application of a pre-stress of 108 MPa to a unidirectional [0]16 E-glass/913 epoxy preform, reduced the residual strain in the composite from -600 µε (conventional processing without pre-stress) to approximately zero. A good correlation was observed between the data obtained from the surface-mounted electrical resistance strain gauge and the embedded optical fibre sensors. In addition to "neutralising" the residual stresses, superior axial orientation of the reinforcement can be obtained from pre-stressed composites. A subsequent publication will highlight the consequences of pres-stressing on fibre alignment, the tensile, flexural, compressive and fatigue performance of unidirectional E-glass composites.

  4. Monitoring Pre-Stressed Composites Using Optical Fibre Sensors

    PubMed Central

    Krishnamurthy, Sriram; Badcock, Rodney A.; Machavaram, Venkata R.; Fernando, Gerard F.

    2016-01-01

    Residual stresses in fibre reinforced composites can give rise to a number of undesired effects such as loss of dimensional stability and premature fracture. Hence, there is significant merit in developing processing techniques to mitigate the development of residual stresses. However, tracking and quantifying the development of these fabrication-induced stresses in real-time using conventional non-destructive techniques is not straightforward. This article reports on the design and evaluation of a technique for manufacturing pre-stressed composite panels from unidirectional E-glass/epoxy prepregs. Here, the magnitude of the applied pre-stress was monitored using an integrated load-cell. The pre-stressing rig was based on a flat-bed design which enabled autoclave-based processing. A method was developed to end-tab the laminated prepregs prior to pre-stressing. The development of process-induced residual strain was monitored in-situ using embedded optical fibre sensors. Surface-mounted electrical resistance strain gauges were used to measure the strain when the composite was unloaded from the pre-stressing rig at room temperature. Four pre-stress levels were applied prior to processing the laminated preforms in an autoclave. The results showed that the application of a pre-stress of 108 MPa to a unidirectional [0]16 E-glass/913 epoxy preform, reduced the residual strain in the composite from −600 µε (conventional processing without pre-stress) to approximately zero. A good correlation was observed between the data obtained from the surface-mounted electrical resistance strain gauge and the embedded optical fibre sensors. In addition to “neutralising” the residual stresses, superior axial orientation of the reinforcement can be obtained from pre-stressed composites. A subsequent publication will highlight the consequences of pres-stressing on fibre alignment, the tensile, flexural, compressive and fatigue performance of unidirectional E-glass composites. PMID:27240378

  5. The Effect of Surfactant Content over Cu-Ni Coatings Electroplated by the sc-CO₂ Technique.

    PubMed

    Chuang, Ho-Chiao; Sánchez, Jorge; Cheng, Hsiang-Yun

    2017-04-19

    Co-plating of Cu-Ni coatings by supercritical CO₂ (sc-CO₂) and conventional electroplating processes was studied in this work. 1,4-butynediol was chosen as the surfactant and the effects of adjusting the surfactant content were described. Although the sc-CO₂ process displayed lower current efficiency, it effectively removed excess hydrogen that causes defects on the coating surface, refined grain size, reduced surface roughness, and increased electrochemical resistance. Surface roughness of coatings fabricated by the sc-CO₂ process was reduced by an average of 10%, and a maximum of 55%, compared to conventional process at different fabrication parameters. Cu-Ni coatings produced by the sc-CO₂ process displayed increased corrosion potential of ~0.05 V over Cu-Ni coatings produced by the conventional process, and 0.175 V over pure Cu coatings produced by the conventional process. For coatings ~10 µm thick, internal stress developed from the sc-CO₂ process were ~20 MPa lower than conventional process. Finally, the preferred crystal orientation of the fabricated coatings remained in the (111) direction regardless of the process used or surfactant content.

  6. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  7. 3D Displays And User Interface Design For A Radiation Therapy Treatment Planning CAD Tool

    NASA Astrophysics Data System (ADS)

    Mosher, Charles E.; Sherouse, George W.; Chaney, Edward L.; Rosenman, Julian G.

    1988-06-01

    The long term goal of the project described in this paper is to improve local tumor control through the use of computer-aided treatment design methods that can result in selection of better treatment plans compared with conventional planning methods. To this end, a CAD tool for the design of radiation treatment beams is described. Crucial to the effectiveness of this tool are high quality 3D display techniques. We have found that 2D and 3D display methods dramatically improve the comprehension of the complex spatial relationships between patient anatomy, radiation beams, and dose distributions. In order to take full advantage of these displays, an intuitive and highly interactive user interface was created. If the system is to be used by physicians unfamiliar with computer systems, it is essential that a user interface is incorporated that allows the user to navigate through each step of the design process in a manner similar to what they are used to. Compared with conventional systems, we believe our display and CAD tools will allow the radiotherapist to achieve more accurate beam targetting leading to a better radiation dose configuration to the tumor volume. This would result in a reduction of the dose to normal tissue.

  8. On-chip infrared sensors: redefining the benefits of scaling

    NASA Astrophysics Data System (ADS)

    Kita, Derek; Lin, Hongtao; Agarwal, Anu; Yadav, Anupama; Richardson, Kathleen; Luzinov, Igor; Gu, Tian; Hu, Juejun

    2017-03-01

    Infrared (IR) spectroscopy is widely recognized as a gold standard technique for chemical and biological analysis. Traditional IR spectroscopy relies on fragile bench-top instruments located in dedicated laboratory settings, and is thus not suitable for emerging field-deployed applications such as in-line industrial process control, environmental monitoring, and point-of-care diagnosis. Recent strides in photonic integration technologies provide a promising route towards enabling miniaturized, rugged platforms for IR spectroscopic analysis. It is therefore attempting to simply replace the bulky discrete optical elements used in conventional IR spectroscopy with their on-chip counterparts. This size down-scaling approach, however, cripples the system performance as both the sensitivity of spectroscopic sensors and spectral resolution of spectrometers scale with optical path length. In light of this challenge, we will discuss two novel photonic device designs uniquely capable of reaping performance benefits from microphotonic scaling. We leverage strong optical and thermal confinement in judiciously designed micro-cavities to circumvent the thermal diffusion and optical diffraction limits in conventional photothermal sensors and achieve a record 104 photothermal sensitivity enhancement. In the second example, an on-chip spectrometer design with the Fellgett's advantage is analyzed. The design enables sub-nm spectral resolution on a millimeter-sized, fully packaged chip without moving parts.

  9. Experimental Evaluation of Acoustic Engine Liner Models Developed with COMSOL Multiphysics

    NASA Technical Reports Server (NTRS)

    Schiller, Noah H.; Jones, Michael G.; Bertolucci, Brandon

    2017-01-01

    Accurate modeling tools are needed to design new engine liners capable of reducing aircraft noise. The purpose of this study is to determine if a commercially-available finite element package, COMSOL Multiphysics, can be used to accurately model a range of different acoustic engine liner designs, and in the process, collect and document a benchmark dataset that can be used in both current and future code evaluation activities. To achieve these goals, a variety of liner samples, ranging from conventional perforate-over-honeycomb to extended-reaction designs, were installed in one wall of the grazing flow impedance tube at the NASA Langley Research Center. The liners were exposed to high sound pressure levels and grazing flow, and the effect of the liner on the sound field in the flow duct was measured. These measurements were then compared with predictions. While this report only includes comparisons for a subset of the configurations, the full database of all measurements and predictions is available in electronic format upon request. The results demonstrate that both conventional perforate-over-honeycomb and extended-reaction liners can be accurately modeled using COMSOL. Therefore, this modeling tool can be used with confidence to supplement the current suite of acoustic propagation codes, and ultimately develop new acoustic engine liners designed to reduce aircraft noise.

  10. Development of a Multifidelity Approach to Acoustic Liner Impedance Eduction

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.

    2017-01-01

    The use of acoustic liners has proven to be extremely effective in reducing aircraft engine fan noise transmission/radiation. However, the introduction of advanced fan designs and shorter engine nacelles has highlighted a need for novel acoustic liner designs that provide increased fan noise reduction over a broader frequency range. To achieve aggressive noise reduction goals, advanced broadband liner designs, such as zone liners and variable impedance liners, will likely depart from conventional uniform impedance configurations. Therefore, educing the impedance of these axial- and/or spanwise-variable impedance liners will require models that account for three-dimensional effects, thereby increasing computational expense. Thus, it would seem advantageous to investigate the use of multifidelity modeling approaches to impedance eduction for these advanced designs. This paper describes an extension of the use of the CDUCT-LaRC code to acoustic liner impedance eduction. The proposed approach is applied to a hardwall insert and conventional liner using simulated data. Educed values compare well with those educed using two extensively tested and validated approaches. The results are very promising and provide justification to further pursue the complementary use of CDUCT-LaRC with the currently used finite element codes to increase the efficiency of the eduction process for configurations involving three-dimensional effects.

  11. Liquid rocket performance computer model with distributed energy release

    NASA Technical Reports Server (NTRS)

    Combs, L. P.

    1972-01-01

    Development of a computer program for analyzing the effects of bipropellant spray combustion processes on liquid rocket performance is described and discussed. The distributed energy release (DER) computer program was designed to become part of the JANNAF liquid rocket performance evaluation methodology and to account for performance losses associated with the propellant combustion processes, e.g., incomplete spray gasification, imperfect mixing between sprays and their reacting vapors, residual mixture ratio striations in the flow, and two-phase flow effects. The DER computer program begins by initializing the combustion field at the injection end of a conventional liquid rocket engine, based on injector and chamber design detail, and on propellant and combustion gas properties. It analyzes bipropellant combustion, proceeding stepwise down the chamber from those initial conditions through the nozzle throat.

  12. Design and implementation of a control structure for quality products in a crude oil atmospheric distillation column.

    PubMed

    Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis

    2017-11-01

    In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Self-organization of maze-like structures via guided wrinkling.

    PubMed

    Bae, Hyung Jong; Bae, Sangwook; Yoon, Jinsik; Park, Cheolheon; Kim, Kibeom; Kwon, Sunghoon; Park, Wook

    2017-06-01

    Sophisticated three-dimensional (3D) structures found in nature are self-organized by bottom-up natural processes. To artificially construct these complex systems, various bottom-up fabrication methods, designed to transform 2D structures into 3D structures, have been developed as alternatives to conventional top-down lithography processes. We present a different self-organization approach, where we construct microstructures with periodic and ordered, but with random architecture, like mazes. For this purpose, we transformed planar surfaces using wrinkling to directly use randomly generated ridges as maze walls. Highly regular maze structures, consisting of several tessellations with customized designs, were fabricated by precisely controlling wrinkling with the ridge-guiding structure, analogous to the creases in origami. The method presented here could have widespread applications in various material systems with multiple length scales.

  14. Application of computational methods to the design and characterisation of porous molecular materials.

    PubMed

    Evans, Jack D; Jelfs, Kim E; Day, Graeme M; Doonan, Christian J

    2017-06-06

    Composed from discrete units, porous molecular materials (PMMs) possess unique properties not observed for conventional, extended, solids, such as solution processibility and permanent porosity in the liquid phase. However, identifying the origin of porosity is not a trivial process, especially for amorphous or liquid phases. Furthermore, the assembly of molecular components is typically governed by a subtle balance of weak intermolecular forces that makes structure prediction challenging. Accordingly, in this review we canvass the crucial role of molecular simulations in the characterisation and design of PMMs. We will outline strategies for modelling porosity in crystalline, amorphous and liquid phases and also describe the state-of-the-art methods used for high-throughput screening of large datasets to identify materials that exhibit novel performance characteristics.

  15. Comparison of electron beam and laser beam powder bed fusion additive manufacturing process for high temperature turbine component materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dryepondt, Sebastien N; Pint, Bruce A; Ryan, Daniel

    2016-04-01

    The evolving 3D printer technology is now at the point where some turbine components could be additive manufactured (AM) for both development and production purposes. However, this will require a significant evaluation program to qualify the process and components to meet current design and quality standards. The goal of the project was to begin characterization of the microstructure and mechanical properties of Nickel Alloy X (Ni-22Cr-18Fe-9Mo) test bars fabricated by powder bed fusion (PBF) AM processes that use either an electron beam (EB) or laser beam (LB) power source. The AM materials produced with the EB and LB processes displayedmore » significant differences in microstructure and resultant mechanical properties. Accordingly, during the design analysis of AM turbine components, the specific mechanical behavior of the material produced with the selected AM process should be considered. Comparison of the mechanical properties of both the EB and LB materials to those of conventionally processed Nickel Alloy X materials indicates the subject AM materials are viable alternatives for manufacture of some turbine components.« less

  16. A comparison study on microwave-assisted extraction of Artemisia sphaerocephala polysaccharides with conventional method: Molecule structure and antioxidant activities evaluation.

    PubMed

    Wang, Junlong; Zhang, Ji; Wang, Xiaofang; Zhao, Baotang; Wu, Yiqian; Yao, Jian

    2009-12-01

    The conventional extraction methods for polysaccharides were time-consuming, laborious and energy-consuming. Microwave-assisted extraction (MAE) technique was employed for the extraction of Artemisia sphaerocephala polysaccharides (ASP), which is a traditional Chinese food. The extracting parameters were optimized by Box-Behnken design. In microwave heating process, a decrease in molecular weight (M(w)) was detected in SEC-LLS measurement. A d(f) value of 2.85 indicated ASP using MAE exhibited as a sphere conformation of branched clusters in aqueous solution. Furthermore, it showed stronger antioxidant activities compared with hot water extraction. The data obtained showed that the molecular weights played a more important role in antioxidant activities.

  17. A practical radial basis function equalizer.

    PubMed

    Lee, J; Beach, C; Tepedelenlioglu, N

    1999-01-01

    A radial basis function (RBF) equalizer design process has been developed in which the number of basis function centers used is substantially fewer than conventionally required. The reduction of centers is accomplished in two-steps. First an algorithm is used to select a reduced set of centers that lie close to the decision boundary. Then the centers in this reduced set are grouped, and an average position is chosen to represent each group. Channel order and delay, which are determining factors in setting the initial number of centers, are estimated from regression analysis. In simulation studies, an RBF equalizer with more than 2000-to-1 reduction in centers performed as well as the RBF equalizer without reduction in centers, and better than a conventional linear equalizer.

  18. Surface-specific additive manufacturing test artefacts

    NASA Astrophysics Data System (ADS)

    Townsend, Andrew; Racasan, Radu; Blunt, Liam

    2018-06-01

    Many test artefact designs have been proposed for use with additive manufacturing (AM) systems. These test artefacts have primarily been designed for the evaluation of AM form and dimensional performance. A series of surface-specific measurement test artefacts designed for use in the verification of AM manufacturing processes are proposed here. Surface-specific test artefacts can be made more compact because they do not require the large dimensions needed for accurate dimensional and form measurements. The series of three test artefacts are designed to provide comprehensive information pertaining to the manufactured surface. Measurement possibilities include deviation analysis, surface texture parameter data generation, sub-surface analysis, layer step analysis and build resolution comparison. The test artefacts are designed to provide easy access for measurement using conventional surface measurement techniques, for example, focus variation microscopy, stylus profilometry, confocal microscopy and scanning electron microscopy. Additionally, the test artefacts may be simply visually inspected as a comparative tool, giving a fast indication of process variation between builds. The three test artefacts are small enough to be included in every build and include built-in manufacturing traceability information, making them a convenient physical record of the build.

  19. Design Criteria For Networked Image Analysis System

    NASA Astrophysics Data System (ADS)

    Reader, Cliff; Nitteberg, Alan

    1982-01-01

    Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.

  20. Energy-efficient ovens for unpolluted balady bread

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gadalla, M.A.; Mansour, M.S.; Mahdy, E.

    A new bread oven has been developed, tested and presented in this work for local balady bread. The design has the advantage of being efficient and producing unpolluted bread. An extensive study of the conventional and available designs has been carried out in order to help developing the new design. Evaluation of the conventional design is based on numerous tests and measurements. A computer code utilizing the indirect method has been developed to evaluate the thermal performance of the tested ovens. The present design achieves higher thermal efficiency of about 50% than the conventional ones. In addition, its capital costmore » is much cheaper than other imported designs. Thus, the present design achieves higher efficiency, pollutant free products and less cost. Moreover, it may be modified for different types of bread baking systems.« less

  1. Energy-Efficient Routes for the Production of Gasoline from Biogas and Pyrolysis Oil—Process Design and Life-Cycle Assessment

    PubMed Central

    2017-01-01

    Two novel routes for the production of gasoline from pyrolysis oil (from timber pine) and biogas (from ley grass) are simulated, followed by a cradle-to-gate life-cycle assessment of the two production routes. The main aim of this work is to conduct a holistic evaluation of the proposed routes and benchmark them against the conventional route of producing gasoline from natural gas. A previously commercialized method of synthesizing gasoline involves conversion of natural gas to syngas, which is further converted to methanol, and then as a last step, the methanol is converted to gasoline. In the new proposed routes, the syngas production step is different; syngas is produced from a mixture of pyrolysis oil and biogas in the following two ways: (i) autothermal reforming of pyrolysis oil and biogas, in which there are two reactions in one reactor (ATR) and (ii) steam reforming of pyrolysis oil and catalytic partial oxidation of biogas, in which there are separated but thermally coupled reactions and reactors (CR). The other two steps to produce methanol from syngas, and gasoline from methanol, remain the same. The purpose of this simulation is to have an ex-ante comparison of the performance of the new routes against a reference, in terms of energy and sustainability. Thus, at this stage of simulations, nonrigorous, equilibrium-based models have been used for reactors, which will give the best case conversions for each step. For the conventional production route, conversion and yield data available in the literature have been used, wherever available.The results of the process design showed that the second method (separate, but thermally coupled reforming) has a carbon efficiency of 0.53, compared to the conventional route (0.48), as well as the first route (0.40). The life-cycle assessment results revealed that the newly proposed processes have a clear advantage over the conventional process in some categories, particularly the global warming potential and primary energy demand; but there are also some in which the conventional route fares better, such as the human toxicity potential and the categories related to land-use change such as biotic production potential and the groundwater resistance indicator. The results confirmed that even though using biomass such as timber pine as raw material does result in reduced greenhouse gas emissions, the activities associated with biomass, such as cultivation and harvesting, contribute to the environmental footprint, particularly the land use change categories. This gives an impetus to investigate the potential of agricultural, forest, or even food waste, which would be likely to have a substantially lower impact on the environment. Moreover, it could be seen that the source of electricity used in the process has a major impact on the environmental performance. PMID:28405056

  2. The performances of different overlay mark types at 65nm node on 300-mm wafers

    NASA Astrophysics Data System (ADS)

    Tseng, H. T.; Lin, Ling-Chieh; Huang, I. H.; Lin, Benjamin S.; Huang, Chin-Chou K.; Huang, Chien-Jen

    2005-05-01

    The integrated circuit (IC) manufacturing factories have measured overlay with conventional "box-in-box" (BiB) or "frame-in-frame" (FiF) structures for many years. Since UMC played as a roll of world class IC foundry service provider, tighter and tighter alignment accuracy specs need to be achieved from generation to generation to meet any kind of customers' requirement, especially according to International Technology Roadmap for Semiconductors (ITRS) 2003 METROLOGY section1. The process noises resulting from dishing, overlay mark damaging by chemical mechanism polishing (CMP), and the variation of film thickness during deposition are factors which can be very problematic in mark alignment. For example, the conventional "box-in-box" overlay marks could be damaged easily by CMP, because the less local pattern density and wide feature width of the box induce either dishing or asymmetric damages for the measurement targets, which will make the overlay measurement varied and difficult. After Advanced Imaging Metrology (AIM) overlay targets was introduced by KLA-Tencor, studies in the past shown AIM was more robust in overlay metrology than conventional FiF or BiB targets. In this study, the applications of AIM overlay marks under different process conditions will be discussed and compared with the conventional overlay targets. To evaluate the overlay mark performance against process variation on 65nm technology node in 300-mm wafer, three critical layers were chosen in this study. These three layers were Poly, Contact, and Cu-Metal. The overlay targets used for performance comparison were BiB and Non-Segmented AIM (NS AIM) marks. We compared the overlay mark performance on two main areas. The first one was total measurement uncertainty (TMU)3 related items that include Tool Induced Shift (TIS) variability, precision, and matching. The other area is the target robustness against process variations. Based on the present study AIM mark demonstrated an equal or better performance in the TMU related items under our process conditions. However, when non-optimized tungsten CMP was introduced in the tungsten contact process, due to the dense grating line structure design, we found that AIM mark was much more robust than BiB overlay target.

  3. Magnetic carbon nanostructures: microwave energy-assisted pyrolysis vs. conventional pyrolysis.

    PubMed

    Zhu, Jiahua; Pallavkar, Sameer; Chen, Minjiao; Yerra, Narendranath; Luo, Zhiping; Colorado, Henry A; Lin, Hongfei; Haldolaarachchige, Neel; Khasanov, Airat; Ho, Thomas C; Young, David P; Wei, Suying; Guo, Zhanhu

    2013-01-11

    Magnetic carbon nanostructures from microwave assisted- and conventional-pyrolysis processes are compared. Unlike graphitized carbon shells from conventional heating, different carbon shell morphologies including nanotubes, nanoflakes and amorphous carbon were observed. Crystalline iron and cementite were observed in the magnetic core, different from a single cementite phase from the conventional process.

  4. Processing of Materials for Regenerative Medicine Using Supercritical Fluid Technology.

    PubMed

    García-González, Carlos A; Concheiro, Angel; Alvarez-Lorenzo, Carmen

    2015-07-15

    The increase in the world demand of bone and cartilage replacement therapies urges the development of advanced synthetic scaffolds for regenerative purposes, not only providing mechanical support for tissue formation, but also promoting and guiding the tissue growth. Conventional manufacturing techniques have severe restrictions for designing these upgraded scaffolds, namely, regarding the use of organic solvents, shearing forces, and high operating temperatures. In this context, the use of supercritical fluid technology has emerged as an attractive solution to design solvent-free scaffolds and ingredients for scaffolds under mild processing conditions. The state-of-the-art on the technological endeavors for scaffold production using supercritical fluids is presented in this work with a critical review on the key processing parameters as well as the main advantages and limitations of each technique. A special stress is focused on the strategies suitable for the incorporation of bioactive agents (drugs, bioactive glasses, and growth factors) and the in vitro and in vivo performance of supercritical CO2-processed scaffolds.

  5. Low cost MATLAB-based pulse oximeter for deployment in research and development applications.

    PubMed

    Shokouhian, M; Morling, R C S; Kale, I

    2013-01-01

    Problems such as motion artifact and effects of ambient lights have forced developers to design different signal processing techniques and algorithms to increase the reliability and accuracy of the conventional pulse oximeter device. To evaluate the robustness of these techniques, they are applied either to recorded data or are implemented on chip to be applied to real-time data. Recorded data is the most common method of evaluating however it is not as reliable as real-time measurements. On the other hand, hardware implementation can be both expensive and time consuming. This paper presents a low cost MATLAB-based pulse oximeter that can be used for rapid evaluation of newly developed signal processing techniques and algorithms. Flexibility to apply different signal processing techniques, providing both processed and unprocessed data along with low implementation cost are the important features of this design which makes it ideal for research and development purposes, as well as commercial, hospital and healthcare application.

  6. Agile green process design for the intensified Kolbe-Schmitt synthesis by accompanying (simplified) life cycle assessment.

    PubMed

    Kressirer, Sabine; Kralisch, Dana; Stark, Annegret; Krtschil, Ulrich; Hessel, Volker

    2013-05-21

    In order to investigate the potential for process intensification, various reaction conditions were applied to the Kolbe-Schmitt synthesis starting from resorcinol. Different CO₂ precursors such as aqueous potassium hydrogencarbonate, hydrogencarbonate-based ionic liquids, DIMCARB, or sc-CO₂, the application of microwave irradiation for fast volumetric heating of the reaction mixture, and the effect of harsh reaction conditions were investigated. The experiments, carried out in conventional batch-wise as well as in continuously operated microstructured reactors, aimed at the development of an environmentally benign process for the preparation of 2,4-dihydroxybenzoic acid. To provide decision support toward a green process design, a research-accompanying simplified life cycle assessment (SLCA) was performed throughout the whole investigation. Following this approach, it was found that convective heating methods such as oil bath or electrical heating were more beneficial than the application of microwave irradiation. Furthermore, the consideration of workup procedures was crucial for a holistic view on the environmental burdens.

  7. Stability Analysis of Radial Turning Process for Superalloys

    NASA Astrophysics Data System (ADS)

    Jiménez, Alberto; Boto, Fernando; Irigoien, Itziar; Sierra, Basilio; Suarez, Alfredo

    2017-09-01

    Stability detection in machining processes is an essential component for the design of efficient machining processes. Automatic methods are able to determine when instability is happening and prevent possible machine failures. In this work a variety of methods are proposed for detecting stability anomalies based on the measured forces in the radial turning process of superalloys. Two different methods are proposed to determine instabilities. Each one is tested on real data obtained in the machining of Waspalloy, Haynes 282 and Inconel 718. Experimental data, in both Conventional and High Pressure Coolant (HPC) environments, are set in four different states depending on materials grain size and Hardness (LGA, LGS, SGA and SGS). Results reveal that PCA method is useful for visualization of the process and detection of anomalies in online processes.

  8. Computational fluid modeling and performance analysis of a bidirectional rotating perfusion culture system.

    PubMed

    Kang, Chang-Wei; Wang, Yan; Tania, Marshella; Zhou, Huancheng; Gao, Yi; Ba, Te; Tan, Guo-Dong Sean; Kim, Sangho; Leo, Hwa Liang

    2013-01-01

    A myriad of bioreactor configurations have been investigated as extracorporeal medical support systems for temporary replacement of vital organ functions. In recent years, studies have demonstrated that the rotating bioreactors have the potential to be utilized as bioartificial liver assist devices (BLADs) owing to their advantage of ease of scalability of cell-culture volume. However, the fluid movement in the rotating chamber will expose the suspended cells to unwanted flow structures with abnormally high shear conditions that may result in poor cell stability and in turn lower the efficacy of the bioreactor system. In this study, we compared the hydrodynamic performance of our modified rotating bioreactor design with that of an existing rotating bioreactor design. Computational fluid dynamic analysis coupled with experimental results were employed in the optimization process for the development of the modified bioreactor design. Our simulation results showed that the modified bioreactor had lower fluid induced shear stresses and more uniform flow conditions within its rotating chamber than the conventional design. Experimental results revealed that the cells within the modified bioreactor also exhibited better cell-carrier attachment, higher metabolic activity, and cell viability compared to those in the conventional design. In conclusion, this study was able to provide important insights into the flow physics within the rotating bioreactors, and help enhanced the hydrodynamic performance of an existing rotating bioreactor for BLAD applications. © 2013 American Institute of Chemical Engineers.

  9. Design and fabrication of vertically-integrated CMOS image sensors.

    PubMed

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors.

  10. Design and Fabrication of Vertically-Integrated CMOS Image Sensors

    PubMed Central

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors. PMID:22163860

  11. Characterization and modeling of electrostatically actuated polysilicon micromechanical devices

    NASA Astrophysics Data System (ADS)

    Chan, Edward Keat Leem

    Sensors, actuators, transducers, microsystems and MEMS (MicroElertroMechanical Systems) are some of the terms describing technologies that interface information processing systems with the physical world. Electrostatically actuated micromechanical devices are important building blocks in many of these technologies. Arrays of these devices are used in video projection displays, fluid pumping systems, optical communications systems, tunable lasers and microwave circuits. Well-calibrated simulation tools are essential for propelling ideas from the drawing board into production. This work characterizes a fabrication process---the widely-used polysilicon MUMPs process---to facilitate the design of electrostatically actuated micromechanical devices. The operating principles of a representative device---a capacitive microwave switch---are characterized using a wide range of electrical and optical measurements of test structures along with detailed electromechanical simulations. Consistency in the extraction of material properties from measurements of both pull-in voltage and buckling amplitude is demonstrated. Gold is identified as an area-dependent source of nonuniformity in polysilicon thicknesses and stress. Effects of stress gradients, substrate curvature, and film coverage are examined quantitatively. Using well-characterized beams as in-situ surface probes, capacitance-voltage and surface profile measurements reveal that compressible surface residue modifies the effective electrical gap when the movable electrode contacts an underlying silicon nitride layer. A compressible contact surface model used in simulations improves the fit to measurements. In addition, the electric field across the nitride causes charge to build up in the nitride, increasing the measured capacitance over time. The rate of charging corresponds to charge injection through direct tunneling. A novel actuator that can travel stably beyond one-third of the initial gap (a trademark limitation of conventional actuators) is demonstrated. A "folded capacitor" design, requiring only minimal modifications to the layout of conventional devices, reduces the parasitic capacitances and modes of deformation that limit performance. This device, useful for optical applications, can travel almost twice the conventional range before succumbing to a tilting instability.

  12. Suppression of Lateral Diffusion and Surface Leakage Currents in nBn Photodetectors Using an Inverted Design

    NASA Astrophysics Data System (ADS)

    Du, X.; Savich, G. R.; Marozas, B. T.; Wicks, G. W.

    2018-02-01

    Surface leakage and lateral diffusion currents in InAs-based nBn photodetectors have been investigated. Devices fabricated using a shallow etch processing scheme that etches through the top contact and stops at the barrier exhibited large lateral diffusion current but undetectably low surface leakage. Such large lateral diffusion current significantly increased the dark current, especially in small devices, and causes pixel-to-pixel crosstalk in detector arrays. To eliminate the lateral diffusion current, two different approaches were examined. The conventional solution utilized a deep etch process, which etches through the top contact, barrier, and absorber. This deep etch processing scheme eliminated lateral diffusion, but introduced high surface current along the device mesa sidewalls, increasing the dark current. High device failure rate was also observed in deep-etched nBn structures. An alternative approach to limit lateral diffusion used an inverted nBn structure that has its absorber grown above the barrier. Like the shallow etch process on conventional nBn structures, the inverted nBn devices were fabricated with a processing scheme that only etches the top layer (the absorber, in this case) but avoids etching through the barrier. The results show that inverted nBn devices have the advantage of eliminating the lateral diffusion current without introducing elevated surface current.

  13. Environmental assessment of mild bisulfite pretreatment of forest residues into fermentable sugars for biofuel production.

    PubMed

    Nwaneshiudu, Ikechukwu C; Ganguly, Indroneil; Pierobon, Francesca; Bowers, Tait; Eastin, Ivan

    2016-01-01

    Sugar production via pretreatment and enzymatic hydrolysis of cellulosic feedstock, in this case softwood harvest residues, is a critical step in the biochemical conversion pathway towards drop-in biofuels. Mild bisulfite (MBS) pretreatment is an emerging option for the breakdown and subsequent processing of biomass towards fermentable sugars. An environmental assessment of this process is critical to discern its future sustainability in the ever-changing biofuels landscape. The subsequent cradle-to-gate assessment of a proposed sugar production facility analyzes sugar made from woody biomass using MBS pretreatment across all seven impact categories (functional unit 1 kg dry mass sugar), with a specific focus on potential global warming and eutrophication impacts. The study found that the eutrophication impact (0.000201 kg N equivalent) is less than the impacts from conventional beet and cane sugars, while the global warming impact (0.353 kg CO2 equivalent) falls within the range of conventional processes. This work discusses some of the environmental impacts of designing and operating a sugar production facility that uses MBS as a method of treating cellulosic forest residuals. The impacts of each unit process in the proposed facility are highlighted. A comparison to other sugar-making process is detailed and will inform the growing biofuels literature.

  14. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. A data model of the Climate and Forecast metadata conventions (CF-1.6) with a software implementation (cf-python v2.1)

    NASA Astrophysics Data System (ADS)

    Hassell, David; Gregory, Jonathan; Blower, Jon; Lawrence, Bryan N.; Taylor, Karl E.

    2017-12-01

    The CF (Climate and Forecast) metadata conventions are designed to promote the creation, processing, and sharing of climate and forecasting data using Network Common Data Form (netCDF) files and libraries. The CF conventions provide a description of the physical meaning of data and of their spatial and temporal properties, but they depend on the netCDF file encoding which can currently only be fully understood and interpreted by someone familiar with the rules and relationships specified in the conventions documentation. To aid in development of CF-compliant software and to capture with a minimal set of elements all of the information contained in the CF conventions, we propose a formal data model for CF which is independent of netCDF and describes all possible CF-compliant data. Because such data will often be analysed and visualised using software based on other data models, we compare our CF data model with the ISO 19123 coverage model, the Open Geospatial Consortium CF netCDF standard, and the Unidata Common Data Model. To demonstrate that this CF data model can in fact be implemented, we present cf-python, a Python software library that conforms to the model and can manipulate any CF-compliant dataset.

  16. An investigation of constraint-based component-modeling for knowledge representation in computer-aided conceptual design

    NASA Technical Reports Server (NTRS)

    Kolb, Mark A.

    1990-01-01

    Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.

  17. Hollow Abutment Screw Design for Easy Retrieval in Case of Screw Fracture in Dental Implant System.

    PubMed

    Sim, Bo Kyun; Kim, Bongju; Kim, Min Jeong; Jeong, Guk Hyun; Ju, Kyung Won; Shin, Yoo Jin; Kim, Man Yong; Lee, Jong-Ho

    2017-01-01

    The prosthetic component of dental implant is attached on the abutment which is connected to the fixture with an abutment screw. The abutment screw fracture is not frequent; however, the retrieval of the fractured screw is not easy, and it poses complications. A retrieval kit was developed which utilizes screw removal drills to make a hole on the fractured screw that provides an engaging drill to unscrew it. To minimize this process, the abutment screw is modified with a prefabricated access hole for easy retrieval. This study aimed to introduce this modified design of the abutment screw, the concept of easy retrieval, and to compare the mechanical strengths of the conventional and hollow abutment screws by finite element analysis (FEA) and mechanical test. In the FEA results, both types of abutment screws showed similar stress distribution in the single artificial tooth system. A maximum load difference of about 2% occurred in the vertical load by a mechanical test. This study showed that the hollow abutment screw may be an alternative to the conventional abutment screws because this is designed for easy retrieval and that both abutment screws showed no significant difference in the mechanical tests and in the FEA.

  18. Effects of mesh type on a non-premixed model in a flameless combustion simulation

    NASA Astrophysics Data System (ADS)

    Komonhirun, Seekharin; Yongyingsakthavorn, Pisit; Nontakeaw, Udomkiat

    2018-01-01

    Flameless combustion is a recently developed combustion system, which provides zero emission product. This phenomenon requires auto-ignition by supplying high-temperature air with low oxygen concentration. The flame is vanished and colorless. Temperature of the flameless combustion is less than that of a conventional case, where NOx reactions can be well suppressed. To design a flameless combustor, the computational fluid dynamics (CFD) is employed. The designed air-and-fuel injection method can be applied with the turbulent and non-premixed models. Due to the fact that nature of turbulent non-premixed combustion is based on molecular randomness, inappropriate mesh type can lead to significant numerical errors. Therefore, this research aims to numerically investigate the effects of mesh type on flameless combustion characteristics, which is a primary step of design process. Different meshes, i.e. tetrahedral, hexagonal are selected. Boundary conditions are 5% of oxygen and 900 K of air-inlet temperature for the flameless combustion, and 21% of oxygen and 300 K of air-inlet temperature for the conventional case. The results are finally presented and discussed in terms of velocity streamlines, and contours of turbulent kinetic energy and viscosity, temperature, and combustion products.

  19. Mechanical properties of porous and cellular materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sieradzki, K.; Green, D.J.; Gibson, L.J.

    1991-01-01

    This symposium successfully brought scientists together from a wide variety of disciplines to focus on the mechanical behavior of porous and cellular solids composed of metals, ceramics, polymers, or biological materials. For cellular materials, papers ranged from processing techniques through microstructure-mechanical property relationships to design. In an overview talk, Mike Ashby (Cambridge Univ.) showed how porous cellular materials can be more efficient than dense materials in designs that require minimum weight. He indicated that many biological materials have been able to accomplish such efficiency but there exists an opportunity to design even more efficient, manmade materials controlling microstructures at differentmore » scale levels. In the area of processing, James Aubert (Sandia National Laboratories) discussed techiques for manipulating polymersolvent phase equilibria to control the microstructure of microcellular foams. Other papers on processing discussed the production of cellular ceramics by CVD, HIPing and sol- gel techniques. Papers on the mechanical behavior of cellular materials considered various ceramics microcellular polymers, conventional polymer foams and apples. There were also contributions that considered optimum design procedures for cellular materials. Steven Cowin (City Univ. of New York) discussed procedures to match the discrete microstructural aspects of cellular materials with the continuum mechanics approach to their elastic behavior.« less

  20. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  1. Parametric study of a canard-configured transport using conceptual design optimization

    NASA Technical Reports Server (NTRS)

    Arbuckle, P. D.; Sliwa, S. M.

    1985-01-01

    Constrained-parameter optimization is used to perform optimal conceptual design of both canard and conventional configurations of a medium-range transport. A number of design constants and design constraints are systematically varied to compare the sensitivities of canard and conventional configurations to a variety of technology assumptions. Main-landing-gear location and canard surface high-lift performance are identified as critical design parameters for a statically stable, subsonic, canard-configured transport.

  2. Fuzzy image processing in sun sensor

    NASA Technical Reports Server (NTRS)

    Mobasser, S.; Liebe, C. C.; Howard, A.

    2003-01-01

    This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.

  3. Structural health monitoring feature design by genetic programming

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Todd, Michael D.

    2014-09-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.

  4. A Review on Adsorption of Fluoride from Aqueous Solution

    PubMed Central

    Habuda-Stanić, Mirna; Ergović Ravančić, Maja; Flanagan, Andrew

    2014-01-01

    Fluoride is one of the anionic contaminants which is found in excess in surface or groundwater because of geochemical reactions or anthropogenic activities such as the disposal of industrial wastewaters. Among various methods used for defluoridation of water such as coagulation, precipitation, membrane processes, electrolytic treatment, ion-exchange, the adsorption process is widely used. It offers satisfactory results and seems to be a more attractive method for the removal of fluoride in terms of cost, simplicity of design and operation. Various conventional and non-conventional adsorbents have been assessed for the removal of fluoride from water. In this review, a list of various adsorbents (oxides and hydroxides, biosorbents, geomaterials, carbonaceous materials and industrial products and by-products) and its modifications from literature are surveyed and their adsorption capacities under various conditions are compared. The effect of other impurities on fluoride removal has also been discussed. This survey showed that various adsorbents, especially binary and trimetal oxides and hydroxides, have good potential for the fluoride removal from aquatic environments. PMID:28788194

  5. A highly symmetrical 10 transistor 2-read/write dual-port static random access memory bitcell design in 28 nm high-k/metal-gate planar bulk CMOS technology

    NASA Astrophysics Data System (ADS)

    Ishii, Yuichiro; Tanaka, Miki; Yabuuchi, Makoto; Sawada, Yohei; Tanaka, Shinji; Nii, Koji; Lu, Tien Yu; Huang, Chun Hsien; Sian Chen, Shou; Tse Kuo, Yu; Lung, Ching Cheng; Cheng, Osbert

    2018-04-01

    We propose a highly symmetrical 10 transistor (10T) 2-read/write (2RW) dual-port (DP) static random access memory (SRAM) bitcell in 28 nm high-k/metal-gate (HKMG) planar bulk CMOS. It replaces the conventional 8T 2RW DP SRAM bitcell without any area overhead. It significantly improves the robustness of process variations and an asymmetric issue between the true and bar bitline pairs. Measured data show that read current (I read) and read static noise margin (SNM) are respectively boosted by +20% and +15 mV by introducing the proposed bitcell with enlarged pull-down (PD) and pass-gate (PG) N-channel MOSs (NMOSs). The minimum operating voltage (V min) of the proposed 256 kbit 10T DP SRAM is 0.53 V in the TT process, 25 °C under the worst access condition with read/write disturbances, and improved by 90 mV (15%) compared with the conventional one.

  6. Multi-band filter design with less total film thickness for short-wave infrared

    NASA Astrophysics Data System (ADS)

    Yan, Yung-Jhe; Chien, I.-Pen; Chen, Po-Han; Chen, Sheng-Hui; Tsai, Yi-Chun; Ou-Yang, Mang

    2017-08-01

    A multi-band pass filter array was proposed and designed for short wave infrared applications. The central wavelength of the multi-band pass filters are located about 905 nm, 950 nm, 1055 nm and 1550 nm. In the simulation of an optical interference band pass filter, high spectrum performance (high transmittance ratio between the pass band and stop band) relies on (1) the index gap between the selected high/low-index film materials, with a larger gap correlated to higher performance, and (2) sufficient repeated periods of high/low-index thin-film layers. When determining high and low refractive index materials, spectrum performance was improved by increasing repeated periods. Consequently, the total film thickness increases rapidly. In some cases, a thick total film thickness is difficult to process in practice, especially when incorporating photolithography liftoff. Actually the maximal thickness of the photoresist being able to liftoff will bound the total film thickness of the band pass filter. For the application of the short wave infrared with the wavelength range from 900nm to 1700nm, silicone was chosen as a high refractive index material. Different from other dielectric materials used in the visible range, silicone has a higher absorptance in the visible range opposite to higher transmission in the short wave infrared. In other words, designing band pass filters based on silicone as a high refractive index material film could not obtain a better spectrum performance than conventional high index materials like TiO2 or Ta2O5, but also its material cost would reduce about half compared to the total film thickness with the conventional material TiO2. Through the simulation and several experimental trials, the total film thickness below 4 um was practicable and reasonable. The fabrication of the filters was employed a dual electric gun deposition system with ion assisted deposition after the lithography process. Repeating four times of lithography and deposition process and black matrix coating, the optical device processes were completed.

  7. A Systematic Approach of Employing Quality by Design Principles: Risk Assessment and Design of Experiments to Demonstrate Process Understanding and Identify the Critical Process Parameters for Coating of the Ethylcellulose Pseudolatex Dispersion Using Non-Conventional Fluid Bed Process.

    PubMed

    Kothari, Bhaveshkumar H; Fahmy, Raafat; Claycamp, H Gregg; Moore, Christine M V; Chatterjee, Sharmista; Hoag, Stephen W

    2017-05-01

    The goal of this study was to utilize risk assessment techniques and statistical design of experiments (DoE) to gain process understanding and to identify critical process parameters for the manufacture of controlled release multiparticulate beads using a novel disk-jet fluid bed technology. The material attributes and process parameters were systematically assessed using the Ishikawa fish bone diagram and failure mode and effect analysis (FMEA) risk assessment methods. The high risk attributes identified by the FMEA analysis were further explored using resolution V fractional factorial design. To gain an understanding of the processing parameters, a resolution V fractional factorial study was conducted. Using knowledge gained from the resolution V study, a resolution IV fractional factorial study was conducted; the purpose of this IV study was to identify the critical process parameters (CPP) that impact the critical quality attributes and understand the influence of these parameters on film formation. For both studies, the microclimate, atomization pressure, inlet air volume, product temperature (during spraying and curing), curing time, and percent solids in the coating solutions were studied. The responses evaluated were percent agglomeration, percent fines, percent yield, bead aspect ratio, median particle size diameter (d50), assay, and drug release rate. Pyrobuttons® were used to record real-time temperature and humidity changes in the fluid bed. The risk assessment methods and process analytical tools helped to understand the novel disk-jet technology and to systematically develop models of the coating process parameters like process efficiency and the extent of curing during the coating process.

  8. A study of digital gyro compensation loops. [data conversion routines and breadboard models

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The feasibility is discussed of replacing existing state-of-the-art analog gyro compensation loops with digital computations. This was accomplished by designing appropriate compensation loops for the dry turned TDF gyro, selecting appropriate data conversion and processing techniques and algorithms, and breadboarding the design for laboratory evaluation. A breadboard design was established in which one axis of a Teledyne turned-gimbal TDF gyro was caged digitally while the other was caged using conventional analog electronics. The digital loop was designed analytically to closely resemble the analog loop in performance. The breadboard was subjected to various static and dynamic tests in order to establish the relative stability characteristics and frequency responses of the digital and analog loops. Several variations of the digital loop configuration were evaluated. The results were favorable.

  9. The trade-off between morphology and control in the co-optimized design of robots.

    PubMed

    Rosendo, Andre; von Atzigen, Marco; Iida, Fumiya

    2017-01-01

    Conventionally, robot morphologies are developed through simulations and calculations, and different control methods are applied afterwards. Assuming that simulations and predictions are simplified representations of our reality, how sure can roboticists be that the chosen morphology is the most adequate for the possible control choices in the real-world? Here we study the influence of the design parameters in the creation of a robot with a Bayesian morphology-control (MC) co-optimization process. A robot autonomously creates child robots from a set of possible design parameters and uses Bayesian Optimization (BO) to infer the best locomotion behavior from real world experiments. Then, we systematically change from an MC co-optimization to a control-only (C) optimization, which better represents the traditional way that robots are developed, to explore the trade-off between these two methods. We show that although C processes can greatly improve the behavior of poor morphologies, such agents are still outperformed by MC co-optimization results with as few as 25 iterations. Our findings, on one hand, suggest that BO should be used in the design process of robots for both morphological and control parameters to reach optimal performance, and on the other hand, point to the downfall of current design methods in face of new search techniques.

  10. The trade-off between morphology and control in the co-optimized design of robots

    PubMed Central

    Iida, Fumiya

    2017-01-01

    Conventionally, robot morphologies are developed through simulations and calculations, and different control methods are applied afterwards. Assuming that simulations and predictions are simplified representations of our reality, how sure can roboticists be that the chosen morphology is the most adequate for the possible control choices in the real-world? Here we study the influence of the design parameters in the creation of a robot with a Bayesian morphology-control (MC) co-optimization process. A robot autonomously creates child robots from a set of possible design parameters and uses Bayesian Optimization (BO) to infer the best locomotion behavior from real world experiments. Then, we systematically change from an MC co-optimization to a control-only (C) optimization, which better represents the traditional way that robots are developed, to explore the trade-off between these two methods. We show that although C processes can greatly improve the behavior of poor morphologies, such agents are still outperformed by MC co-optimization results with as few as 25 iterations. Our findings, on one hand, suggest that BO should be used in the design process of robots for both morphological and control parameters to reach optimal performance, and on the other hand, point to the downfall of current design methods in face of new search techniques. PMID:29023482

  11. Configurable analog-digital conversion using the neural engineering framework

    PubMed Central

    Mayr, Christian G.; Partzsch, Johannes; Noack, Marko; Schüffny, Rene

    2014-01-01

    Efficient Analog-Digital Converters (ADC) are one of the mainstays of mixed-signal integrated circuit design. Besides the conventional ADCs used in mainstream ICs, there have been various attempts in the past to utilize neuromorphic networks to accomplish an efficient crossing between analog and digital domains, i.e., to build neurally inspired ADCs. Generally, these have suffered from the same problems as conventional ADCs, that is they require high-precision, handcrafted analog circuits and are thus not technology portable. In this paper, we present an ADC based on the Neural Engineering Framework (NEF). It carries out a large fraction of the overall ADC process in the digital domain, i.e., it is easily portable across technologies. The analog-digital conversion takes full advantage of the high degree of parallelism inherent in neuromorphic networks, making for a very scalable ADC. In addition, it has a number of features not commonly found in conventional ADCs, such as a runtime reconfigurability of the ADC sampling rate, resolution and transfer characteristic. PMID:25100933

  12. Comparative Kinetic Study and Microwaves Non-Thermal Effects on the Formation of Poly(amic acid) 4,4′-(Hexafluoroisopropylidene)diphthalic Anhydride (6FDA) and 4,4′-(Hexafluoroisopropylidene)bis(p-phenyleneoxy)dianiline (BAPHF). Reaction Activated by Microwave, Ultrasound and Conventional Heating

    PubMed Central

    Tellez, Hugo Mendoza; Alquisira, Joaquín Palacios; Alonso, Carlos Rius; Cortés, José Guadalupe López; Toledano, Cecilio Alvarez

    2011-01-01

    Green chemistry is the design of chemical processes that reduce or eliminate negative environmental impacts. The use and production of chemicals involve the reduction of waste products, non-toxic components, and improved efficiency. Green chemistry applies innovative scientific solutions in the use of new reagents, catalysts and non-classical modes of activation such as ultrasounds or microwaves. Kinetic behavior and non-thermal effect of poly(amic acid) synthesized from (6FDA) dianhydride and (BAPHF) diamine in a low microwave absorbing p-dioxane solvent at low temperature of 30, 50, 70 °C were studied, under conventional heating (CH), microwave (MW) and ultrasound irradiation (US). Results show that the polycondensation rate decreases (MW > US > CH) and that the increased rates observed with US and MW are due to decreased activation energies of the Arrhenius equation. Rate constant for a chemical process activated by conventional heating declines proportionally as the induction time increases, however, this behavior is not observed under microwave and ultrasound activation. We can say that in addition to the thermal microwave effect, a non-thermal microwave effect is present in the system. PMID:22072913

  13. Comparative kinetic study and microwaves non-thermal effects on the formation of poly(amic acid) 4,4'-(hexafluoroisopropylidene)diphthalic anhydride (6FDA) and 4,4'-(hexafluoroisopropylidene)bis(p-phenyleneoxy)dianiline (BAPHF). Reaction activated by microwave, ultrasound and conventional heating.

    PubMed

    Tellez, Hugo Mendoza; Alquisira, Joaquín Palacios; Alonso, Carlos Rius; Cortés, José Guadalupe López; Toledano, Cecilio Alvarez

    2011-01-01

    Green chemistry is the design of chemical processes that reduce or eliminate negative environmental impacts. The use and production of chemicals involve the reduction of waste products, non-toxic components, and improved efficiency. Green chemistry applies innovative scientific solutions in the use of new reagents, catalysts and non-classical modes of activation such as ultrasounds or microwaves. Kinetic behavior and non-thermal effect of poly(amic acid) synthesized from (6FDA) dianhydride and (BAPHF) diamine in a low microwave absorbing p-dioxane solvent at low temperature of 30, 50, 70 °C were studied, under conventional heating (CH), microwave (MW) and ultrasound irradiation (US). Results show that the polycondensation rate decreases (MW > US > CH) and that the increased rates observed with US and MW are due to decreased activation energies of the Arrhenius equation. Rate constant for a chemical process activated by conventional heating declines proportionally as the induction time increases, however, this behavior is not observed under microwave and ultrasound activation. We can say that in addition to the thermal microwave effect, a non-thermal microwave effect is present in the system.

  14. Biostereometric Data Processing In ERGODATA: Choice Of Human Body Models

    NASA Astrophysics Data System (ADS)

    Pineau, J. C.; Mollard, R.; Sauvignon, M.; Amphoux, M.

    1983-07-01

    The definition of human body models was elaborated with anthropometric data from ERGODATA. The first model reduces the human body into a series of points and lines. The second model is well adapted to represent volumes of each segmentary element. The third is an original model built from the conventional anatomical points. Each segment is defined in space by a tri-angular plane located with its 3-D coordinates. This new model can answer all the processing possibilities in the field of computer-aided design (C.A.D.) in ergonomy but also biomechanics and orthopaedics.

  15. LLNL Partners with IBM on Brain-Like Computing Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Essen, Brian

    Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.

  16. LLNL Partners with IBM on Brain-Like Computing Chip

    ScienceCinema

    Van Essen, Brian

    2018-06-25

    Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.

  17. Metal Coatings

    NASA Technical Reports Server (NTRS)

    1994-01-01

    During the Apollo Program, General Magnaplate Corporation developed process techniques for bonding dry lubricant coatings to space metals. The coatings were not susceptible to outgassing and offered enhanced surface hardness and superior resistance to corrosion and wear. This development was necessary because conventional lubrication processes were inadequate for lightweight materials used in Apollo components. General Magnaplate built on the original technology and became a leader in development of high performance metallurgical surface enhancement coatings - "synergistic" coatings, - which are used in applications from pizza making to laser manufacture. Each of the coatings is designed to protect a specific metal or group of metals to solve problems encountered under operating conditions.

  18. Multifunctional two-stage riser fluid catalytic cracking process.

    PubMed

    Zhang, Jinhong; Shan, Honghong; Chen, Xiaobo; Li, Chunyi; Yang, Chaohe

    This paper described the discovering process of some shortcomings of the conventional fluid catalytic cracking (FCC) process and the proposed two-stage riser (TSR) FCC process for decreasing dry gas and coke yields and increasing light oil yield, which has been successfully applied in 12 industrial units. Furthermore, the multifunctional two-stage riser (MFT) FCC process proposed on the basis of the TSR FCC process was described, which were carried out by the optimization of reaction conditions for fresh feedstock and cycle oil catalytic cracking, respectively, by the coupling of cycle oil cracking and light FCC naphtha upgrading processes in the second-stage riser, and the specially designed reactor for further reducing the olefin content of gasoline. The pilot test showed that it can further improve the product quality, increase the diesel yield, and enhance the conversion of heavy oil.

  19. Toward Higher QA: From Parametric Release of Sterile Parenteral Products to PAT for Other Pharmaceutical Dosage Forms.

    PubMed

    Hock, Sia Chong; Constance, Neo Xue Rui; Wah, Chan Lai

    2012-01-01

    Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach for determining the pharmaceutical quality of the finished dosage form. In the case of terminally sterilized parenteral products, the limitations of conventional batch testing have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms, beyond terminally sterilized parenteral products. For parametric release to be possible, manufacturers must be capable of designing quality into the product, monitoring the manufacturing processes, and controlling the quality of intermediates and finished products in real-time. Process analytical technology (PAT) has been thought to be capable of contributing to these prerequisites. It is believed that the appropriate use of PAT tools can eventually lead to the possibility of real-time release of other pharmaceutical dosage forms, by-passing the need for end-product batch testing. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. Last but not least, current regulations governing the use of PAT and the manufacturing challenges associated with PAT implementation are also discussed. Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach. In the case of terminally sterilized parenteral products, these limitations have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms. With the advancement of process analytical technology (PAT), it is possible to monitor the manufacturing processes closely. This will eventually enable quality control of the intermediates and finished products, and thus their release in real-time. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. It will also discuss the current regulations governing the use of PAT and the manufacturing challenges associated with the implementation of PAT.

  20. Comparison of machinability of manganese alloyed austempered ductile iron produced using conventional and two step austempering processes

    NASA Astrophysics Data System (ADS)

    Hegde, Ananda; Sharma, Sathyashankara

    2018-05-01

    Austempered Ductile Iron (ADI) is a revolutionary material with high strength and hardness combined with optimum ductility and toughness. The discovery of two step austempering process has lead to the superior combination of all the mechanical properties. However, because of the high strength and hardness of ADI, there is a concern regarding its machinability. In the present study, machinability of ADI produced using conventional and two step heat treatment processes is assessed using tool life and the surface roughness. Speed, feed and depth of cut are considered as the machining parameters in the dry turning operation. The machinability results along with the mechanical properties are compared for ADI produced using both conventional and two step austempering processes. The results have shown that two step austempering process has produced better toughness with good hardness and strength without sacrificing ductility. Addition of 0.64 wt% manganese did not cause any detrimental effect on the machinability of ADI, both in conventional and two step processes. Marginal improvement in tool life and surface roughness were observed in two step process compared to that with conventional process.

  1. The Effect of Surfactant Content over Cu-Ni Coatings Electroplated by the sc-CO2 Technique

    PubMed Central

    Chuang, Ho-Chiao; Sánchez, Jorge; Cheng, Hsiang-Yun

    2017-01-01

    Co-plating of Cu-Ni coatings by supercritical CO2 (sc-CO2) and conventional electroplating processes was studied in this work. 1,4-butynediol was chosen as the surfactant and the effects of adjusting the surfactant content were described. Although the sc-CO2 process displayed lower current efficiency, it effectively removed excess hydrogen that causes defects on the coating surface, refined grain size, reduced surface roughness, and increased electrochemical resistance. Surface roughness of coatings fabricated by the sc-CO2 process was reduced by an average of 10%, and a maximum of 55%, compared to conventional process at different fabrication parameters. Cu-Ni coatings produced by the sc-CO2 process displayed increased corrosion potential of ~0.05 V over Cu-Ni coatings produced by the conventional process, and 0.175 V over pure Cu coatings produced by the conventional process. For coatings ~10 µm thick, internal stress developed from the sc-CO2 process were ~20 MPa lower than conventional process. Finally, the preferred crystal orientation of the fabricated coatings remained in the (111) direction regardless of the process used or surfactant content. PMID:28772787

  2. Towards non-conventional methods of designing register-based epidemiological studies: An application to pediatric research.

    PubMed

    Gong, Tong; Brew, Bronwyn; Sjölander, Arvid; Almqvist, Catarina

    2017-07-01

    Various epidemiological designs have been applied to investigate the causes and consequences of fetal growth restriction in register-based observational studies. This review seeks to provide an overview of several conventional designs, including cohort, case-control and more recently applied non-conventional designs such as family-based designs. We also discuss some practical points regarding the application and interpretation of family-based designs. Definitions of each design, the study population, the exposure and the outcome measures are briefly summarised. Examples of study designs are taken from the field of low birth-weight research for illustrative purposes. Also examined are relative advantages and disadvantages of each design in terms of assumptions, potential selection and information bias, confounding and generalisability. Kinship data linkage, statistical models and result interpretation are discussed specific to family-based designs. When all information is retrieved from registers, there is no evident preference of the case-control design over the cohort design to estimate odds ratios. All conventional designs included in the review are prone to bias, particularly due to residual confounding. Family-based designs are able to reduce such bias and strengthen causal inference. In the field of low birth-weight research, family-based designs have been able to confirm a negative association not confounded by genetic or shared environmental factors between low birth weight and the risk of asthma. We conclude that there is a broader need for family-based design in observational research as evidenced by the meaningful contributions to the understanding of the potential causal association between low birth weight and subsequent outcomes.

  3. Heat sink structural design concepts for a hypersonic research airplane

    NASA Technical Reports Server (NTRS)

    Taylor, A. H.; Jackson, L. R.

    1977-01-01

    Hypersonic research aircraft design requires careful consideration of thermal stresses. This paper relates some of the problems in a heat sink structural design that can be avoided by appropriate selection of design options including material selection, design concepts, and load paths. Data on several thermal loading conditions are presented on various conventional designs including bulkheads, longerons, fittings, and frames. Results indicate that conventional designs are inadequate and that acceptable designs are possible by incorporating innovative design practices. These include nonintegral pressure compartments, ball-jointed links to distribute applied loads without restraining the thermal expansion, and material selections based on thermal compatibility.

  4. Integrated Software Development System/Higher Order Software Conceptual Description (ISDS/HOS)

    DTIC Science & Technology

    1976-11-01

    Structured Flowchart Conventions 270 6.3.5.3 Design Diagram Notation 273 xii HIGHER ORDER SOFTWARE, INC. 843 MASSACHUSETTS AVENUE. CAMBRIDGE, MASSACHUSETTS...associated with the process steps. They also reference other HIPO diagrams as well an non-HIPO documentation such as flowcharts or decision tables of...syntax that is easy to learn and must provide the novice with some prompting to help him avoid classic beginner errors. Desirable editing capabilities

  5. A manufacturing database of advanced materials used in spacecraft structures

    NASA Technical Reports Server (NTRS)

    Bao, Han P.

    1994-01-01

    Cost savings opportunities over the life cycle of a product are highest in the early exploratory phase when different design alternatives are evaluated not only for their performance characteristics but also their methods of fabrication which really control the ultimate manufacturing costs of the product. In the past, Design-To-Cost methodologies for spacecraft design concentrated on the sizing and weight issues more than anything else at the early so-called 'Vehicle Level' (Ref: DOD/NASA Advanced Composites Design Guide). Given the impact of manufacturing cost, the objective of this study is to identify the principal cost drivers for each materials technology and propose a quantitative approach to incorporating these cost drivers into the family of optimization tools used by the Vehicle Analysis Branch of NASA LaRC to assess various conceptual vehicle designs. The advanced materials being considered include aluminum-lithium alloys, thermoplastic graphite-polyether etherketone composites, graphite-bismaleimide composites, graphite- polyimide composites, and carbon-carbon composites. Two conventional materials are added to the study to serve as baseline materials against which the other materials are compared. These two conventional materials are aircraft aluminum alloys series 2000 and series 7000, and graphite-epoxy composites T-300/934. The following information is available in the database. For each material type, the mechanical, physical, thermal, and environmental properties are first listed. Next the principal manufacturing processes are described. Whenever possible, guidelines for optimum processing conditions for specific applications are provided. Finally, six categories of cost drivers are discussed. They include, design features affecting processing, tooling, materials, fabrication, joining/assembly, and quality assurance issues. It should be emphasized that this database is not an exhaustive database. Its primary use is to make the vehicle designer aware of some of the most important aspects of manufacturing associated with his/her choice of the structural materials. The other objective of this study is to propose a quantitative method to determine a Manufacturing Complexity Factor (MCF) for each material being contemplated. This MCF is derived on the basis of the six cost drivers mentioned above plus a Technology Readiness Factor which is very closely related to the Technology Readiness Level (TRL) as defined in the Access To Space final report. Short of any manufacturing information, our MCF is equivalent to the inverse of TRL. As more manufacturing information is available, our MCF is a better representation (than TRL) of the fabrication processes involved. The most likely application for MCF is in cost modeling for trade studies. On-going work is being pursued to expand the potential applications of MCF.

  6. A manufacturing database of advanced materials used in spacecraft structures

    NASA Astrophysics Data System (ADS)

    Bao, Han P.

    1994-12-01

    Cost savings opportunities over the life cycle of a product are highest in the early exploratory phase when different design alternatives are evaluated not only for their performance characteristics but also their methods of fabrication which really control the ultimate manufacturing costs of the product. In the past, Design-To-Cost methodologies for spacecraft design concentrated on the sizing and weight issues more than anything else at the early so-called 'Vehicle Level' (Ref: DOD/NASA Advanced Composites Design Guide). Given the impact of manufacturing cost, the objective of this study is to identify the principal cost drivers for each materials technology and propose a quantitative approach to incorporating these cost drivers into the family of optimization tools used by the Vehicle Analysis Branch of NASA LaRC to assess various conceptual vehicle designs. The advanced materials being considered include aluminum-lithium alloys, thermoplastic graphite-polyether etherketone composites, graphite-bismaleimide composites, graphite- polyimide composites, and carbon-carbon composites. Two conventional materials are added to the study to serve as baseline materials against which the other materials are compared. These two conventional materials are aircraft aluminum alloys series 2000 and series 7000, and graphite-epoxy composites T-300/934. The following information is available in the database. For each material type, the mechanical, physical, thermal, and environmental properties are first listed. Next the principal manufacturing processes are described. Whenever possible, guidelines for optimum processing conditions for specific applications are provided. Finally, six categories of cost drivers are discussed. They include, design features affecting processing, tooling, materials, fabrication, joining/assembly, and quality assurance issues. It should be emphasized that this database is not an exhaustive database. Its primary use is to make the vehicle designer aware of some of the most important aspects of manufacturing associated with his/her choice of the structural materials. The other objective of this study is to propose a quantitative method to determine a Manufacturing Complexity Factor (MCF) for each material being contemplated. This MCF is derived on the basis of the six cost drivers mentioned above plus a Technology Readiness Factor which is very closely related to the Technology Readiness Level (TRL) as defined in the Access To Space final report. Short of any manufacturing information, our MCF is equivalent to the inverse of TRL. As more manufacturing information is available, our MCF is a better representation (than TRL) of the fabrication processes involved.

  7. Machinability of nickel based alloys using electrical discharge machining process

    NASA Astrophysics Data System (ADS)

    Khan, M. Adam; Gokul, A. K.; Bharani Dharan, M. P.; Jeevakarthikeyan, R. V. S.; Uthayakumar, M.; Thirumalai Kumaran, S.; Duraiselvam, M.

    2018-04-01

    The high temperature materials such as nickel based alloys and austenitic steel are frequently used for manufacturing critical aero engine turbine components. Literature on conventional and unconventional machining of steel materials is abundant over the past three decades. However the machining studies on superalloy is still a challenging task due to its inherent property and quality. Thus this material is difficult to be cut in conventional processes. Study on unconventional machining process for nickel alloys is focused in this proposed research. Inconel718 and Monel 400 are the two different candidate materials used for electrical discharge machining (EDM) process. Investigation is to prepare a blind hole using copper electrode of 6mm diameter. Electrical parameters are varied to produce plasma spark for diffusion process and machining time is made constant to calculate the experimental results of both the material. Influence of process parameters on tool wear mechanism and material removal are considered from the proposed experimental design. While machining the tool has prone to discharge more materials due to production of high energy plasma spark and eddy current effect. The surface morphology of the machined surface were observed with high resolution FE SEM. Fused electrode found to be a spherical structure over the machined surface as clumps. Surface roughness were also measured with surface profile using profilometer. It is confirmed that there is no deviation and precise roundness of drilling is maintained.

  8. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  9. An in-mold packaging process for plastic fluidic devices.

    PubMed

    Yoo, Y E; Lee, K H; Je, T J; Choi, D S; Kim, S K

    2011-01-01

    Micro or nanofluidic devices have many channel shapes to deliver chemical solutions, body fluids or any fluids. The channels in these devices should be covered to prevent the fluids from overflowing or leaking. A typical method to fabricate an enclosed channel is to bond or weld a cover plate to a channel plate. This solid-to-solid bonding process, however, takes a considerable amount of time for mass production. In this study, a new process for molding a cover layer that can enclose open micro or nanochannels without solid-to-solid bonding is proposed and its feasibility is estimated. First, based on the design of a model microchannel, a brass microchannel master core was machined and a plastic microchannel platform was injection-molded. Using this molded platform, a series of experiments was performed for four process or mold design parameters. Some feasible conditions were successfully found to enclosed channels without filling the microchannels for the injection molding of a cover layer over the plastic microchannel platform. In addition, the bond strength and seal performance were estimated in a comparison with those done by conventional bonding or welding processes.

  10. Dry-grind processing using amylase corn and superior yeast to reduce the exogenous enzyme requirements in bioethanol production.

    PubMed

    Kumar, Deepak; Singh, Vijay

    2016-01-01

    Conventional corn dry-grind ethanol production process requires exogenous alpha and glucoamylases enzymes to breakdown starch into glucose, which is fermented to ethanol by yeast. This study evaluates the potential use of new genetically engineered corn and yeast, which can eliminate or minimize the use of these external enzymes, improve the economics and process efficiencies, and simplify the process. An approach of in situ ethanol removal during fermentation was also investigated for its potential to improve the efficiency of high-solid fermentation, which can significantly reduce the downstream ethanol and co-product recovery cost. The fermentation of amylase corn (producing endogenous α-amylase) using conventional yeast and no addition of exogenous α-amylase resulted in ethanol concentration of 4.1 % higher compared to control treatment (conventional corn using exogenous α-amylase). Conventional corn processed with exogenous α-amylase and superior yeast (producing glucoamylase or GA) with no exogenous glucoamylase addition resulted in ethanol concentration similar to control treatment (conventional yeast with exogenous glucoamylase addition). Combination of amylase corn and superior yeast required only 25 % of recommended glucoamylase dose to complete fermentation and achieve ethanol concentration and yield similar to control treatment (conventional corn with exogenous α-amylase, conventional yeast with exogenous glucoamylase). Use of superior yeast with 50 % GA addition resulted in similar increases in yield for conventional or amylase corn of approximately 7 % compared to that of control treatment. Combination of amylase corn, superior yeast, and in situ ethanol removal resulted in a process that allowed complete fermentation of 40 % slurry solids with only 50 % of exogenous GA enzyme requirements and 64.6 % higher ethanol yield compared to that of conventional process. Use of amylase corn and superior yeast in the dry-grind processing industry can reduce the total external enzyme usage by more than 80 %, and combining their use with in situ removal of ethanol during fermentation allows efficient high-solid fermentation.

  11. DESIGN REPORT: LOW-NOX BURNERS FOR PACKAGE BOILERS

    EPA Science Inventory

    The report describes a low-NOx burner design, presented for residual-oil-fired industrial boilers and boilers cofiring conventional fuels and nitrated hazardous wastes. The burner offers lower NOx emission levels for these applications than conventional commercial burners. The bu...

  12. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  13. An on-board processing satellite payload for European mobile communications

    NASA Astrophysics Data System (ADS)

    Evans, B. G.; Casewell, I. E.; Craig, A. D.

    1987-06-01

    An examination of the use of satellite on-board processing (OBP) for land mobile applications shows the feasibility of designing an OBP payload to satisfy the functional requirements of the land mobile system projected for the 1990s. Following a discussion of the proposed land mobile system, advantages of OBP over conventional transport payloads are considered. The use of digital signal processing techniques is shown to provide a solution for the merging of the routing and transmultiplexing functions into a single element, and such techniques are ideally suited to space applications. It is suggested that the projected power, mass, and size estimates are compatible with the payload capacity of one of the large Olympus satellites.

  14. High-performance polymer/layered silicate nanocomposites

    NASA Astrophysics Data System (ADS)

    Heidecker, Matthew J.

    High-performance layered-silicate nanocomposites of Polycarbonate (PC), poly(ethylene terephthalate) (PET), and their blends were produced via conventional melt-blending techniques. The focus of this thesis was on the fundamentals of dispersion, control of thermal stability, maintenance of melt-blending processing conditions, and on optimization of the composites' mechanical properties via the design of controlled and thermodynamically favorable nano-filler dispersions within the polymer matrices. PET and PC require high temperatures for melt-processing, rendering impractical the use of conventional/commercial organically-modified layered-silicates, since the thermal degradation temperatures of their ammonium surfactants lies below the typical processing temperatures. Thus, different surfactant chemistries must be employed in order to develop melt-processable nanocomposites, also accounting for polymer matrix degradation due to water (PET) or amine compounds (PC). Novel high thermal-stability surfactants were developed and employed in montmorillonite nanocomposites of PET, PC, and PC/PET blends, and were compared to the respective nanocomposites based on conventional quaternary-ammonium modified montmorillonites. Favorable dispersion was achieved in all cases, however, the overall material behavior -- i.e., the combination of crystallization, mechanical properties, and thermal degradation -- was better for the nanocomposites based on the thermally-stable surfactant fillers. Studies were also done to trace, and ultimately limit, the matrix degradation of Polycarbonate/montmorillonite nanocomposites, through varying the montmorillonite surfactant chemistry, processing conditions, and processing additives. Molecular weight degradation was, maybe surprisingly, better controlled in the conventional quaternary ammonium based nanocomposites -- even though the thermal stability of the organically modified montmorillonites was in most cases the lowest. Dependence of the resultant nanocomposites' mechanical properties on the preferential alignment of the montmorillonite nano-platelet was also evaluated. Highly aligned filler platelets did not result in an additional enhancement in mechanical properties. PC/PET blends and their respective PC/PET/montmorillonite nanocomposites were synthesized and compared. The dispersion of the organically modified nano-fillers in the PC/PET blends was controlled via thermodynamic considerations, realized through proper surfactant choice: Nanocomposites in which the layered silicate was preferentially sequestered in the PET phase were designed and synthesized. This preferential dispersion of the nano-filler in the PET phase of the PC/PET blend was insensitive to processing conditions, including approaches employing a master-batch (filler concentrate); regardless of the master-batch matrix, both PC and PET were employed, thermodynamics drove the layered silicate to preferentially migrate to the PET phase of the PC/PET blend. In a second approach, the development of a nanocomposite with controlled PC/PET compatibilization near the montmorillonite platelets, in absence of appreciable transesterification reactions, led to the formation of very high performance nanocomposites. These latter systems, point to an exciting new avenue of future considerations for nanocomposite blends with selective nano-filler dispersions, where performance can be tailored via the controlled preferential dispersion of nano-fillers in one phase, or by filler-induced polymer compatibilization.

  15. Intrahospital teleradiology from the emergency room

    NASA Astrophysics Data System (ADS)

    Fuhrman, Carl R.; Slasky, B. S.; Gur, David; Lattner, Stefanie; Herron, John M.; Plunkett, Michael B.; Towers, Jeffrey D.; Thaete, F. Leland

    1993-09-01

    Off-hour operations of the modern emergency room presents a challenge to conventional image management systems. To assess the utility of intrahospital teleradiology systems from the emergency room (ER), we installed a high-resolution film digitizer which was interfaced to a central archive and to a workstation at the main reading room. The system was designed to allow for digitization of images as soon as the films were processed. Digitized images were autorouted to both destinations, and digitized images could be laser printed (if desired). Almost real time interpretations of nonselected cases were performed at both locations (conventional film in the ER and a workstation in the main reading room), and an analysis of disagreements was performed. Our results demonstrate that in spite of a `significant' difference in reporting, `clinically significant differences' were found in less than 5% of cases. Folder management issues, preprocessing, image orientation, and setting reasonable lookup tables for display were identified as the main limitations to the systems' routine use in a busy environment. The main limitation of the conventional film was the identification of subtle abnormalities in the bright regions of the film. Once identified on either system (conventional film or soft display), all abnormalities were visible and detectable on both display modalities.

  16. Comparison of conventional rule based flow control with control processes based on fuzzy logic in a combined sewer system.

    PubMed

    Klepiszewski, K; Schmitt, T G

    2002-01-01

    While conventional rule based, real time flow control of sewer systems is in common use, control systems based on fuzzy logic have been used only rarely, but successfully. The intention of this study is to compare a conventional rule based control of a combined sewer system with a fuzzy logic control by using hydrodynamic simulation. The objective of both control strategies is to reduce the combined sewer overflow volume by an optimization of the utilized storage capacities of four combined sewer overflow tanks. The control systems affect the outflow of four combined sewer overflow tanks depending on the water levels inside the structures. Both systems use an identical rule base. The developed control systems are tested and optimized for a single storm event which affects heterogeneously hydraulic load conditions and local discharge. Finally the efficiencies of the two different control systems are compared for two more storm events. The results indicate that the conventional rule based control and the fuzzy control similarly reach the objective of the control strategy. In spite of the higher expense to design the fuzzy control system its use provides no advantages in this case.

  17. Feedstock powder processing research needs for additive manufacturing development

    DOE PAGES

    Anderson, Iver E.; White, Emma M. H.; Dehoff, Ryan

    2018-02-01

    Additive manufacturing (AM) promises to redesign traditional manufacturing by enabling the ultimate in agility for rapid component design changes in commercial products and for fabricating complex integrated parts. Here, by significantly increasing quality and yield of metallic alloy powders, the pace for design, development, and deployment of the most promising AM approaches can be greatly accelerated, resulting in rapid commercialization of these advanced manufacturing methods. By successful completion of a critical suite of processing research tasks that are intended to greatly enhance gas atomized powder quality and the precision and efficiency of powder production, researchers can help promote continued rapidmore » growth of AM. Finally, other powder-based or spray-based advanced manufacturing methods could also benefit from these research outcomes, promoting the next wave of sustainable manufacturing technologies for conventional and advanced materials.« less

  18. Feedstock powder processing research needs for additive manufacturing development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Iver E.; White, Emma M. H.; Dehoff, Ryan

    Additive manufacturing (AM) promises to redesign traditional manufacturing by enabling the ultimate in agility for rapid component design changes in commercial products and for fabricating complex integrated parts. Here, by significantly increasing quality and yield of metallic alloy powders, the pace for design, development, and deployment of the most promising AM approaches can be greatly accelerated, resulting in rapid commercialization of these advanced manufacturing methods. By successful completion of a critical suite of processing research tasks that are intended to greatly enhance gas atomized powder quality and the precision and efficiency of powder production, researchers can help promote continued rapidmore » growth of AM. Finally, other powder-based or spray-based advanced manufacturing methods could also benefit from these research outcomes, promoting the next wave of sustainable manufacturing technologies for conventional and advanced materials.« less

  19. Neural Networks for Rapid Design and Analysis

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Maghami, Peiman G.

    1998-01-01

    Artificial neural networks have been employed for rapid and efficient dynamics and control analysis of flexible systems. Specifically, feedforward neural networks are designed to approximate nonlinear dynamic components over prescribed input ranges, and are used in simulations as a means to speed up the overall time response analysis process. To capture the recursive nature of dynamic components with artificial neural networks, recurrent networks, which use state feedback with the appropriate number of time delays, as inputs to the networks, are employed. Once properly trained, neural networks can give very good approximations to nonlinear dynamic components, and by their judicious use in simulations, allow the analyst the potential to speed up the analysis process considerably. To illustrate this potential speed up, an existing simulation model of a spacecraft reaction wheel system is executed, first conventionally, and then with an artificial neural network in place.

  20. Some inadequacies of the current human factors certification process of advanced aircraft technologies

    NASA Technical Reports Server (NTRS)

    Paries, Jean

    1994-01-01

    Automation related accidents or serious incidents are not limited to advanced technology aircraft. There is a full history of such accidents with conventional technology aircraft. However, this type of occurrence is far from sparing the newest 'glass cockpit' generation, and it even seems to be a growing contributor to its accident rate. Nevertheless, all these aircraft have been properly certificated according to the relevant airworthiness regulations. Therefore, there is a growing concern that with the technological advancement of air transport aircraft cockpits, the current airworthiness regulations addressing cockpit design and human factors may have reached some level of inadequacy. This paper reviews some aspects of the current airworthiness regulations and certification process related to human factors of cockpit design and focuses on questioning their ability to guarantee the intended safety objectives.

  1. The Importance of Considering Differences in Study Design in Network Meta-analysis: An Application Using Anti-Tumor Necrosis Factor Drugs for Ulcerative Colitis.

    PubMed

    Cameron, Chris; Ewara, Emmanuel; Wilson, Florence R; Varu, Abhishek; Dyrda, Peter; Hutton, Brian; Ingham, Michael

    2017-11-01

    Adaptive trial designs present a methodological challenge when performing network meta-analysis (NMA), as data from such adaptive trial designs differ from conventional parallel design randomized controlled trials (RCTs). We aim to illustrate the importance of considering study design when conducting an NMA. Three NMAs comparing anti-tumor necrosis factor drugs for ulcerative colitis were compared and the analyses replicated using Bayesian NMA. The NMA comprised 3 RCTs comparing 4 treatments (adalimumab 40 mg, golimumab 50 mg, golimumab 100 mg, infliximab 5 mg/kg) and placebo. We investigated the impact of incorporating differences in the study design among the 3 RCTs and presented 3 alternative methods on how to convert outcome data derived from one form of adaptive design to more conventional parallel RCTs. Combining RCT results without considering variations in study design resulted in effect estimates that were biased against golimumab. In contrast, using the 3 alternative methods to convert outcome data from one form of adaptive design to a format more consistent with conventional parallel RCTs facilitated more transparent consideration of differences in study design. This approach is more likely to yield appropriate estimates of comparative efficacy when conducting an NMA, which includes treatments that use an alternative study design. RCTs based on adaptive study designs should not be combined with traditional parallel RCT designs in NMA. We have presented potential approaches to convert data from one form of adaptive design to more conventional parallel RCTs to facilitate transparent and less-biased comparisons.

  2. Open Rotor Aeroacoustic Installation Effects for Conventional and Unconventional Airframes

    NASA Technical Reports Server (NTRS)

    Czech, Michael J.; Thomas, Russell H.

    2013-01-01

    As extensive experimental campaign was performed to study the aeroacoustic installation effects of an open rotor with respect to both a conventional tube and wing type airframe and an unconventional hybrid wing body airframe. The open rotor rig had two counter rotating rows of blades each with eight blades of a design originally flight tested in the 1980s. The aeroacoustic installation effects measured in an aeroacoustic wind tunnel included those from flow effects due to inflow distortion or wake interaction and acoustic propagation effects such as shielding and reflection. The objective of the test campaign was to quantify the installation effects for a wide range of parameters and configurations derived from the two airframe types. For the conventional airframe, the open rotor was positioned in increments in front of and then over the main wing and then in positions representative of tail mounted aircraft with a conventional tail, a T-tail and a U-tail. The interaction of the wake of the open rotor as well as acoustic scattering results in an increase of about 10 dB when the rotor is positioned in front of the main wing. When positioned over the main wing a substantial amount of noise reduction is obtained and this is also observed for tail-mounted installations with a large U-tail. For the hybrid wing body airframe, the open rotor was positioned over the airframe along the centerline as well as off-center representing a twin engine location. A primary result was the documentation of the noise reduction from shielding as a function of the location of the open rotor upstream of the trailing edge of the hybrid wing body. The effects from vertical surfaces and elevon deflection were also measured. Acoustic lining was specially designed and inserted flush with the elevon and airframe surface, the result was an additional reduction in open rotor noise propagating to the far field microphones. Even with the older blade design used, the experiment provided quantification of the aeroacoustic installation effects for a wide range of open rotor and airframe configurations and can be used with data processing methods to evaluate the aeroacoustic installation effects for open rotors with modern blade designs.

  3. Effect of oil content and kernel processing of corn silage on digestibility and milk production by dairy cows.

    PubMed

    Weiss, W P; Wyatt, D J

    2000-02-01

    Corn silages were produced from a high oil corn hybrid and from its conventional hybrid counterpart and were harvested with a standard silage chopper or a chopper equipped with a kernel processing unit. High oil silages had higher concentrations of fatty acids (5.5 vs. 3.4% of dry matter) and crude protein (8.4 vs. 7.5% of dry matter) than the conventional hybrid. Processed silage had larger particle size than unprocessed silage, but more starch was found in small particles for processed silage. Dry matter intake was not influenced by treatment (18.4 kg/d), but yield of fat-corrected milk (23.9 vs. 22.6 kg/d) was increased by feeding high oil silage. Overall, processing corn silage did not affect milk production, but cows fed processed conventional silage tended to produce more milk than did cows fed unprocessed conventional silage. Milk protein percent, but not yield, was reduced with high oil silage. Milk fat percent, but not yield, was higher with processed silage. Overall, processed silage had higher starch digestibility, but the response was much greater for the conventional silage hybrid. The concentration of total digestible nutrients (TDN) tended to be higher for diets with high oil silage (71.6 vs. 69.9%) and tended to be higher for processed silage than unprocessed silage (71.7 vs. 69.8%), but an interaction between variety and processing was observed. Processing conventional corn silage increased TDN to values similar to high oil corn silage but processing high oil corn silage did not influence TDN.

  4. First-order reliability application and verification methods for semistatic structures

    NASA Astrophysics Data System (ADS)

    Verderaime, V.

    1994-11-01

    Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored in conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments; stress audits are shown to be arbitrary and incomplete, and the concept compromises the performance of high-strength materials. A reliability method is proposed that combines first-order reliability principles with deterministic design variables and conventional test techniques to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety-index expression. The application is reduced to solving for a design factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this design factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and the development of semistatic structural designs.

  5. Solid Waste Management with Emphasis on Environmental Aspect

    NASA Astrophysics Data System (ADS)

    Sinha, Navin Kr.; Choudhary, Binod Kumar; Shree, Shalini

    2011-12-01

    In this paper focus on Solid waste management. Its comprises of purposeful and systematic control of generation, storage, collection, transport, separations, processing, recycling, recovery and disposal of solid waste. Awareness of Four R's management & EMS support also for management Solid waste. Basel convention on the Control of transboundary movements of hazardous wastes and their Disposal usually known simply as the Basel Convention, is an international treaty that was designed to reduce the movements of hazardous waste between nations, and specifically to prevent transfer of hazardous waste from developed to less developed countries (LDCs). it came into force 5 May 1992. According to this "Substances or objects which are disposed of or are intended to be disposed of or are required to be disposed of by the provisions of national law"(UNEP).

  6. Optimum processing of mammographic film.

    PubMed

    Sprawls, P; Kitts, E L

    1996-03-01

    Underprocessing of mammographic film can result in reduced contrast and visibility of breast structures and an unnecessary increase in radiation dose to the patient. Underprocessing can be caused by physical factors (low developer temperature, inadequate development time, insufficient developer agitation) or chemical factors (developer not optimized for film type; overdiluted, underreplenished, contaminated, or frequently changed developer). Conventional quality control programs are designed to produce consistent processing but do not address the issue of optimum processing. Optimum processing is defined as the level of processing that produces the film performance characteristics (contrast and sensitivity) specified by the film manufacturer. Optimum processing of mammographic film can be achieved by following a two-step protocol. The first step is to set up the processing conditions according to recommendations from the film and developer chemistry manufacturers. The second step is to verify the processing results by comparing them with sensitometric data provided by the film manufacturer.

  7. Digital vs. conventional implant prosthetic workflows: a cost/time analysis.

    PubMed

    Joda, Tim; Brägger, Urs

    2015-12-01

    The aim of this prospective cohort trial was to perform a cost/time analysis for implant-supported single-unit reconstructions in the digital workflow compared to the conventional pathway. A total of 20 patients were included for rehabilitation with 2 × 20 implant crowns in a crossover study design and treated consecutively each with customized titanium abutments plus CAD/CAM-zirconia-suprastructures (test: digital) and with standardized titanium abutments plus PFM-crowns (control conventional). Starting with prosthetic treatment, analysis was estimated for clinical and laboratory work steps including measure of costs in Swiss Francs (CHF), productivity rates and cost minimization for first-line therapy. Statistical calculations were performed with Wilcoxon signed-rank test. Both protocols worked successfully for all test and control reconstructions. Direct treatment costs were significantly lower for the digital workflow 1815.35 CHF compared to the conventional pathway 2119.65 CHF [P = 0.0004]. For subprocess evaluation, total laboratory costs were calculated as 941.95 CHF for the test group and 1245.65 CHF for the control group, respectively [P = 0.003]. The clinical dental productivity rate amounted to 29.64 CHF/min (digital) and 24.37 CHF/min (conventional) [P = 0.002]. Overall, cost minimization analysis exhibited an 18% cost reduction within the digital process. The digital workflow was more efficient than the established conventional pathway for implant-supported crowns in this investigation. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. First Order Reliability Application and Verification Methods for Semistatic Structures

    NASA Technical Reports Server (NTRS)

    Verderaime, Vincent

    1994-01-01

    Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored by conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments, its stress audits are shown to be arbitrary and incomplete, and it compromises high strength materials performance. A reliability method is proposed which combines first order reliability principles with deterministic design variables and conventional test technique to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety index expression. The application is reduced to solving for a factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and with the pace of semistatic structural designs.

  9. Personal manufacturing systems

    NASA Astrophysics Data System (ADS)

    Bailey, P.

    1992-04-01

    Personal Manufacturing Systems are the missing link in the automation of the design-to- manufacture process. A PMS will act as a CAD peripheral, closing the loop around the designer enabling him to directly produce models, short production runs or soft tooling with as little fuss as he might otherwise plot a drawing. Whereas conventional 5-axis CNC machines are based on orthogonal axes and simple incremental movements, the PMS is based on a geodetic structure and complex co-ordinated 'spline' movements. The software employs a novel 3D pixel technique for give itself 'spatial awareness' and an expert system to determine the optimum machining conditions. A completely automatic machining strategy can then be determined.

  10. Resonant metamaterial detectors based on THz quantum-cascade structures

    PubMed Central

    Benz, A.; Krall, M.; Schwarz, S.; Dietze, D.; Detz, H.; Andrews, A. M.; Schrenk, W.; Strasser, G.; Unterrainer, K.

    2014-01-01

    We present the design, fabrication and characterisation of an intersubband detector employing a resonant metamaterial coupling structure. The semiconductor heterostructure relies on a conventional THz quantum-cascade laser design and is operated at zero bias for the detector operation. The same active region can be used to generate or detect light depending on the bias conditions and the vertical confinement. The metamaterial is processed directly into the top metal contact and is used to couple normal incidence radiation resonantly to the intersubband transitions. The device is capable of detecting light below and above the reststrahlenband of gallium-arsenide corresponding to the mid-infrared and THz spectral region. PMID:24608677

  11. Development of polymer nano composite patterns using fused deposition modeling for rapid investment casting process

    NASA Astrophysics Data System (ADS)

    Vivek, Tiwary; Arunkumar, P.; Deshpande, A. S.; Vinayak, Malik; Kulkarni, R. M.; Asif, Angadi

    2018-04-01

    Conventional investment casting is one of the oldest and most economical manufacturing techniques to produce intricate and complex part geometries. However, investment casting is considered economical only if the volume of production is large. Design iterations and design optimisations in this technique proves to be very costly due to time and tooling cost for making dies for producing wax patterns. However, with the advent of Additive manufacturing technology, plastic patterns promise a very good potential to replace the wax patterns. This approach can be very useful for low volume production & lab requirements, since the cost and time required to incorporate the changes in the design is very low. This research paper discusses the steps involved for developing polymer nanocomposite filaments and checking its suitability for investment castings. The process parameters of the 3D printer machine are also optimized using the DOE technique to obtain mechanically stronger plastic patterns. The study is done to develop a framework for rapid investment casting for lab as well as industrial requirements.

  12. Reverse Engineering Nature to Design Biomimetic Total Knee Implants.

    PubMed

    Varadarajan, Kartik Mangudi; Zumbrunn, Thomas; Rubash, Harry E; Malchau, Henrik; Muratoglu, Orhun K; Li, Guoan

    2015-10-01

    While contemporary total knee arthroplasty (TKA) provides tremendous clinical benefits, the normal feel and function of the knee is not fully restored. To address this, a novel design process was developed to reverse engineer "biomimetic" articular surfaces that are compatible with normal soft-tissue envelope and kinematics of the knee. The biomimetic articular surface is created by moving the TKA femoral component along in vivo kinematics of normal knees and carving out the tibial articular surface from a rectangular tibial block. Here, we describe the biomimetic design process. In addition, we utilize geometric comparisons and kinematic simulations to show that; (1) tibial articular surfaces of conventional implants are fundamentally incompatible with normal knee motion, and (2) the anatomic geometry of the biomimetic surface contributes directly to restoration of normal knee kinematics. Such biomimetic implants may enable us to achieve the long sought after goal of a "normal" knee post-TKA surgery. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. A framework for sustainable nanomaterial selection and design based on performance, hazard, and economic considerations.

    PubMed

    Falinski, Mark M; Plata, Desiree L; Chopra, Shauhrat S; Theis, Thomas L; Gilbertson, Leanne M; Zimmerman, Julie B

    2018-04-30

    Engineered nanomaterials (ENMs) and ENM-enabled products have emerged as potentially high-performance replacements to conventional materials and chemicals. As such, there is an urgent need to incorporate environmental and human health objectives into ENM selection and design processes. Here, an adapted framework based on the Ashby material selection strategy is presented as an enhanced selection and design process, which includes functional performance as well as environmental and human health considerations. The utility of this framework is demonstrated through two case studies, the design and selection of antimicrobial substances and conductive polymers, including ENMs, ENM-enabled products and their alternatives. Further, these case studies consider both the comparative efficacy and impacts at two scales: (i) a broad scale, where chemical/material classes are readily compared for primary decision-making, and (ii) within a chemical/material class, where physicochemical properties are manipulated to tailor the desired performance and environmental impact profile. Development and implementation of this framework can inform decision-making for the implementation of ENMs to facilitate promising applications and prevent unintended consequences.

  14. Multidisciplinary Design Optimization of a Full Vehicle with High Performance Computing

    NASA Technical Reports Server (NTRS)

    Yang, R. J.; Gu, L.; Tho, C. H.; Sobieszczanski-Sobieski, Jaroslaw

    2001-01-01

    Multidisciplinary design optimization (MDO) of a full vehicle under the constraints of crashworthiness, NVH (Noise, Vibration and Harshness), durability, and other performance attributes is one of the imperative goals for automotive industry. However, it is often infeasible due to the lack of computational resources, robust simulation capabilities, and efficient optimization methodologies. This paper intends to move closer towards that goal by using parallel computers for the intensive computation and combining different approximations for dissimilar analyses in the MDO process. The MDO process presented in this paper is an extension of the previous work reported by Sobieski et al. In addition to the roof crush, two full vehicle crash modes are added: full frontal impact and 50% frontal offset crash. Instead of using an adaptive polynomial response surface method, this paper employs a DOE/RSM method for exploring the design space and constructing highly nonlinear crash functions. Two NMO strategies are used and results are compared. This paper demonstrates that with high performance computing, a conventionally intractable real world full vehicle multidisciplinary optimization problem considering all performance attributes with large number of design variables become feasible.

  15. Infrared processing of foods

    USDA-ARS?s Scientific Manuscript database

    Infrared (IR) processing of foods has been gaining popularity over conventional processing in several unit operations, including drying, peeling, baking, roasting, blanching, pasteurization, sterilization, disinfection, disinfestation, cooking, and popping . It has shown advantages over conventional...

  16. NK1 receptor antagonism and emotional processing in healthy volunteers.

    PubMed

    Chandra, P; Hafizi, S; Massey-Chase, R M; Goodwin, G M; Cowen, P J; Harmer, C J

    2010-04-01

    The neurokinin-1 (NK(1)) receptor antagonist, aprepitant, showed activity in several animal models of depression; however, its efficacy in clinical trials was disappointing. There is little knowledge of the role of NK(1) receptors in human emotional behaviour to help explain this discrepancy. The aim of the current study was to assess the effects of a single oral dose of aprepitant (125 mg) on models of emotional processing sensitive to conventional antidepressant drug administration in 38 healthy volunteers, randomly allocated to receive aprepitant or placebo in a between groups double blind design. Performance on measures of facial expression recognition, emotional categorisation, memory and attentional visual-probe were assessed following the drug absorption. Relative to placebo, aprepitant improved recognition of happy facial expressions and increased vigilance to emotional information in the unmasked condition of the visual probe task. In contrast, aprepitant impaired emotional memory and slowed responses in the facial expression recognition task suggesting possible deleterious effects on cognition. These results suggest that while antagonism of NK(1) receptors does affect emotional processing in humans, its effects are more restricted and less consistent across tasks than those of conventional antidepressants. Human models of emotional processing may provide a useful means of assessing the likely therapeutic potential of new treatments for depression.

  17. Design of Channel Type Indirect Blank Holder for Prevention of Wrinkling and Fracture in Hot Stamping Process

    NASA Astrophysics Data System (ADS)

    Choi, Hong-seok; Ha, Se-yoon; Cha, Seung-hoon; kang, Chung-gil; Kim, Byung-min

    2011-08-01

    The hot stamping process has been used in the automotive industry to reduce the weight of the body-in-white and to increase passenger safety via improved crashworthiness. In this study, a new form die with a simple structure that can prevent defects such as wrinkle and fracture is proposed for the manufacture of hot stamped components. The wrinkling at the flange cannot be eliminated when using a conventional form die. It is known that the initiation of wrinkling is influenced by many factors such as the mechanical properties of the sheet material, geometry of the sheet and tool, and other process parameters, including the blank holding force (BHF) and the contact conditions. In this research, channel type indirect blank holder (CIBH) is introduced to replace general blank holder for manufacturing the hot stamped center pillar. First, we investigate the tension force acting on the blank according to the channel shapes. We determine the appropriate range by comparing the tension force with the upper and lower BHFs in a conventional stamping process. We then use FE-analysis to study the influence of the slope angle and corner radius of the channel on the formability. Finally, the center pillar is manufactured using the form die with the selected channel.

  18. Optimizing surface finishing processes through the use of novel solvents and systems

    NASA Astrophysics Data System (ADS)

    Quillen, M.; Holbrook, P.; Moore, J.

    2007-03-01

    As the semiconductor industry continues to implement the ITRS (International Technology Roadmap for Semiconductors) node targets that go beyond 45nm [1], the need for improved cleanliness between repeated process steps continues to grow. Wafer cleaning challenges cover many applications such as Cu/low-K integration, where trade-offs must be made between dielectric damage and residue by plasma etching and CMP or moisture uptake by aqueous cleaning products. [2-5] Some surface sensitive processes use the Marangoni tool design [6] where a conventional solvent such as IPA (isopropanol), combines with water to provide improved physical properties such as reduced contact angle and surface tension. This paper introduces the use of alternative solvents and their mixtures compared to pure IPA in removing ionics, moisture, and particles using immersion bench-chemistry models of various processes. A novel Eastman proprietary solvent, Eastman methyl acetate is observed to provide improvement in ionic, moisture capture, and particle removal, as compared to conventional IPA. [7] These benefits may be improved relative to pure IPA, simply by the addition of various additives. Some physical properties of the mixtures were found to be relatively unchanged even as measured performance improved. This report presents our attempts to cite and optimize these benefits through the use of laboratory models.

  19. Directed self-assembly of block copolymer films on atomically-thin graphene chemical patterns

    DOE PAGES

    Chang, Tzu-Hsuan; Xiong, Shisheng; Jacobberger, Robert M.; ...

    2016-08-16

    Directed self-assembly of block copolymers is a scalable method to fabricate well-ordered patterns over the wafer scale with feature sizes below the resolution of conventional lithography. Typically, lithographically-defined prepatterns with varying chemical contrast are used to rationally guide the assembly of block copolymers. The directed self-assembly to obtain accurate registration and alignment is largely influenced by the assembly kinetics. Furthermore, a considerably broad processing window is favored for industrial manufacturing. Using an atomically-thin layer of graphene on germanium, after two simple processing steps, we create a novel chemical pattern to direct the assembly of polystyreneblock-poly(methyl methacrylate). Faster assembly kinetics aremore » observed on graphene/germanium chemical patterns than on conventional chemical patterns based on polymer mats and brushes. This new chemical pattern allows for assembly on a wide range of guiding periods and along designed 90° bending structures. We also achieve density multiplication by a factor of 10, greatly enhancing the pattern resolution. Lastly, the rapid assembly kinetics, minimal topography, and broad processing window demonstrate the advantages of inorganic chemical patterns composed of hard surfaces.« less

  20. Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.

  1. Design of transplanting mechanism for system of rice intensification (SRI) transplanter in Kedah, Malaysia

    NASA Astrophysics Data System (ADS)

    Imran, M. S.; Manan, M. S. Abdul; Khalil, A. N. M.; MdNaim, M. K.; Ahmad, R. N.

    2017-08-01

    There is a demand to develop transplanter specifically for system of rice intensification (SRI) cultivation in Malaysia. This SRI transplanter is different from conventional transplanter as it is required special requirements for transplanting. The work focused on transplanting mechanism design which can be later attached to SRI transplanter. The mechanical design was established using linkage mechanism, having a wheel that act as timing wheel that will control the distance between transplanted seedlings. The linkage mechanism also control the opening of the flapper that allow the seedling together with its nursery soil to be dropped, and control the stopper to prevent next seedling from sliding down the tray. The use of simple mechanism will have low cost for fabrication. The design was analysed using motion analysis software. Results show the design is perfectly good and can be fabricated without any problem. The animation successfully shows the perfect movement of the mechanism and transplanting process.

  2. How the Hilbert integral theorem inspired flow lines

    NASA Astrophysics Data System (ADS)

    Winston, Roland; Jiang, Lun

    2017-09-01

    Nonimaging Optics has been shown to achieve the theoretical limits constrained only by thermodynamic principles. The designing principles of nonimaging optics allow a non-conventional way of thinking about and generating new optical devices. Compared to conventional imaging optics which rarely utilizes the framework of thermodynamic arguments, nonimaging optics chooses to map etendue instead of rays. This fundamental shift of design paradigm frees the optics design from ray based designs which heavily relies on error tolerance analysis. Instead, the underlying thermodynamic principles guide the nonimaging design to be naturally constructed for extended light source for illumination, non-tracking concentrators and sensors that require sharp cut-off angles. We argue in this article that such optical devices which has enabled a multitude of applications depends on probabilities, geometric flux field and radiative heat transfer while "optics" in the conventional sense recedes into the background.

  3. Radiation shielding design of a new tomotherapy facility.

    PubMed

    Zacarias, Albert; Balog, John; Mills, Michael

    2006-10-01

    It is expected that intensity modulated radiation therapy (IMRT) and image guided radiation therapy (IGRT) will replace a large portion of radiation therapy treatments currently performed with conventional MLC-based 3D conformal techniques. IGRT may become the standard of treatment in the future for prostate and head and neck cancer. Many established facilities may convert existing vaults to perform this treatment method using new or upgraded equipment. In the future, more facilities undoubtedly will be considering de novo designs for their treatment vaults. A reevaluation of the design principles used in conventional vault design is of benefit to those considering this approach with a new tomotherapy facility. This is made more imperative as the design of the TomoTherapy system is unique in several aspects and does not fit well into the formalism of NCRP 49 for a conventional linear accelerator.

  4. Intrinsic Hardware Evolution for the Design and Reconfiguration of Analog Speed Controllers for a DC Motor

    NASA Technical Reports Server (NTRS)

    Gwaltney, David A.; Ferguson, Michael I.

    2003-01-01

    Evolvable hardware provides the capability to evolve analog circuits to produce amplifier and filter functions. Conventional analog controller designs employ these same functions. Analog controllers for the control of the shaft speed of a DC motor are evolved on an evolvable hardware platform utilizing a second generation Field Programmable Transistor Array (FPTA2). The performance of an evolved controller is compared to that of a conventional proportional-integral (PI) controller. It is shown that hardware evolution is able to create a compact design that provides good performance, while using considerably less functional electronic components than the conventional design. Additionally, the use of hardware evolution to provide fault tolerance by reconfiguring the design is explored. Experimental results are presented showing that significant recovery of capability can be made in the face of damaging induced faults.

  5. Microwave processing of gustatory tissues for immunohistochemistry

    PubMed Central

    Bond, Amanda; Kinnamon, John C.

    2013-01-01

    We use immunohistochemistry to study taste cell structure and function as a means to elucidate how taste receptor cells communicate with nerve fibers and adjacent taste cells. This conventional method, however, is time consuming. In the present study we used taste buds from rat circumvallate papillae to compare conventional immunohistochemical tissue processing with microwave processing for the colocalization of several biochemical pathway markers (PLCβ2, syntaxin-1, IP3R3, α-gustducin) and the nuclear stain, Sytox. The results of our study indicate that in microwave versus conventional immunocytochemistry: (1) fixation quality is improved; (2) the amount of time necessary for processing tissue is decreased; (3) antigen retrieval is no longer needed; (4) image quality is superior. In sum, microwave tissue processing of gustatory tissues is faster and superior to conventional immunohistochemical tissue processing for many applications. PMID:23473796

  6. High-throughput miniaturized bioreactors for cell culture process development: reproducibility, scalability, and control.

    PubMed

    Rameez, Shahid; Mostafa, Sigma S; Miller, Christopher; Shukla, Abhinav A

    2014-01-01

    Decreasing the timeframe for cell culture process development has been a key goal toward accelerating biopharmaceutical development. Advanced Microscale Bioreactors (ambr™) is an automated micro-bioreactor system with miniature single-use bioreactors with a 10-15 mL working volume controlled by an automated workstation. This system was compared to conventional bioreactor systems in terms of its performance for the production of a monoclonal antibody in a recombinant Chinese Hamster Ovary cell line. The miniaturized bioreactor system was found to produce cell culture profiles that matched across scales to 3 L, 15 L, and 200 L stirred tank bioreactors. The processes used in this article involve complex feed formulations, perturbations, and strict process control within the design space, which are in-line with processes used for commercial scale manufacturing of biopharmaceuticals. Changes to important process parameters in ambr™ resulted in predictable cell growth, viability and titer changes, which were in good agreement to data from the conventional larger scale bioreactors. ambr™ was found to successfully reproduce variations in temperature, dissolved oxygen (DO), and pH conditions similar to the larger bioreactor systems. Additionally, the miniature bioreactors were found to react well to perturbations in pH and DO through adjustments to the Proportional and Integral control loop. The data presented here demonstrates the utility of the ambr™ system as a high throughput system for cell culture process development. © 2014 American Institute of Chemical Engineers.

  7. Presence of enteric viruses in freshwater and their removal by the conventional drinking water treatment process.

    PubMed Central

    Hurst, C. J.

    1991-01-01

    A review of results published in English or French between 1980 and 1990 was carried out to determine the levels of indigenous human enteric viruses in untreated surface and subsurface freshwaters, as well as in drinking water that had undergone the complete conventional treatment process. For this purpose, the conventional treatment process was defined as an operation that included coagulation followed by sedimentation, filtration, and disinfection. Also assessed was the stepwise efficiency of the conventional treatment process, as practised at full-scale facilities, for removing indigenous viruses from naturally occurring freshwaters. A list was compiled of statistical correlations relating to the occurrence of indigenous viruses in water. PMID:1647273

  8. BLENDING LOW ENRICHED URANIUM WITH DEPLETED URANIUM TO CREATE A SOURCE MATERIAL ORE THAT CAN BE PROCESSED FOR THE RECOVERY OF YELLOWCAKE AT A CONVENTIONAL URANIUM MILL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schutt, Stephen M.; Hochstein, Ron F.; Frydenlund, David C.

    2003-02-27

    Throughout the United States Department of Energy (DOE) complex, there are a number of streams of low enriched uranium (LEU) that contain various trace contaminants. These surplus nuclear materials require processing in order to meet commercial fuel cycle specifications. To date, they have not been designated as waste for disposal at the DOE's Nevada Test Site (NTS). Currently, with no commercial outlet available, the DOE is evaluating treatment and disposal as the ultimate disposition path for these materials. This paper will describe an innovative program that will provide a solution to DOE that will allow disposition of these materials atmore » a cost that will be competitive with treatment and disposal at the NTS, while at the same time recycling the material to recover a valuable energy resource (yellowcake) for reintroduction into the commercial nuclear fuel cycle. International Uranium (USA) Corporation (IUSA) and Nuclear Fuel Services, Inc. (NFS) have entered into a commercial relationship to pursue the development of this program. The program involves the design of a process and construction of a plant at NFS' site in Erwin, Tennessee, for the blending of contaminated LEU with depleted uranium (DU) to produce a uranium source material ore (USM Ore{trademark}). The USM Ore{trademark} will then be further processed at IUC's White Mesa Mill, located near Blanding, Utah, to produce conventional yellowcake, which can be delivered to conversion facilities, in the same manner as yellowcake that is produced from natural ores or other alternate feed materials. The primary source of feed for the business will be the significant sources of trace contaminated materials within the DOE complex. NFS has developed a dry blending process (DRYSM Process) to blend the surplus LEU material with DU at its Part 70 licensed facility, to produce USM Ore{trademark} with a U235 content within the range of U235 concentrations for source material. By reducing the U235 content to source material levels in this manner, the material will be suitable for processing at a conventional uranium mill under its existing Part 40 license to remove contaminants and enable the product to re-enter the commercial fuel cycle. The tailings from processing the USM Ore{trademark} at the mill will be permanently disposed of in the mill's tailings impoundment as 11e.(2) byproduct material. Blending LEU with DU to make a uranium source material ore that can be returned to the nuclear fuel cycle for processing to produce yellowcake, has never been accomplished before. This program will allow DOE to disposition its surplus LEU and DU in a cost effective manner, and at the same time provide for the recovery of valuable energy resources that would be lost through processing and disposal of the materials. This paper will discuss the nature of the surplus LEU and DU materials, the manner in which the LEU will be blended with DU to form a uranium source material ore, and the legal means by which this blending can be accomplished at a facility licensed under 10 CFR Part 70 to produce ore that can be processed at a conventional uranium mill licensed under 10 CFR Part 40.« less

  9. An introduction to the Astro Edge solar array

    NASA Technical Reports Server (NTRS)

    Spence, B. R.; Marks, G. W.

    1994-01-01

    The Astro Edge solar array is a new and innovative low concentrator power generating system which has been developed for applications requiring high specific power, high stiffness, low risk, light modular construction which utilizes conventional materials and technology, and standard photovoltaic solar cells and laydown processes. Mechanisms, restraint/release devices, wiring harnesses, substrates, and support structures are designed to be simple, functional, lightweight, and modular. A brief overview of the Astro Edge solar array is discussed.

  10. Mixed Oxidant Process for Control of Biological Growth in Cooling Towers

    DTIC Science & Technology

    2010-02-01

    Concentration is < 1% (vs. 12.5% for bulk bleach ) • Will not form chlorine gas • No transport or storage of hazardous chemicals • Uses only salt as...Eliminates purchase, transport, and storage of hazardous biocide compounds such as hypochlorite or chlorine gas • Provides a constant dosage level of...patented MIOX equipment design • Chemical and biocidal properties are more effective than conventional chlorine Bulk Bleach On-Site Hypo Mixed Oxidants E

  11. Cycle time improvement for plastic injection moulding process by sub groove modification in conformal cooling channel

    NASA Astrophysics Data System (ADS)

    Kamarudin, K.; Wahab, M. S.; Batcha, M. F. M.; Shayfull, Z.; Raus, A. A.; Ahmed, Aqeel

    2017-09-01

    Mould designers have been struggling for the improvement of the cooling system performance, despite the fact that the cooling system complexity is physically limited by the fabrication capability of the conventional tooling methods. However, the growth of Solid Free Form Technology (SFF) allow the mould designer to develop more than just a regular conformal cooling channel. Numerous researchers demonstrate that conformal cooling channel was tremendously given significant result in the improvement of productivity and quality in the plastic injection moulding process. This paper presents the research work that applies the passive enhancement method in square shape cooling channel to enhance the efficiency of cooling performance by adding the sub groove to the cooling channel itself. Previous design that uses square shape cooling channel was improved by adding various numbers of sub groove to meet the best sub groove design that able reduced the cooling time. The effect of sub groove design on cooling time was investigated by Autodesk Modlflow Insight software. The simulation results showed that the various sub groove designs give different values to ejection time. The Design 7 showed the lowest value of ejection time with 24.3% increment. The addition of sub groove significantly increased a coolant velocity and a rate of heat transfer from molten plastic to coolant.

  12. Design and evaluation of a wireless sensor network based aircraft strength testing system.

    PubMed

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system.

  13. Design and Evaluation of a Wireless Sensor Network Based Aircraft Strength Testing System

    PubMed Central

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system. PMID:22408521

  14. Superconducting RF Linacs Driving Subcritical Reactors for Profitable Disposition of Surplus Weapons-grade Plutonium

    NASA Astrophysics Data System (ADS)

    Cummings, Mary Anne; Johnson, Rolland

    Acceptable capital and operating costs of high-power proton accelerators suitable for profitable commercial electric-power and process-heat applications have been demonstrated. However, studies have pointed out that even a few hundred trips of an accelerator lasting a few seconds would lead to unacceptable thermal stresses as each trip causes fission to be turned off in solid fuel structures found in conventional reactors. The newest designs based on the GEM*STAR concept take such trips in stride by using molten-salt fuel, where fuel pin fatigue is not an issue. Other aspects of the GEM*STAR concept which address all historical reactor failures include an internal spallation neutron target and high temperature molten salt fuel with continuous purging of volatile radioactive fission products such that the reactor contains less than a critical mass and almost a million times fewer volatile radioactive fission products than conventional reactors. GEM*STAR is a reactor that without redesign will burn spent nuclear fuel, natural uranium, thorium, or surplus weapons material. It will operate without the need for a critical core, fuel enrichment, or reprocessing making it an excellent candidate for export. As a first application, the design for a pilot plant is described for the profitable disposition of surplus weapons-grade plutonium by using process heat to produce green diesel fuel for the Department of Defense (DOD) from natural gas and renewable carbon.

  15. Experimental Investigation Nano Particles Influence in NPMEDM to Machine Inconel 800 with Electrolyte Copper Electrode

    NASA Astrophysics Data System (ADS)

    Karunakaran, K.; Chandrasekaran, M.

    2017-05-01

    The recent technology of machining hard materials is Powder mix dielectric electrical Discharge Machining (PMEDM). This research investigates nano sized (about 5Nm) powders influence in machining Inconel 800 nickel based super alloy. This work is motivated for a practical need for a manufacturing industry, which processes various kinds of jobs of Inconel 800 material. The conventional EDM machining also considered for investigation for the measure of Nano powders performances. The aluminum, silicon and multi walled Carbon Nano tubes powders were considered in this investigation along with pulse on time, pulse of time and input current to analyze and optimize the responses of Material Removal Rate, Tool Wear Rate and surface roughness. The Taguchi general Full Factorial Design was used to design the experiments. The most advance equipments employed in conducting experiments and measuring equipments to improve the accuracy of the result. The MWCNT powder mix was out performs than other powders which reduce 22% to 50% of the tool wear rate, gives the surface roughness reduction from 29.62% to 41.64% and improved MRR 42.91% to 53.51% than conventional EDM.

  16. Additive Manufactured Superconducting Cavities

    NASA Astrophysics Data System (ADS)

    Holland, Eric; Rosen, Yaniv; Woolleet, Nathan; Materise, Nicholas; Voisin, Thomas; Wang, Morris; Mireles, Jorge; Carosi, Gianpaolo; Dubois, Jonathan

    Superconducting radio frequency cavities provide an ultra-low dissipative environment, which has enabled fundamental investigations in quantum mechanics, materials properties, and the search for new particles in and beyond the standard model. However, resonator designs are constrained by limitations in conventional machining techniques. For example, current through a seam is a limiting factor in performance for many waveguide cavities. Development of highly reproducible methods for metallic parts through additive manufacturing, referred to colloquially as 3D printing\\x9D, opens the possibility for novel cavity designs which cannot be implemented through conventional methods. We present preliminary investigations of superconducting cavities made through a selective laser melting process, which compacts a granular powder via a high-power laser according to a digitally defined geometry. Initial work suggests that assuming a loss model and numerically optimizing a geometry to minimize dissipation results in modest improvements in device performance. Furthermore, a subset of titanium alloys, particularly, a titanium, aluminum, vanadium alloy (Ti - 6Al - 4V) exhibits properties indicative of a high kinetic inductance material. This work is supported by LDRD 16-SI-004.

  17. GPU-accelerated FDTD modeling of radio-frequency field-tissue interactions in high-field MRI.

    PubMed

    Chi, Jieru; Liu, Feng; Weber, Ewald; Li, Yu; Crozier, Stuart

    2011-06-01

    The analysis of high-field RF field-tissue interactions requires high-performance finite-difference time-domain (FDTD) computing. Conventional CPU-based FDTD calculations offer limited computing performance in a PC environment. This study presents a graphics processing unit (GPU)-based parallel-computing framework, producing substantially boosted computing efficiency (with a two-order speedup factor) at a PC-level cost. Specific details of implementing the FDTD method on a GPU architecture have been presented and the new computational strategy has been successfully applied to the design of a novel 8-element transceive RF coil system at 9.4 T. Facilitated by the powerful GPU-FDTD computing, the new RF coil array offers optimized fields (averaging 25% improvement in sensitivity, and 20% reduction in loop coupling compared with conventional array structures of the same size) for small animal imaging with a robust RF configuration. The GPU-enabled acceleration paves the way for FDTD to be applied for both detailed forward modeling and inverse design of MRI coils, which were previously impractical.

  18. Key issues in decomposing fMRI during naturalistic and continuous music experience with independent component analysis.

    PubMed

    Cong, Fengyu; Puoliväli, Tuomas; Alluri, Vinoo; Sipola, Tuomo; Burunat, Iballa; Toiviainen, Petri; Nandi, Asoke K; Brattico, Elvira; Ristaniemi, Tapani

    2014-02-15

    Independent component analysis (ICA) has been often used to decompose fMRI data mostly for the resting-state, block and event-related designs due to its outstanding advantage. For fMRI data during free-listening experiences, only a few exploratory studies applied ICA. For processing the fMRI data elicited by 512-s modern tango, a FFT based band-pass filter was used to further pre-process the fMRI data to remove sources of no interest and noise. Then, a fast model order selection method was applied to estimate the number of sources. Next, both individual ICA and group ICA were performed. Subsequently, ICA components whose temporal courses were significantly correlated with musical features were selected. Finally, for individual ICA, common components across majority of participants were found by diffusion map and spectral clustering. The extracted spatial maps (by the new ICA approach) common across most participants evidenced slightly right-lateralized activity within and surrounding the auditory cortices. Meanwhile, they were found associated with the musical features. Compared with the conventional ICA approach, more participants were found to have the common spatial maps extracted by the new ICA approach. Conventional model order selection methods underestimated the true number of sources in the conventionally pre-processed fMRI data for the individual ICA. Pre-processing the fMRI data by using a reasonable band-pass digital filter can greatly benefit the following model order selection and ICA with fMRI data by naturalistic paradigms. Diffusion map and spectral clustering are straightforward tools to find common ICA spatial maps. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Optimizing friction stir weld parameters of aluminum and copper using conventional milling machine

    NASA Astrophysics Data System (ADS)

    Manisegaran, Lohappriya V.; Ahmad, Nurainaa Ayuni; Nazri, Nurnadhirah; Noor, Amirul Syafiq Mohd; Ramachandran, Vignesh; Ismail, Muhammad Tarmizizulfika; Ahmad, Ku Zarina Ku; Daruis, Dian Darina Indah

    2018-05-01

    The joining of two of any particular materials through friction stir welding (FSW) are done by a rotating tool and the work piece material that generates heat which causes the region near the FSW tool to soften. This in return will mechanically intermix the work pieces. The first objective of this study is to join aluminum plates and copper plates by means of friction stir welding process using self-fabricated tools and conventional milling machine. This study also aims to investigate the optimum process parameters to produce the optimum mechanical properties of the welding joints for Aluminum plates and Copper plates. A suitable tool bit and a fixture is to be fabricated for the welding process. A conventional milling machine will be used to weld the aluminum and copper. The most important parameters to enable the process are speed and pressure of the tool (or tool design and alignment of the tool onto the work piece). The study showed that the best surface finish was produced from speed of 1150 rpm and tool bit tilted to 3°. For a 200mm × 100mm Aluminum 6061 with plate thickness of 2 mm at a speed of 1 mm/s, the time taken to complete the welding is only 200 seconds or equivalent to 3 minutes and 20 seconds. The Copper plates was successfully welded using FSW with tool rotation speed of 500 rpm, 700 rpm, 900 rpm, 1150 rpm and 1440 rpm and with welding traverse rate of 30 mm/min, 60 mm/min and 90 mm/min. As the conclusion, FSW using milling machine can be done on both Aluminum and Copper plates, however the weld parameters are different for the two types of plates.

  20. Large dynamic range terahertz spectrometers based on plasmonic photomixers (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Javadi, Hamid; Jarrahi, Mona

    2017-02-01

    Heterodyne terahertz spectrometers are highly in demand for space explorations and astrophysics studies. A conventional heterodyne terahertz spectrometer consists of a terahertz mixer that mixes a received terahertz signal with a local oscillator signal to generate an intermediate frequency signal in the radio frequency (RF) range, where it can be easily processed and detected by RF electronics. Schottky diode mixers, superconductor-insulator-superconductor (SIS) mixers and hot electron bolometer (HEB) mixers are the most commonly used mixers in conventional heterodyne terahertz spectrometers. While conventional heterodyne terahertz spectrometers offer high spectral resolution and high detection sensitivity levels at cryogenic temperatures, their dynamic range and bandwidth are limited by the low radiation power of existing terahertz local oscillators and narrow bandwidth of existing terahertz mixers. To address these limitations, we present a novel approach for heterodyne terahertz spectrometry based on plasmonic photomixing. The presented design replaces terahertz mixer and local oscillator of conventional heterodyne terahertz spectrometers with a plasmonic photomixer pumped by an optical local oscillator. The optical local oscillator consists of two wavelength-tunable continuous-wave optical sources with a terahertz frequency difference. As a result, the spectrometry bandwidth and dynamic range of the presented heterodyne spectrometer is not limited by radiation frequency and power restrictions of conventional terahertz sources. We demonstrate a proof-of-concept terahertz spectrometer with more than 90 dB dynamic range and 1 THz spectrometry bandwidth.

  1. Dental students' preferences and performance in crown design: conventional wax-added versus CAD.

    PubMed

    Douglas, R Duane; Hopp, Christa D; Augustin, Marcus A

    2014-12-01

    The purpose of this study was to evaluate dental students' perceptions of traditional waxing vs. computer-aided crown design and to determine the effectiveness of either technique through comparative grading of the final products. On one of twoidentical tooth preparations, second-year students at one dental school fabricated a wax pattern for a full contour crown; on the second tooth preparation, the same students designed and fabricated an all-ceramic crown using computer-aided design (CAD) and computer-aided manufacturing (CAM) technology. Projects were graded for occlusion and anatomic form by three faculty members. On completion of the projects, 100 percent of the students (n=50) completed an eight-question, five-point Likert scalesurvey, designed to assess their perceptions of and learning associated with the two design techniques. The average grades for the crown design projects were 78.3 (CAD) and 79.1 (wax design). The mean numbers of occlusal contacts were 3.8 (CAD) and 2.9(wax design), which was significantly higher for CAD (p=0.02). The survey results indicated that students enjoyed designing afull contour crown using CAD as compared to using conventional wax techniques and spent less time designing the crown using CAD. From a learning perspective, students felt that they learned more about position and the size/strength of occlusal contacts using CAD. However, students recognized that CAD technology has limits in terms of representing anatomic contours and excursive occlusion compared to conventional wax techniques. The results suggest that crown design using CAD could be considered as an adjunct to conventional wax-added techniques in preclinical fixed prosthodontic curricula.

  2. Development of the platelet micro-orifice injector. [for liquid propellant rocket engines

    NASA Technical Reports Server (NTRS)

    La Botz, R. J.

    1984-01-01

    For some time to come, liquid rocket engines will continue to provide the primary means of propulsion for space transportation. The injector represents a key to the optimization of engine and system performance. The present investigation is concerned with a unique injector design and fabrication process which has demonstrated performance capabilities beyond that achieved with more conventional approaches. This process, which is called the 'platelet process', makes it feasible to fabricate injectors with a pattern an order of magnitude finer than that obtainable by drilling. The fine pattern leads to an achievement of high combustion efficiencies. Platelet injectors have been identified as one of the significant technology advances contributing to the feasibility of advanced dual-fuel booster engines. Platelet injectors are employed in the Space Shuttle Orbit Maneuvering System (OMS) engines. Attention is given to injector design theory as it relates to pattern fineness, a description of platelet injectors, and test data obtained with three different platelet injectors.

  3. Plasma Spray-Physical Vapor Deposition (PS-PVD) of Ceramics for Protective Coatings

    NASA Technical Reports Server (NTRS)

    Harder, Bryan J.; Zhu, Dongming

    2011-01-01

    In order to generate advanced multilayer thermal and environmental protection systems, a new deposition process is needed to bridge the gap between conventional plasma spray, which produces relatively thick coatings on the order of 125-250 microns, and conventional vapor phase processes such as electron beam physical vapor deposition (EB-PVD) which are limited by relatively slow deposition rates, high investment costs, and coating material vapor pressure requirements. The use of Plasma Spray - Physical Vapor Deposition (PS-PVD) processing fills this gap and allows thin (< 10 microns) single layers to be deposited and multilayer coatings of less than 100 microns to be generated with the flexibility to tailor microstructures by changing processing conditions. Coatings of yttria-stabilized zirconia (YSZ) were applied to NiCrAlY bond coated superalloy substrates using the PS-PVD coater at NASA Glenn Research Center. A design-of-experiments was used to examine the effects of process variables (Ar/He plasma gas ratio, the total plasma gas flow, and the torch current) on chamber pressure and torch power. Coating thickness, phase and microstructure were evaluated for each set of deposition conditions. Low chamber pressures and high power were shown to increase coating thickness and create columnar-like structures. Likewise, high chamber pressures and low power had lower growth rates, but resulted in flatter, more homogeneous layers

  4. Comprehension Process of Second Language Indirect Requests.

    ERIC Educational Resources Information Center

    Takahashi, Satomi; Roitblat, Herbert L.

    1994-01-01

    Examines the comprehension of English conventional indirect requests by native English speakers and Japanese learners of English. Subjects read stories inducing either a conventional or a literal interpretation of a priming sentence. Results suggest that both native and nonnative speakers process both meanings of an ambiguous conventional request.…

  5. Taguchi's off line method and Multivariate loss function approach for quality management and optimization of process parameters -A review

    NASA Astrophysics Data System (ADS)

    Bharti, P. K.; Khan, M. I.; Singh, Harbinder

    2010-10-01

    Off-line quality control is considered to be an effective approach to improve product quality at a relatively low cost. The Taguchi method is one of the conventional approaches for this purpose. Through this approach, engineers can determine a feasible combination of design parameters such that the variability of a product's response can be reduced and the mean is close to the desired target. The traditional Taguchi method was focused on ensuring good performance at the parameter design stage with one quality characteristic, but most products and processes have multiple quality characteristics. The optimal parameter design minimizes the total quality loss for multiple quality characteristics. Several studies have presented approaches addressing multiple quality characteristics. Most of these papers were concerned with maximizing the parameter combination of signal to noise (SN) ratios. The results reveal the advantages of this approach are that the optimal parameter design is the same as the traditional Taguchi method for the single quality characteristic; the optimal design maximizes the amount of reduction of total quality loss for multiple quality characteristics. This paper presents a literature review on solving multi-response problems in the Taguchi method and its successful implementation in various industries.

  6. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  7. Image data-processing system for solar astronomy

    NASA Technical Reports Server (NTRS)

    Wilson, R. M.; Teuber, D. L.; Watkins, J. R.; Thomas, D. T.; Cooper, C. M.

    1977-01-01

    The paper describes an image data processing system (IDAPS), its hardware/software configuration, and interactive and batch modes of operation for the analysis of the Skylab/Apollo Telescope Mount S056 X-Ray Telescope experiment data. Interactive IDAPS is primarily designed to provide on-line interactive user control of image processing operations for image familiarization, sequence and parameter optimization, and selective feature extraction and analysis. Batch IDAPS follows the normal conventions of card control and data input and output, and is best suited where the desired parameters and sequence of operations are known and when long image-processing times are required. Particular attention is given to the way in which this system has been used in solar astronomy and other investigations. Some recent results obtained by means of IDAPS are presented.

  8. The status of membrane bioreactor technology.

    PubMed

    Judd, Simon

    2008-02-01

    In this article, the current status of membrane bioreactor (MBR) technology for wastewater treatment is reviewed. Fundamental facets of the MBR process and membrane and process configurations are outlined and the advantages and disadvantages over conventional suspended growth-based biotreatment are briefly identified. Key process design and operating parameters are defined and their significance explained. The inter-relationships between these parameters are identified and their implications discussed, with particular reference to impacts on membrane surface fouling and channel clogging. In addition, current understanding of membrane surface fouling and identification of candidate foulants is appraised. Although much interest in this technology exists and its penetration of the market will probably increase significantly, there remains a lack of understanding of key process constraints such as membrane channel clogging, and of the science of membrane cleaning.

  9. Lost in the crowd? Using eye-tracking to investigate the effect of complexity on attribute non-attendance in discrete choice experiments.

    PubMed

    Spinks, Jean; Mortimer, Duncan

    2016-02-03

    The provision of additional information is often assumed to improve consumption decisions, allowing consumers to more accurately weigh the costs and benefits of alternatives. However, increasing the complexity of decision problems may prompt changes in information processing. This is particularly relevant for experimental methods such as discrete choice experiments (DCEs) where the researcher can manipulate the complexity of the decision problem. The primary aims of this study are (i) to test whether consumers actually process additional information in an already complex decision problem, and (ii) consider the implications of any such 'complexity-driven' changes in information processing for design and analysis of DCEs. A discrete choice experiment (DCE) is used to simulate a complex decision problem; here, the choice between complementary and conventional medicine for different health conditions. Eye-tracking technology is used to capture the number of times and the duration that a participant looks at any part of a computer screen during completion of DCE choice sets. From this we can analyse what has become known in the DCE literature as 'attribute non-attendance' (ANA). Using data from 32 participants, we model the likelihood of ANA as a function of choice set complexity and respondent characteristics using fixed and random effects models to account for repeated choice set completion. We also model whether participants are consistent with regard to which characteristics (attributes) they consider across choice sets. We find that complexity is the strongest predictor of ANA when other possible influences, such as time pressure, ordering effects, survey specific effects and socio-demographic variables (including proxies for prior experience with the decision problem) are considered. We also find that most participants do not apply a consistent information processing strategy across choice sets. Eye-tracking technology shows promise as a way of obtaining additional information from consumer research, improving DCE design, and informing the design of policy measures. With regards to DCE design, results from the present study suggest that eye-tracking data can identify the point at which adding complexity (and realism) to DCE choice scenarios becomes self-defeating due to unacceptable increases in ANA. Eye-tracking data therefore has clear application in the construction of guidelines for DCE design and during piloting of DCE choice scenarios. With regards to design of policy measures such as labelling requirements for CAM and conventional medicines, the provision of additional information has the potential to make difficult decisions even harder and may not have the desired effect on decision-making.

  10. Magnetic Gearing Versus Conventional Gearing in Actuators for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Puchhammer, Gregor

    2014-01-01

    Magnetic geared actuators (MGA) are designed to perform highly reliable, robust and precise motion on satellite platforms or aerospace vehicles. The design allows MGA to be used for various tasks in space applications. In contrast to conventional geared drives, the contact and lubrication free force transmitting elements lead to a considerable lifetime and range extension of drive systems. This paper describes the fundamentals of magnetic wobbling gears (MWG) and the deduced inherent characteristics, and compares conventional and magnetic gearing.

  11. Advances in Neutron Radiography: Application to Additive Manufacturing Inconel 718

    DOE PAGES

    Bilheux, Hassina Z; Song, Gian; An, Ke; ...

    2016-01-01

    Reactor-based neutron radiography is a non-destructive, non-invasive characterization technique that has been extensively used for engineering materials such as inspection of components, evaluation of porosity, and in-operando observations of engineering parts. Neutron radiography has flourished at reactor facilities for more than four decades and is relatively new to accelerator-based neutron sources. Recent advances in neutron source and detector technologies, such as the Spallation Neutron Source (SNS) at the Oak Ridge National Laboratory (ORNL) in Oak Ridge, TN, and the microchannel plate (MCP) detector, respectively, enable new contrast mechanisms using the neutron scattering Bragg features for crystalline information such as averagemore » lattice strain, crystalline plane orientation, and identification of phases in a neutron radiograph. Additive manufacturing (AM) processes or 3D printing have recently become very popular and have a significant potential to revolutionize the manufacturing of materials by enabling new designs with complex geometries that are not feasible using conventional manufacturing processes. However, the technique lacks standards for process optimization and control compared to conventional processes. Residual stresses are a common occurrence in materials that are machined, rolled, heat treated, welded, etc., and have a significant impact on a component s mechanical behavior and durability. They may also arise during the 3D printing process, and defects such as internal cracks can propagate over time as the component relaxes after being removed from its build plate (the base plate utilized to print materials on). Moreover, since access to the AM material is possible only after the component has been fully manufactured, it is difficult to characterize the material for defects a priori to minimize expensive re-runs. Currently, validation of the AM process and materials is mainly through expensive trial-and-error experiments at the component level, whereas in conventional processes the level of confidence in predictive computational modeling is high enough to allow process and materials optimization through computational approaches. Thus, there is a clear need for non-destructive characterization techniques and for the establishment of processing- microstructure databases that can be used for developing and validating predictive modeling tools for AM.« less

  12. Techno-economic analysis of decentralized biomass processing depots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamers, Patrick; Roni, Mohammad S.; Tumuluru, Jaya S.

    Decentralized biomass processing facilities, known as biomass depots, may be necessary to achieve feedstock cost, quantity, and quality required to grow the future U.S. bioeconomy. In this paper, we assess three distinct depot configurations for technical difference and economic performance. The depot designs were chosen to compare and contrast a suite of capabilities that a depot could perform ranging from conventional pelleting to sophisticated pretreatment technologies. Our economic analyses indicate that depot processing costs are likely to range from ~US$30 to US$63 per dry metric tonne (Mg), depending upon the specific technology implemented and the energy consumption for processing equipmentmore » such as grinders and dryers. We conclude that the benefits of integrating depots into the overall biomass feedstock supply chain will outweigh depot processing costs and that incorporation of this technology should be aggressively pursued.« less

  13. Numerical Simulation of Hydro-mechanical Deep Drawing — A Study on the Effect of Process Parameters on Drawability and Thickness Variation

    NASA Astrophysics Data System (ADS)

    Singh, Swadesh Kumar; Kumar, D. Ravi

    2005-08-01

    Hydro-mechanical deep drawing is a process for producing cup shaped parts with the assistance of a pressurized fluid. In the present work, numerical simulation of the conventional and counter pressure deep drawing processes has been done with the help of a finite element method based software. Simulation results were analyzed to study the improvement in drawability by using hydro-mechanical processes. The thickness variations in the drawn cups were analyzed and also the effect of counter pressure and oil gap on the thickness distribution was studied. Numerical simulations were also used for the die design, which combines both drawing and ironing processes in a single operation. This modification in the die provides high drawability, facilitates smooth material flow, gives more uniform thickness distribution and corrects the shape distortion.

  14. Techno-economic analysis of decentralized biomass processing depots

    DOE PAGES

    Lamers, Patrick; Roni, Mohammad S.; Tumuluru, Jaya S.; ...

    2015-07-08

    Decentralized biomass processing facilities, known as biomass depots, may be necessary to achieve feedstock cost, quantity, and quality required to grow the future U.S. bioeconomy. In this paper, we assess three distinct depot configurations for technical difference and economic performance. The depot designs were chosen to compare and contrast a suite of capabilities that a depot could perform ranging from conventional pelleting to sophisticated pretreatment technologies. Our economic analyses indicate that depot processing costs are likely to range from ~US$30 to US$63 per dry metric tonne (Mg), depending upon the specific technology implemented and the energy consumption for processing equipmentmore » such as grinders and dryers. We conclude that the benefits of integrating depots into the overall biomass feedstock supply chain will outweigh depot processing costs and that incorporation of this technology should be aggressively pursued.« less

  15. 3S (Safeguards, Security, Safety) based pyroprocessing facility safety evaluation plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ku, J.H.; Choung, W.M.; You, G.S.

    The big advantage of pyroprocessing for the management of spent fuels against the conventional reprocessing technologies lies in its proliferation resistance since the pure plutonium cannot be separated from the spent fuel. The extracted materials can be directly used as metal fuel in a fast reactor, and pyroprocessing reduces drastically the volume and heat load of the spent fuel. KAERI has implemented the SBD (Safeguards-By-Design) concept in nuclear fuel cycle facilities. The goal of SBD is to integrate international safeguards into the entire facility design process since the very beginning of the design phase. This paper presents a safety evaluationmore » plan using a conceptual design of a reference pyroprocessing facility, in which 3S (Safeguards, Security, Safety)-By-Design (3SBD) concept is integrated from early conceptual design phase. The purpose of this paper is to establish an advanced pyroprocessing hot cell facility design concept based on 3SBD for the successful realization of pyroprocessing technology with enhanced safety and proliferation resistance.« less

  16. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms.

    PubMed

    Yang, Yan-Pu; Chen, Deng-Kai; Gu, Rong; Gu, Yu-Feng; Yu, Sui-Huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design.

  17. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms

    PubMed Central

    Chen, Deng-kai; Gu, Rong; Gu, Yu-feng; Yu, Sui-huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design. PMID:27630709

  18. Practical experience with full-scale structured sheet media (SSM) integrated fixed-film activated sludge (IFAS) systems for nitrification.

    PubMed

    Li, Hua; Zhu, Jia; Flamming, James J; O'Connell, Jack; Shrader, Michael

    2015-01-01

    Many wastewater treatment plants in the USA, which were originally designed as secondary treatment systems with no or partial nitrification requirements, are facing increased flows, loads, and more stringent ammonia discharge limits. Plant expansion is often not cost-effective due to either high construction costs or lack of land. Under these circumstances, integrated fixed-film activated sludge (IFAS) systems using both suspended growth and biofilms that grow attached to a fixed plastic structured sheet media are found to be a viable solution for solving the challenges. Multiple plants have been retrofitted with such IFAS systems in the past few years. The system has proven to be efficient and reliable in achieving not only consistent nitrification, but also enhanced bio-chemical oxygen demand removal and sludge settling characteristics. This paper presents long-term practical experiences with the IFAS system design, operation and maintenance, and performance for three full-scale plants with distinct processes; that is, a trickling filter/solids contact process, a conventional plug flow activated sludge process and an extended aeration process.

  19. Fatigue Strength Prediction for Titanium Alloy TiAl6V4 Manufactured by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Leuders, Stefan; Vollmer, Malte; Brenne, Florian; Tröster, Thomas; Niendorf, Thomas

    2015-09-01

    Selective laser melting (SLM), as a metalworking additive manufacturing technique, received considerable attention from industry and academia due to unprecedented design freedom and overall balanced material properties. However, the fatigue behavior of SLM-processed materials often suffers from local imperfections such as micron-sized pores. In order to enable robust designs of SLM components used in an industrial environment, further research regarding process-induced porosity and its impact on the fatigue behavior is required. Hence, this study aims at a transfer of fatigue prediction models, established for conventional process-routes, to the field of SLM materials. By using high-resolution computed tomography, load increase tests, and electron microscopy, it is shown that pore-based fatigue strength predictions for a titanium alloy TiAl6V4 have become feasible. However, the obtained accuracies are subjected to scatter, which is probably caused by the high defect density even present in SLM materials manufactured following optimized processing routes. Based on thorough examination of crack surfaces and crack initiation sites, respectively, implications for optimization of prediction accuracy of the models in focus are deduced.

  20. Optimal control of the gear shifting process for shift smoothness in dual-clutch transmissions

    NASA Astrophysics Data System (ADS)

    Li, Guoqiang; Görges, Daniel

    2018-03-01

    The control of the transmission system in vehicles is significant for the driving comfort. In order to design a controller for smooth shifting and comfortable driving, a dynamic model of a dual-clutch transmission is presented in this paper. A finite-time linear quadratic regulator is proposed for the optimal control of the two friction clutches in the torque phase for the upshift process. An integral linear quadratic regulator is introduced to regulate the relative speed difference between the engine and the slipping clutch under the optimization of the input torque during the inertia phase. The control objective focuses on smoothing the upshift process so as to improve the driving comfort. Considering the available sensors in vehicles for feedback control, an observer design is presented to track the immeasurable variables. Simulation results show that the jerk can be reduced both in the torque phase and inertia phase, indicating good shift performance. Furthermore, compared with conventional controllers for the upshift process, the proposed control method can reduce shift jerk and improve shift quality.

Top