Sample records for process requires high

  1. 48 CFR 1352.237-70 - Security processing requirements-high or moderate risk contracts.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... requirements-high or moderate risk contracts. 1352.237-70 Section 1352.237-70 Federal Acquisition Regulations... Provisions and Clauses 1352.237-70 Security processing requirements—high or moderate risk contracts. As prescribed in 48 CFR 1337.110-70 (b), insert the following clause: Security Processing Requirements—High or...

  2. Command and Control Common Semantic Core Required to Enable Net-centric Operations

    DTIC Science & Technology

    2008-05-20

    automated processing capability. A former US Marine Corps component C4 director during Operation Iraqi Freedom identified the problems of 1) uncertainty...interoperability improvements to warfighter community processes, thanks to ubiquitous automated processing , are likely high and somewhat easier to quantify. A...synchronized with the actions of other partners / warfare communities. This requires high- quality information, rapid sharing and automated processing – which

  3. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  4. A requirements index for information processing in hospitals.

    PubMed

    Ammenwerth, E; Buchauer, A; Haux, R

    2002-01-01

    Reference models describing typical information processing requirements in hospitals do not currently exist. This leads to high hospital information system (HIS) management expenses, for example, during tender processes for the acquisition of software application programs. Our aim was, therefore, to develop a comprehensive, lasting, technology-independent, and sufficiently detailed index of requirements for information processing in hospitals in order to reduce respective expenses. Two-dozen German experts established an index of requirements for information processing in university hospitals. This was done in a consensus-based, top-down, cyclic manner. Each functional requirement was derived from information processing functions and sub-functions of a hospital. The result is the first official German version of a requirements index, containing 233 functional requirements and 102 function-independent requirements, focusing on German needs. The functional requirements are structured according to the primary care process from admission to discharge and supplemented by requirements for handling patient records, work organization and resource planning, hospital management, research and education. Both the German version and its English translation are available in the Internet. The index of requirements contains general information processing requirements in hospitals which are formulated independent of information processing tools, or of HIS architectures. It aims at supporting HIS management, especially HIS strategic planning, HIS evaluation, and tender processes. The index can be regarded as a draft, which must, however, be refined according to the specific aims of a particular project. Although focused on German needs, we expect that it can also be useful in other countries. The high amount of interest shown for the index supports its usefulness.

  5. HIGH-TEMPERATURE AND HIGH-PRESSURE PARTICULATE CONTROL REQUIREMENTS

    EPA Science Inventory

    The report reviews and evaluates high-temperature and high-pressure particulate cleanup requirements of existing and proposed energy processes. The study's aims are to define specific high-temperature and high-pressure particle removal problems, to indicate potential solutions, a...

  6. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, John R.; Stolz, Christopher J.

    1993-08-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  7. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, J. R.; Stolz, C. J.

    1992-12-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  8. Strawberry puree processed by thermal, high pressure, or power ultrasound: Process energy requirements and quality modeling during storage.

    PubMed

    Sulaiman, Alifdalino; Farid, Mohammed; Silva, Filipa Vm

    2017-06-01

    Strawberry puree was processed for 15 min using thermal (65 ℃), high-pressure processing (600 MPa, 48 ℃), and ultrasound (24 kHz, 1.3 W/g, 33 ℃). These conditions were selected based on similar polyphenoloxidase inactivation (11%-18%). The specific energies required for the above-mentioned thermal, high-pressure processing, and power ultrasound processes were 240, 291, and 1233 kJ/kg, respectively. Then, the processed strawberry was stored at 3 ℃ and room temperature for 30 days. The constant pH (3.38±0.03) and soluble solids content (9.03 ± 0.25°Brix) during storage indicated a microbiological stability. Polyphenoloxidase did not reactivate during storage. The high-pressure processing and ultrasound treatments retained the antioxidant activity (70%-74%) better than the thermal process (60%), and high-pressure processing was the best treatment after 30 days of ambient storage to preserve antioxidant activity. Puree treated with ultrasound presented more color retention after processing and after ambient storage than the other preservation methods. For the three treatments, the changes of antioxidant activity and total color difference during storage were described by the fractional conversion model with rate constants k ranging between 0.03-0.09 and 0.06-0.22 day  - 1 , respectively. In resume, high-pressure processing and thermal processes required much less energy than ultrasound for the same polyphenoloxidase inactivation in strawberry. While high-pressure processing retained better the antioxidant activity of the strawberry puree during storage, the ultrasound treatment was better in terms of color retention.

  9. Significantly reducing the processing times of high-speed photometry data sets using a distributed computing model

    NASA Astrophysics Data System (ADS)

    Doyle, Paul; Mtenzi, Fred; Smith, Niall; Collins, Adrian; O'Shea, Brendan

    2012-09-01

    The scientific community is in the midst of a data analysis crisis. The increasing capacity of scientific CCD instrumentation and their falling costs is contributing to an explosive generation of raw photometric data. This data must go through a process of cleaning and reduction before it can be used for high precision photometric analysis. Many existing data processing pipelines either assume a relatively small dataset or are batch processed by a High Performance Computing centre. A radical overhaul of these processing pipelines is required to allow reduction and cleaning rates to process terabyte sized datasets at near capture rates using an elastic processing architecture. The ability to access computing resources and to allow them to grow and shrink as demand fluctuates is essential, as is exploiting the parallel nature of the datasets. A distributed data processing pipeline is required. It should incorporate lossless data compression, allow for data segmentation and support processing of data segments in parallel. Academic institutes can collaborate and provide an elastic computing model without the requirement for large centralized high performance computing data centers. This paper demonstrates how a base 10 order of magnitude improvement in overall processing time has been achieved using the "ACN pipeline", a distributed pipeline spanning multiple academic institutes.

  10. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    NASA Technical Reports Server (NTRS)

    Rey, Charles A.

    1991-01-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  11. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    NASA Astrophysics Data System (ADS)

    Rey, Charles A.

    1991-03-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, F.N. Jr.

    The Dynacracking process developed by Hydrocarbon Research, Inc., is a non-catalytic process capable of upgrading heavy oil whose sulfur, metal, and carbon contents may be high. It converts residual stocks to distillates with high naphtha yields, and to synthetic fuel gas of high quality (700-800 Btu/ft/sup 3/). It has esentially no air polution emissions and requires a relatively small amount of water and utilities. The process generates sufficient heat internally such that, except for start-up, no boilers, furnaces, or external heaters are required to operate the plant. Several aspects of the process are discussed: chemistry, hardware, feedstock, flexibility in themore » product mix, product quality, and economics.« less

  13. High efficiency solar cell processing

    NASA Technical Reports Server (NTRS)

    Ho, F.; Iles, P. A.

    1985-01-01

    At the time of writing, cells made by several groups are approaching 19% efficiency. General aspects of the processing required for such cells are discussed. Most processing used for high efficiency cells is derived from space-cell or concentrator cell technology, and recent advances have been obtained from improved techniques rather than from better understanding of the limiting mechanisms. Theory and modeling are fairly well developed, and adequate to guide further asymptotic increases in performance of near conventional cells. There are several competitive cell designs with promise of higher performance ( 20%) but for these designs further improvements are required. The available cell processing technology to fabricate high efficiency cells is examined.

  14. High Rate Digital Demodulator ASIC

    NASA Technical Reports Server (NTRS)

    Ghuman, Parminder; Sheikh, Salman; Koubek, Steve; Hoy, Scott; Gray, Andrew

    1998-01-01

    The architecture of High Rate (600 Mega-bits per second) Digital Demodulator (HRDD) ASIC capable of demodulating BPSK and QPSK modulated data is presented in this paper. The advantages of all-digital processing include increased flexibility and reliability with reduced reproduction costs. Conventional serial digital processing would require high processing rates necessitating a hardware implementation in other than CMOS technology such as Gallium Arsenide (GaAs) which has high cost and power requirements. It is more desirable to use CMOS technology with its lower power requirements and higher gate density. However, digital demodulation of high data rates in CMOS requires parallel algorithms to process the sampled data at a rate lower than the data rate. The parallel processing algorithms described here were developed jointly by NASA's Goddard Space Flight Center (GSFC) and the Jet Propulsion Laboratory (JPL). The resulting all-digital receiver has the capability to demodulate BPSK, QPSK, OQPSK, and DQPSK at data rates in excess of 300 Mega-bits per second (Mbps) per channel. This paper will provide an overview of the parallel architecture and features of the HRDR ASIC. In addition, this paper will provide an over-view of the implementation of the hardware architectures used to create flexibility over conventional high rate analog or hybrid receivers. This flexibility includes a wide range of data rates, modulation schemes, and operating environments. In conclusion it will be shown how this high rate digital demodulator can be used with an off-the-shelf A/D and a flexible analog front end, both of which are numerically computer controlled, to produce a very flexible, low cost high rate digital receiver.

  15. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  16. Discovering system requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bahill, A.T.; Bentz, B.; Dean, F.F.

    1996-07-01

    Cost and schedule overruns are often caused by poor requirements that are produced by people who do not understand the requirements process. This report provides a high-level overview of the system requirements process, explaining types, sources, and characteristics of good requirements. System requirements, however, are seldom stated by the customer. Therefore, this report shows ways to help you work with your customer to discover the system requirements. It also explains terminology commonly used in the requirements development field, such as verification, validation, technical performance measures, and the various design reviews.

  17. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  18. Silicon-Germanium Fast Packet Switch Developed for Communications Satellites

    NASA Technical Reports Server (NTRS)

    Quintana, Jorge A.

    1999-01-01

    Emerging multimedia applications and future satellite systems will require high-speed switching networks to accommodate high data-rate traffic among thousands of potential users. This will require advanced switching devices to enable communication between satellites. The NASA Lewis Research Center has been working closely with industry to develop a state-of-the-art fast packet switch (FPS) to fulfill this requirement. Recently, the Satellite Industry Task Force identified the need for high-capacity onboard processing switching components as one of the "grand challenges" for the satellite industry in the 21st century. In response to this challenge, future generations of onboard processing satellites will require low power and low mass components to enable transmission of services in the 100 gigabit (1011 bits) per second (Gbps) range.

  19. RTM: Cost-effective processing of composite structures

    NASA Technical Reports Server (NTRS)

    Hasko, Greg; Dexter, H. Benson

    1991-01-01

    Resin transfer molding (RTM) is a promising method for cost effective fabrication of high strength, low weight composite structures from textile preforms. In this process, dry fibers are placed in a mold, resin is introduced either by vacuum infusion or pressure, and the part is cured. RTM has been used in many industries, including automotive, recreation, and aerospace. Each of the industries has different requirements of material strength, weight, reliability, environmental resistance, cost, and production rate. These requirements drive the selection of fibers and resins, fiber volume fractions, fiber orientations, mold design, and processing equipment. Research is made into applying RTM to primary aircraft structures which require high strength and stiffness at low density. The material requirements are discussed of various industries, along with methods of orienting and distributing fibers, mold configurations, and processing parameters. Processing and material parameters such as resin viscosity, perform compaction and permeability, and tool design concepts are discussed. Experimental methods to measure preform compaction and permeability are presented.

  20. Mass production of silicon pore optics for ATHENA

    NASA Astrophysics Data System (ADS)

    Wille, Eric; Bavdaz, Marcos; Collon, Maximilien

    2016-07-01

    Silicon Pore Optics (SPO) provide high angular resolution with low effective area density as required for the Advanced Telescope for High Energy Astrophysics (Athena). The x-ray telescope consists of several hundreds of SPO mirror modules. During the development of the process steps of the SPO technology, specific requirements of a future mass production have been considered right from the beginning. The manufacturing methods heavily utilise off-the-shelf equipment from the semiconductor industry, robotic automation and parallel processing. This allows to upscale the present production flow in a cost effective way, to produce hundreds of mirror modules per year. Considering manufacturing predictions based on the current technology status, we present an analysis of the time and resources required for the Athena flight programme. This includes the full production process starting with Si wafers up to the integration of the mirror modules. We present the times required for the individual process steps and identify the equipment required to produce two mirror modules per day. A preliminary timeline for building and commissioning the required infrastructure, and for flight model production of about 1000 mirror modules, is presented.

  1. An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements

    PubMed Central

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987

  2. An approach for integrating the prioritization of functional and nonfunctional requirements.

    PubMed

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  3. When unconscious rewards boost cognitive task performance inefficiently: the role of consciousness in integrating value and attainability information

    PubMed Central

    Zedelius, Claire M.; Veling, Harm; Aarts, Henk

    2012-01-01

    Research has shown that high vs. low value rewards improve cognitive task performance independent of whether they are perceived consciously or unconsciously. However, efficient performance in response to high value rewards also depends on whether or not rewards are attainable. This raises the question of whether unconscious reward processing enables people to take into account such attainability information. Building on a theoretical framework according to which conscious reward processing is required to enable higher level cognitive processing, the present research tested the hypothesis that conscious but not unconscious reward processing enables integration of reward value with attainability information. In two behavioral experiments, participants were exposed to mask high and low value coins serving as rewards on a working memory (WM) task. The likelihood for conscious processing was manipulated by presenting the coins relatively briefly (17 ms) or long and clearly visible (300 ms). Crucially, rewards were expected to be attainable or unattainable. Requirements to integrate reward value with attainability information varied across experiments. Results showed that when integration of value and attainability was required (Experiment 1), long reward presentation led to efficient performance, i.e., selectively improved performance for high value attainable rewards. In contrast, in the short presentation condition, performance was increased for high value rewards even when these were unattainable. This difference between the effects of long and short presentation time disappeared when integration of value and attainability information was not required (Experiment 2). Together these findings suggest that unconsciously processed reward information is not integrated with attainability expectancies, causing inefficient effort investment. These findings are discussed in terms of a unique role of consciousness in efficient allocation of effort to cognitive control processes. PMID:22848198

  4. Mathematical Analysis of High-Temperature Co-electrolysis of CO2 and O2 Production in a Closed-Loop Atmosphere Revitalization System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael G. McKellar; Manohar S. Sohal; Lila Mulloth

    2010-03-01

    NASA has been evaluating two closed-loop atmosphere revitalization architectures based on Sabatier and Bosch carbon dioxide, CO2, reduction technologies. The CO2 and steam, H2O, co-electrolysis process is another option that NASA has investigated. Utilizing recent advances in the fuel cell technology sector, the Idaho National Laboratory, INL, has developed a CO2 and H2O co-electrolysis process to produce oxygen and syngas (carbon monoxide, CO and hydrogen, H2 mixture) for terrestrial (energy production) application. The technology is a combined process that involves steam electrolysis, CO2 electrolysis, and the reverse water gas shift (RWGS) reaction. A number of process models have been developedmore » and analyzed to determine the theoretical power required to recover oxygen, O2, in each case. These models include the current Sabatier and Bosch technologies and combinations of those processes with high-temperature co-electrolysis. The cases of constant CO2 supply and constant O2 production were evaluated. In addition, a process model of the hydrogenation process with co-electrolysis was developed and compared. Sabatier processes require the least amount of energy input per kg of oxygen produced. If co-electrolysis replaces solid polymer electrolyte (SPE) electrolysis within the Sabatier architecture, the power requirement is reduced by over 10%, but only if heat recuperation is used. Sabatier processes, however, require external water to achieve the lower power results. Under conditions of constant incoming carbon dioxide flow, the Sabatier architectures require more power than the other architectures. The Bosch, Boudouard with co-electrolysis, and the hydrogenation with co-electrolysis processes require little or no external water. The Bosch and hydrogenation processes produce water within their reactors, which aids in reducing the power requirement for electrolysis. The Boudouard with co-electrolysis process has a higher electrolysis power requirement because carbon dioxide is split instead of water, which has a lower heat of formation. Hydrogenation with co-electrolysis offers the best overall power performance for two reasons: it requires no external water, and it produces its own water, which reduces the power requirement for co-electrolysis.« less

  5. (abstract) A High Throughput 3-D Inner Product Processor

    NASA Technical Reports Server (NTRS)

    Daud, Tuan

    1996-01-01

    A particularily challenging image processing application is the real time scene acquisition and object discrimination. It requires spatio-temporal recognition of point and resolved objects at high speeds with parallel processing algorithms. Neural network paradigms provide fine grain parallism and, when implemented in hardware, offer orders of magnitude speed up. However, neural networks implemented on a VLSI chip are planer architectures capable of efficient processing of linear vector signals rather than 2-D images. Therefore, for processing of images, a 3-D stack of neural-net ICs receiving planar inputs and consuming minimal power are required. Details of the circuits with chip architectures will be described with need to develop ultralow-power electronics. Further, use of the architecture in a system for high-speed processing will be illustrated.

  6. High-Flux Solar Furnace Facility | Concentrating Solar Power | NREL

    Science.gov Websites

    High-Flux Solar Furnace Facility High-Flux Solar Furnace Facility NREL's High-Flux Solar Furnace (HFSF) is a 10-kW optical furnace for testing high-temperature processes or applications requiring high range of technologies with a diverse set of experimental requirements. The high heating rates create the

  7. Evaluating the energy performance of a hybrid membrane-solvent process for flue gas carbon dioxide capture

    DOE PAGES

    Kusuma, Victor A.; Li, Zhiwei; Hopkinson, David; ...

    2016-10-13

    In this study, a particularly energy intensive step in the conventional amine absorption process to remove carbon dioxide is solvent regeneration using a steam stripping column. An attractive alternative to reduce the energy requirement is gas pressurized stripping, in which a high pressure noncondensable gas is used to strip CO 2 off the rich solvent stream. The gas pressurized stripping column product, having CO 2 at high concentration and high partial pressure, can then be regenerated readily using membrane separation. In this study, we performed an energetic analysis in the form of total equivalent work and found that, for capturingmore » CO 2 from flue gas, this hybrid stripping process consumes 49% less energy compared to the base case conventional MEA absorption/steam stripping process. We also found the amount of membrane required in this process is much less than required for direct CO 2 capture from the flue gas: approximately 100-fold less than a previously published two-stage cross-flow scheme, mostly due to the more favorable pressure ratio and CO 2 concentration. There does exist a trade-off between energy consumption and required membrane area that is most strongly affected by the gas pressurized stripper operating pressure. While initial analysis looks promising from both an energy requirement and membrane unit capital cost, the viability of this hybrid process depends on the availability of advanced, next generation gas separation membranes to perform the stripping gas regeneration.« less

  8. Evaluating the energy performance of a hybrid membrane-solvent process for flue gas carbon dioxide capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kusuma, Victor A.; Li, Zhiwei; Hopkinson, David

    In this study, a particularly energy intensive step in the conventional amine absorption process to remove carbon dioxide is solvent regeneration using a steam stripping column. An attractive alternative to reduce the energy requirement is gas pressurized stripping, in which a high pressure noncondensable gas is used to strip CO 2 off the rich solvent stream. The gas pressurized stripping column product, having CO 2 at high concentration and high partial pressure, can then be regenerated readily using membrane separation. In this study, we performed an energetic analysis in the form of total equivalent work and found that, for capturingmore » CO 2 from flue gas, this hybrid stripping process consumes 49% less energy compared to the base case conventional MEA absorption/steam stripping process. We also found the amount of membrane required in this process is much less than required for direct CO 2 capture from the flue gas: approximately 100-fold less than a previously published two-stage cross-flow scheme, mostly due to the more favorable pressure ratio and CO 2 concentration. There does exist a trade-off between energy consumption and required membrane area that is most strongly affected by the gas pressurized stripper operating pressure. While initial analysis looks promising from both an energy requirement and membrane unit capital cost, the viability of this hybrid process depends on the availability of advanced, next generation gas separation membranes to perform the stripping gas regeneration.« less

  9. Process Performance of Optima XEx Single Wafer High Energy Implanter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J. H.; Yoon, Jongyoon; Kondratenko, S.

    2011-01-07

    To meet the process requirements for well formation in future CMOS memory production, high energy implanters require more robust angle, dose, and energy control while maintaining high productivity. The Optima XEx high energy implanter meets these requirements by integrating a traditional LINAC beamline with a robust single wafer handling system. To achieve beam angle control, Optima XEx can control both the horizontal and vertical beam angles to within 0.1 degrees using advanced beam angle measurement and correction. Accurate energy calibration and energy trim functions accelerate process matching by eliminating energy calibration errors. The large volume process chamber and UDC (upstreammore » dose control) using faraday cups outside of the process chamber precisely control implant dose regardless of any chamber pressure increase due to PR (photoresist) outgassing. An optimized RF LINAC accelerator improves reliability and enables singly charged phosphorus and boron energies up to 1200 keV and 1500 keV respectively with higher beam currents. A new single wafer endstation combined with increased beam performance leads to overall increased productivity. We report on the advanced performance of Optima XEx observed during tool installation and volume production at an advanced memory fab.« less

  10. High pressure processing and its application to the challenge of virus-contaminated foods

    USDA-ARS?s Scientific Manuscript database

    High pressure processing (HPP) is an increasingly popular non-thermal food processing technology. Study of HPP’s potential to inactivate foodborne viruses has defined general pressure levels required to inactivate hepatitis A virus, norovirus surrogates, and human norovirus itself within foods such...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erga, O.; Finborud, A.

    Cost-effective FGD processes with high SO{sub 2} removal efficiencies are required for fossil-fired power plants. With high-sulfur fuel, conventional limestone processes are less ideal, and regenerative processes with SO{sub 2} recovery may offer important advantages. The Elsorb process, which is being developed by the Norwegian company Elkem Technology a.s., is a regenerable SO{sub 2} recovery process which operates on the principle of chemical absorption followed by regeneration by evaporation. The process is based on the use of a chemical stable sodium phosphate buffer in high concentration. It combines high cleaning efficiency with high cyclic absorption capacity, moderate energy requirement, andmore » very little oxidation losses. The process produces SO{sub 2} (g) which can be converted into liquid SO{sub 2}, sulfuric acid or elemental sulfur. The Elsorb process has been pilot tested on flue gas from a coal-fired boiler with very promising results, concerning cleaning efficiency and oxidation losses of SO{sub 2}. The first commercial Elsorb plant has been installed for treating incinerated Claus tail gas. Preliminary data regarding cleaning efficiency are in accordance with the pilot tests. However, unexpected high consumption of make-up chemicals were encountered. The existing incinerator is now to be modified. Complete data for the Elsorb plant should be available later this year. 1 fig.« less

  12. The Role of Independent V&V in Upstream Software Development Processes

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve

    1996-01-01

    This paper describes the role of Verification and Validation (V&V) during the requirements and high level design processes, and in particular the role of Independent V&V (IV&V). The job of IV&V during these phases is to ensure that the requirements are complete, consistent and valid, and to ensure that the high level design meets the requirements. This contrasts with the role of Quality Assurance (QA), which ensures that appropriate standards and process models are defined and applied. This paper describes the current state of practice for IV&V, concentrating on the process model used in NASA projects. We describe a case study, showing the processes by which problem reporting and tracking takes place, and how IV&V feeds into decision making by the development team. We then describe the problems faced in implementing IV&V. We conclude that despite a well defined process model, and tools to support it, IV&V is still beset by communication and coordination problems.

  13. Data Processing Technology, A Suggested 2-Year Post High School Curriculum.

    ERIC Educational Resources Information Center

    Central Texas Coll., Killeen.

    This guide identifies technicians, states specific job requirements, and describes special problems in defining, initiating, and operating post-high school programs in data processing technology. The following are discussed: (1) the program (employment opportunities, the technician, work performed by data processing personnel, the faculty, student…

  14. Thermal analysis of heat and power plant with high temperature reactor and intermediate steam cycle

    NASA Astrophysics Data System (ADS)

    Fic, Adam; Składzień, Jan; Gabriel, Michał

    2015-03-01

    Thermal analysis of a heat and power plant with a high temperature gas cooled nuclear reactor is presented. The main aim of the considered system is to supply a technological process with the heat at suitably high temperature level. The considered unit is also used to produce electricity. The high temperature helium cooled nuclear reactor is the primary heat source in the system, which consists of: the reactor cooling cycle, the steam cycle and the gas heat pump cycle. Helium used as a carrier in the first cycle (classic Brayton cycle), which includes the reactor, delivers heat in a steam generator to produce superheated steam with required parameters of the intermediate cycle. The intermediate cycle is provided to transport energy from the reactor installation to the process installation requiring a high temperature heat. The distance between reactor and the process installation is assumed short and negligable, or alternatively equal to 1 km in the analysis. The system is also equipped with a high temperature argon heat pump to obtain the temperature level of a heat carrier required by a high temperature process. Thus, the steam of the intermediate cycle supplies a lower heat exchanger of the heat pump, a process heat exchanger at the medium temperature level and a classical steam turbine system (Rankine cycle). The main purpose of the research was to evaluate the effectiveness of the system considered and to assess whether such a three cycle cogeneration system is reasonable. Multivariant calculations have been carried out employing the developed mathematical model. The results have been presented in a form of the energy efficiency and exergy efficiency of the system as a function of the temperature drop in the high temperature process heat exchanger and the reactor pressure.

  15. Conversion of direct process high-boiling residue to monosilanes

    DOEpatents

    Brinson, Jonathan Ashley; Crum, Bruce Robert; Jarvis, Jr., Robert Frank

    2000-01-01

    A process for the production of monosilanes from the high-boiling residue resulting from the reaction of hydrogen chloride with silicon metalloid in a process typically referred to as the "direct process." The process comprises contacting a high-boiling residue resulting from the reaction of hydrogen chloride and silicon metalloid, with hydrogen gas in the presence of a catalytic amount of aluminum trichloride effective in promoting conversion of the high-boiling residue to monosilanes. The present process results in conversion of the high-boiling residue to monosilanes. At least a portion of the aluminum trichloride catalyst required for conduct of the process may be formed in situ during conduct of the direct process and isolation of the high-boiling residue.

  16. High Fidelity Tape Transfer Printing Based On Chemically Induced Adhesive Strength Modulation

    NASA Astrophysics Data System (ADS)

    Sim, Kyoseung; Chen, Song; Li, Yuhang; Kammoun, Mejdi; Peng, Yun; Xu, Minwei; Gao, Yang; Song, Jizhou; Zhang, Yingchun; Ardebili, Haleh; Yu, Cunjiang

    2015-11-01

    Transfer printing, a two-step process (i.e. picking up and printing) for heterogeneous integration, has been widely exploited for the fabrication of functional electronics system. To ensure a reliable process, strong adhesion for picking up and weak or no adhesion for printing are required. However, it is challenging to meet the requirements of switchable stamp adhesion. Here we introduce a simple, high fidelity process, namely tape transfer printing(TTP), enabled by chemically induced dramatic modulation in tape adhesive strength. We describe the working mechanism of the adhesion modulation that governs this process and demonstrate the method by high fidelity tape transfer printing several types of materials and devices, including Si pellets arrays, photodetector arrays, and electromyography (EMG) sensors, from their preparation substrates to various alien substrates. High fidelity tape transfer printing of components onto curvilinear surfaces is also illustrated.

  17. Alloy-steel nuts for bolting for high-pressure and high-temperature service (ASME SA-194 with additional requirements)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This standard covers alloy steel nuts for bolting for high-pressure and high-temperature service in nuclear and associated applications. This standard does not cover bar or other starting materials. The only implied special considerations for starting materials are that they be capable of passing the required tests when processed into finished products in accordance with this standard. Material shall conform to the requirements of ASME SA-194; to the requirements of the ASME Boiler and Pressure Vessel Code (ASME Code), Section III, Article NB-2000; to the requirements of NE E 8-18; and to the additional requirements of this standard.

  18. Optical design and performance of F-Theta lenses for high-power and high-precision applications

    NASA Astrophysics Data System (ADS)

    Yurevich, V. I.; Grimm, V. A.; Afonyushkin, A. A.; Yudin, K. V.; Gorny, S. G.

    2015-09-01

    F-Theta lenses are widely used in remote laser processing. Nowadays, a large variety of scanning systems utilizing these devices are commercially available. In this paper, we demonstrate that all practical issues lose their triviality in designing high-performance F-Theta scanning systems. Laser power scaling requires attention to thermally-induced phenomena and ghost reflections. This requirement considerably complicates optimization of the optical configuration of the system and primary aberration correction, even during preliminary design. Obtaining high positioning accuracy requires taking into consideration all probable reasons for processing field distortion. We briefly describe the key engineering relationships and invariants as well as the typical design of a scanner lens and the main field-flattening techniques. Specific emphasis is directed to consideration of the fundamental nonlinearity of two-mirror scanners. To the best of our knowledge, this issue has not been yet studied. We also demonstrate the benefits of our F-Theta lens optimization technique, which uses a plurality of entrance pupils. The problems of eliminating focused ghost reflections and the effects of thermally-induced processes in high-power F-Theta lenses are considered. A set of multi-path 3D processing and laser cutting experiments were conducted and are presented herein to demonstrate the impact of laser beam degradation on the process performance. A selection of our non-standard optical designs is presented.

  19. Low-temperature direct bonding of glass nanofluidic chips using a two-step plasma surface activation process.

    PubMed

    Xu, Yan; Wang, Chenxi; Dong, Yiyang; Li, Lixiao; Jang, Kihoon; Mawatari, Kazuma; Suga, Tadatomo; Kitamori, Takehiko

    2012-01-01

    Owing to the well-established nanochannel fabrication technology in 2D nanoscales with high resolution, reproducibility, and flexibility, glass is the leading, ideal, and unsubstitutable material for the fabrication of nanofluidic chips. However, high temperature (~1,000 °C) and a vacuum condition are usually required in the conventional fusion bonding process, unfortunately impeding the nanofluidic applications and even the development of the whole field of nanofluidics. We present a direct bonding of fused silica glass nanofluidic chips at low temperature, around 200 °C in ambient air, through a two-step plasma surface activation process which consists of an O(2) reactive ion etching plasma treatment followed by a nitrogen microwave radical activation. The low-temperature bonded glass nanofluidic chips not only had high bonding strength but also could work continuously without leakage during liquid introduction driven by air pressure even at 450 kPa, a very high pressure which can meet the requirements of most nanofluidic operations. Owing to the mild conditions required in the bonding process, the method has the potential to allow the integration of a range of functional elements into nanofluidic chips during manufacture, which is nearly impossible in the conventional high-temperature fusion bonding process. Therefore, we believe that the developed low-temperature bonding would be very useful and contribute to the field of nanofluidics.

  20. New Directions in Space Operations Services in Support of Interplanetary Exploration

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.

    2005-01-01

    To gain access to the necessary operational processes and data in support of NASA's Lunar/Mars Exploration Initiative, new services, adequate levels of computing cycles and access to myriad forms of data must be provided to onboard spacecraft and ground based personnel/systems (earth, lunar and Martian) to enable interplanetary exploration by humans. These systems, cycles and access to vast amounts of development, test and operational data will be required to provide a new level of services not currently available to existing spacecraft, on board crews and other operational personnel. Although current voice, video and data systems in support of current space based operations has been adequate, new highly reliable and autonomous processes and services will be necessary for future space exploration activities. These services will range from the more mundane voice in LEO to voice in interplanetary travel which because of the high latencies will require new voice processes and standards. New services, like component failure predictions based on data mining of significant quantities of data, located at disparate locations, will be required. 3D or holographic representation of onboard components, systems or family members will greatly improve maintenance, operations and service restoration not to mention crew morale. Current operational systems and standards, like the Internet Protocol, will not able to provide the level of service required end to end from an end point on the Martian surface like a scientific instrument to a researcher at a university. Ground operations whether earth, lunar or Martian and in flight operations to the moon and especially to Mars will require significant autonomy that will require access to highly reliable processing capabilities, data storage based on network storage technologies. Significant processing cycles will be needed onboard but could be borrowed from other locations either ground based or onboard other spacecraft. Reliability will be a key factor with onboard and distributed backup processing an absolutely necessary requirement. Current cluster processing/Grid technologies may provide the basis for providing these services. An overview of existing services, future services that will be required and the technologies and standards required to be developed will be presented. The purpose of this paper will be to initiate a technological roadmap, albeit at a high level, of current voice, video, data and network technologies and standards (which show promise for adaptation or evolution) to what technologies and standards need to be redefined, adjusted or areas where new ones require development. The roadmap should begin the differentiation between non manned and manned processes/services where applicable. The paper will be based in part on the activities of the CCSDS Monitor and Control working group which is beginning the process of standardization of the these processes. Another element of the paper will be based on an analysis of current technologies supporting space flight processes and services at JSC, MSFC, GSFC and to a lesser extent at KSC. Work being accomplished in areas such as Grid computing, data mining and network storage at ARC, IBM and the University of Alabama at Huntsville will be researched and analyzed.

  1. Engineering Design Elements of a Two-Phase Thermosyphon to Trannsfer NGNP Nuclear Thermal Energy to a Hydrogen Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piyush Sabharwal

    2009-07-01

    Two hydrogen production processes, both powered by a Next Generation Nuclear Plant (NGNP), are currently under investigation at Idaho National Laboratory. The first is high-temperature steam electrolysis, which uses both heat and electricity; the second is thermo-chemical production through the sulfur iodine process primarily using heat. Both processes require a high temperature (>850°C) for enhanced efficiency; temperatures indicative of the NGNP. Safety and licensing mandates prudently dictate that the NGNP and the hydrogen production facility be physically isolated, perhaps requiring separation of over 100 m.

  2. Progress in the development of the reverse osmosis process for spacecraft wash water recovery.

    NASA Technical Reports Server (NTRS)

    Pecoraro, J. N.; Podall, H. E.; Spurlock, J. M.

    1972-01-01

    Research work on ambient- and pasteurization-temperature reverse osmosis processes for wash water recovery in a spacecraft environment is reviewed, and the advantages and drawbacks of each are noted. A key requirement in each case is to provide a membrane of appropriate stability and semipermeability. Reverse osmosis systems intended for such use must also take into account the specific limitations and requirements imposed by the small volume of water to be processed and the high water recovery desired. The incorporation of advanced high-temperature membranes into specially designed modules is discussed.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, Steven Adriel

    The following discussion contains a high-level description of methods used to implement software for data processing. It describes the required directory structures and file handling required to use Excel's Visual Basic for Applications programming language and how to identify shot, test and capture types to appropriately process data. It also describes how to interface with the software.

  4. Use of highly alkaline conditions to improve cost-effectiveness of algal biotechnology.

    PubMed

    Canon-Rubio, Karen A; Sharp, Christine E; Bergerson, Joule; Strous, Marc; De la Hoz Siegler, Hector

    2016-02-01

    Phototrophic microorganisms have been proposed as an alternative to capture carbon dioxide (CO2) and to produce biofuels and other valuable products. Low CO2 absorption rates, low volumetric productivities, and inefficient downstream processing, however, currently make algal biotechnology highly energy intensive, expensive, and not economically competitive to produce biofuels. This mini-review summarizes advances made regarding the cultivation of phototrophic microorganisms at highly alkaline conditions, as well as other innovations oriented toward reducing the energy input into the cultivation and processing stages. An evaluation, in terms of energy requirements and energy return on energy invested, is performed for an integrated high-pH, high-alkalinity growth process that uses biofilms. Performance in terms of productivity and expected energy return on energy invested is presented for this process and is compared to previously reported life cycle assessments (LCAs) for systems at near-neutral pH. The cultivation of alkaliphilic phototrophic microorganisms in biofilms is shown to have a significant potential to reduce both energy requirements and capital costs.

  5. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    NASA Astrophysics Data System (ADS)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  6. Large Area Active Brazing of Multi-tile Ceramic-Metal Structures

    DTIC Science & Technology

    2012-05-01

    metallurgical bonds. The major disadvantage of using active brazing for metals and ceramics is the high processing temperature required that results in...steels) and form strong, metallurgical bonds. However, the high processing temperatures result in large strain (stress) build-up from the inherent...metals such as titanium alloys and stainless steels) and form strong, metallurgical bonds. However, the high processing temperatures result in large

  7. Material requirements for the High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Stephens, Joseph R.; Hecht, Ralph J.; Johnson, Andrew M.

    1993-01-01

    Under NASA-sponsored High Speed Research (HSR) programs, the materials and processing requirements have been identified for overcoming the environmental and economic barriers of the next generation High Speed Civil Transport (HSCT) propulsion system. The long (2 to 5 hours) supersonic cruise portion of the HSCT cycle will place additional durability requirements on all hot section engine components. Low emissions combustor designs will require high temperature ceramic matrix composite liners to meet an emission goal of less than 5g NO(x) per Kg fuel burned. Large axisymmetric and two-dimensional exhaust nozzle designs are now under development to meet or exceed FAR 36 Stage III noise requirements, and will require lightweight, high temperature metallic, intermetallic, and ceramic matrix composites to reduce nozzle weight and meet structural and acoustic component performance goals. This paper describes and discusses the turbomachinery, combustor, and exhaust nozzle requirements of the High Speed Civil Transport propulsion system.

  8. Research on the tool holder mode in high speed machining

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  9. From sequencer to supercomputer: an automatic pipeline for managing and processing next generation sequencing data.

    PubMed

    Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun

    2012-01-01

    Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.

  10. Space Debris Detection on the HPDP, a Coarse-Grained Reconfigurable Array Architecture for Space

    NASA Astrophysics Data System (ADS)

    Suarez, Diego Andres; Bretz, Daniel; Helfers, Tim; Weidendorfer, Josef; Utzmann, Jens

    2016-08-01

    Stream processing, widely used in communications and digital signal processing applications, requires high- throughput data processing that is achieved in most cases using Application-Specific Integrated Circuit (ASIC) designs. Lack of programmability is an issue especially in space applications, which use on-board components with long life-cycles requiring applications updates. To this end, the High Performance Data Processor (HPDP) architecture integrates an array of coarse-grained reconfigurable elements to provide both flexible and efficient computational power suitable for stream-based data processing applications in space. In this work the capabilities of the HPDP architecture are demonstrated with the implementation of a real-time image processing algorithm for space debris detection in a space-based space surveillance system. The implementation challenges and alternatives are described making trade-offs to improve performance at the expense of negligible degradation of detection accuracy. The proposed implementation uses over 99% of the available computational resources. Performance estimations based on simulations show that the HPDP can amply match the application requirements.

  11. High pressure processing's potential to inactivate norovirus and other fooodborne viruses

    USDA-ARS?s Scientific Manuscript database

    High pressure processing (HPP) can inactivate human norovirus. However, all viruses are not equally susceptible to HPP. Pressure treatment parameters such as required pressure levels, initial pressurization temperatures, and pressurization times substantially affect inactivation. How food matrix ...

  12. Turboexpander plant designs can provide high ethane recovery without inlet CO/sub 2/ removal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkinson, J.D.; Hudson, H.M.

    1982-05-03

    New turboexpander plant designs can process natural gas streams containing moderate amounts of carbon dioxide (CO/sub 2/) for high ethane recovery without inlet gas treating. The designs will handle a wide range of inlet ethane-plus fractions. They also offer reduced horsepower requirements compared to other processes. CO/sub 2/ is a typical component of most natural gas streams. In many cases, processing of these gas streams in a turboexpander plant for high ethane recovery requires pre-treatment of the gas for CO/sub 2/ removal. This is required to avoid the formation of solid CO/sub 2/ (freezing) in the cold sections of themore » process and/or to meet necessary residue gas and liquid product CO/sub 2/ specifications. Depending on the quantities involved, the CO/sub 2/ removal systems is generally a significant portion of both the installed cost and operating cost for the ethane recovery facility. Therefore, turboexpander plant designs that are capable of handling increased quantities of CO/sub 2/ in the feed gas without freezing can offer the gas processor substantial economic benefits.« less

  13. EOS image data processing system definition study

    NASA Technical Reports Server (NTRS)

    Gilbert, J.; Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.

    1973-01-01

    The Image Processing System (IPS) requirements and configuration are defined for NASA-sponsored advanced technology Earth Observatory System (EOS). The scope included investigation and definition of IPS operational, functional, and product requirements considering overall system constraints and interfaces (sensor, etc.) The scope also included investigation of the technical feasibility and definition of a point design reflecting system requirements. The design phase required a survey of present and projected technology related to general and special-purpose processors, high-density digital tape recorders, and image recorders.

  14. Challenges of designing and testing a highly stable sensor platform: Cesic solves MTG star sensor bracket thermoelastic requirements

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias; Zauner, Christoph

    2017-09-01

    The Meteosat Third Generation's extreme pointing requirements call for a highly stable bracket for mounting the Star Trackers. HB-Cesic®, a chopped fibre reinforced silicon carbide, was selected as a base material for the sensor bracket. The high thermal conductivity and low thermal expansion of HB-Cesic® were the key properties to fulfil the demanding thermo-elastic pointing requirements of below 1μrad/K for the Star Trackers mounting interfaces. Dominated by thermoelastic stability requirements, the design and analysis of the Bracket required a multidisciplinary approach with the focus on thermal and thermo-elastic analyses. Dedicated modal and thermal post-processing strategies have been applied in the scope of the light weighting process. The experimental verification of this thermo-elastic stable system has been a challenging task of its own. A thermo-elastic distortion measurement rig was developed with a stability of <0.1μrad/K in all three rotational degrees of freedom.

  15. Performance Steel Castings

    DTIC Science & Technology

    2012-09-30

    Development of Sand Properties 103 Advanced Modeling Dataset.. 105 High Strength Low Alloy (HSLA) Steels 107 Steel Casting and Engineering Support...to achieve the performance goals required for new systems. The dramatic reduction in weight and increase in capability will require high performance...for improved weapon system reliability. SFSA developed innovative casting design and manufacturing processes for high performance parts. SFSA is

  16. Static and dynamic high power, space nuclear electric generating systems

    NASA Technical Reports Server (NTRS)

    Wetch, J. R.; Begg, L. L.; Koester, J. K.

    1985-01-01

    Space nuclear electric generating systems concepts have been assessed for their potential in satisfying future spacecraft high power (several megawatt) requirements. Conceptual designs have been prepared for reactor power systems using the most promising static (thermionic) and the most promising dynamic conversion processes. Component and system layouts, along with system mass and envelope requirements have been made. Key development problems have been identified and the impact of the conversion process selection upon thermal management and upon system and vehicle configuration is addressed.

  17. A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.

    2017-03-01

    Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.

  18. Processes for metal extraction

    NASA Technical Reports Server (NTRS)

    Bowersox, David F.

    1992-01-01

    This report describes the processing of plutonium at Los Alamos National Laboratory (LANL), and operation illustrating concepts that may be applicable to the processing of lunar materials. The toxic nature of plutonium requires a highly closed system for processing lunar surface materials.

  19. Research on aspheric focusing lens processing and testing technology in the high-energy laser test system

    NASA Astrophysics Data System (ADS)

    Liu, Dan; Fu, Xiu-hua; Jia, Zong-he; Wang, Zhe; Dong, Huan

    2014-08-01

    In the high-energy laser test system, surface profile and finish of the optical element are put forward higher request. Taking a focusing aspherical zerodur lens with a diameter of 100mm as example, using CNC and classical machining method of combining surface profile and surface quality of the lens were investigated. Taking profilometer and high power microscope measurement results as a guide, by testing and simulation analysis, process parameters were improved constantly in the process of manufacturing. Mid and high frequency error were trimmed and improved so that the surface form gradually converged to the required accuracy. The experimental results show that the final accuracy of the surface is less than 0.5μm and the surface finish is □, which fulfils the accuracy requirement of aspherical focusing lens in optical system.

  20. GPU-Based High-performance Imaging for Mingantu Spectral RadioHeliograph

    NASA Astrophysics Data System (ADS)

    Mei, Ying; Wang, Feng; Wang, Wei; Chen, Linjie; Liu, Yingbo; Deng, Hui; Dai, Wei; Liu, Cuiyin; Yan, Yihua

    2018-01-01

    As a dedicated solar radio interferometer, the MingantU SpEctral RadioHeliograph (MUSER) generates massive observational data in the frequency range of 400 MHz-15 GHz. High-performance imaging forms a significantly important aspect of MUSER’s massive data processing requirements. In this study, we implement a practical high-performance imaging pipeline for MUSER data processing. At first, the specifications of the MUSER are introduced and its imaging requirements are analyzed. Referring to the most commonly used radio astronomy software such as CASA and MIRIAD, we then implement a high-performance imaging pipeline based on the Graphics Processing Unit technology with respect to the current operational status of the MUSER. A series of critical algorithms and their pseudo codes, i.e., detection of the solar disk and sky brightness, automatic centering of the solar disk and estimation of the number of iterations for clean algorithms, are proposed in detail. The preliminary experimental results indicate that the proposed imaging approach significantly increases the processing performance of MUSER and generates images with high-quality, which can meet the requirements of the MUSER data processing. Supported by the National Key Research and Development Program of China (2016YFE0100300), the Joint Research Fund in Astronomy (No. U1531132, U1631129, U1231205) under cooperative agreement between the National Natural Science Foundation of China (NSFC) and the Chinese Academy of Sciences (CAS), the National Natural Science Foundation of China (Nos. 11403009 and 11463003).

  1. Computer Output Microfilm and Library Catalogs.

    ERIC Educational Resources Information Center

    Meyer, Richard W.

    Early computers dealt with mathematical and scientific problems requiring very little input and not much output, therefore high speed printing devices were not required. Today with increased variety of use, high speed printing is necessary and Computer Output Microfilm (COM) devices have been created to meet this need. This indirect process can…

  2. The Effect of Gravity on the Combustion Synthesis of Porous Biomaterials

    NASA Technical Reports Server (NTRS)

    Castillo, M.; Zhang, X.; Moore, J. J.; Schowengerdt, F. D.; Ayers, R. A.

    2003-01-01

    Production of highly porous composite materials by traditional materials processing is limited by difficult processing techniques. This work investigates the use of self propagating high temperature (combustion) synthesis (SHS) to create porous tricalcium phosphate (Ca3(PO4)2), TiB-Ti, and NiTi in low and microgravity. Combustion synthesis provides the ability to use set processing parameters to engineer the required porous structure suitable for bone repair or replacement. The processing parameters include green density, particle size, gasifying agents, composition, and gravity. The advantage of the TiB-Ti system is the high level of porosity achieved together with a modulus that can be controlled by both composition (TiB-Ti) and porosity. At the same time, NiTi exhibits shape memory properties. SHS of biomaterials allows the engineering of required porosity coupled with resorbtion properties and specific mechanical properties into the composite materials to allow for a better biomaterial.

  3. Application of advanced diffraction based optical metrology overlay capabilities for high-volume manufacturing

    NASA Astrophysics Data System (ADS)

    Chen, Kai-Hsiung; Huang, Guo-Tsai; Hsieh, Hung-Chih; Ni, Wei-Feng; Chuang, S. M.; Chuang, T. K.; Ke, Chih-Ming; Huang, Jacky; Rao, Shiuan-An; Cumurcu Gysen, Aysegul; d'Alfonso, Maxime; Yueh, Jenny; Izikson, Pavel; Soco, Aileen; Wu, Jon; Nooitgedagt, Tjitte; Ottens, Jeroen; Kim, Yong Ho; Ebert, Martin

    2017-03-01

    On-product overlay requirements are becoming more challenging with every next technology node due to the continued decrease of the device dimensions and process tolerances. Therefore, current and future technology nodes require demanding metrology capabilities such as target designs that are robust towards process variations and high overlay measurement density (e.g. for higher order process corrections) to enable advanced process control solutions. The impact of advanced control solutions based on YieldStar overlay data is being presented in this paper. Multi patterning techniques are applied for critical layers and leading to additional overlay measurement demands. The use of 1D process steps results in the need of overlay measurements relative to more than one layer. Dealing with the increased number of overlay measurements while keeping the high measurement density and metrology accuracy at the same time presents a challenge for high volume manufacturing (HVM). These challenges are addressed by the capability to measure multi-layer targets with the recently introduced YieldStar metrology tool, YS350. On-product overlay results of such multi-layers and standard targets are presented including measurement stability performance.

  4. Simulation Assessment Validation Environment (SAVE). Software User’s Manual

    DTIC Science & Technology

    2000-09-01

    requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible

  5. Alloy steel nuts for bolting for high-pressure and high-temperature service (ASME SA-194 with additional requirements)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This standard covers alloy steel nuts for bolting for high-pressure and high-temperature service in nuclear and associated applications. This standard does not cover bar or other starting materials. The only implied special considerations for starting materials are that they be capable of passing the required tests when processed into finished products in accordance with this standard.

  6. Harnessing QbD, Programming Languages, and Automation for Reproducible Biology.

    PubMed

    Sadowski, Michael I; Grant, Chris; Fell, Tim S

    2016-03-01

    Building robust manufacturing processes from biological components is a task that is highly complex and requires sophisticated tools to describe processes, inputs, and measurements and administrate management of knowledge, data, and materials. We argue that for bioengineering to fully access biological potential, it will require application of statistically designed experiments to derive detailed empirical models of underlying systems. This requires execution of large-scale structured experimentation for which laboratory automation is necessary. This requires development of expressive, high-level languages that allow reusability of protocols, characterization of their reliability, and a change in focus from implementation details to functional properties. We review recent developments in these areas and identify what we believe is an exciting trend that promises to revolutionize biotechnology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. A Study of Emotions in Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Colomo-Palacios, Ricardo; Hernández-López, Adrián; García-Crespo, Ángel; Soto-Acosta, Pedro

    Requirements engineering (RE) is a crucial activity in software development projects. This phase in the software development cycle is knowledge intensive, and thus, human capital intensive. From the human point of view, emotions play an important role in behavior and can even act as behavioral motivators. Thus, if we consider that RE represents a set of knowledge-intensive tasks, which include acceptance and negotiation activities, then the emotional factor represents a key element in these issues. However, the emotional factor in RE has not received the attention it deserves. This paper aims to integrate the stakeholder's emotions into the requirement process, proposing to catalogue them like any other factor in the process such as clarity or stability. Results show that high arousal and low pleasure levels are predictors of high versioning requirements.

  8. Use of KRS-XE positive chemically amplified resist for optical mask manufacturing

    NASA Astrophysics Data System (ADS)

    Ashe, Brian; Deverich, Christina; Rabidoux, Paul A.; Peck, Barbara; Petrillo, Karen E.; Angelopoulos, Marie; Huang, Wu-Song; Moreau, Wayne M.; Medeiros, David R.

    2002-03-01

    The traditional mask making process uses chain scission-type resists such as PBS, poly(butene-1-sulfone), and ZEP, poly(methyl a-chloroacrylate-co-a-methylstyrene) for making masks with dimensions greater than 180nm. PBS resist requires a wet etch process to produce patterns in chrome. ZEP was employed for dry etch processing to meet the requirements of shrinking dimensions, optical proximity corrections and phase shift masks. However, ZEP offers low contrast, marginal etch resistance, organic solvent development, and concerns regarding resist heating with its high dose requirements1. Chemically Amplified Resist (CAR) systems are a very good choice for dimensions less than 180nm because of their high sensitivity and contrast, high resolution, dry etch resistance, aqueous development, and process latitude2. KRS-XE was developed as a high contrast CA resist based on ketal protecting groups that eliminate the need for post exposure bake (PEB). This resist can be used for a variety of electron beam exposures, and improves the capability to fabricate masks for devices smaller than 180nm. Many factors influence the performance of resists in mask making such as post apply bake, exposure dose, resist develop, and post exposure bake. These items will be discussed as well as the use of reactive ion etching (RIE) selectivity and pattern transfer.

  9. Fabrication High Resolution Metrology Target By Step And Repeat Method

    NASA Astrophysics Data System (ADS)

    Dusa, Mircea

    1983-10-01

    Based on the photolithography process generally used to generate high resolution masks for semiconductor I.C.S, we found a very useful industrial application of laser technology.First, we have generated high resolution metrology targets which are used in industrial measurement laser interferometers as difra.ction gratings. Secondi we have generated these targets using step and repeat machine, with He-Ne laser interferometer controlled state, as a pattern generator, due to suitable computer programming.Actually, high resolution metrology target, means two chromium plates, one of which is called the" rule" the other one the "vernier". In Fig.1 we have the configuration of the rule and the vernier. The rule has a succesion of 3 μM lines generated as a difraction grating on a 4 x 4 inch chromium blank. The vernier has several exposed fields( areas) having 3 - 15 μm lines, fields placed on very precise position on the chromium blank surface. High degree of uniformity, tight CD tolerances, low defect density required by the targets, creates specialised problems during processing. Details of the processing, together with experimental results will be presented. Before we start to enter into process details, we have to point out that the dimensional requirements of the reticle target, are quite similar or perhaps more strict than LSI master casks. These requirements presented in Fig.2.

  10. Mesocell study area snow distributions for the Cold Land Processes Experiment (CLPX)

    Treesearch

    Glen E. Liston; Christopher A. Hiemstra; Kelly Elder; Donald W. Cline

    2008-01-01

    The Cold Land Processes Experiment (CLPX) had a goal of describing snow-related features over a wide range of spatial and temporal scales. This required linking disparate snow tools and datasets into one coherent, integrated package. Simulating realistic high-resolution snow distributions and features requires a snow-evolution modeling system (SnowModel) that can...

  11. General Recommendations on Fatigue Risk Management for the Canadian Forces

    DTIC Science & Technology

    2010-04-01

    missions performed in aviation require an individual(s) to process large amount of information in a short period of time and to do this on a continuous...information processing required during sustained operations can deteriorate an individual’s ability to perform a task. Given the high operational tempo...memory, which, in turn, is utilized to perform human thought processes (Baddeley, 2003). While various versions of this theory exist, they all share

  12. Pulp extrusion at ultra-high consistencies : selection of water soluble polymers for process optimization

    Treesearch

    C. Tim Scott

    2002-01-01

    Pulp extrusion at ultra-high consistencies (20% to 40% solids) is a new process developed at USDA Forest Service, Forest Products Laboratory (FPL) to convert recovered papers, wastepaper, and papermill residuals into solid sheets or profiles for compression molding. This process requires adding a water-soluble polymer (WSP) to alter the rheological properties of the...

  13. Microbial desulfurization of coal

    NASA Technical Reports Server (NTRS)

    Dastoor, M. N.; Kalvinskas, J. J.

    1978-01-01

    Experiments indicate that several sulfur-oxidizing bacteria strains have been very efficient in desulfurizing coal. Process occurs at room temperature and does not require large capital investments of high energy inputs. Process may expand use of abundant reserves of high-sulfur bituminous coal, which is currently restricted due to environmental pollution. On practical scale, process may be integrated with modern coal-slurry transportation lines.

  14. Efficient High Performance Collective Communication for Distributed Memory Environments

    ERIC Educational Resources Information Center

    Ali, Qasim

    2009-01-01

    Collective communication allows efficient communication and synchronization among a collection of processes, unlike point-to-point communication that only involves a pair of communicating processes. Achieving high performance for both kernels and full-scale applications running on a distributed memory system requires an efficient implementation of…

  15. Mars Atmospheric Capture and Gas Separation

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony; Santiago-Maldonado, Edgardo; Gibson, Tracy; Devor, Robert; Captain, James

    2011-01-01

    The Mars atmospheric capture and gas separation project is selecting, developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Trace gases will be required to be separated from Martian atmospheric gases to provide pure C02 to processing elements. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and should be captured as welL To achieve these goals, highly efficient gas separation processes will be required. These gas separation techniques are also required across various areas within the ISRU project to support various consumable production processes. The development of innovative gas separation techniques will evaluate the current state-of-the-art for the gas separation required, with the objective to demonstrate and develop light-weight, low-power methods for gas separation. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from un-reacted carbon oxides (C02- CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, (3) carbon oxides from oxygen from a trash/waste processing reaction, and (4) helium from hydrogen or oxygen from a propellant scavenging process. Potential technologies for the separations include freezers, selective membranes, selective solvents, polymeric sorbents, zeolites, and new technologies. This paper and presentation will summarize the results of an extensive literature review and laboratory evaluations of candidate technologies for the capture and separation of C02 and other relevant gases.

  16. High Available COTS Based Computer for Space

    NASA Astrophysics Data System (ADS)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  17. Annealing of gallium nitride under high-N 2 pressure

    NASA Astrophysics Data System (ADS)

    Porowski, S.; Jun, J.; Krukowski, S.; Grzegory, I.; Leszczynski, M.; Suski, T.; Teisseyre, H.; Foxon, C. T.; Korakakis, D.

    1999-04-01

    GaN is the key material for blue and ultraviolet optoelectronics. It is a strongly bonded wurztite structure semiconductor with the direct energy gap 3.5 eV. Due to strong bonding, the diffusion processes require high temperatures, above 1300 K. However at this temperature range at ambient pressure, GaN becomes unstable and dissociates into Ga and N 2. Therefore high pressure of N 2 is required to study the diffusion and other annealing related processes. We studied annealing of bulk GaN nitride single crystals grown under high pressure and also annealing of homo- and heteroepitaxial GaN layers grown by MOCVD technique. Annealing at temperatures above 1300 K influences strongly the structural and optical properties of GaN crystals and layers. At this temperature diffusion of the Mg and Zn acceptors have been observed. In spite of very interesting experimental observations the understanding of microscopic mechanisms of these processes is limited.

  18. Application of high speed machining technology in aviation

    NASA Astrophysics Data System (ADS)

    Bałon, Paweł; Szostak, Janusz; Kiełbasa, Bartłomiej; Rejman, Edward; Smusz, Robert

    2018-05-01

    Aircraft structures are exposed to many loads during their working lifespan. Every particular action made during a flight is composed of a series of air movements which generate various aircraft loads. The most rigorous requirement which modern aircraft structures must fulfill is to maintain their high durability and reliability. This requirement involves taking many restrictions into account during the aircraft design process. The most important factor is the structure's overall mass, which has a crucial impact on both utility properties and cost-effectiveness. This makes aircraft one of the most complex results of modern technology. Additionally, there is currently an increasing utilization of high strength aluminum alloys, which requires the implementation of new manufacturing processes. High Speed Machining technology (HSM) is currently one of the most important machining technologies used in the aviation industry, especially in the machining of aluminium alloys. The primary difference between HSM and other milling techniques is the ability to select cutting parameters - depth of the cut layer, feed rate, and cutting speed in order to simultaneously ensure high quality, precision of the machined surface, and high machining efficiency, all of which shorten the manufacturing process of the integral components. In this paper, the authors explain the implementation of the HSM method in integral aircraft constructions. It presents the method of the airframe manufacturing method, and the final results. The HSM method is compared to the previous method where all subcomponents were manufactured by bending and forming processes, and then, they were joined by riveting.

  19. Bit error rate performance of Image Processing Facility high density tape recorders

    NASA Technical Reports Server (NTRS)

    Heffner, P.

    1981-01-01

    The Image Processing Facility at the NASA/Goddard Space Flight Center uses High Density Tape Recorders (HDTR's) to transfer high volume image data and ancillary information from one system to another. For ancillary information, it is required that very low bit error rates (BER's) accompany the transfers. The facility processes about 10 to the 11th bits of image data per day from many sensors, involving 15 independent processing systems requiring the use of HDTR's. When acquired, the 16 HDTR's offered state-of-the-art performance of 1 x 10 to the -6th BER as specified. The BER requirement was later upgraded in two steps: (1) incorporating data randomizing circuitry to yield a BER of 2 x 10 to the -7th and (2) further modifying to include a bit error correction capability to attain a BER of 2 x 10 to the -9th. The total improvement factor was 500 to 1. Attention is given here to the background, technical approach, and final results of these modifications. Also discussed are the format of the data recorded by the HDTR, the magnetic tape format, the magnetic tape dropout characteristics as experienced in the Image Processing Facility, the head life history, and the reliability of the HDTR's.

  20. Rapid high-throughput cloning and stable expression of antibodies in HEK293 cells.

    PubMed

    Spidel, Jared L; Vaessen, Benjamin; Chan, Yin Yin; Grasso, Luigi; Kline, J Bradford

    2016-12-01

    Single-cell based amplification of immunoglobulin variable regions is a rapid and powerful technique for cloning antigen-specific monoclonal antibodies (mAbs) for purposes ranging from general laboratory reagents to therapeutic drugs. From the initial screening process involving small quantities of hundreds or thousands of mAbs through in vitro characterization and subsequent in vivo experiments requiring large quantities of only a few, having a robust system for generating mAbs from cloning through stable cell line generation is essential. A protocol was developed to decrease the time, cost, and effort required by traditional cloning and expression methods by eliminating bottlenecks in these processes. Removing the clonal selection steps from the cloning process using a highly efficient ligation-independent protocol and from the stable cell line process by utilizing bicistronic plasmids to generate stable semi-clonal cell pools facilitated an increased throughput of the entire process from plasmid assembly through transient transfections and selection of stable semi-clonal cell pools. Furthermore, the time required by a single individual to clone, express, and select stable cell pools in a high-throughput format was reduced from 4 to 6months to only 4 to 6weeks. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Requirements Flowdown for Prognostics and Health Management

    NASA Technical Reports Server (NTRS)

    Goebel, Kai; Saxena, Abhinav; Roychoudhury, Indranil; Celaya, Jose R.; Saha, Bhaskar; Saha, Sankalita

    2012-01-01

    Prognostics and Health Management (PHM) principles have considerable promise to change the game of lifecycle cost of engineering systems at high safety levels by providing a reliable estimate of future system states. This estimate is a key for planning and decision making in an operational setting. While technology solutions have made considerable advances, the tie-in into the systems engineering process is lagging behind, which delays fielding of PHM-enabled systems. The derivation of specifications from high level requirements for algorithm performance to ensure quality predictions is not well developed. From an engineering perspective some key parameters driving the requirements for prognostics performance include: (1) maximum allowable Probability of Failure (PoF) of the prognostic system to bound the risk of losing an asset, (2) tolerable limits on proactive maintenance to minimize missed opportunity of asset usage, (3) lead time to specify the amount of advanced warning needed for actionable decisions, and (4) required confidence to specify when prognosis is sufficiently good to be used. This paper takes a systems engineering view towards the requirements specification process and presents a method for the flowdown process. A case study based on an electric Unmanned Aerial Vehicle (e-UAV) scenario demonstrates how top level requirements for performance, cost, and safety flow down to the health management level and specify quantitative requirements for prognostic algorithm performance.

  2. Materials Challenges in Space Exploration

    NASA Technical Reports Server (NTRS)

    Vickers, John; Shah, Sandeep

    2005-01-01

    The new vision of space exploration encompasses a broad range of human and robotic missions to the Moon, Mars and beyond. Extended human space travel requires high reliability and high performance systems for propulsion, vehicle structures, thermal and radiation protection, crew habitats and health monitoring. Advanced materials and processing technologies are necessary to meet the exploration mission requirements. Materials and processing technologies must be sufficiently mature before they can be inserted into a development program leading to an exploration mission. Exploration will be more affordable by in-situ utilization of materials on the Moon and Mars.

  3. Gear materials for high-production light-deputy service

    NASA Technical Reports Server (NTRS)

    Townsend, D. P.

    1973-01-01

    The selection of a material for high volume, low cost gears requires careful consideration of all the requirements and the processes used to manufacture the gears. The wrong choice in material selection could very well mean the difference between success and failure. A summary of the cost that might be expected for different materials and processes is presented; it can be seen that the cost can span nearly three order of magnitudes from the molded plastic gear to the machined gear with stamped and powder metal gears falling in between these extremes.

  4. AltiVec performance increases for autonomous robotics for the MARSSCAPE architecture program

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.

    2002-02-01

    One of the main tall poles that must be overcome to develop a fully autonomous vehicle is the inability of the computer to understand its surrounding environment to a level that is required for the intended task. The military mission scenario requires a robot to interact in a complex, unstructured, dynamic environment. Reference A High Fidelity Multi-Sensor Scene Understanding System for Autonomous Navigation The Mobile Autonomous Robot Software Self Composing Adaptive Programming Environment (MarsScape) perception research addresses three aspects of the problem; sensor system design, processing architectures, and algorithm enhancements. A prototype perception system has been demonstrated on robotic High Mobility Multi-purpose Wheeled Vehicle and All Terrain Vehicle testbeds. This paper addresses the tall pole of processing requirements and the performance improvements based on the selected MarsScape Processing Architecture. The processor chosen is the Motorola Altivec-G4 Power PC(PPC) (1998 Motorola, Inc.), a highly parallized commercial Single Instruction Multiple Data processor. Both derived perception benchmarks and actual perception subsystems code will be benchmarked and compared against previous Demo II-Semi-autonomous Surrogate Vehicle processing architectures along with desktop Personal Computers(PC). Performance gains are highlighted with progress to date, and lessons learned and future directions are described.

  5. Image data processing system requirements study. Volume 1: Analysis. [for Earth Resources Survey Program

    NASA Technical Reports Server (NTRS)

    Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.

    1973-01-01

    Digital image processing, image recorders, high-density digital data recorders, and data system element processing for use in an Earth Resources Survey image data processing system are studied. Loading to various ERS systems is also estimated by simulation.

  6. Overview of processing activities aimed at higher efficiencies and economical production

    NASA Technical Reports Server (NTRS)

    Bickler, D. B.

    1985-01-01

    An overview of processing activities aimed at higher efficiencies and economical production were presented. Present focus is on low-cost process technology for higher-efficiency cells of up to 18% or higher. Process development concerns center on the use of less than optimum silicon sheet, the control of production yields, and making uniformly efficient large-area cells. High-efficiency cell factors that require process development are bulk material perfection, very shallow junction formation, front-surface passivation, and finely detailed metallization. Better bulk properties of the silicon sheet and the keeping of those qualities throughout large areas during cell processing are required so that minority carrier lifetimes are maintained and cell performance is not degraded by high doping levels. When very shallow junctions are formed, the process must be sensitive to metallizatin punch-through, series resisitance in the cell, and control of dopant leaching during surface passivation. There is a need to determine the sensitivity to processing by mathematical modeling and experimental activities.

  7. High-Level Radioactive Waste.

    ERIC Educational Resources Information Center

    Hayden, Howard C.

    1995-01-01

    Presents a method to calculate the amount of high-level radioactive waste by taking into consideration the following factors: the fission process that yields the waste, identification of the waste, the energy required to run a 1-GWe plant for one year, and the uranium mass required to produce that energy. Briefly discusses waste disposal and…

  8. Driving Objectives and High-level Requirements for KP-Lab Technologies

    ERIC Educational Resources Information Center

    Lakkala, Minna; Paavola, Sami; Toikka, Seppo; Bauters, Merja; Markannen, Hannu; de Groot, Reuma; Ben Ami, Zvi; Baurens, Benoit; Jadin, Tanja; Richter, Christoph; Zoserl, Eva; Batatia, Hadj; Paralic, Jan; Babic, Frantisek; Damsa, Crina; Sins, Patrick; Moen, Anne; Norenes, Svein Olav; Bugnon, Alexandra; Karlgren, Klas; Kotzinons, Dimitris

    2008-01-01

    One of the central goals of the KP-Lab project is to co-design pedagogical methods and technologies for knowledge creation and practice transformation in an integrative and reciprocal manner. In order to facilitate this process user tasks, driving objectives and high-level requirements have been introduced as conceptual tools to mediate between…

  9. Convergent spray process for environmentally friendly coatings

    NASA Technical Reports Server (NTRS)

    Scarpa, Jack

    1995-01-01

    Conventional spray application processes have poor transfer efficiencies, resulting in an exorbitant loss in materials, solvents, and time. Also, with ever tightening Environmental Protection Agency (EPA) regulations and Occupational Safety and Health Administration requirements, the low transfer efficiencies have a significant impact on the quantities of materials and solvents that are released into the environment. High solids spray processes are also limited by material viscosities, thus requiring many passes over the surface to achieve a thickness in the 0.125 -inch range. This results in high application costs and a negative impact on the environment. Until recently, requirements for a 100 percent solid sprayable, environmentally friendly, lightweight thermal protection system that can be applied in a thick (greater than 0.125 inch) single-pass operation exceeded the capability of existing systems. Such coatings must be applied by hand lay-up techniques, especially for thermal and/or fire protection systems. The current formulation of these coatings has presented many problems such as worker safety, environmental hazards, waste, high cost, and application constraints. A system which can apply coatings without using hazardous materials would alleviate many of these problems. Potential applications include the aerospace thermal protective specialty coatings, chemical and petroleum industries that require fire-protection coatings that resist impact, chemicals, and weather. These markets can be penetrated by offering customized coatings applied by automated processes that are environmentally friendly.

  10. Development and Validation of High Precision Thermal, Mechanical, and Optical Models for the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles

    2006-01-01

    SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.

  11. A framework supporting the development of a Grid portal for analysis based on ROI.

    PubMed

    Ichikawa, K; Date, S; Kaishima, T; Shimojo, S

    2005-01-01

    In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.

  12. Fluvial process and the establishment of bottomland trees

    USGS Publications Warehouse

    Scott, Michael L.; Friedman, Jonathan M.; Auble, Gregor T.

    1996-01-01

    The relation between streamflow and establishment of bottomland trees is conditioned by the dominant fluvial process or processes acting along a stream. For successful establishment, cottonwoods, poplars, and willows require bare, moist surfaces protected from disturbance. Channel narrowing, channel meandering, and flood deposition promote different spatial and temporal patterns of establishment. During channel narrowing, the site requirements are met on portions of the bed abandoned by the stream, and establishment is associated with a period of low flow lasting one to several years. During channel meandering, the requirements are met on point bars following moderate or higher peak flows. Following flood deposition, the requirements are met on flood deposits ;high above the channel bed. Flood deposition can occur along most streams, but where a channel is constrained by a narrow valley, this process may be the only mechanism that can produce a bare, moist surface high enough to be safe from future disturbance. Because of differences in local bedrock, tributary influence, or geologic history, two nearby reaches of the same stream may be dominated by different fluvial processes and have different spatial and temporal patterns of trees. We illustrate this phenomenon with examples from forests of plains cottonwood (Populus deltoides ssp. monilifera) along meandering and constrained reaches of the Missouri River in Montana.

  13. Robot design for a vacuum environment

    NASA Technical Reports Server (NTRS)

    Belinski, S.; Trento, W.; Imani-Shikhabadi, R.; Hackwood, S.

    1987-01-01

    The cleanliness requirements for many processing and manufacturing tasks are becoming ever stricter, resulting in a greater interest in the vacuum environment. Researchers discuss the importance of this special environment, and the development of robots which are physically and functionally suited to vacuum processing tasks. Work is in progress at the Center for robotic Systems in Microelectronics (CRSM) to provide a robot for the manufacture of a revolutionary new gyroscope in high vacuum. The need for vacuum in this and other processes is discussed as well as the requirements for a vacuum-compatible robot. Finally, researchers present details on work done at the CRSM to modify an existing clean-room compatible robot for use at high vacuum.

  14. The Impact of Comer's School Development Program's Student Staff Support Team Process on High-Incidence Special Education Referrals in One Elementary School

    ERIC Educational Resources Information Center

    Gibson-Robinson, Joi

    2010-01-01

    This study examines whether the Comer (1996) placement model process reduces the overrepresentation of certain student groups into high-incidence disabilities programs. High-incidence disabilities are those disabilities which require an extensive degree of "professional judgment" by the teacher in determining whether or not a disability exists…

  15. KENNEDY SPACE CENTER, FLA. - This bird's-eye view of a high bay in the Orbiter Processing Facility (OPF) shows Space Shuttle Atlantis surrounded by the standard platforms and equipment required to process a Space Shuttle orbiter for flight. The high bay is 197 feet (60 meters) long, 150 feet (46 meters) wide, 95 feet (29 meters) high, and encompasses a 29,000-square-foot (2,694-meter) area. Platforms, a main access bridge, and two rolling bridges with trucks provide access to various parts of the orbiter. The next mission scheduled for Atlantis is STS-114, a utilization and logistics flight to the International Space Station.

    NASA Image and Video Library

    2003-09-03

    KENNEDY SPACE CENTER, FLA. - This bird's-eye view of a high bay in the Orbiter Processing Facility (OPF) shows Space Shuttle Atlantis surrounded by the standard platforms and equipment required to process a Space Shuttle orbiter for flight. The high bay is 197 feet (60 meters) long, 150 feet (46 meters) wide, 95 feet (29 meters) high, and encompasses a 29,000-square-foot (2,694-meter) area. Platforms, a main access bridge, and two rolling bridges with trucks provide access to various parts of the orbiter. The next mission scheduled for Atlantis is STS-114, a utilization and logistics flight to the International Space Station.

  16. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  17. Russian Earth Science Research Program on ISS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armand, N. A.; Tishchenko, Yu. G.

    1999-01-22

    Version of the Russian Earth Science Research Program on the Russian segment of ISS is proposed. The favorite tasks are selected, which may be solved with the use of space remote sensing methods and tools and which are worthwhile for realization. For solving these tasks the specialized device sets (submodules), corresponding to the specific of solved tasks, are working out. They would be specialized modules, transported to the ISS. Earth remote sensing research and ecological monitoring (high rates and large bodies transmitted from spaceborne information, comparatively stringent requirements to the period of its processing, etc.) cause rather high requirements tomore » the ground segment of receiving, processing, storing, and distribution of space information in the interests of the Earth natural resources investigation. Creation of the ground segment has required the development of the interdepartmental data receiving and processing center. Main directions of works within the framework of the ISS program are determined.« less

  18. The integrated design and archive of space-borne signal processing and compression coding

    NASA Astrophysics Data System (ADS)

    He, Qiang-min; Su, Hao-hang; Wu, Wen-bo

    2017-10-01

    With the increasing demand of users for the extraction of remote sensing image information, it is very urgent to significantly enhance the whole system's imaging quality and imaging ability by using the integrated design to achieve its compact structure, light quality and higher attitude maneuver ability. At this present stage, the remote sensing camera's video signal processing unit and image compression and coding unit are distributed in different devices. The volume, weight and consumption of these two units is relatively large, which unable to meet the requirements of the high mobility remote sensing camera. This paper according to the high mobility remote sensing camera's technical requirements, designs a kind of space-borne integrated signal processing and compression circuit by researching a variety of technologies, such as the high speed and high density analog-digital mixed PCB design, the embedded DSP technology and the image compression technology based on the special-purpose chips. This circuit lays a solid foundation for the research of the high mobility remote sensing camera.

  19. Advances in the Development of a WCl6 CVD System for Coating UO2 Powders with Tungsten

    NASA Technical Reports Server (NTRS)

    Mireles, Omar R.; Tieman, Alyssa; Broadway, Jeramie; Hickman, Robert

    2013-01-01

    Demonstrated viability and utilization of: a) Fluidized powder bed. b) WCl6 CVD process. c) Coated spherical particles with tungsten. The highly corrosive nature of the WCl6 solid reagent limits material of construction. Indications that identifying optimized process variables with require substantial effort and will likely vary with changes in fuel requirements.

  20. UV-LIGA microfabrication process for sub-terahertz waveguides utilizing multiple layered SU-8 photoresist

    NASA Astrophysics Data System (ADS)

    Malekabadi, Ali; Paoloni, Claudio

    2016-09-01

    A microfabrication process based on UV LIGA (German acronym of lithography, electroplating and molding) is proposed for the fabrication of relatively high aspect ratio sub-terahertz (100-1000 GHz) metal waveguides, to be used as a slow wave structure in sub-THz vacuum electron devices. The high accuracy and tight tolerances required to properly support frequencies in the sub-THz range can be only achieved by a stable process with full parameter control. The proposed process, based on SU-8 photoresist, has been developed to satisfy high planar surface requirements for metal sub-THz waveguides. It will be demonstrated that, for a given thickness, it is more effective to stack a number of layers of SU-8 with lower thickness rather than using a single thick layer obtained at lower spin rate. The multiple layer approach provides the planarity and the surface quality required for electroforming of ground planes or assembly surfaces and for assuring low ohmic losses of waveguides. A systematic procedure is provided to calculate soft and post-bake times to produce high homogeneity SU-8 multiple layer coating as a mold for very high quality metal waveguides. A double corrugated waveguide designed for 0.3 THz operating frequency, to be used in vacuum electronic devices, was fabricated as test structure. The proposed process based on UV LIGA will enable low cost production of high accuracy sub-THz 3D waveguides. This is fundamental for producing a new generation of affordable sub-THz vacuum electron devices, to fill the technological gap that still prevents a wide diffusion of numerous applications based on THz radiation.

  1. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    NASA Astrophysics Data System (ADS)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  2. Maneuver Recovery Analysis for the Magnetospheric Multiscale Mission

    NASA Technical Reports Server (NTRS)

    Gramling, Cheryl; Carpenter, Russell; Volle, Michael; Lee, Taesul; Long, Anne

    2007-01-01

    The use of spacecraft formations creates new and more demanding requirements for orbit determination accuracy. In addition to absolute navigation requirements, there are typically relative navigation requirements that are based on the size or shape of the formation. The difficulty in meeting these requirements is related to the relative dynamics of the spacecraft orbits and the frequency of the formation maintenance maneuvers. This paper examines the effects of bi-weekly formation maintenance maneuvers on the absolute and relative orbit determination accuracy for the four-spacecraft Magnetospheric Multiscale (MMS) formation. Results are presented from high fidelity simulations that include the effects of realistic orbit determination errors in the maneuver planning process. Solutions are determined using a high accuracy extended Kalman filter designed for onboard navigation. Three different solutions are examined, considering the effects of process noise and measurement rate on the solutions.

  3. A Risk Management Framework to Characterize Black Swan Risks: A Case Study of Lightning Effects on Insensitive High Explosives

    NASA Astrophysics Data System (ADS)

    Sanders, Gary A.

    Effective and efficient risk management processes include the use of high fidelity modeling and simulation during the concept exploration phase as part of the technology and risk assessment activities, with testing and evaluation tasks occurring in later design development phases. However, some safety requirements and design architectures may be dominated by the low probability/high consequence "Black Swan" vulnerabilities that require very early testing to characterize and efficiently mitigate. Failure to address these unique risks has led to catastrophic systems failures including the space shuttle Challenger, Deepwater Horizon, Fukushima nuclear reactor, and Katrina dike failures. Discovering and addressing these risks later in the design and development process can be very costly or even lead to project cancellation. This paper examines the need for risk management process adoption of early hazard phenomenology testing to inform the technical risk assessment, requirements definition and conceptual design. A case study of the lightning design vulnerability of the insensitive high explosives being used in construction, mining, demolition, and defense industries will be presented to examine the impact of this vulnerability testing during the concept exploration phase of the design effort. While these insensitive high explosives are far less sensitive to accidental initiation by fire, impact, friction or even electrical stimuli, their full range of sensitivities have not been characterized and ensuring safe engineering design and operations during events such as lightning storms requires vulnerability testing during the risk assessment phase.

  4. Constellation Mission Operation Working Group: ESMO Maneuver Planning Process Review

    NASA Technical Reports Server (NTRS)

    Moyer, Eric

    2015-01-01

    The Earth Science Mission Operation (ESMO) Project created an Independent Review Board to review our Conjunction Risk evaluation process and Maneuver Planning Process to identify improvements that safely manages mission conjunction risks, maintains ground track science requirements, and minimizes overall hours expended on High Interest Events (HIE). The Review Board is evaluating the current maneuver process which requires support by multiple groups. In the past year, there have been several changes to the processes although many prior and new concerns exist. This presentation will discuss maneuver process reviews and Board comments, ESMO assessment and path foward, ESMO future plans, recent changes and concerns.

  5. Technological Enhancements for Personal Computers

    DTIC Science & Technology

    1992-03-01

    quicker order processing , shortening the time required to obtain critical spare parts. 31 Customer service and spare parts tracking are facilitated by...cards speed up order processing and filing. Bar code readers speed inventory control processing. D. DEPLOYMENT PLANNING. Many units with high mobility

  6. Solar Pumped Lasers and Their Applications

    NASA Technical Reports Server (NTRS)

    Lee, Ja H.

    1991-01-01

    Since 1980, NASA has been pursuing high power solar lasers as part of the space power beaming program. Materials in liquid, solid, and gas phases have been evaluated against the requirements for solar pumping. Two basic characteristics of solar insolation, namely its diffuse irradiance and 5800 K blackbody-like spectrum, impose rather stringent requirements for laser excitation. However, meeting these requirements is not insurmountable as solar thermal energy technology has progressed today, and taking advantage of solar pumping lasers is becoming increasingly attractive. The high density photons of concentrated solar energy have been used for mainly electric power generation and thermal processing of materials by the DOE Solar Thermal Technologies Program. However, the photons can interact with materials through many other direct kinetic paths, and applications of the concentrated photons could be extended to processes requiring photolysis, photosynthesis, and photoexcitation. The use of solar pumped lasers on Earth seems constrained by economics and sociopolitics. Therefore, prospective applications may be limited to those that require use of quantum effects and coherency of the laser in order to generate extremely high value products and services when conventional and inexpensive means are ineffective or impossible. The new applications already proposed for concentrated solar photons, such as destruction of hazardous waste, production of renewable fuel, production of fertilizer, and air/water pollution controls, may benefit from the use of inexpensive solar pumped laser matched with the photochemical kinetics of these processes.

  7. High-precision laser microcutting and laser microdrilling using diffractive beam-splitting and high-precision flexible beam alignment

    NASA Astrophysics Data System (ADS)

    Zibner, F.; Fornaroli, C.; Holtkamp, J.; Shachaf, Lior; Kaplan, Natan; Gillner, A.

    2017-08-01

    High-precision laser micro machining gains more importance in industrial applications every month. Optical systems like the helical optics offer highest quality together with controllable and adjustable drilling geometry, thus as taper angle, aspect ratio and heat effected zone. The helical optics is based on a rotating Dove-prism which is mounted in a hollow shaft engine together with other optical elements like wedge prisms and plane plates. Although the achieved quality can be interpreted as extremely high the low process efficiency is a main reason that this manufacturing technology has only limited demand within the industrial market. The objective of the research studies presented in this paper is to dramatically increase process efficiency as well as process flexibility. During the last years, the average power of commercial ultra-short pulsed laser sources has increased significantly. The efficient utilization of the high average laser power in the field of material processing requires an effective distribution of the laser power onto the work piece. One approach to increase the efficiency is the application of beam splitting devices to enable parallel processing. Multi beam processing is used to parallelize the fabrication of periodic structures as most application only require a partial amount of the emitted ultra-short pulsed laser power. In order to achieve highest flexibility while using multi beam processing the single beams are diverted and re-guided in a way that enables the opportunity to process with each partial beam on locally apart probes or semimanufactures.

  8. A comprehensive review on utilization of wastewater from coffee processing.

    PubMed

    Rattan, Supriya; Parande, A K; Nagaraju, V D; Ghiwari, Girish K

    2015-05-01

    The coffee processing industry is one of the major agro-based industries contributing significantly in international and national growth. Coffee fruits are processed by two methods, wet and dry process. In wet processing, coffee fruits generate enormous quantities of high strength wastewater requiring systematic treatment prior to disposal. Different method approach is used to treat the wastewater. Many researchers have attempted to assess the efficiency of batch aeration as posttreatment of coffee processing wastewater from an upflow anaerobic hybrid reactor (UAHR)-continuous and intermittent aeration system. However, wet coffee processing requires a high degree of processing know-how and produces large amounts of effluents which have the potential to damage the environment. Characteristics of wastewater from coffee processing has a biological oxygen demand (BOD) of up to 20,000 mg/l and a chemical oxygen demand (COD) of up to 50,000 mg/l as well as the acidity of pH below 4. In this review paper, various methods are discussed to treat coffee processing wastewaters; the constitution of wastewater is presented and the technical solutions for wastewater treatment are discussed.

  9. A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Boyles, Carole A.

    2008-01-01

    The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

  10. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  11. A review on plasma-etch-process induced damage of HgCdTe

    NASA Astrophysics Data System (ADS)

    Liu, Lingfeng; Chen, Yiyu; Ye, Zhenhua; Ding, Ruijun

    2018-05-01

    Dry etching techniques with minimal etch induced damage are required to develop highly anisotropic etch for pixel delineation of HgCdTe infrared focal plane arrays (IRFPAs). High density plasma process has become the main etching technique for HgCdTe in the past twenty years, In this paper, high density plasma electron cyclotron resonance (ECR) and inductively coupled plasma (ICP) etching of HgCdTe are summarized. Common plasma-etch-process induced type conversion and related mechanisms are reviewed particularly.

  12. Thermal design of the space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Bachrtel, F. D.; Vaniman, J. L.; Stuckey, J. M.; Gray, C.; Widofsky, B.

    1985-01-01

    The shuttle external tank thermal design presents many challenges in meeting the stringent requirements established by the structures, main propulsion systems, and Orbiter elements. The selected thermal protection design had to meet these requirements, and ease of application, suitability for mass production considering low weight, cost, and high reliability. This development led to a spray-on-foam (SOFI) which covers the entire tank. The need and design for a SOFI material with a dual role of cryogenic insulation and ablator, and the development of the SOFI over SLA concept for high heating areas are discussed. Further issuses of minimum surface ice/frost, no debris, and the development of the TPS spray process considering the required quality and process control are examined.

  13. 78 FR 75571 - Independent Assessment of the Process for the Review of Device Submissions; High Priority...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... of performing the technical analysis, management assessment, and program evaluation tasks required to.... Analysis of elements of the review process (including the presubmission process, and investigational device... time to facilitate a more efficient process. This includes analysis of root causes for inefficiencies...

  14. The Use of Interactive Methods in the Educational Process of the Higher Education Institution

    ERIC Educational Resources Information Center

    Kutbiddinova, Rimma A.; Eromasova, Aleksandr? A.; Romanova, Marina A.

    2016-01-01

    The modernization of higher education and the transition to the new Federal Education Standards require a higher quality training of the graduates. The training of highly qualified specialists must meet strict requirements: a high level of professional competence, the developed communication skills, the ability to predict the results of one's own…

  15. Design and fabrication of label-free biochip using a guided mode resonance filter with nano grating structures by injection molding process.

    PubMed

    Cho, E; Kim, B; Choi, S; Han, J; Jin, J; Han, J; Lim, J; Heo, Y; Kim, S; Sung, G Y; Kang, S

    2011-01-01

    This paper introduces technology to fabricate a guided mode resonance filter biochip using injection molding. Of the various nanofabrication processes that exist, injection molding is the most suitable for the mass production of polymer nanostructures. Fabrication of a nanograting pattern for guided mode resonance filters by injection molding requires a durable metal stamp, because of the high injection temperature and pressure. Careful consideration of the optimized process parameters is also required to achieve uniform sub-wavelength gratings with high fidelity. In this study, a metallic nanostructure pattern to be used as the stamp for the injection molding process was fabricated using electron beam lithography, a UV nanoimprinting process, and an electroforming process. A one-dimensional nanograting substrate was replicated by injection molding, during which the process parameters were controlled. To evaluate the geometric quality of the injection molded nanograting patterns, the surface profile of the fabricated nanograting for different processing conditions was analyzed using an atomic force microscope and a scanning electron microscope. Finally, to demonstrate the feasibility of the proposed process for fabricating guided mode resonance filter biochips, a high-refractive-index material was deposited on the polymer nanograting and its guided mode resonance characteristics were analyzed.

  16. HEVC real-time decoding

    NASA Astrophysics Data System (ADS)

    Bross, Benjamin; Alvarez-Mesa, Mauricio; George, Valeri; Chi, Chi Ching; Mayer, Tobias; Juurlink, Ben; Schierl, Thomas

    2013-09-01

    The new High Efficiency Video Coding Standard (HEVC) was finalized in January 2013. Compared to its predecessor H.264 / MPEG4-AVC, this new international standard is able to reduce the bitrate by 50% for the same subjective video quality. This paper investigates decoder optimizations that are needed to achieve HEVC real-time software decoding on a mobile processor. It is shown that HEVC real-time decoding up to high definition video is feasible using instruction extensions of the processor while decoding 4K ultra high definition video in real-time requires additional parallel processing. For parallel processing, a picture-level parallel approach has been chosen because it is generic and does not require bitstreams with special indication.

  17. Review of the workshop on low-cost polysilicon for terrestrial photovoltaic solar cell applications

    NASA Technical Reports Server (NTRS)

    Lutwack, R.

    1986-01-01

    Topics reviewed include: polysilicon material requirements; effects of impurities; requirements for high-efficiency solar cells; economics; development of silane processes; fluidized-bed processor development; silicon purification; and marketing.

  18. Managing unexpected events in the manufacturing of biologic medicines.

    PubMed

    Grampp, Gustavo; Ramanan, Sundar

    2013-08-01

    The manufacturing of biologic medicines (biologics) requires robust process and facility design, rigorous regulatory compliance, and a well-trained workforce. Because of the complex attributes of biologics and their sensitivity to production and handling conditions, manufacturing of these medicines also requires a high-reliability manufacturing organization. As required by regulators, such an organization must monitor the state-of-control for the manufacturing process. A high-reliability organization also invests in an experienced and fully engaged technical support staff and fosters a management culture that rewards in-depth analysis of unexpected results, robust risk assessments, and timely and effective implementation of mitigation measures. Such a combination of infrastructure, technology, human capital, management, and a science-based operations culture does not occur without a strong organizational and financial commitment. These attributes of a high-reliability biologics manufacturer are difficult to achieve and may be differentiating factors as the supply of biologics diversifies in future years.

  19. Non-contact Measurement of Creep in Ultra-High-Temperature Materials

    DTIC Science & Technology

    2009-11-04

    Task 1: Process UHTC materials at the relevant temperatures in Electrostatic Levitation for extended periods. 5 3.5 Task 2: Prepare the required high...Electrostatic Levitation ITI Industrial Tectonics, Inc. MSFC NASA George C. Marshall Space Flight Center NASA National Aeronautics and Space...was divided into certain research questions: Can high-precision UHTC spheres be processed in Electrostatic Levitation (ESL) at the relevant

  20. Metrology: Calibration and measurement processes guidelines

    NASA Technical Reports Server (NTRS)

    Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.

    1994-01-01

    The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.

  1. Mass storage technology in networks

    NASA Astrophysics Data System (ADS)

    Ishii, Katsunori; Takeda, Toru; Itao, Kiyoshi; Kaneko, Reizo

    1990-08-01

    Trends and features of mass storage subsystems in network are surveyed and their key technologies spotlighted. Storage subsystems are becoming increasingly important in new network systems in which communications and data processing are systematically combined. These systems require a new class of high-performance mass-information storage in order to effectively utilize their processing power. The requirements of high transfer rates, high transactional rates and large storage capacities, coupled with high functionality, fault tolerance and flexibility in configuration, are major challenges in storage subsystems. Recent progress in optical disk technology has resulted in improved performance of on-line external memories to optical disk drives, which are competing with mid-range magnetic disks. Optical disks are more effective than magnetic disks in using low-traffic random-access file storing multimedia data that requires large capacity, such as in archive use and in information distribution use by ROM disks. Finally, it demonstrates image coded document file servers for local area network use that employ 130mm rewritable magneto-optical disk subsystems.

  2. Cost-effective lightweight mirrors for aerospace and defense

    NASA Astrophysics Data System (ADS)

    Woodard, Kenneth S.; Comstock, Lovell E.; Wamboldt, Leonard; Roy, Brian P.

    2015-05-01

    The demand for high performance, lightweight mirrors was historically driven by aerospace and defense (A&D) but now we are also seeing similar requirements for commercial applications. These applications range from aerospace-like platforms such as small unmanned aircraft for agricultural, mineral and pollutant aerial mapping to an eye tracking gimbaled mirror for optometry offices. While aerospace and defense businesses can often justify the high cost of exotic, low density materials, commercial products rarely can. Also, to obtain high performance with low overall optical system weight, aspheric surfaces are often prescribed. This may drive the manufacturing process to diamond machining thus requiring the reflective side of the mirror to be a diamond machinable material. This paper summarizes the diamond machined finishing and coating of some high performance, lightweight designs using non-exotic substrates to achieve cost effective mirrors. The results indicate that these processes can meet typical aerospace and defense requirements but may also be competitive in some commercial applications.

  3. Correlative microscopy including CLSM and SEM to improve high-speed, high-resolution laser-engraved print and embossing forms

    NASA Astrophysics Data System (ADS)

    Bohrer, Markus; Schweitzer, Michael; Nirnberger, Robert; Weinberger, Bernhard

    2015-10-01

    The industrial market for processing large-scale films has seen dramatic changes since the 1980s and has almost completely been replaced by lasers and digital processes. A commonly used technology for engraving screens, print and embossing forms in the printing industry, well known since then, is the use of RF-excited CO2 lasers with a beam power up to about 1 kW, modulated in accordance to the pattern to be engraved. Future needs for high-security printing (banknotes, security papers, passports, etc.) will require laser engraving of at least half a million or even more structured elements with a depth from some μm up to 500 μm. Industry now wants photorealistic pictures in packaging design, which requires a similar performance. To ensure 'trusted pulses' from the digital process to the print result the use of correlative microscopy (CLSM and SEM) is demonstrated as a complete chain for a correlative print process in this paper.

  4. High Speed PC Based Data Acquisition and Instrumentation for Measurement of Simulated Low Earth Orbit Thermally Induced Disturbances

    NASA Technical Reports Server (NTRS)

    Sills, Joel W., Jr.; Griffin, Thomas J. (Technical Monitor)

    2001-01-01

    The Hubble Space Telescope (HST) Disturbance Verification Test (DVT) was conducted to characterize responses of the Observatory's new set of rigid solar array's (SA3) to thermally induced 'creak' or stiction releases. The data acquired in the DVT were used in verification of the HST Pointing Control System on-orbit performance, post-Servicing Mission 3B (SM3B). The test simulated the on-orbit environment on a deployed SA3 flight wing. Instrumentation for this test required pretest simulations in order to select the correct sensitivities. Vacuum compatible, highly accurate accelerometers and force gages were used for this test. The complexity of the test, as well as a short planning schedule, required a data acquisition system that was easy to configure, highly flexible, and extremely robust. A PC Windows oriented data acquisition system meets these requirements, allowing the test engineers to minimize the time required to plan and perform complex environmental test. The SA3 DVT provided a direct practical and complex demonstration of the versatility that PC based data acquisition systems provide. Two PC based data acquisition systems were assembled to acquire, process, distribute, and provide real time processing for several types of transducers used in the SA3 DVT. A high sample rate digital tape recorder was used to archive the sensor signals. The two systems provided multi-channel hardware and software architecture and were selected based on the test requirements. How these systems acquire and processes multiple data rates from different transducer types is discussed, along with the system hardware and software architecture.

  5. Collaborative Manufacturing for Small-Medium Enterprises

    NASA Astrophysics Data System (ADS)

    Irianto, D.

    2016-02-01

    Manufacturing systems involve decisions concerning production processes, capacity, planning, and control. In a MTO manufacturing systems, strategic decisions concerning fulfilment of customer requirement, manufacturing cost, and due date of delivery are the most important. In order to accelerate the decision making process, research on decision making structure when receiving order and sequencing activities under limited capacity is required. An effective decision making process is typically required by small-medium components and tools maker as supporting industries to large industries. On one side, metal small-medium enterprises are expected to produce parts, components or tools (i.e. jigs, fixture, mold, and dies) with high precision, low cost, and exact delivery time. On the other side, a metal small- medium enterprise may have weak bargaining position due to aspects such as low production capacity, limited budget for material procurement, and limited high precision machine and equipment. Instead of receiving order exclusively, a small-medium enterprise can collaborate with other small-medium enterprise in order to fulfill requirements high quality, low manufacturing cost, and just in time delivery. Small-medium enterprises can share their best capabilities to form effective supporting industries. Independent body such as community service at university can take a role as a collaboration manager. The Laboratory of Production Systems at Bandung Institute of Technology has implemented shared manufacturing systems for small-medium enterprise collaboration.

  6. Method and apparatus for energy efficient self-aeration in chemical, biochemical, and wastewater treatment processes

    DOEpatents

    Gao, Johnway [Richland, WA; Skeen, Rodney S [Pendleton, OR

    2002-05-28

    The present invention is a pulse spilling self-aerator (PSSA) that has the potential to greatly lower the installation, operation, and maintenance cost associated with aerating and mixing aqueous solutions. Currently, large quantities of low-pressure air are required in aeration systems to support many biochemical production processes and wastewater treatment plants. Oxygen is traditionally supplied and mixed by a compressor or blower and a mechanical agitator. These systems have high-energy requirements and high installation and maintenance costs. The PSSA provides a mixing and aeration capability that can increase operational efficiency and reduce overall cost.

  7. High-Speed Particle-in-Cell Simulation Parallelized with Graphic Processing Units for Low Temperature Plasmas for Material Processing

    NASA Astrophysics Data System (ADS)

    Hur, Min Young; Verboncoeur, John; Lee, Hae June

    2014-10-01

    Particle-in-cell (PIC) simulations have high fidelity in the plasma device requiring transient kinetic modeling compared with fluid simulations. It uses less approximation on the plasma kinetics but requires many particles and grids to observe the semantic results. It means that the simulation spends lots of simulation time in proportion to the number of particles. Therefore, PIC simulation needs high performance computing. In this research, a graphic processing unit (GPU) is adopted for high performance computing of PIC simulation for low temperature discharge plasmas. GPUs have many-core processors and high memory bandwidth compared with a central processing unit (CPU). NVIDIA GeForce GPUs were used for the test with hundreds of cores which show cost-effective performance. PIC code algorithm is divided into two modules which are a field solver and a particle mover. The particle mover module is divided into four routines which are named move, boundary, Monte Carlo collision (MCC), and deposit. Overall, the GPU code solves particle motions as well as electrostatic potential in two-dimensional geometry almost 30 times faster than a single CPU code. This work was supported by the Korea Institute of Science Technology Information.

  8. HIRIS (High-Resolution Imaging Spectrometer: Science opportunities for the 1990s. Earth observing system. Volume 2C: Instrument panel report

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The high-resolution imaging spectrometer (HIRIS) is an Earth Observing System (EOS) sensor developed for high spatial and spectral resolution. It can acquire more information in the 0.4 to 2.5 micrometer spectral region than any other sensor yet envisioned. Its capability for critical sampling at high spatial resolution makes it an ideal complement to the MODIS (moderate-resolution imaging spectrometer) and HMMR (high-resolution multifrequency microwave radiometer), lower resolution sensors designed for repetitive coverage. With HIRIS it is possible to observe transient processes in a multistage remote sensing strategy for Earth observations on a global scale. The objectives, science requirements, and current sensor design of the HIRIS are discussed along with the synergism of the sensor with other EOS instruments and data handling and processing requirements.

  9. Thin film resonator technology.

    PubMed

    Lakin, Kenneth M

    2005-05-01

    Advances in wireless systems have placed increased demands on high performance frequency control devices for operation into the microwave range. With spectrum crowding, high bandwidth requirements, miniaturization, and low cost requirements as a background, the thin film resonator technology has evolved into the mainstream of applications. This technology has been under development for over 40 years in one form or another, but it required significant advances in integrated circuit processing to reach microwave frequencies and practical manufacturing for high-volume applications. This paper will survey the development of the thin film resonator technology and describe the core elements that give rise to resonators and filters for today's high performance wireless applications.

  10. Assessment of atmospheric moisture harvesting by direct cooling

    NASA Astrophysics Data System (ADS)

    Gido, Ben; Friedler, Eran; Broday, David M.

    2016-12-01

    The enormous amount of water vapor present in the atmosphere may serve as a potential water resource. An index is proposed for assessing the feasibility and energy requirements of atmospheric moisture harvesting by a direct cooling process. A climate-based analysis of different locations reveals the global potential of this process. We demonstrate that the Moisture Harvesting Index (MHI) can be used for assessing the energy requirements of atmospheric moisture harvesting. The efficiency of atmospheric moisture harvesting is highly weather and climate dependent, with the smallest estimated energy requirement found at the tropical regions of the Philippines (0.23 kW/L). Less favorable locations have much higher energy demands for the operation of an atmospheric moisture harvesting device. In such locations, using the MHI to select the optimal operation time periods (during the day and the year) can reduce the specific energy requirements of the process dramatically. Still, using current technology the energy requirement of atmospheric moisture harvesting by a direct air cooling process is significantly higher than of desalination by reverse osmosis.

  11. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  12. Molybdenum-base cermet fuel development

    NASA Astrophysics Data System (ADS)

    Pilger, James P.; Gurwell, William E.; Moss, Ronald W.; White, George D.; Seifert, David A.

    Development of a multimegawatt (MMW) space nuclear power system requires identification and resolution of several technical feasibility issues before selecting one or more promising system concepts. Demonstration of reactor fuel fabrication technology is required for cermet-fueled reactor concepts. The MMW reactor fuel development activity at Pacific Northwest Laboratory (PNL) is focused on producing a molybdenum-matrix uranium-nitride (UN) fueled cermte. This cermet is to have a high matrix density (greater than or equal to 95 percent) for high strength and high thermal conductance coupled with a high particle (UN) porosity (approximately 25 percent) for retention of released fission gas at high burnup. Fabrication process development involves the use of porous TiN microspheres as surrogate fuel material until porous Un microspheres become available. Process development was conducted in the areas of microsphere synthesis, particle sealing/coating, and high-energy-rate forming (HERF) and the vacuum hot press consolidation techniques. This paper summarizes the status of these activities.

  13. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  14. Gamma-ray, neutron, and hard X-ray studies and requirements for a high-energy solar physics facility

    NASA Technical Reports Server (NTRS)

    Ramaty, R.; Dennis, B. R.; Emslie, A. G.

    1988-01-01

    The requirements for future high-resolution spatial, spectral, and temporal observation of hard X-rays, gamma rays and neutrons from solar flares are discussed in the context of current high-energy flare observations. There is much promise from these observations for achieving a deep understanding of processes of energy release, particle acceleration and particle transport in a complicated environment such as the turbulent and highly magnetized atmosphere of the active sun.

  15. High performance thermal imaging for the 21st century

    NASA Astrophysics Data System (ADS)

    Clarke, David J.; Knowles, Peter

    2003-01-01

    In recent years IR detector technology has developed from early short linear arrays. Such devices require high performance signal processing electronics to meet today's thermal imaging requirements for military and para-military applications. This paper describes BAE SYSTEMS Avionics Group's Sensor Integrated Modular Architecture thermal imager which has been developed alongside the group's Eagle 640×512 arrays to provide high performance imaging capability. The electronics architecture also supprots High Definition TV format 2D arrays for future growth capability.

  16. Influence of using challenging tasks in biology classrooms on students' cognitive knowledge structure: an empirical video study

    NASA Astrophysics Data System (ADS)

    Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.

    2016-08-01

    Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.

  17. Numerical simulation of high speed incremental forming of aluminum alloy

    NASA Astrophysics Data System (ADS)

    Giuseppina, Ambrogio; Teresa, Citrea; Luigino, Filice; Francesco, Gagliardi

    2013-12-01

    In this study, an innovative process is analyzed with the aim to satisfy the industrial requirements, such as process flexibility, differentiation and customizing of products, cost reduction, minimization of execution time, sustainable production, etc. The attention is focused on incremental forming process, nowadays used in different fields such as: rapid prototyping, medical sector, architectural industry, aerospace and marine, in the production of molds and dies. Incremental forming consists in deforming only a small region of the workspace through a punch driven by a NC machine. SPIF is the considered variant of the process, in which the punch gives local deformation without dies and molds; consequently, the final product geometry can be changed by the control of an actuator without requiring a set of different tools. The drawback of this process is its slowness. The aim of this study is to assess the IF feasibility at high speeds. An experimental campaign will be performed by a CNC lathe with high speed to test process feasibility and the influence on materials formability mainly on aluminum alloys. The first results show how the material presents the same performance than in conventional speed IF and, in some cases, better material behavior due to the temperature field. An accurate numerical simulation has been performed to investigate the material behavior during the high speed process substantially confirming experimental evidence.

  18. HALE UAS Command and Control Communications: Step 1 - Functional Requirements Document. Version 4.0

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The High Altitude Long Endurance (HALE) unmanned aircraft system (UAS) communicates with an off-board pilot-in-command in all flight phases via the C2 data link, making it a critical component for the UA to fly in the NAS safely and routinely. This is a new requirement in current FAA communications planning and monitoring processes. This document provides a set of comprehensive C2 communications functional requirements and performance guidelines to help facilitate the future FAA certification process for civil UAS to operate in the NAS. The objective of the guidelines is to provide the ability to validate the functional requirements and in future be used to develop performance-level requirements.

  19. Natural Resource Information System. Volume 1: Overall description

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.

  20. Magnetorheological finishing: a perfect solution to nanofinishing requirements

    NASA Astrophysics Data System (ADS)

    Sidpara, Ajay

    2014-09-01

    Finishing of optics for different applications is the most important as well as difficult step to meet the specification of optics. Conventional grinding or other polishing processes are not able to reduce surface roughness beyond a certain limit due to high forces acting on the workpiece, embedded abrasive particles, limited control over process, etc. Magnetorheological finishing (MRF) process provides a new, efficient, and innovative way to finish optical materials as well many metals to their desired level of accuracy. This paper provides an overview of MRF process for different applications, important process parameters, requirement of magnetorheological fluid with respect to workpiece material, and some areas that need to be explored for extending the application of MRF process.

  1. Explosive bonding of metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Reece, O. Y.

    1969-01-01

    Explosive bonding process produces sheet composites of aluminum alloy reinforced by high-strength stainless steel wires. The bonds are excellent metallurgically, no external heat is required, various metals can be bonded, and the process is inexpensive.

  2. A study of the very high order natural user language (with AI capabilities) for the NASA space station common module

    NASA Technical Reports Server (NTRS)

    Gill, E. N.

    1986-01-01

    The requirements are identified for a very high order natural language to be used by crew members on board the Space Station. The hardware facilities, databases, realtime processes, and software support are discussed. The operations and capabilities that will be required in both normal (routine) and abnormal (nonroutine) situations are evaluated. A structure and syntax for an interface (front-end) language to satisfy the above requirements are recommended.

  3. Implementation of ionizing radiation environment requirements for Space Station

    NASA Technical Reports Server (NTRS)

    Boeder, Paul A.; Watts, John W.

    1993-01-01

    Proper functioning of Space Station hardware requires that the effects of high-energy ionizing particles from the natural environment and (possibly) from man-made sources be considered during design. At the Space Station orbit of 28.5-deg inclination and 330-440 km altitude, geomagnetically trapped protons and electrons contribute almost all of the dose, while galactic cosmic rays and anomalous cosmic rays may produce Single Event Upsets (SEUs), latchups, and burnouts of microelectronic devices. Implementing ionizing radiation environment requirements for Space Station has been a two part process, including the development of a description of the environment for imposing requirements on the design and the development of a control process for assessing how well the design addresses the effects of the ionizing radiation environment. We will review both the design requirements and the control process for addressing ionizing radiation effects on Space Station.

  4. Robotic tape library system level testing at NSA: Present and planned

    NASA Technical Reports Server (NTRS)

    Shields, Michael F.

    1994-01-01

    In the present of declining Defense budgets, increased pressure has been placed on the DOD to utilize Commercial Off the Shelf (COTS) solutions to incrementally solve a wide variety of our computer processing requirements. With the rapid growth in processing power, significant expansion of high performance networking, and the increased complexity of applications data sets, the requirement for high performance, large capacity, reliable and secure, and most of all affordable robotic tape storage libraries has greatly increased. Additionally, the migration to a heterogeneous, distributed computing environment has further complicated the problem. With today's open system compute servers approaching yesterday's supercomputer capabilities, the need for affordable, reliable secure Mass Storage Systems (MSS) has taken on an ever increasing importance to our processing center's ability to satisfy operational mission requirements. To that end, NSA has established an in-house capability to acquire, test, and evaluate COTS products. Its goal is to qualify a set of COTS MSS libraries, thereby achieving a modicum of standardization for robotic tape libraries which can satisfy our low, medium, and high performance file and volume serving requirements. In addition, NSA has established relations with other Government Agencies to complete this in-house effort and to maximize our research, testing, and evaluation work. While the preponderance of the effort is focused at the high end of the storage ladder, considerable effort will be extended this year and next at the server class or mid range storage systems.

  5. Performance characterization of water recovery and water quality from chemical/organic waste products

    NASA Technical Reports Server (NTRS)

    Moses, W. M.; Rogers, T. D.; Chowdhury, H.; Cullingford, H. S.

    1989-01-01

    The water reclamation subsystems currently being evaluated for the Space Shuttle Freedom are briefly reviewed with emphasis on a waste water management system capable of processing wastes containing high concentrations of organic/inorganic materials. The process combines low temperature/pressure to vaporize water with high temperature catalytic oxidation to decompose volatile organics. The reclaimed water is of potable quality and has high potential for maintenance under sterile conditions. Results from preliminary experiments and modifications in process and equipment required to control reliability and repeatability of system operation are presented.

  6. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    PubMed

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  7. Principles of gas phase processing of ceramics during combustion

    NASA Technical Reports Server (NTRS)

    Zachariah, Michael R.

    1993-01-01

    In recent years, ceramic materials have found applications in an increasingly wider range of industrial processes, where their unique mechanical, electrical and optical properties are exploited. Ceramics are especially useful for applications in high temperature, corrosive environments, which impose particularly stringent requirements on mechanical reliability. One approach to provide such materials is the manufacture of submicron (and more recently nanometer scale) particles, which may subsequently be sintered to produce a material with extremely high mechanical integrity. However, high quality ceramic materials can only be obtained if particles of known size, polydispersity, shape and chemical purity can be produced consistently, under well controlled conditions. These requirements are the fundamental driving force for the renewed interest in studying particle formation and growth of such materials.

  8. Evaluation of Mars CO2 Capture and Gas Separation Technologies

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony C.; Santiago-Maldonado, Edgardo; Gibson, Tracy; Devor, Robert; Captain, James

    2011-01-01

    Recent national policy statements have established that the ultimate destination of NASA's human exploration program is Mars. In Situ Resource Utilization (ISRU) is a key technology required to ,enable such missions and it is appropriate to review progress in this area and continue to advance the systems required to produce rocket propellant, oxygen, and other consumables on Mars using the carbon dioxide atmosphere and other potential resources. The Mars Atmospheric Capture and Gas separation project is selecting, developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Trace gases will be required to be separated from Martian atmospheric gases to provide pure CO2 to processing elements. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and should be captured as well. To achieve these goals, highly efficient gas separation processes will be required. These gas separation techniques are also required across various areas within the ISRU project to support various consumable production processes. The development of innovative gas separation techniques will evaluate the current state-of-the-art for the gas separation required, with the objective to demonstrate and develop light-weight, low-power methods for gas separation. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from unreacted carbon oxides (C02-CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, (3)/carbon oxides from oxygen from a trash/waste processing reaction, and (4) helium from hydrogen or oxygen from a propellant scavenging process. Potential technologies for the separations include' freezers, selective membranes, selective solvents, polymeric sorbents, zeolites, and new technologies. This paper summarizes the results of an extensive literature review of candidate technologies for the capture and separation of CO2 and other relevant gases. This information will be used to prioritize the technologies to be developed further during this and other ISRU projects.

  9. Corrosion of Highly Specular Vapor Deposited Aluminum (VDA) on Earthshade Door Sandwich Structure

    NASA Technical Reports Server (NTRS)

    Plaskon, Daniel; Hsieh, Cheng

    2003-01-01

    High-resolution infrared (IR) imaging requires spacecraft instrument design that is tightly coupled with overall thermal control design. The JPL Tropospheric Emission Spectrometer (TES) instrument measures the 3-dimensional distribution of ozone and its precursors in the lower atmosphere on a global scale. The TES earthshade must protect the 180-K radiator and the 230-K radiator from the Earth IR and albedo. Requirements for specularity, emissivity, and solar absorptance of inner surfaces could only be met with vapor deposited aluminum (VDA). Circumstances leading to corrosion of the VDA are described. Innovative materials and processing to meet the optical and thermal cycle requirements were developed. Examples of scanning electronmicroscope (SEM), atomic force microscope (AFM), and other surface analysis techniques used in failure analysis, problem solving, and process development are given. Materials and process selection criteria and development test results are presented in a decision matrix. Examples of conditions promoting and preventing galvanic corrosion between VDA and graphite fiber-reinforced laminates are provided.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Donald F.; Schulz, Carl; Konijnenburg, Marco

    High-resolution Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry imaging enables the spatial mapping and identification of biomolecules from complex surfaces. The need for long time-domain transients, and thus large raw file sizes, results in a large amount of raw data (“big data”) that must be processed efficiently and rapidly. This can be compounded by largearea imaging and/or high spatial resolution imaging. For FT-ICR, data processing and data reduction must not compromise the high mass resolution afforded by the mass spectrometer. The continuous mode “Mosaic Datacube” approach allows high mass resolution visualization (0.001 Da) of mass spectrometry imaging data, butmore » requires additional processing as compared to featurebased processing. We describe the use of distributed computing for processing of FT-ICR MS imaging datasets with generation of continuous mode Mosaic Datacubes for high mass resolution visualization. An eight-fold improvement in processing time is demonstrated using a Dutch nationally available cloud service.« less

  11. High-performance image processing on the desktop

    NASA Astrophysics Data System (ADS)

    Jordan, Stephen D.

    1996-04-01

    The suitability of computers to the task of medical image visualization for the purposes of primary diagnosis and treatment planning depends on three factors: speed, image quality, and price. To be widely accepted the technology must increase the efficiency of the diagnostic and planning processes. This requires processing and displaying medical images of various modalities in real-time, with accuracy and clarity, on an affordable system. Our approach to meeting this challenge began with market research to understand customer image processing needs. These needs were translated into system-level requirements, which in turn were used to determine which image processing functions should be implemented in hardware. The result is a computer architecture for 2D image processing that is both high-speed and cost-effective. The architectural solution is based on the high-performance PA-RISC workstation with an HCRX graphics accelerator. The image processing enhancements are incorporated into the image visualization accelerator (IVX) which attaches to the HCRX graphics subsystem. The IVX includes a custom VLSI chip which has a programmable convolver, a window/level mapper, and an interpolator supporting nearest-neighbor, bi-linear, and bi-cubic modes. This combination of features can be used to enable simultaneous convolution, pan, zoom, rotate, and window/level control into 1 k by 1 k by 16-bit medical images at 40 frames/second.

  12. Conductor requirements for high-temperature superconducting utility power transformers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pleva, E. F.; Mehrotra, V.; Schwenterly, S W

    High-temperature superconducting (HTS) coated conductors in utility power transformers must satisfy a set of operating requirements that are driven by two major considerations-HTS transformers must be economically competitive with conventional units, and the conductor must be robust enough to be used in a commercial manufacturing environment. The transformer design and manufacturing process will be described in order to highlight the various requirements that it imposes on the HTS conductor. Spreadsheet estimates of HTS transformer costs allow estimates of the conductor cost required for an HTS transformer to be competitive with a similarly performing conventional unit.

  13. Practical, Real-Time, and Robust Watermarking on the Spatial Domain for High-Definition Video Contents

    NASA Astrophysics Data System (ADS)

    Kim, Kyung-Su; Lee, Hae-Yeoun; Im, Dong-Hyuck; Lee, Heung-Kyu

    Commercial markets employ digital right management (DRM) systems to protect valuable high-definition (HD) quality videos. DRM system uses watermarking to provide copyright protection and ownership authentication of multimedia contents. We propose a real-time video watermarking scheme for HD video in the uncompressed domain. Especially, our approach is in aspect of practical perspectives to satisfy perceptual quality, real-time processing, and robustness requirements. We simplify and optimize human visual system mask for real-time performance and also apply dithering technique for invisibility. Extensive experiments are performed to prove that the proposed scheme satisfies the invisibility, real-time processing, and robustness requirements against video processing attacks. We concentrate upon video processing attacks that commonly occur in HD quality videos to display on portable devices. These attacks include not only scaling and low bit-rate encoding, but also malicious attacks such as format conversion and frame rate change.

  14. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  15. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  16. Applications of massively parallel computers in telemetry processing

    NASA Technical Reports Server (NTRS)

    El-Ghazawi, Tarek A.; Pritchard, Jim; Knoble, Gordon

    1994-01-01

    Telemetry processing refers to the reconstruction of full resolution raw instrumentation data with artifacts, of space and ground recording and transmission, removed. Being the first processing phase of satellite data, this process is also referred to as level-zero processing. This study is aimed at investigating the use of massively parallel computing technology in providing level-zero processing to spaceflights that adhere to the recommendations of the Consultative Committee on Space Data Systems (CCSDS). The workload characteristics, of level-zero processing, are used to identify processing requirements in high-performance computing systems. An example of level-zero functions on a SIMD MPP, such as the MasPar, is discussed. The requirements in this paper are based in part on the Earth Observing System (EOS) Data and Operation System (EDOS).

  17. Pragmatic Comprehension of High and Low Level Language Learners

    ERIC Educational Resources Information Center

    Garcia, Paula

    2004-01-01

    This study compares the performances of 16 advanced and 19 beginning English language learners on a listening comprehension task that focused on linguistic and pragmatic processing. Processing pragmatic meaning differs from processing linguistic meaning because pragmatic meaning requires the listener to understand not only linguistic information,…

  18. r-process nucleosynthesis in the high-entropy supernova bubble

    NASA Technical Reports Server (NTRS)

    Meyer, B. S.; Mathews, G. J.; Howard, W. M.; Woosley, S. E.; Hoffman, R. D.

    1992-01-01

    We show that the high-temperature, high-entropy evacuated region outside the recent neutron star in a core-collapse supernova may be an ideal r-process site. In this high-entropy environment it is possible that most nucleons are in the form of free neutrons or bound into alpha particles. Thus, there can be many neutrons per seed nucleus even though the material is not particularly neutron rich. The predicted amount of r-process material ejected per event from this environment agrees well with that required by simple galactic evolution arguments. When averaged over regions of different neutron excess in the supernova ejecta, the calculated r-process abundance curve can give a good representation of the solar-system r-process abundances as long as the entropy per baryon is sufficiently high. Neutrino irradiation may aid in smoothing the final abundance distribution.

  19. Mapping of MPEG-4 decoding on a flexible architecture platform

    NASA Astrophysics Data System (ADS)

    van der Tol, Erik B.; Jaspers, Egbert G.

    2001-12-01

    In the field of consumer electronics, the advent of new features such as Internet, games, video conferencing, and mobile communication has triggered the convergence of television and computers technologies. This requires a generic media-processing platform that enables simultaneous execution of very diverse tasks such as high-throughput stream-oriented data processing and highly data-dependent irregular processing with complex control flows. As a representative application, this paper presents the mapping of a Main Visual profile MPEG-4 for High-Definition (HD) video onto a flexible architecture platform. A stepwise approach is taken, going from the decoder application toward an implementation proposal. First, the application is decomposed into separate tasks with self-contained functionality, clear interfaces, and distinct characteristics. Next, a hardware-software partitioning is derived by analyzing the characteristics of each task such as the amount of inherent parallelism, the throughput requirements, the complexity of control processing, and the reuse potential over different applications and different systems. Finally, a feasible implementation is proposed that includes amongst others a very-long-instruction-word (VLIW) media processor, one or more RISC processors, and some dedicated processors. The mapping study of the MPEG-4 decoder proves the flexibility and extensibility of the media-processing platform. This platform enables an effective HW/SW co-design yielding a high performance density.

  20. Explosive Welding in the 1990's

    NASA Technical Reports Server (NTRS)

    Lalwaney, N. S.; Linse, V. D.

    1985-01-01

    Explosive bonding is a unique joining process with the serious potential to produce composite materials capable of fulfilling many of the high performance materials capable of fulfilling many of the high performance materials needs of the 1990's. The process has the technological versatility to provide a true high quality metallurgical compatible and incompatible systems. Metals routinely explosively bonded include a wide variety of combinations of reactive and refractory metals, low and high density metals and their alloys, corrosion resistant and high strength alloys, and common steels. The major advantage of the process is its ability to custom design and engineer composites with physical and/or mechanical properties that meet a specific or unusual performance requirement. Explosive bonding offers the designer unique opportunities in materials selection with unique combinations of properties and high integrity bonds that cannot be achieved by any other metal joining process. The process and some applications are discussed.

  1. Exascale computing and big data

    DOE PAGES

    Reed, Daniel A.; Dongarra, Jack

    2015-06-25

    Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less

  2. Exascale computing and big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, Daniel A.; Dongarra, Jack

    Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less

  3. Standard High Solids Vessel Design De-inventory Simulant Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiskum, Sandra K.; Burns, Carolyn A.M.; Gauglitz, Phillip A.

    The Hanford Tank Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant be developed that would represent the de-inventory (residual high-density tank solids cleanout) process. Its basis and target characteristics are defined in 24590-WTP-ES-ENG-16-021 and implemented through PNNL Test Plan TP-WTPSP-132 Rev. 1.0. This document describes the de-inventory Newtonian carrier fluid (DNCF) simulant composition that will satisfy the basis requirement to mimic the density (1.18 g/mL ± 0.1 g/mL) and viscosity (2.8 cP ± 0.5more » cP) of 5 M NaOH at 25 °C.1 The simulant viscosity changes significantly with temperature. Therefore, various solution compositions may be required, dependent on the test stand process temperature range, to meet these requirements. Table ES.1 provides DNCF compositions at selected temperatures that will meet the density and viscosity specifications as well as the temperature range at which the solution will meet the acceptable viscosity tolerance.« less

  4. Enhancement of Lipid Extraction from Marine Microalga, Scenedesmus Associated with High-Pressure Homogenization Process

    PubMed Central

    Cho, Seok-Cheol; Choi, Woon-Yong; Oh, Sung-Ho; Lee, Choon-Geun; Seo, Yong-Chang; Kim, Ji-Seon; Song, Chi-Ho; Kim, Ga-Vin; Lee, Shin-Young; Kang, Do-Hyung; Lee, Hyeon-Yong

    2012-01-01

    Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v). Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production. PMID:22969270

  5. A Fault-Tolerant Radiation-Robust Mass Storage Concept for Highly Scaled Flash Memory

    NASA Astrophysics Data System (ADS)

    Fuchs, Cristian M.; Trinitis, Carsten; Appel, Nicolas; Langer, Martin

    2015-09-01

    Future spacemissions will require vast amounts of data to be stored and processed aboard spacecraft. While satisfying operational mission requirements, storage systems must guarantee data integrity and recover damaged data throughout the mission. NAND-flash memories have become popular for space-borne high performance mass memory scenarios, though future storage concepts will rely upon highly scaled flash or other memory technologies. With modern flash memory, single bit erasure coding and RAID based concepts are insufficient. Thus, a fully run-time configurable, high performance, dependable storage concept, requiring a minimal set of logic or software. The solution is based on composite erasure coding and can be adjusted for altered mission duration or changing environmental conditions.

  6. Theoretical study of thermodynamic properties and reaction rates of importance in the high-speed research program

    NASA Technical Reports Server (NTRS)

    Langhoff, Stephen; Bauschlicher, Charles; Jaffe, Richard

    1992-01-01

    One of the primary goals of NASA's high-speed research program is to determine the feasibility of designing an environmentally safe commercial supersonic transport airplane. The largest environmental concern is focused on the amount of ozone destroying nitrogen oxides (NO(x)) that would be injected into the lower stratosphere during the cruise portion of the flight. The limitations placed on NO(x) emission require more than an order of magnitude reduction over current engine designs. To develop strategies to meet this goal requires first gaining a fundamental understanding of the combustion chemistry. To accurately model the combustor requires a computational fluid dynamics approach that includes both turbulence and chemistry. Since many of the important chemical processes in this regime involve highly reactive radicals, an experimental determination of the required thermodynamic data and rate constants is often very difficult. Unlike experimental approaches, theoretical methods are as applicable to highly reactive species as stable ones. Also our approximation of treating the dynamics classically becomes more accurate with increasing temperature. In this article we review recent progress in generating thermodynamic properties and rate constants that are required to understand NO(x) formation in the combustion process. We also describe our one-dimensional modeling efforts to validate an NH3 combustion reaction mechanism. We have been working in collaboration with researchers at LeRC, to ensure that our theoretical work is focused on the most important thermodynamic quantities and rate constants required in the chemical data base.

  7. Flow Asymmetric Propargylation: Development of Continuous Processes for the Preparation of a Chiral β-Amino Alcohol.

    PubMed

    Li, Hui; Sheeran, Jillian W; Clausen, Andrew M; Fang, Yuan-Qing; Bio, Matthew M; Bader, Scott

    2017-08-01

    The development of a flow chemistry process for asymmetric propargylation using allene gas as a reagent is reported. The connected continuous process of allene dissolution, lithiation, Li-Zn transmetallation, and asymmetric propargylation provides homopropargyl β-amino alcohol 1 with high regio- and diastereoselectivity in high yield. This flow process enables practical use of an unstable allenyllithium intermediate. The process uses the commercially available and recyclable (1S,2R)-N-pyrrolidinyl norephedrine as a ligand to promote the highly diastereoselective (32:1) propargylation. Judicious selection of mixers based on the chemistry requirement and real-time monitoring of the process using process analytical technology (PAT) enabled stable and scalable flow chemistry runs. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Process for making boron nitride using sodium cyanide and boron

    DOEpatents

    Bamberger, Carlos E.

    1990-02-06

    This a very simple process for making boron nitride by mixing sodium cyanide and boron phosphate and heating the mixture in an inert atmosphere until a reaction takes place. The product is a white powder of boron nitride that can be used in applications that require compounds that are stable at high temperatures and that exhibit high electrical resistance.

  9. Process for making boron nitride using sodium cyanide and boron

    DOEpatents

    Bamberger, Carlos E.

    1990-01-01

    This a very simple process for making boron nitride by mixing sodium cyanide and boron phosphate and heating the mixture in an inert atmosphere until a reaction takes place. The product is a white powder of boron nitride that can be used in applications that require compounds that are stable at high temperatures and that exhibit high electrical resistance.

  10. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  11. The Legacy of Space Shuttle Flight Software

    NASA Technical Reports Server (NTRS)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  12. Facilitating NASA's Use of GEIA-STD-0005-1, Performance Standard for Aerospace and High Performance Electronic Systems Containing Lead-Free Solder

    NASA Technical Reports Server (NTRS)

    Plante, Jeannete

    2010-01-01

    GEIA-STD-0005-1 defines the objectives of, and requirements for, documenting processes that assure customers and regulatory agencies that AHP electronic systems containing lead-free solder, piece parts, and boards will satisfy the applicable requirements for performance, reliability, airworthiness, safety, and certify-ability throughout the specified life of performance. It communicates requirements for a Lead-Free Control Plan (LFCP) to assist suppliers in the development of their own Plans. The Plan documents the Plan Owner's (supplier's) processes, that assure their customer, and all other stakeholders that the Plan owner's products will continue to meet their requirements. The presentation reviews quality assurance requirements traceability and LFCP template instructions.

  13. Industrial applications of high-average power high-peak power nanosecond pulse duration Nd:YAG lasers

    NASA Astrophysics Data System (ADS)

    Harrison, Paul M.; Ellwi, Samir

    2009-02-01

    Within the vast range of laser materials processing applications, every type of successful commercial laser has been driven by a major industrial process. For high average power, high peak power, nanosecond pulse duration Nd:YAG DPSS lasers, the enabling process is high speed surface engineering. This includes applications such as thin film patterning and selective coating removal in markets such as the flat panel displays (FPD), solar and automotive industries. Applications such as these tend to require working spots that have uniform intensity distribution using specific shapes and dimensions, so a range of innovative beam delivery systems have been developed that convert the gaussian beam shape produced by the laser into a range of rectangular and/or shaped spots, as required by demands of each project. In this paper the authors will discuss the key parameters of this type of laser and examine why they are important for high speed surface engineering projects, and how they affect the underlying laser-material interaction and the removal mechanism. Several case studies will be considered in the FPD and solar markets, exploring the close link between the application, the key laser characteristics and the beam delivery system that link these together.

  14. CZT sensors for Computed Tomography: from crystal growth to image quality

    NASA Astrophysics Data System (ADS)

    Iniewski, K.

    2016-12-01

    Recent advances in Traveling Heater Method (THM) growth and device fabrication that require additional processing steps have enabled to dramatically improve hole transport properties and reduce polarization effects in Cadmium Zinc Telluride (CZT) material. As a result high flux operation of CZT sensors at rates in excess of 200 Mcps/mm2 is now possible and has enabled multiple medical imaging companies to start building prototype Computed Tomography (CT) scanners. CZT sensors are also finding new commercial applications in non-destructive testing (NDT) and baggage scanning. In order to prepare for high volume commercial production we are moving from individual tile processing to whole wafer processing using silicon methodologies, such as waxless processing, cassette based/touchless wafer handling. We have been developing parametric level screening at the wafer stage to ensure high wafer quality before detector fabrication in order to maximize production yields. These process improvements enable us, and other CZT manufacturers who pursue similar developments, to provide high volume production for photon counting applications in an economically feasible manner. CZT sensors are capable of delivering both high count rates and high-resolution spectroscopic performance, although it is challenging to achieve both of these attributes simultaneously. The paper discusses material challenges, detector design trade-offs and ASIC architectures required to build cost-effective CZT based detection systems. Photon counting ASICs are essential part of the integrated module platforms as charge-sensitive electronics needs to deal with charge-sharing and pile-up effects.

  15. Overview of Characterization Techniques for High Speed Crystal Growth

    NASA Technical Reports Server (NTRS)

    Ravi, K. V.

    1984-01-01

    Features of characterization requirements for crystals, devices and completed products are discussed. Key parameters of interest in semiconductor processing are presented. Characterization as it applies to process control, diagnostics and research needs is discussed with appropriate examples.

  16. Digital imaging technology assessment: Digital document storage project

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An ongoing technical assessment and requirements definition project is examining the potential role of digital imaging technology at NASA's STI facility. The focus is on the basic components of imaging technology in today's marketplace as well as the components anticipated in the near future. Presented is a requirement specification for a prototype project, an initial examination of current image processing at the STI facility, and an initial summary of image processing projects at other sites. Operational imaging systems incorporate scanners, optical storage, high resolution monitors, processing nodes, magnetic storage, jukeboxes, specialized boards, optical character recognition gear, pixel addressable printers, communications, and complex software processes.

  17. Development of practical high temperature superconducting wire for electric power application

    NASA Technical Reports Server (NTRS)

    Hawsey, Robert A.; Sokolowski, Robert S.; Haldar, Pradeep; Motowidlo, Leszek R.

    1995-01-01

    The technology of high temperature superconductivity has gone from beyond mere scientific curiousity into the manufacturing environment. Single lengths of multifilamentary wire are now produced that are over 200 meters long and that carry over 13 amperes at 77 K. Short-sample critical current densities approach 5 x 104 A/sq cm at 77 K. Conductor requirements such as high critical current density in a magnetic field, strain-tolerant sheathing materials, and other engineering properties are addressed. A new process for fabricating round BSCCO-2212 wire has produced wires with critical current densities as high as 165,000 A/sq cm at 4.2 K and 53,000 A/sq cm at 40 K. This process eliminates the costly, multiple pressing and rolling steps that are commonly used to develop texture in the wires. New multifilamentary wires with strengthened sheathing materials have shown improved yield strengths up to a factor of five better than those made with pure silver. Many electric power devices require the wire to be formed into coils for production of strong magnetic fields. Requirements for coils and magnets for electric power applications are described.

  18. Mapping High Dimensional Sparse Customer Requirements into Product Configurations

    NASA Astrophysics Data System (ADS)

    Jiao, Yao; Yang, Yu; Zhang, Hongshan

    2017-10-01

    Mapping customer requirements into product configurations is a crucial step for product design, while, customers express their needs ambiguously and locally due to the lack of domain knowledge. Thus the data mining process of customer requirements might result in fragmental information with high dimensional sparsity, leading the mapping procedure risk uncertainty and complexity. The Expert Judgment is widely applied against that background since there is no formal requirements for systematic or structural data. However, there are concerns on the repeatability and bias for Expert Judgment. In this study, an integrated method by adjusted Local Linear Embedding (LLE) and Naïve Bayes (NB) classifier is proposed to map high dimensional sparse customer requirements to product configurations. The integrated method adjusts classical LLE to preprocess high dimensional sparse dataset to satisfy the prerequisite of NB for classifying different customer requirements to corresponding product configurations. Compared with Expert Judgment, the adjusted LLE with NB performs much better in a real-world Tablet PC design case both in accuracy and robustness.

  19. Processing of high performance (LRE)-Ba Cu O large, single-grain bulk superconductors in air

    NASA Astrophysics Data System (ADS)

    Hari Babu, N.; Iida, K.; Shi, Y.; Cardwell, D. A.

    2006-10-01

    We report the fabrication of large (LRE)BCO single-grains with improved superconducting properties for LRE = Nd, Sm and Gd using a practical process via both conventional top seeded melt growth (TSMG) and seeded infiltration-growth (SIG). This process uses a new generic seed crystal that promotes heterogeneous grain nucleation in the required orientation and suppresses the formation of solid solution in a controlled manner within individual grains by the addition of excess BaO2 to the precursor powder. The spatial distribution of the superconducting properties of LRE bulk superconductors as a function of BaO2 addition for large (LRE)BCO grains fabricated in air by TSMG and SIG for LRE = Gd, Sm and Nd are compared. The optimum BaO2 content required to fabricate single-grain (LRE)BCO with high and homogeneous Tc is determined from these experiments for each LRE system. The irreversibility fields of (LRE)BCO bulk superconductors processed in air are as high as those processed in reduced PO2. Critical current densities in excess of 105 A/cm2 at 77 K and higher trapped fields have been achieved in optimized (LRE)BCO superconductors fabricated in air for the first time.

  20. Review of Manganese Processing for Production of TRIP/TWIP Steels, Part 1: Current Practice and Processing Fundamentals

    NASA Astrophysics Data System (ADS)

    Elliott, R.; Coley, K.; Mostaghel, S.; Barati, M.

    2018-02-01

    The increasing demand for high-performance steel alloys has led to development of transformation-induced plasticity (TRIP) and twinning-induced plasticity (TWIP) alloys over the past three decades. These alloys offer exceptional combinations of high tensile strength and ductility. Thus, the mechanical behavior of these alloys has been a subject of significant work in recent years. However, the challenge of economically providing Mn in the quantity and purity required by these alloys has received considerably less attention. To enable commercial implementation of ultrahigh-Mn alloys, it is desirable to lower the high material costs associated with their production. Therefore, the present work reviews Mn processing routes in the context of the chemical requirements of these alloys. The aim of this review is to assess the current state of the art regarding reduction of manganese ores and provide a comprehensive reference for researchers working to mitigate material processing costs associated with Mn production. The review is presented in two parts: Part 1 introduces TRIP and TWIP alloys, current industrial practice, and pertinent thermodynamic fundamentals; Part 2 addresses available literature regarding reduction of Mn ores and oxides, and seeks to identify opportunities for future process development.

  1. NASA Case Sensitive Review and Audit Approach

    NASA Astrophysics Data System (ADS)

    Lee, Arthur R.; Bacus, Thomas H.; Bowersox, Alexandra M.; Newman, J. Steven

    2005-12-01

    As an Agency involved in high-risk endeavors NASA continually reassesses its commitment to engineering excellence and compliance to requirements. As a component of NASA's continual process improvement, the Office of Safety and Mission Assurance (OSMA) established the Review and Assessment Division (RAD) [1] to conduct independent audits to verify compliance with Agency requirements that impact safe and reliable operations. In implementing its responsibilities, RAD benchmarked various approaches for conducting audits, focusing on organizations that, like NASA, operate in high-risk environments - where seemingly inconsequential departures from safety, reliability, and quality requirements can have catastrophic impact to the public, NASA personnel, high-value equipment, and the environment. The approach used by the U.S. Navy Submarine Program [2] was considered the most fruitful framework for the invigorated OSMA audit processes. Additionally, the results of benchmarking activity revealed that not all audits are conducted using just one approach or even with the same objectives. This led to the concept of discrete, unique "audit cases."

  2. High resolution weather data for urban hydrological modelling and impact assessment, ICT requirements and future challenges

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-claire; van Riemsdijk, Birna

    2013-04-01

    Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This presentation will highlight ICT-related requirements and limitations in high resolution urban hydrological modelling and analysis. Further ICT challenges arise in provision of high resolution radar data for diverging information needs as well as in combination with other data sources in the urban environment. Different types of information are required for such diverse activities as operational flood protection, traffic management, large event organisation, business planning in shopping districts and restaurants, timing of family activities. These different information needs may require different configurations and data processing for radars and other data sources. An ICT challenge is to develop techniques for deciding how to automatically respond to these diverging information needs (e.g., through (semi-)automated negotiation). Diverse activities also provide a wide variety of information resources that can supplement traditional networks of weather sensors, such as rain sensors on cars and social media. Another ICT challenge is how to combine data from these different sources for answering a particular information need. Examples will be presented of solutions are currently being explored.

  3. Switch contact device for interrupting high current, high voltage, AC and DC circuits

    DOEpatents

    Via, Lester C.; Witherspoon, F. Douglas; Ryan, John M.

    2005-01-04

    A high voltage switch contact structure capable of interrupting high voltage, high current AC and DC circuits. The contact structure confines the arc created when contacts open to the thin area between two insulating surfaces in intimate contact. This forces the arc into the shape of a thin sheet which loses heat energy far more rapidly than an arc column having a circular cross-section. These high heat losses require a dramatic increase in the voltage required to maintain the arc, thus extinguishing it when the required voltage exceeds the available voltage. The arc extinguishing process with this invention is not dependent on the occurrence of a current zero crossing and, consequently, is capable of rapidly interrupting both AC and DC circuits. The contact structure achieves its high performance without the use of sulfur hexafluoride.

  4. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  5. TruMicro Series 2000 sub-400 fs class industrial fiber lasers: adjustment of laser parameters to process requirements

    NASA Astrophysics Data System (ADS)

    Kanal, Florian; Kahmann, Max; Tan, Chuong; Diekamp, Holger; Jansen, Florian; Scelle, Raphael; Budnicki, Aleksander; Sutter, Dirk

    2017-02-01

    The matchless properties of ultrashort laser pulses, such as the enabling of cold processing and non-linear absorption, pave the way to numerous novel applications. Ultrafast lasers arrived in the last decade at a level of reliability suitable for the industrial environment.1 Within the next years many industrial manufacturing processes in several markets will be replaced by laser-based processes due to their well-known benefits: These are non-contact wear-free processing, higher process accuracy or an increase of processing speed and often improved economic efficiency compared to conventional processes. Furthermore, new processes will arise with novel sources, addressing previously unsolved challenges. One technical requirement for these exciting new applications will be to optimize the large number of available parameters to the requirements of the application. In this work we present an ultrafast laser system distinguished by its capability to combine high flexibility and real time process-inherent adjustments of the parameters with industry-ready reliability. This industry-ready reliability is ensured by a long experience in designing and building ultrashort-pulse lasers in combination with rigorous optimization of the mechanical construction, optical components and the entire laser head for continuous performance. By introducing a new generation of mechanical design in the last few years, TRUMPF enabled its ultrashort-laser platforms to fulfill the very demanding requirements for passively coupling high-energy single-mode radiation into a hollow-core transport fiber. The laser architecture presented here is based on the all fiber MOPA (master oscillator power amplifier) CPA (chirped pulse amplification) technology. The pulses are generated in a high repetition rate mode-locked fiber oscillator also enabling flexible pulse bursts (groups of multiple pulses) with 20 ns intra-burst pulse separation. An external acousto-optic modulator (XAOM) enables linearization and multi-level quad-loop stabilization of the output power of the laser.2 In addition to the well-established platform latest developments addressed single-pulse energies up to 50 μJ and made femtosecond pulse durations available for the TruMicro Series 2000. Beyond these stabilization aspects this laser architecture together with other optical modules and combined with smart laser control software enables process-driven adjustments of the parameters (e. g. repetition rate, multi-pulse functionalities, pulse energy, pulse duration) by external signals, which will be presented in this work.

  6. Parallel Processing Systems for Passive Ranging During Helicopter Flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Bavavar; Suorsa, Raymond E.; Showman, Robert D. (Technical Monitor)

    1994-01-01

    The complexity of rotorcraft missions involving operations close to the ground result in high pilot workload. In order to allow a pilot time to perform mission-oriented tasks, sensor-aiding and automation of some of the guidance and control functions are highly desirable. Images from an electro-optical sensor provide a covert way of detecting objects in the flight path of a low-flying helicopter. Passive ranging consists of processing a sequence of images using techniques based on optical low computation and recursive estimation. The passive ranging algorithm has to extract obstacle information from imagery at rates varying from five to thirty or more frames per second depending on the helicopter speed. We have implemented and tested the passive ranging algorithm off-line using helicopter-collected images. However, the real-time data and computation requirements of the algorithm are beyond the capability of any off-the-shelf microprocessor or digital signal processor. This paper describes the computational requirements of the algorithm and uses parallel processing technology to meet these requirements. Various issues in the selection of a parallel processing architecture are discussed and four different computer architectures are evaluated regarding their suitability to process the algorithm in real-time. Based on this evaluation, we conclude that real-time passive ranging is a realistic goal and can be achieved with a short time.

  7. Nursing, Pharmacy, and Prescriber Knowledge and Perceptions of High-Alert Medications in a Large, Academic Medical Hospital

    PubMed Central

    Engels, Melanie J.

    2015-01-01

    Background: High-alert medications pose a greater risk of causing significant harm to patients if used in error. The Joint Commission requires that hospitals define institution-specific high-alert medications and implement processes to ensure safe medication use. Method: Nursing, pharmacy, and prescribers were asked to voluntarily complete a 34-question survey to assess their knowledge, experience, and perceptions regarding high-alert medications in an academic hospital. Results: The majority of respondents identified the organization’s high-alert medications, the consequences of an error involving a high-alert medication, and the reversal agent. Most of the risk-reduction strategies within the institution were viewed as being effective by respondents. Forty-five percent of the respondents utilized a high-alert medication in the previous 24 hours. Only 14.2% had experienced an error with a high-alert medication in the previous 12 months, with 46% being near misses. The survey found the 5 rights for medication administration were not being utilized consistently. Respondents indicated that work experience or hospital orientation is the preferred learning experience for high-alert medications. Conclusions: This study assessed all disciplines involved in the medication use process. Perceptions about high-alert medications differ between disciplines. Ongoing discipline-specific education is required to ensure that individuals accept accountability in the medication use process and to close knowledge gaps on high-alert medications and risk-reduction strategies. PMID:26446747

  8. Professional or administrative value patterns? Clinical pathways in medical problem-solving processes.

    PubMed

    Holmberg, Leif

    2007-11-01

    A health-care organization simultaneously belongs to two different institutional value patterns: a professional and an administrative value pattern. At the administrative level, medical problem-solving processes are generally perceived as the efficient application of familiar chains of activities to well-defined problems; and a low task uncertainty is therefore assumed at the work-floor level. This assumption is further reinforced through clinical pathways and other administrative guidelines. However, studies have shown that in clinical practice such administrative guidelines are often considered inadequate and difficult to implement mainly because physicians generally perceive task uncertainty to be high and that the guidelines do not cover the scope of encountered deviations. The current administrative level guidelines impose uniform structural features that meet the requirement for low task uncertainty. Within these structural constraints, physicians must organize medical problem-solving processes to meet any task uncertainty that may be encountered. Medical problem-solving processes with low task uncertainty need to be organized independently of processes with high task uncertainty. Each process must be evaluated according to different performance standards and needs to have autonomous administrative guideline models. Although clinical pathways seem appropriate when there is low task uncertainty, other kinds of guidelines are required when the task uncertainty is high.

  9. Measuring the Creative Process: A Psychometric Examination of Creative Ideation and Grit

    ERIC Educational Resources Information Center

    Rojas, Joanne P.; Tyler, Kenneth M.

    2018-01-01

    Within the investment theory of creativity (Sternberg & Lubart, 1996), creativity is defined as a 2-part process of "buying low" by investing in unusual ideas and then "selling high" by convincing others of the value or usefulness of these new ideas. This process requires both creative ideation and perseverance. The purpose…

  10. Water-based binary polyol process for the controllable synthesis of silver nanoparticles inhibiting human and foodborne pathogenic bacteria

    USDA-ARS?s Scientific Manuscript database

    The polyol process is a widely used strategy for producing nanoparticles from various reducible metallic precursors; however it requires a bulk polyol liquid reaction with additional protective agents at high temperatures. Here, we report a water-based binary polyol process using low concentrations ...

  11. Industrial Photogrammetry - Accepted Metrology Tool or Exotic Niche

    NASA Astrophysics Data System (ADS)

    Bösemann, Werner

    2016-06-01

    New production technologies like 3D printing and other adaptive manufacturing technologies have changed the industrial manufacturing process, often referred to as next industrial revolution or short industry 4.0. Such Cyber Physical Production Systems combine virtual and real world through digitization, model building process simulation and optimization. It is commonly understood that measurement technologies are the key to combine the real and virtual worlds (eg. [Schmitt 2014]). This change from measurement as a quality control tool to a fully integrated step in the production process has also changed the requirements for 3D metrology solutions. Key words like MAA (Measurement Assisted Assembly) illustrate that new position of metrology in the industrial production process. At the same time it is obvious that these processes not only require more measurements but also systems to deliver the required information in high density in a short time. Here optical solutions including photogrammetry for 3D measurements have big advantages over traditional mechanical CMM's. The paper describes the relevance of different photogrammetric solutions including state of the art, industry requirements and application examples.

  12. Business Process Modelling is an Essential Part of a Requirements Analysis. Contribution of EFMI Primary Care Working Group.

    PubMed

    de Lusignan, S; Krause, P; Michalakidis, G; Vicente, M Tristan; Thompson, S; McGilchrist, M; Sullivan, F; van Royen, P; Agreus, L; Desombre, T; Taweel, A; Delaney, B

    2012-01-01

    To perform a requirements analysis of the barriers to conducting research linking of primary care, genetic and cancer data. We extended our initial data-centric approach to include socio-cultural and business requirements. We created reference models of core data requirements common to most studies using unified modelling language (UML), dataflow diagrams (DFD) and business process modelling notation (BPMN). We conducted a stakeholder analysis and constructed DFD and UML diagrams for use cases based on simulated research studies. We used research output as a sensitivity analysis. Differences between the reference model and use cases identified study specific data requirements. The stakeholder analysis identified: tensions, changes in specification, some indifference from data providers and enthusiastic informaticians urging inclusion of socio-cultural context. We identified requirements to collect information at three levels: micro- data items, which need to be semantically interoperable, meso- the medical record and data extraction, and macro- the health system and socio-cultural issues. BPMN clarified complex business requirements among data providers and vendors; and additional geographical requirements for patients to be represented in both linked datasets. High quality research output was the norm for most repositories. Reference models provide high-level schemata of the core data requirements. However, business requirements' modelling identifies stakeholder issues and identifies what needs to be addressed to enable participation.

  13. High and Dry

    ERIC Educational Resources Information Center

    Johnson, Robert L.

    2005-01-01

    High-performance schools are facilities that improve the learning environment while saving energy, resources and money. Creating a high-performance school requires an integrated design approach. Key systems--including lighting, HVAC, electrical and plumbing--must be considered from the beginning of the design process. According to William H.…

  14. Development of high strength, high temperature ceramics

    NASA Technical Reports Server (NTRS)

    Hall, W. B.

    1982-01-01

    Improvement in the high-pressure turbopumps, both fuel and oxidizer, in the Space Shuttle main engine were considered. The operation of these pumps is limited by temperature restrictions of the metallic components used in these pumps. Ceramic materials that retain strength at high temperatures and appear to be promising candidates for use as turbine blades and impellers are discussed. These high strength materials are sensitive to many related processing parameters such as impurities, sintering aids, reaction aids, particle size, processing temperature, and post thermal treatment. The specific objectives of the study were to: (1) identify and define the processing parameters that affect the properties of Si3N4 ceramic materials, (2) design and assembly equipment required for processing high strength ceramics, (3) design and assemble test apparatus for evaluating the high temperature properties of Si3N4, and (4) conduct a research program of manufacturing and evaluating Si3N4 materials as applicable to rocket engine applications.

  15. Integrated Analysis Tools for Determination of Structural Integrity and Durability of High temperature Polymer Matrix Composites

    DTIC Science & Technology

    2008-08-18

    fidelity will be used to reduce the massive experimental testing and associated time required for qualification of new materials. Tools and...develping a model of the thermo-oxidative process for polymer systems, that incorporates the effects of reaction rates, Fickian diffusion, time varying...degradation processes. Year: 2005 Month: 12 Not required at this time . AIR FORCE OFFICE OF SCIENTIFIC KESEARCH 04 SEP 2008 Page 2 of 2 DTIC Data

  16. Telescience - Optimizing aerospace science return through geographically distributed operations

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad M.

    1990-01-01

    The paper examines the objectives and requirements of teleoperations, defined as the means and process for scientists, NASA operations personnel, and astronauts to conduct payload operations as if these were colocated. This process is described in terms of Space Station era platforms. Some of the enabling technologies are discussed, including open architecture workstations, distributed computing, transaction management, expert systems, and high-speed networks. Recent testbedding experiments are surveyed to highlight some of the human factors requirements.

  17. Description of waste pretreatment and interfacing systems dynamic simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garbrick, D.J.; Zimmerman, B.D.

    1995-05-01

    The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggestedmore » average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage.« less

  18. Dynamic balancing of super-critical rotating structures using slow-speed data via parametric excitation

    NASA Astrophysics Data System (ADS)

    Tresser, Shachar; Dolev, Amit; Bucher, Izhak

    2018-02-01

    High-speed machinery is often designed to pass several "critical speeds", where vibration levels can be very high. To reduce vibrations, rotors usually undergo a mass balancing process, where the machine is rotated at its full speed range, during which the dynamic response near critical speeds can be measured. High sensitivity, which is required for a successful balancing process, is achieved near the critical speeds, where a single deflection mode shape becomes dominant, and is excited by the projection of the imbalance on it. The requirement to rotate the machine at high speeds is an obstacle in many cases, where it is impossible to perform measurements at high speeds, due to harsh conditions such as high temperatures and inaccessibility (e.g., jet engines). This paper proposes a novel balancing method of flexible rotors, which does not require the machine to be rotated at high speeds. With this method, the rotor is spun at low speeds, while subjecting it to a set of externally controlled forces. The external forces comprise a set of tuned, response dependent, parametric excitations, and nonlinear stiffness terms. The parametric excitation can isolate any desired mode, while keeping the response directly linked to the imbalance. A software controlled nonlinear stiffness term limits the response, hence preventing the rotor to become unstable. These forces warrant sufficient sensitivity required to detect the projection of the imbalance on any desired mode without rotating the machine at high speeds. Analytical, numerical and experimental results are shown to validate and demonstrate the method.

  19. Challenges and requirements of mask data processing for multi-beam mask writer

    NASA Astrophysics Data System (ADS)

    Choi, Jin; Lee, Dong Hyun; Park, Sinjeung; Lee, SookHyun; Tamamushi, Shuichi; Shin, In Kyun; Jeon, Chan Uk

    2015-07-01

    To overcome the resolution and throughput of current mask writer for advanced lithography technologies, the platform of e-beam writer have been evolved by the developments of hardware and software in writer. Especially, aggressive optical proximity correction (OPC) for unprecedented extension of optical lithography and the needs of low sensitivity resist for high resolution result in the limit of variable shaped beam writer which is widely used for mass production. The multi-beam mask writer is attractive candidate for photomask writing of sub-10nm device because of its high speed and the large degree of freedom which enable high dose and dose modulation for each pixel. However, the higher dose and almost unlimited appetite for dose modulation challenge the mask data processing (MDP) in aspects of extreme data volume and correction method. Here, we discuss the requirements of mask data processing for multi-beam mask writer and presents new challenges of the data format, data flow, and correction method for user and supplier MDP tool.

  20. Conformal doping of topographic silicon structures using a radial line slot antenna plasma source

    NASA Astrophysics Data System (ADS)

    Ueda, Hirokazu; Ventzek, Peter L. G.; Oka, Masahiro; Horigome, Masahiro; Kobayashi, Yuuki; Sugimoto, Yasuhiro; Nozawa, Toshihisa; Kawakami, Satoru

    2014-06-01

    Fin extension doping for 10 nm front end of line technology requires ultra-shallow high dose conformal doping. In this paper, we demonstrate a new radial line slot antenna plasma source based doping process that meets these requirements. Critical to reaching true conformality while maintaining fin integrity is that the ion energy be low and controllable, while the dose absorption is self-limited. The saturated dopant later is rendered conformal by concurrent amorphization and dopant containing capping layer deposition followed by stabilization anneal. Dopant segregation assists in driving dopants from the capping layer into the sub silicon surface. Very high resolution transmission electron microscopy-Energy Dispersive X-ray spectroscopy, used to prove true conformality, was achieved. We demonstrate these results using an n-type arsenic based plasma doping process on 10 to 40 nm high aspect ratio fins structures. The results are discussed in terms of the different types of clusters that form during the plasma doping process.

  1. High-density patterned media fabrication using jet and flash imprint lithography

    NASA Astrophysics Data System (ADS)

    Ye, Zhengmao; Ramos, Rick; Brooks, Cynthia; Simpson, Logan; Fretwell, John; Carden, Scott; Hellebrekers, Paul; LaBrake, Dwayne; Resnick, Douglas J.; Sreenivasan, S. V.

    2011-04-01

    The Jet and Flash Imprint Lithography (J-FIL®) process uses drop dispensing of UV curable resists for high resolution patterning. Several applications, including patterned media, are better, and more economically served by a full substrate patterning process since the alignment requirements are minimal. Patterned media is particularly challenging because of the aggressive feature sizes necessary to achieve storage densities required for manufacturing beyond the current technology of perpendicular recording. In this paper, the key process steps for the application of J-FIL to pattern media fabrication are reviewed with special attention to substrate cleaning, vapor adhesion of the adhesion layer and imprint performance at >300 disk per hour. Also discussed are recent results for imprinting discrete track patterns at half pitches of 24nm and bit patterned media patterns at densities of 1 Tb/in2.

  2. A distributed computing approach to mission operations support. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1975-01-01

    Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.

  3. Integrated in situ gas stripping-salting-out process for high-titer acetone-butanol-ethanol production from sweet sorghum bagasse.

    PubMed

    Wen, Hao; Chen, Huidong; Cai, Di; Gong, Peiwen; Zhang, Tao; Wu, Zhichao; Gao, Heting; Li, Zhuangzhuang; Qin, Peiyong; Tan, Tianwei

    2018-01-01

    The production of biobutanol from renewable biomass resources is attractive. The energy-intensive separation process and low-titer solvents production are the key constraints on the economy-feasible acetone-butanol-ethanol (ABE) production by fermentation. To decrease energy consumption and increase the solvents concentration, a novel two-stage gas stripping-salting-out system was established for effective ABE separation from the fermentation broth using sweet sorghum bagasse as feedstock. The ABE condensate (143.6 g/L) after gas stripping, the first-stage separation, was recovered and introduced to salting-out process as the second-stage. K 4 P 2 O 7 and K 2 HPO 4 were used, respectively. The effect of saturated salt solution temperature on final ABE concentration was also investigated. The results showed high ABE recovery (99.32%) and ABE concentration (747.58 g/L) when adding saturated K 4 P 2 O 7 solution at 323.15 K and 3.0 of salting-out factor. On this condition, the energy requirement of the downstream distillation process was 3.72 MJ/kg of ABE. High-titer cellulosic ABE production was separated from the fermentation broth by the novel two-stage gas stripping-salting-out process. The process was effective, which reduced the downstream process energy requirement significantly.

  4. Mechanical and biological behavior of ultrafine-grained Ti alloy aneurysm clip processed using high-pressure torsion.

    PubMed

    Um, Ho Yong; Park, Byung Ho; Ahn, Dong-Hyun; Abd El Aal, Mohamed Ibrahim; Park, Jaechan; Kim, Hyoung Seop

    2017-04-01

    Severe plastic deformation (SPD) has recently been advanced as the main process for fabricating bulk ultrafine grained or nanocrystalline metallic materials, which present much higher strength and better bio-compatibility than coarse-grained counterparts. Medical devices, such as aneurysm clips and dental implants, require high mechanical and biological performance (e.g., stiffness, yield strength, fatigue resistance, and bio-compatibility). These requirements match well the characteristics of SPD-processed materials. Typical aneurysm clips are made of a commercial Ti-6Al-4V alloy, which has higher yield strength than Ti. In this work, Ti and Ti-6Al-4V workpieces were processed by high-pressure torsion (HPT) to enhance their mechanical properties. Tensile tests and hardness tests were performed to evaluate their mechanical properties, and their microstructure was investigated. The hardness and yield stress of the HPT-processed Ti are comparable to those of the initial Ti-6Al-4V due to significantly refined microstructure. Finite element analyses for evaluating the opening performance of a specific geometry of the YASARGIL aneurysm clip were carried out using mechanical properties of the initial and HPT-processed Ti and Ti-6Al-4V. These results indicate that SPD-processed Ti could be a good candidate to substitute for Ti-6Al-4V in aneurysm clips. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Materials Research for High Speed Civil Transport and Generic Hypersonics-Metals Durability

    NASA Technical Reports Server (NTRS)

    Schulz, Paul; Hoffman, Daniel

    1996-01-01

    This report covers a portion of an ongoing investigation of the durability of titanium alloys for the High Speed Civil Transport (HSCT). Candidate alloys need to possess an acceptable combination of properties including strength and toughness as well as fatigue and corrosion resistance when subjected to the HSCT operational environment. These materials must also be capable of being processed into required product forms while maintaining their properties. Processing operations being considered for this airplane include forming, welding, adhesive bonding, and superplastic forming with or without diffusion bonding. This program was designed to develop the material properties database required to lower the risk of using advanced titanium alloys on the HSCT.

  6. Solution-Processed Cu2Se Nanocrystal Films with Bulk-Like Thermoelectric Performance.

    PubMed

    Forster, Jason D; Lynch, Jared J; Coates, Nelson E; Liu, Jun; Jang, Hyejin; Zaia, Edmond; Gordon, Madeleine P; Szybowski, Maxime; Sahu, Ayaskanta; Cahill, David G; Urban, Jeffrey J

    2017-06-05

    Thermoelectric power generation can play a key role in a sustainable energy future by converting waste heat from power plants and other industrial processes into usable electrical power. Current thermoelectric devices, however, require energy intensive manufacturing processes such as alloying and spark plasma sintering. Here, we describe the fabrication of a p-type thermoelectric material, copper selenide (Cu 2 Se), utilizing solution-processing and thermal annealing to produce a thin film that achieves a figure of merit, ZT, which is as high as its traditionally processed counterpart, a value of 0.14 at room temperature. This is the first report of a fully solution-processed nanomaterial achieving performance equivalent to its bulk form and represents a general strategy to reduce the energy required to manufacture advanced energy conversion and harvesting materials.

  7. Feasibility study on the use of groupware support for NASA source evaluation boards

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Yoes, Cissy

    1991-01-01

    Groupware is a class of computer based systems that support groups engaged in a common task (or goal) and that provide an interface to a shared environment. A potential application for groupware is the source evaluation board (SEB) process used in the procurement of government contracts. This study was undertaken to (1) identify parts of the SEB process which are candidates for groupware supports; and (2) identify tools which could be used to support the candidate process. Two processes of the SEB were identified as good candidates for groupware support: (1) document generation - a coordination and communication process required to present and document the findings of an SEB; and (2) group decision making - a highly analytical and integrative decision process requiring a clear and supportable outcome.

  8. Discovery of 100K SNP array and its utilization in sugarcane

    USDA-ARS?s Scientific Manuscript database

    Next generation sequencing (NGS) enable us to identify thousands of single nucleotide polymorphisms (SNPs) marker for genotyping and fingerprinting. However, the process requires very precise bioinformatics analysis and filtering process. High throughput SNP array with predefined genomic location co...

  9. Guayule (Parthenium argentatum)pyrolysis and analysis by PY-GC/MS

    USDA-ARS?s Scientific Manuscript database

    Economic and sustainable biofuel production requires high process efficiency. The choice of biomass and the conversion technology employed to produce renewable fuels determines the product yields, fuel quality and consequently the process efficiency. Guayule, a perennial shrub native to the southwes...

  10. A microdisplay-based HUD for automotive applications: Backplane design, planarization, and optical implementation

    NASA Astrophysics Data System (ADS)

    Schuck, Miller Harry

    Automotive head-up displays require compact, bright, and inexpensive imaging systems. In this thesis, a compact head-up display (HUD) utilizing liquid-crystal-on-silicon microdisplay technology is presented from concept to implementation. The thesis comprises three primary areas of HUD research: the specification, design and implementation of a compact HUD optical system, the development of a wafer planarization process to enhance reflective device brightness and light immunity and the design, fabrication and testing of an inexpensive 640 x 512 pixel active matrix backplane intended to meet the HUD requirements. The thesis addresses the HUD problem at three levels, the systems level, the device level, and the materials level. At the systems level, the optical design of an automotive HUD must meet several competing requirements, including high image brightness, compact packaging, video-rate performance, and low cost. An optical system design which meets the competing requirements has been developed utilizing a fully-reconfigurable reflective microdisplay. The design consists of two optical stages, the first a projector stage which magnifies the display, and a second stage which forms the virtual image eventually seen by the driver. A key component of the optical system is a diffraction grating/field lens which forms a large viewing eyebox while reducing the optical system complexity. Image quality biocular disparity and luminous efficacy were analyzed and results of the optical implementation are presented. At the device level, the automotive HUD requires a reconfigurable, video-rate, high resolution image source for applications such as navigation and night vision. The design of a 640 x 512 pixel active matrix backplane which meets the requirements of the HUD is described. The backplane was designed to produce digital field sequential color images at video rates utilizing fast switching liquid crystal as the modulation layer. The design methodology is discussed, and the example of a clock generator is described from design to implementation. Electrical and optical test results of the fabricated backplane are presented. At the materials level, a planarization method was developed to meet the stringent brightness requirements of automotive HUD's. The research efforts described here have resulted in a simple, low cost post-processing method for planarizing microdisplay substrates based on a spin-cast polymeric resin, benzocyclobutene (BCB). Six- fold reductions in substrate step height were accomplished with a single coating. Via masking and dry etching methods were developed. High reflectivity metal was deposited and patterned over the planarized substrate to produce high aperture pixel mirrors. The process is simple, rapid, and results in microdisplays better able to meet the stringent requirements of high brightness display systems. Methods and results of the post- processing are described.

  11. Superoxide radical and UV irradiation in ultrasound assisted oxidative desulfurization (UAOD): A potential alternative for greener fuels

    NASA Astrophysics Data System (ADS)

    Chan, Ngo Yeung

    This study is aimed at improving the current ultrasound assisted oxidative desulfurization (UAOD) process by utilizing superoxide radical as oxidant. Research was also conducted to investigate the feasibility of ultraviolet (UV) irradiation-assisted desulfurization. These modifications can enhance the process with the following achievements: (1) Meet the upcoming sulfur standards on various fuels including diesel fuel oils and residual oils; (2) More efficient oxidant with significantly lower consumption in accordance with stoichiometry; (3) Energy saving by 90%; (4) Greater selectivity in petroleum composition. Currently, the UAOD process and subsequent modifications developed in University of Southern California by Professor Yen's research group have demonstrated high desulfurization efficiencies towards various fuels with the application of 30% wt. hydrogen peroxide as oxidant. The UAOD process has demonstrated more than 50% desulfurization of refractory organic sulfur compounds with the use of Venturella type catalysts. Application of quaternary ammonium fluoride as phase transfer catalyst has significantly improved the desulfurization efficiency to 95%. Recent modifications incorporating ionic liquids have shown that the modified UAOD process can produce ultra-low sulfur, or near-zero sulfur diesels under mild conditions with 70°C and atmospheric pressure. Nevertheless, the UAOD process is considered not to be particularly efficient with respect to oxidant and energy consumption. Batch studies have demonstrated that the UAOD process requires 100 fold more oxidant than the stoichiometic requirement to achieve high desulfurization yield. The expected high costs of purchasing, shipping and storage of the oxidant would reduce the practicability of the process. The excess use of oxidant is not economically desirable, and it also causes environmental and safety issues. Post treatments would be necessary to stabilize the unspent oxidant residual to prevent the waste stream from becoming reactive or even explosive. High energy consumption is another drawback in the UAOD process. A typical 10 minutes ultrasonication applied in the UAOD process to achieve 95% desulfurization for 20g of diesel requires 450 kJ of energy, which is equivalent to approximately 50% of the energy that can be provided by the treated diesel. This great expenditure of energy is impractical for industries to adopt. In this study, modifications of the UAOD process, including the application of superoxide and selection of catalysts, were applied to lower the oxidant dosage and to improve the applicability towards heavy-distillates such as residual oil. The results demonstrated that the new system required 80% less oxidant as compared to previous generations of UAOD process without the loss of desulfurization efficiency. The new system demonstrated its suitability towards desulfurizing commercial mid-distillates including jet fuels, marine gas oil and sour diesel. This process also demonstrated a new method to desulfurize residual oil with high desulfurization yields. The new process development has been supported by Eco Energy Solutions Inc., Reno, Nevada and Intelligent Energy Inc., Long Beach, California. A feasibility study on UV assisted desulfurization by replacing ultrasound with UV irradiation was also conducted. The study demonstrated that the UV assisted desulfurization process consumes 90% less energy than the comparable process using ultrasonication. These process modifications demonstrated over 98% desulfurization efficiency on diesel oils and more than 75% on residual oils with significantly less oxidant and energy consumption. Also the feasibility to desulfurize commercial sour heavy oil was demonstrated. Based on the UAOD process and the commercialized modifications by Wan and Cheng, the feasible applications of superoxide and UV irradiation in the UAOD process could provide deep-desulfurization on various fuels with practical cost.

  12. Experimental evaluation of tool wear throughout a continuous stroke blanking process of quenched 22MnB5 ultra-high-strength steel

    NASA Astrophysics Data System (ADS)

    Vogt, S.; Neumayer, F. F.; Serkyov, I.; Jesner, G.; Kelsch, R.; Geile, M.; Sommer, A.; Golle, R.; Volk, W.

    2017-09-01

    Steel is the most common material used in vehicles’ chassis, which makes its research an important topic for the automotive industry. Recently developed ultra-high-strength steels (UHSS) provide extreme tensile strength up to 1,500 MPa and combine great crashworthiness with good weight reduction potential. However, in order to reach the final shape of sheet metal parts additional cutting steps such as trimming and piercing are often required. The final trimming of quenched metal sheets presents a huge challenge to a conventional process, mainly because of the required extreme cutting force. The high cutting impact, due to the materials’ brittleness, causes excessive tool wear or even sudden tool failure. Therefore, a laser is commonly used for the cutting process, which is time and energy consuming. The purpose of this paper is to demonstrate the capability of a conventional blanking tool design in a continuous stroke piercing process using boron steel 22MnB5 sheets. Two different types of tool steel were tested for their suitability as active cutting elements: electro-slag remelted (ESR) cold work tool steel Bohler K340 ISODUR and powder-metallurgic (PM) high speed steel Bohler S390 MICROCLEAN. A FEM study provided information about an optimized punch design, which withstands buckling under high cutting forces. The wear behaviour of the process was assessed by the tool wear of the active cutting elements as well as the quality of cut surfaces.

  13. Method for materials deposition by ablation transfer processing

    DOEpatents

    Weiner, Kurt H.

    1996-01-01

    A method in which a thin layer of semiconducting, insulating, or metallic material is transferred by ablation from a source substrate, coated uniformly with a thin layer of said material, to a target substrate, where said material is desired, with a pulsed, high intensity, patternable beam of energy. The use of a patternable beam allows area-selective ablation from the source substrate resulting in additive deposition of the material onto the target substrate which may require a very low percentage of the area to be covered. Since material is placed only where it is required, material waste can be minimized by reusing the source substrate for depositions on multiple target substrates. Due to the use of a pulsed, high intensity energy source the target substrate remains at low temperature during the process, and thus low-temperature, low cost transparent glass or plastic can be used as the target substrate. The method can be carried out atmospheric pressures and at room temperatures, thus eliminating vacuum systems normally required in materials deposition processes. This invention has particular application in the flat panel display industry, as well as minimizing materials waste and associated costs.

  14. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  15. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  16. Functional Laser Trimming Of Thin Film Resistors On Silicon ICs

    NASA Astrophysics Data System (ADS)

    Mueller, Michael J.; Mickanin, Wes

    1986-07-01

    Modern Laser Wafer Trimming (LWT) technology achieves exceptional analog circuit performance and precision while maintain-ing the advantages of high production throughput and yield. Microprocessor-driven instrumentation has both emphasized the role of data conversion circuits and demanded sophisticated signal conditioning functions. Advanced analog semiconductor circuits with bandwidths over 1 GHz, and high precision, trimmable, thin-film resistors meet many of todays emerging circuit requirements. Critical to meeting these requirements are optimum choices of laser characteristics, proper materials, trimming process control, accurate modeling of trimmed resistor performance, and appropriate circuit design. Once limited exclusively to hand-crafted, custom integrated circuits, designs are now available in semi-custom circuit configurations. These are similar to those provided for digital designs and supported by computer-aided design (CAD) tools. Integrated with fully automated measurement and trimming systems, these quality circuits can now be produced in quantity to meet the requirements of communications, instrumentation, and signal processing markets.

  17. A parallel architecture of interpolated timing recovery for high- speed data transfer rate and wide capture-range

    NASA Astrophysics Data System (ADS)

    Higashino, Satoru; Kobayashi, Shoei; Yamagami, Tamotsu

    2007-06-01

    High data transfer rate has been demanded for data storage devices along increasing the storage capacity. In order to increase the transfer rate, high-speed data processing techniques in read-channel devices are required. Generally, parallel architecture is utilized for the high-speed digital processing. We have developed a new architecture of Interpolated Timing Recovery (ITR) to achieve high-speed data transfer rate and wide capture-range in read-channel devices for the information storage channels. It facilitates the parallel implementation on large-scale-integration (LSI) devices.

  18. Research and Development of Micro-Alloying High-Strength Shipbuilding Plate

    NASA Astrophysics Data System (ADS)

    Chen, Zhenye

    Based on the technological requirements and market demand, Nb micro-alloying D36 grade high strength shipbuilding plate has been successfully developed in HBIS. In this papers, the rational chemical compositions design, smelting and rolling process of Nb micro-alloying D36 grade high strength shipbuilding plate were introduced. Its various performance figures not only comply with the rules of nine classification societies of CCS, LR, ABS NK, DNV, BV, GL, KR and RINA but meet users' requirements. It indicates that HBIS have capacity producing Nb micro-alloying D36 grade high strength shipbuilding plate.

  19. Improved process robustness by using closed loop control in deep drawing applications

    NASA Astrophysics Data System (ADS)

    Barthau, M.; Liewald, M.; Christian, Held

    2017-09-01

    The production of irregular shaped deep-drawing parts with high quality requirements, which are common in today’s automotive production, permanently challenges production processes. High requirements on lightweight construction of passenger car bodies following European regulations until 2020 have been massively increasing the use of high strength steels substantially for years and are also leading to bigger challenges in sheet metal part production. Of course, the more and more complex shapes of today’s car body shells also intensify the issue due to modern and future design criteria. The metal forming technology tries to meet these challenges by developing a highly sophisticated layout of deep drawing dies that consider part quality requirements, process robustness and controlled material flow during the deep or stretch drawing process phase. A new method for controlling material flow using a closed loop system was developed at the IFU Stuttgart. In contrast to previous approaches, this new method allows a control intervention during the deep-drawing stroke. The blank holder force around the outline of the drawn part is used as control variable. The closed loop is designed as trajectory follow up with feed forward control. The used command variable is the part-wall stress that is measured with a piezo-electric measuring pin. In this paper the used control loop will be described in detail. The experimental tool that was built for testing the new control approach is explained here with its features. A method for gaining the follow up trajectories from simulation will also be presented. Furthermore, experimental results considering the robustness of the deep drawing process and the gain in process performance with developed control loop will be shown. Finally, a new procedure for the industrial application of the new control method of deep drawing will be presented by using a new kind of active element to influence the local blank holder pressure onto part flange.

  20. High-Performance Flexible Perovskite Solar Cells by Using a Combination of Ultrasonic Spray-Coating and Low Thermal Budget Photonic Curing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanjib, Das; Yang, Bin; Gu, Gong

    Realizing the commercialization of high-performance and robust perovskite solar cells urgently requires the development of economically scalable processing techniques. Here we report a high-throughput ultrasonic spray-coating (USC) process capable of fabricating perovskite film-based solar cells on glass substrates with power conversion efficiency (PCE) as high as 13.04%. Perovskite films with high uniformity, crystallinity, and surface coverage are obtained in a single step. Moreover, we report USC processing on TiOx/ITO-coated polyethylene terephthalate (PET) substrates to realize flexible perovskite solar cells with PCE as high as 8.02% that are robust under mechanical stress. In this case, an optical curing technique was usedmore » to achieve a highly-conductive TiOx layer on flexible PET substrates for the first time. The high device performance and reliability obtained by this combination of USC processing with optical curing appears very promising for roll-to-roll manufacturing of high-efficiency, flexible perovskite solar cells.« less

  1. Reframing as a Best Practice: The Priority of Process in Highly Adaptive Decision Making

    ERIC Educational Resources Information Center

    Peters, Gary B.

    2008-01-01

    The development and practice of a well-defined process in which decisions are fully contemplated is needed in education today. The complexity of societal issues requires new depths of understanding, appreciation, and communication. Framing refers to the way a situation is described or viewed; reframing is the process of expanding and enriching the…

  2. A Different Approach to Studying the Charge and Discharge of a Capacitor without an Oscilloscope

    ERIC Educational Resources Information Center

    Ladino, L. A.

    2013-01-01

    A different method to study the charging and discharging processes of a capacitor is presented. The method only requires a high impedance voltmeter. The charging and discharging processes of a capacitor are usually studied experimentally using an oscilloscope and, therefore, both processes are studied as a function of time. The approach presented…

  3. The way to zeros: The future of semiconductor device and chemical mechanical polishing technologies

    NASA Astrophysics Data System (ADS)

    Tsujimura, Manabu

    2016-06-01

    For the last 60 years, the development of cutting-edge semiconductor devices has strongly emphasized scaling; the effort to scale down current CMOS devices may well achieve the target of 5 nm nodes by 2020. Planarization by chemical mechanical polishing (CMP), is one technology essential for supporting scaling. This paper summarizes the history of CMP transitions in the planarization process as well as the changing degree of planarity required, and, finally, introduces innovative technologies to meet the requirements. The use of CMP was triggered by the replacement of local oxidation of silicon (LOCOS) as the element isolation technology by shallow trench isolation (STI) in the 1980s. Then, CMP’s use expanded to improving embedability of aluminum wiring, tungsten (W) contacts, Cu wiring, and, more recently, to its adoption in high-k metal gate (HKMG) and FinFET (FF) processes. Initially, the required degree of planarity was 50 nm, but now 0 nm is required. Further, zero defects on a post-CMP wafer is now the goal, and it is possible that zero psi CMP loading pressure will be required going forward. Soon, it seems, everything will have to be “zero” and perfect. Although the process is also chemical in nature, the CMP process is actually mechanical with a load added using slurry particles several tens of nm in diameter. Zero load in the loading process, zero nm planarity with no trace of processing, and zero residual foreign material, including the very slurry particles used in the process, are all required. This article will provide an overview of how to achieve these new requirements and what technologies should be employed.

  4. Computer simulations in the high school: students' cognitive stages, science process skills and academic achievement in microbiology

    NASA Astrophysics Data System (ADS)

    Huppert, J.; Michal Lomask, S.; Lazarowitz, R.

    2002-08-01

    Computer-assisted learning, including simulated experiments, has great potential to address the problem solving process which is a complex activity. It requires a highly structured approach in order to understand the use of simulations as an instructional device. This study is based on a computer simulation program, 'The Growth Curve of Microorganisms', which required tenth grade biology students to use problem solving skills whilst simultaneously manipulating three independent variables in one simulated experiment. The aims were to investigate the computer simulation's impact on students' academic achievement and on their mastery of science process skills in relation to their cognitive stages. The results indicate that the concrete and transition operational students in the experimental group achieved significantly higher academic achievement than their counterparts in the control group. The higher the cognitive operational stage, the higher students' achievement was, except in the control group where students in the concrete and transition operational stages did not differ. Girls achieved equally with the boys in the experimental group. Students' academic achievement may indicate the potential impact a computer simulation program can have, enabling students with low reasoning abilities to cope successfully with learning concepts and principles in science which require high cognitive skills.

  5. Pupillometric evidence for the decoupling of attention from perceptual input during offline thought.

    PubMed

    Smallwood, Jonathan; Brown, Kevin S; Tipper, Christine; Giesbrecht, Barry; Franklin, Michael S; Mrazek, Michael D; Carlson, Jean M; Schooler, Jonathan W

    2011-03-25

    Accumulating evidence suggests that the brain can efficiently process both external and internal information. The processing of internal information is a distinct "offline" cognitive mode that requires not only spontaneously generated mental activity; it has also been hypothesized to require a decoupling of attention from perception in order to separate competing streams of internal and external information. This process of decoupling is potentially adaptive because it could prevent unimportant external events from disrupting an internal train of thought. Here, we use measurements of pupil diameter (PD) to provide concrete evidence for the role of decoupling during spontaneous cognitive activity. First, during periods conducive to offline thought but not during periods of task focus, PD exhibited spontaneous activity decoupled from task events. Second, periods requiring external task focus were characterized by large task evoked changes in PD; in contrast, encoding failures were preceded by episodes of high spontaneous baseline PD activity. Finally, high spontaneous PD activity also occurred prior to only the slowest 20% of correct responses, suggesting high baseline PD indexes a distinct mode of cognitive functioning. Together, these data are consistent with the decoupling hypothesis, which suggests that the capacity for spontaneous cognitive activity depends upon minimizing disruptions from the external world.

  6. Electric power processing, distribution, management and energy storage

    NASA Astrophysics Data System (ADS)

    Giudici, R. J.

    1980-07-01

    Power distribution subsystems are required for three elements of the SPS program: (1) orbiting satellite, (2) ground rectenna, and (3) Electric Orbiting Transfer Vehicle (EOTV). Power distribution subsystems receive electrical power from the energy conversion subsystem and provide the power busses rotary power transfer devices, switchgear, power processing, energy storage, and power management required to deliver control, high voltage plasma interactions, electric thruster interactions, and spacecraft charging of the SPS and the EOTV are also included as part of the power distribution subsystem design.

  7. Electric power processing, distribution, management and energy storage

    NASA Technical Reports Server (NTRS)

    Giudici, R. J.

    1980-01-01

    Power distribution subsystems are required for three elements of the SPS program: (1) orbiting satellite, (2) ground rectenna, and (3) Electric Orbiting Transfer Vehicle (EOTV). Power distribution subsystems receive electrical power from the energy conversion subsystem and provide the power busses rotary power transfer devices, switchgear, power processing, energy storage, and power management required to deliver control, high voltage plasma interactions, electric thruster interactions, and spacecraft charging of the SPS and the EOTV are also included as part of the power distribution subsystem design.

  8. Method for exfoliation of hexagonal boron nitride

    NASA Technical Reports Server (NTRS)

    Lin, Yi (Inventor); Connell, John W. (Inventor)

    2012-01-01

    A new method is disclosed for the exfoliation of hexagonal boron nitride into mono- and few-layered nanosheets (or nanoplatelets, nanomesh, nanoribbons). The method does not necessarily require high temperature or vacuum, but uses commercially available h-BN powders (or those derived from these materials, bulk crystals) and only requires wet chemical processing. The method is facile, cost efficient, and scalable. The resultant exfoliated h-BN is dispersible in an organic solvent or water thus amenable for solution processing for unique microelectronic or composite applications.

  9. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  10. Energy-Performance-Based Design-Build Process: Strategies for Procuring High-Performance Buildings on Typical Construction Budgets: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheib, J.; Pless, S.; Torcellini, P.

    NREL experienced a significant increase in employees and facilities on our 327-acre main campus in Golden, Colorado over the past five years. To support this growth, researchers developed and demonstrated a new building acquisition method that successfully integrates energy efficiency requirements into the design-build requests for proposals and contracts. We piloted this energy performance based design-build process with our first new construction project in 2008. We have since replicated and evolved the process for large office buildings, a smart grid research laboratory, a supercomputer, a parking structure, and a cafeteria. Each project incorporated aggressive efficiency strategies using contractual energy usemore » requirements in the design-build contracts, all on typical construction budgets. We have found that when energy efficiency is a core project requirement as defined at the beginning of a project, innovative design-build teams can integrate the most cost effective and high performance efficiency strategies on typical construction budgets. When the design-build contract includes measurable energy requirements and is set up to incentivize design-build teams to focus on achieving high performance in actual operations, owners can now expect their facilities to perform. As NREL completed the new construction in 2013, we have documented our best practices in training materials and a how-to guide so that other owners and owner's representatives can replicate our successes and learn from our experiences in attaining market viable, world-class energy performance in the built environment.« less

  11. Dimension Reduction of Multivariable Optical Emission Spectrometer Datasets for Industrial Plasma Processes

    PubMed Central

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A new data dimension-reduction method, called Internal Information Redundancy Reduction (IIRR), is proposed for application to Optical Emission Spectroscopy (OES) datasets obtained from industrial plasma processes. For example in a semiconductor manufacturing environment, real-time spectral emission data is potentially very useful for inferring information about critical process parameters such as wafer etch rates, however, the relationship between the spectral sensor data gathered over the duration of an etching process step and the target process output parameters is complex. OES sensor data has high dimensionality (fine wavelength resolution is required in spectral emission measurements in order to capture data on all chemical species involved in plasma reactions) and full spectrum samples are taken at frequent time points, so that dynamic process changes can be captured. To maximise the utility of the gathered dataset, it is essential that information redundancy is minimised, but with the important requirement that the resulting reduced dataset remains in a form that is amenable to direct interpretation of the physical process. To meet this requirement and to achieve a high reduction in dimension with little information loss, the IIRR method proposed in this paper operates directly in the original variable space, identifying peak wavelength emissions and the correlative relationships between them. A new statistic, Mean Determination Ratio (MDR), is proposed to quantify the information loss after dimension reduction and the effectiveness of IIRR is demonstrated using an actual semiconductor manufacturing dataset. As an example of the application of IIRR in process monitoring/control, we also show how etch rates can be accurately predicted from IIRR dimension-reduced spectral data. PMID:24451453

  12. Requirements for high-efficiency solar cells

    NASA Technical Reports Server (NTRS)

    Sah, C. T.

    1986-01-01

    Minimum recombination and low injection level are essential for high efficiency. Twenty percent AM1 efficiency requires a dark recombination current density of 2 x 10 to the minus 13th power A/sq cm and a recombination center density of less than 10 to the 10th power /cu cm. Recombination mechanisms at thirteen locations in a conventional single crystalline silicon cell design are reviewed. Three additional recombination locations are described at grain boundaries in polycrystalline cells. Material perfection and fabrication process optimization requirements for high efficiency are outlined. Innovative device designs to reduce recombination in the bulk and interfaces of single crystalline cells and in the grain boundary of polycrystalline cells are reviewed.

  13. Teaching High-Accuracy Global Positioning System to Undergraduates Using Online Processing Services

    ERIC Educational Resources Information Center

    Wang, Guoquan

    2013-01-01

    High-accuracy Global Positioning System (GPS) has become an important geoscientific tool used to measure ground motions associated with plate movements, glacial movements, volcanoes, active faults, landslides, subsidence, slow earthquake events, as well as large earthquakes. Complex calculations are required in order to achieve high-precision…

  14. HOLE-BLOCKING LAYERS FOR SILICON/ORGANIC HETEROJUNCTIONS: A NEW CLASS OF HIGH-EFFICIENCY LOW-COST PV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturm, James

    This project is the first investigation of the use of thin titanium dioxide layers on silicon as a hole-blocking / electron-transparent selective contact to silicon. The work was motivated by the goal of a high-efficiency low-cost silicon-based solar cells that could be processed entirely at low temperature (300 Degree Celsius) or less, without requiring plasma-processing.

  15. FPGA cluster for high-performance AO real-time control system

    NASA Astrophysics Data System (ADS)

    Geng, Deli; Goodsell, Stephen J.; Basden, Alastair G.; Dipper, Nigel A.; Myers, Richard M.; Saunter, Chris D.

    2006-06-01

    Whilst the high throughput and low latency requirements for the next generation AO real-time control systems have posed a significant challenge to von Neumann architecture processor systems, the Field Programmable Gate Array (FPGA) has emerged as a long term solution with high performance on throughput and excellent predictability on latency. Moreover, FPGA devices have highly capable programmable interfacing, which lead to more highly integrated system. Nevertheless, a single FPGA is still not enough: multiple FPGA devices need to be clustered to perform the required subaperture processing and the reconstruction computation. In an AO real-time control system, the memory bandwidth is often the bottleneck of the system, simply because a vast amount of supporting data, e.g. pixel calibration maps and the reconstruction matrix, need to be accessed within a short period. The cluster, as a general computing architecture, has excellent scalability in processing throughput, memory bandwidth, memory capacity, and communication bandwidth. Problems, such as task distribution, node communication, system verification, are discussed.

  16. Chemical Fabrication Used to Produce Thin-Film Materials for High Power-to- Weight-Ratio Space Photovoltaic Arrays

    NASA Technical Reports Server (NTRS)

    Hepp, Aloysius F.; Rybicki, George C.; Raffaelle, Ryne P.; Harris, Jerry D.; Hehemann, David G.; Junek, William; Gorse, Joseph; Thompson, Tracy L.; Hollingsworth, Jennifer A.; Buhro, William E.

    2000-01-01

    The key to achieving high specific power (watts per kilogram) space solar arrays is the development of a high-efficiency, thin-film solar cell that can be fabricated directly on a flexible, lightweight, space-qualified durable substrate such as Kapton (DuPont) or other polyimide or suitable polymer film. Cell efficiencies approaching 20 percent at AM0 (air mass zero) are required. Current thin-film cell fabrication approaches are limited by either (1) the ultimate efficiency that can be achieved with the device material and structure or (2) the requirement for high-temperature deposition processes that are incompatible with all presently known flexible polyimide or other polymer substrate materials. Cell fabrication processes must be developed that will produce high-efficiency cells at temperatures below 400 degrees Celsius, and preferably below 300 degress Celsius to minimize the problems associated with the difference between the coefficients of thermal expansion of the substrate and thin-film solar cell and/or the decomposition of the substrate.

  17. High Temperature Polybenzimidazole Hollow Fiber Membranes for Hydrogen Separation and Carbon Dioxide Capture from Synthesis Gas

    DOE PAGES

    Singh, Rajinder P.; Dahe, Ganpat J.; Dudeck, Kevin W.; ...

    2014-12-31

    Sustainable reliance on hydrocarbon feedstocks for energy generation requires CO₂ separation technology development for energy efficient carbon capture from industrial mixed gas streams. High temperature H₂ selective glassy polymer membranes are an attractive option for energy efficient H₂/CO₂ separations in advanced power production schemes with integrated carbon capture. They enable high overall process efficiencies by providing energy efficient CO₂ separations at process relevant operating conditions and correspondingly, minimized parasitic energy losses. Polybenzimidazole (PBI)-based materials have demonstrated commercially attractive H₂/CO₂ separation characteristics and exceptional tolerance to hydrocarbon fuel derived synthesis (syngas) gas operating conditions and chemical environments. To realize a commerciallymore » attractive carbon capture technology based on these PBI materials, development of high performance, robust PBI hollow fiber membranes (HFMs) is required. In this work, we discuss outcomes of our recent efforts to demonstrate and optimize the fabrication and performance of PBI HFMs for use in pre-combustion carbon capture schemes. These efforts have resulted in PBI HFMs with commercially attractive fabrication protocols, defect minimized structures, and commercially attractive permselectivity characteristics at IGCC syngas process relevant conditions. The H₂/CO₂ separation performance of these PBI HFMs presented in this document regarding realistic process conditions is greater than that of any other polymeric system reported to-date.« less

  18. Particulate generation and control in the PREPP (Process Experimental Pilot Plant) incinerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stermer, D.L.; Gale, L.G.

    1989-03-01

    Particulate emissions in radioactive incineration systems using a wet scrubbing system are generally ultimately controlled by flowing the process offgas stream through a high-efficiency filter, such as a High Efficient Particulate Air (HEPA) filter. Because HEPA filters are capable of reducing particulate emissions over an order of magnitude below regulatory limits, they consequently are vulnerable to high loading rates. This becomes a serious handicap in radioactive systems when filter change-out is required at an unacceptably high rate. The Process Experimental Pilot Plant (PREPP) incineration system is designed for processing retrieved low level mixed hazardous waste. It has a wet offgasmore » treatment system consisting of a Quencher, Venturi Scrubber, Entrainment Eliminator, Mist Eliminator, two stages of HEPA filters, and induced draft fans. During previous tests, it was noted that the offgas filters loaded with particulate at a rate requiring replacement as often as every four hours. During 1988, PREPP conducted a series of tests which included an investigation of the causes of heavy particulate accumulation on the offgas filters in relation to various operating parameters. This was done by measuring the particulate concentrations in the offgas system, primarily as a function of scrub solution salt concentration, waste feed rate, and offgas flow rate. 2 figs., 9 tabs.« less

  19. Development Challenges and Opportunities Confronting Economies in Transition

    ERIC Educational Resources Information Center

    Estes, Richard J.

    2007-01-01

    "Economies in Transition" (hereafter EIT or EITs) are countries in the process of shifting from "command" to "more open", liberalized, free market economic systems. In addition to achieving major structural adjustments to their economies, the transformational process requires the introduction of a high degree of…

  20. The Power of Imaging.

    ERIC Educational Resources Information Center

    Haapaniemi, Peter

    1990-01-01

    Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…

  1. In vitro Perturbations of Targets in Cancer Hallmark Processes Predict Rodent Chemical Carcinogenesis

    EPA Science Inventory

    Thousands of untested chemicals in the environment require efficient characterization of carcinogenic potential in humans. A proposed solution is rapid testing of chemicals using in vitro high-throughput screening (HTS) assays for targets in pathways linked to disease processes ...

  2. Way Forward for High Performance Payload Processing Development

    NASA Astrophysics Data System (ADS)

    Notebaert, Olivier; Franklin, John; Lefftz, Vincent; Moreno, Jose; Patte, Mathieu; Syed, Mohsin; Wagner, Arnaud

    2012-08-01

    Payload processing is facing technological challenges due to the large increase of performance requirements of future scientific, observation and telecom missions as well as the future instruments technologies capturing much larger amount of data. For several years, with the perspective of higher performance together with the planned obsolescence of solutions covering the current needs, ESA and the European space industry has been developing several technology solutions. Silicon technologies, radiation mitigation techniques and innovative functional architectures are developed with the goal of designing future space qualified processing devices with a much higher level of performance than today. The fast growing commercial market application have developed very attractive technologies but which are not fully suitable with respect to their tolerance to space environment. Without the financial capacity to explore and develop all possible technology paths, a specific and global approach is required to cover the future mission needs and their necessary performance targets with effectiveness.The next sections describe main issues and priorities and provides further detailed relevant for this approach covering the high performance processing technology.

  3. Laser beam temporal and spatial tailoring for laser shock processing

    DOEpatents

    Hackel, Lloyd; Dane, C. Brent

    2001-01-01

    Techniques are provided for formatting laser pulse spatial shape and for effectively and efficiently delivering the laser energy to a work surface in the laser shock process. An appropriately formatted pulse helps to eliminate breakdown and generate uniform shocks. The invention uses a high power laser technology capable of meeting the laser requirements for a high throughput process, that is, a laser which can treat many square centimeters of surface area per second. The shock process has a broad range of applications, especially in the aerospace industry, where treating parts to reduce or eliminate corrosion failure is very important. The invention may be used for treating metal components to improve strength and corrosion resistance. The invention has a broad range of applications for parts that are currently shot peened and/or require peening by means other than shot peening. Major applications for the invention are in the automotive and aerospace industries for components such as turbine blades, compressor components, gears, etc.

  4. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  5. Facilities Requirements for Archives and Special Collections Department.

    ERIC Educational Resources Information Center

    Brown, Charlotte B.

    The program of the Archives and Special Collections Department at Franklin and Marshall College requires the following function areas to be located in the Shadek-Fackenthal Library: (1) Reading Room; (2) Conservation Laboratory; (3) Isolation Room; (4) storage for permanent collection; (5) storage for high security materials; (6) Processing Room;…

  6. Laser ablation and competitive technologies in paint stripping of heavy anticorrosion coatings

    NASA Astrophysics Data System (ADS)

    Schuöcker, Georg D.; Bielak, Robert

    2007-05-01

    During the last years surface preparation prior to coating operations became an important research and development task, since tightened environmental regulations have to be faced in view of the deliberation of hazardous compounds of coatings. Especially, ship-yards get more and more under pressure, because the environmental commitment of their Asian competitors is fairly limited. Therefore, in the US and in Europe several technology evaluation projects have been launched to face this challenge. The majority of coating service providers and ship yards use grit blasting; this process causes heavy emissions as of dust and enormous amounts of waste as polluted sand. Coating removal without any blasting material would reduce the environmental impact. Laser processing offers ecological advantages. Therefore thermal processes like laser ablation have been studied thoroughly in several published projects and also in this study. Many of these studies have been focused on the maintenance of airplanes, but not on de-coating of heavy protective coatings. In this case the required laser power is extra-high. This study is focused on the maintenance of heavy anti-corrosion coatings and compares the industrial requirements and the opportunities of the innovative laser processes. Based on the results of this analysis similar approaches as e.g. plasma jet coating ablation have been studied. It was concluded that none of these methods can compete economically with the conventional processes as grit blasting and water jetting since the required ablation rate is very high (>60m2/h). A new process is required that is not based on any blasting operation and which does not depend strongly on the coating's characteristic. The delamination of the coating where the coatings is not removed by evaporation, but in little pieces of the complete coating system meets these requirements. The delamination can be accomplished by the thermal destruction of the primer coating by an intense heat pulse generated by inductive heating of substrate's surface. After this operation the coating can be peeled off.

  7. The software development process at the Chandra X-ray Center

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Evans, Ian N.; Fabbiano, Giuseppina

    2008-08-01

    Software development for the Chandra X-ray Center Data System began in the mid 1990's, and the waterfall model of development was mandated by our documents. Although we initially tried this approach, we found that a process with elements of the spiral model worked better in our science-based environment. High-level science requirements are usually established by scientists, and provided to the software development group. We follow with review and refinement of those requirements prior to the design phase. Design reviews are conducted for substantial projects within the development team, and include scientists whenever appropriate. Development follows agreed upon schedules that include several internal releases of the task before completion. Feedback from science testing early in the process helps to identify and resolve misunderstandings present in the detailed requirements, and allows review of intangible requirements. The development process includes specific testing of requirements, developer and user documentation, and support after deployment to operations or to users. We discuss the process we follow at the Chandra X-ray Center (CXC) to develop software and support operations. We review the role of the science and development staff from conception to release of software, and some lessons learned from managing CXC software development for over a decade.

  8. Shielding NSLS-II light source: Importance of geometry for calculating radiation levels from beam losses

    NASA Astrophysics Data System (ADS)

    Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; Wahl, W.

    2016-11-01

    Third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and ramped operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. This made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.

  9. System simulation of direct-current speed regulation based on Simulink

    NASA Astrophysics Data System (ADS)

    Yang, Meiying

    2018-06-01

    Many production machines require the smooth adjustment of speed in a certain range In the process of modern industrial production, and require good steady-state and dynamic performance. Direct-current speed regulation system with wide speed regulation range, small relative speed variation, good stability, large overload capacity, can bear the frequent impact load, can realize stepless rapid starting-braking and inversion of frequency and other good dynamic performances, can meet the different kinds of special operation requirements in production process of automation system. The direct-current power drive system is almost always used in the field of drive technology of high performance for a long time.

  10. Etch bias inversion during EUV mask ARC etch

    NASA Astrophysics Data System (ADS)

    Lajn, Alexander; Rolff, Haiko; Wistrom, Richard

    2017-07-01

    The introduction of EUV lithography to high volume manufacturing is now within reach for 7nm technology node and beyond (1), at least for some steps. The scheduling is in transition from long to mid-term. Thus, all contributors need to focus their efforts on the production requirements. For the photo mask industry, these requirements include the control of defectivity, CD performance and lifetime of their masks. The mask CD performance including CD uniformity, CD targeting, and CD linearity/ resolution, is predominantly determined by the photo resist performance and by the litho and etch processes. State-of-the-art chemically amplified resists exhibit an asymmetric resolution for directly and indirectly written features, which usually results in a similarly asymmetric resolution performance on the mask. This resolution gap may reach as high as multiple tens of nanometers on the mask level in dependence of the chosen processes. Depending on the printing requirements of the wafer process, a reduction or even an increase of this gap may be required. A potential way of tuning via the etch process, is to control the lateral CD contribution during etch. Aside from process tuning knobs like pressure, RF powers and gases, which usually also affect CD linearity and CD uniformity, the simplest knob is the etch time itself. An increased over etch time results in an increased CD contribution in the normal case. , We found that the etch CD contribution of ARC layer etch on EUV photo masks is reduced by longer over etch times. Moreover, this effect can be demonstrated to be present for different etch chambers and photo resists.

  11. Mixed-signal 0.18μm CMOS and SiGe BiCMOS foundry technologies for ROIC applications

    NASA Astrophysics Data System (ADS)

    Kar-Roy, Arjun; Howard, David; Racanelli, Marco; Scott, Mike; Hurwitz, Paul; Zwingman, Robert; Chaudhry, Samir; Jordan, Scott

    2010-10-01

    Today's readout integrated-circuits (ROICs) require a high level of integration of high performance analog and low power digital logic. TowerJazz offers a commercial 0.18μm CMOS technology platform for mixed-signal, RF, and high performance analog applications which can be used for ROIC applications. The commercial CA18HD dual gate oxide 1.8V/3.3V and CA18HA dual gate oxide 1.8V/5V RF/mixed signal processes, consisting of six layers of metallization, have high density stacked linear MIM capacitors, high-value resistors, triple-well isolation and thick top aluminum metal. The CA18HA process also has scalable drain extended LDMOS devices, up to 40V Vds, for high-voltage sensor applications, and high-performance bipolars for low noise requirements in ROICs. Also discussed are the available features of the commercial SBC18 SiGe BiCMOS platform with SiGe NPNs operating up to 200/200GHz (fT/fMAX frequencies in manufacturing and demonstrated to 270 GHz fT, for reduced noise and integrated RF capabilities which could be used in ROICs. Implementation of these technologies in a thick film SOI process for integrated RF switch and power management and the availability of high fT vertical PNPs to enable complementary BiCMOS (CBiCMOS), for RF enabled ROICs, are also described in this paper.

  12. Active chatter suppression with displacement-only measurement in turning process

    NASA Astrophysics Data System (ADS)

    Ma, Haifeng; Wu, Jianhua; Yang, Liuqing; Xiong, Zhenhua

    2017-08-01

    Regenerative chatter is a major hindrance for achieving high quality and high production rate in machining processes. Various active controllers have been proposed to mitigate chatter. However, most of existing controllers were developed on the basis of multi-states feedback of the system and state observers were usually needed. Moreover, model parameters of the machining process (mass, damping and stiffness) were required in existing active controllers. In this study, an active sliding mode controller, which employs a dynamic output feedback sliding surface for the unmatched condition and an adaptive law for disturbance estimation, is designed, analyzed, and validated for chatter suppression in turning process. Only displacement measurement is required by this approach. Other sensors and state observers are not needed. Moreover, it facilitates a rapid implementation since the designed controller is established without using model parameters of the turning process. Theoretical analysis, numerical simulations and experiments on a computer numerical control (CNC) lathe are presented. It shows that the chatter can be substantially attenuated and the chatter-free region can be significantly expanded with the presented method.

  13. Minimization of energy and surface roughness of the products machined by milling

    NASA Astrophysics Data System (ADS)

    Belloufi, A.; Abdelkrim, M.; Bouakba, M.; Rezgui, I.

    2017-08-01

    Metal cutting represents a large portion in the manufacturing industries, which makes this process the largest consumer of energy. Energy consumption is an indirect source of carbon footprint, we know that CO2 emissions come from the production of energy. Therefore high energy consumption requires a large production, which leads to high cost and a large amount of CO2 emissions. At this day, a lot of researches done on the Metal cutting, but the environmental problems of the processes are rarely discussed. The right selection of cutting parameters is an effective method to reduce energy consumption because of the direct relationship between energy consumption and cutting parameters in machining processes. Therefore, one of the objectives of this research is to propose an optimization strategy suitable for machining processes (milling) to achieve the optimum cutting conditions based on the criterion of the energy consumed during the milling. In this paper the problem of energy consumed in milling is solved by an optimization method chosen. The optimization is done according to the different requirements in the process of roughing and finishing under various technological constraints.

  14. Integrated resource scheduling in a distributed scheduling environment

    NASA Technical Reports Server (NTRS)

    Zoch, David; Hall, Gardiner

    1988-01-01

    The Space Station era presents a highly-complex multi-mission planning and scheduling environment exercised over a highly distributed system. In order to automate the scheduling process, customers require a mechanism for communicating their scheduling requirements to NASA. A request language that a remotely-located customer can use to specify his scheduling requirements to a NASA scheduler, thus automating the customer-scheduler interface, is described. This notation, Flexible Envelope-Request Notation (FERN), allows the user to completely specify his scheduling requirements such as resource usage, temporal constraints, and scheduling preferences and options. The FERN also contains mechanisms for representing schedule and resource availability information, which are used in the inter-scheduler inconsistency resolution process. Additionally, a scheduler is described that can accept these requests, process them, generate schedules, and return schedule and resource availability information to the requester. The Request-Oriented Scheduling Engine (ROSE) was designed to function either as an independent scheduler or as a scheduling element in a network of schedulers. When used in a network of schedulers, each ROSE communicates schedule and resource usage information to other schedulers via the FERN notation, enabling inconsistencies to be resolved between schedulers. Individual ROSE schedules are created by viewing the problem as a constraint satisfaction problem with a heuristically guided search strategy.

  15. Small business innovation research: Program solicitation

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This, the seventh annual SBIR solicitation by NASA, describes the program, identifies eligibility requirements, outlines the required proposal format and content, states proposal preparation and submission requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in NASA's SBIR program. It also identifies the Technical Topics and Subtopics in which SBIR Phase 1 proposals are solicited in 1989. These Topics and Subtopics cover a broad range of current NASA interests, but do not necessarily include all areas in which NASA plans or currently conducts research. High-risk high pay-off innovations are desired.

  16. Hot working behavior of selective laser melted and laser metal deposited Inconel 718

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Sizova, Irina

    2018-05-01

    The production of Nickel-based high-temperature components is of great importance for the transport and energy sector. Forging of high-temperature alloys often requires expensive dies, multiple forming steps and leads to forged parts with tolerances that require machining to create the final shape and a large amount of scrap. Additive manufacturing offers the possibility to print the desired shapes directly as net-shape components, requiring only little additional effort in machining. Especially for high-temperature alloys carrying a large amount of energy per unit mass, additive manufacturing could be more energy-efficient than forging if the energy contained in the machining scrap exceeds the energy needed for powder production and laser processing. However, the microstructure and performance of 3d-printed parts will not reach the level of forged material unless further expensive processes such as hot-isostatic pressing are used. Using the design freedom and possibilities to locally engineer material, additive manufacturing could be combined with forging operations to novel process chains, offering the possibility to reduce the number of forging steps and to create near-net shape forgings with desired local properties. Some innovative process chains combining additive manufacturing and forging have been patented recently, but almost no scientific knowledge on the workability of 3D printed preforms exists. The present study investigates the flow stress and microstructure evolution during hot working of pre-forms produced by laser powder deposition and selective laser melting (Figure 1) and puts forward a model for the flow stress.

  17. High-κ gate dielectrics: Current status and materials properties considerations

    NASA Astrophysics Data System (ADS)

    Wilk, G. D.; Wallace, R. M.; Anthony, J. M.

    2001-05-01

    Many materials systems are currently under consideration as potential replacements for SiO2 as the gate dielectric material for sub-0.1 μm complementary metal-oxide-semiconductor (CMOS) technology. A systematic consideration of the required properties of gate dielectrics indicates that the key guidelines for selecting an alternative gate dielectric are (a) permittivity, band gap, and band alignment to silicon, (b) thermodynamic stability, (c) film morphology, (d) interface quality, (e) compatibility with the current or expected materials to be used in processing for CMOS devices, (f) process compatibility, and (g) reliability. Many dielectrics appear favorable in some of these areas, but very few materials are promising with respect to all of these guidelines. A review of current work and literature in the area of alternate gate dielectrics is given. Based on reported results and fundamental considerations, the pseudobinary materials systems offer large flexibility and show the most promise toward successful integration into the expected processing conditions for future CMOS technologies, especially due to their tendency to form at interfaces with Si (e.g. silicates). These pseudobinary systems also thereby enable the use of other high-κ materials by serving as an interfacial high-κ layer. While work is ongoing, much research is still required, as it is clear that any material which is to replace SiO2 as the gate dielectric faces a formidable challenge. The requirements for process integration compatibility are remarkably demanding, and any serious candidates will emerge only through continued, intensive investigation.

  18. The research of PSD location method in micro laser welding fields

    NASA Astrophysics Data System (ADS)

    Zhang, Qiue; Zhang, Rong; Dong, Hua

    2010-11-01

    In the field of micro laser welding, besides the special requirement in the parameter of lasers, the locating in welding points accurately is very important. The article adopt position sensitive detector (PSD) as hard core, combine optic system, electric circuits and PC and software processing, confirm the location of welding points. The signal detection circuits adopt the special integrate circuit H-2476 to process weak signal. It is an integrated circuit for high-speed, high-sensitivity optical range finding, which has stronger noiseproof feature, combine digital filter arithmetic, carry out repair the any non-ideal factors, increasing the measure precision. The amplifier adopt programmable amplifier LTC6915. The system adapt two dimension stepping motor drive the workbench, computer and corresponding software processing, make sure the location of spot weld. According to different workpieces to design the clamps. The system on-line detect PSD 's output signal in the moving processing. At the workbench moves in the X direction, the filaments offset is detected dynamic. Analyze the X axes moving sampling signal direction could be estimate the Y axes moving direction, and regulate the Y axes moving values. The workbench driver adopt A3979, it is a stepping motor driver with insert transducer and operate easily. It adapts the requirement of location in micro laser welding fields, real-time control to adjust by computer. It can be content up 20 μm's laser micro welding requirement on the whole. Using laser powder cladding technology achieve inter-penetration welding of high quality and reliability.

  19. Dry-grind processing using amylase corn and superior yeast to reduce the exogenous enzyme requirements in bioethanol production.

    PubMed

    Kumar, Deepak; Singh, Vijay

    2016-01-01

    Conventional corn dry-grind ethanol production process requires exogenous alpha and glucoamylases enzymes to breakdown starch into glucose, which is fermented to ethanol by yeast. This study evaluates the potential use of new genetically engineered corn and yeast, which can eliminate or minimize the use of these external enzymes, improve the economics and process efficiencies, and simplify the process. An approach of in situ ethanol removal during fermentation was also investigated for its potential to improve the efficiency of high-solid fermentation, which can significantly reduce the downstream ethanol and co-product recovery cost. The fermentation of amylase corn (producing endogenous α-amylase) using conventional yeast and no addition of exogenous α-amylase resulted in ethanol concentration of 4.1 % higher compared to control treatment (conventional corn using exogenous α-amylase). Conventional corn processed with exogenous α-amylase and superior yeast (producing glucoamylase or GA) with no exogenous glucoamylase addition resulted in ethanol concentration similar to control treatment (conventional yeast with exogenous glucoamylase addition). Combination of amylase corn and superior yeast required only 25 % of recommended glucoamylase dose to complete fermentation and achieve ethanol concentration and yield similar to control treatment (conventional corn with exogenous α-amylase, conventional yeast with exogenous glucoamylase). Use of superior yeast with 50 % GA addition resulted in similar increases in yield for conventional or amylase corn of approximately 7 % compared to that of control treatment. Combination of amylase corn, superior yeast, and in situ ethanol removal resulted in a process that allowed complete fermentation of 40 % slurry solids with only 50 % of exogenous GA enzyme requirements and 64.6 % higher ethanol yield compared to that of conventional process. Use of amylase corn and superior yeast in the dry-grind processing industry can reduce the total external enzyme usage by more than 80 %, and combining their use with in situ removal of ethanol during fermentation allows efficient high-solid fermentation.

  20. Lightning Protection Certification for High Explosives Facilities at Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, T J; Brown, C G; Ong, M M

    2006-01-11

    Presented here is an innovation in lighting safety certification, and a description of its implementation for high explosives processing and storage facilities at Lawrence Livermore National Laboratory. Lightning rods have proven useful in the protection of wooden structures; however, modern structures made of rebar, concrete, and the like, require fresh thinking. Our process involves a rigorous and unique approach to lightning safety for modern buildings, where the internal voltages and currents are quantified and the risk assessed. To follow are the main technical aspects of lightning protection for modern structures and these methods comply with the requirements of the Nationalmore » Fire Protection Association, the National Electrical Code, and the Department of Energy [1][2]. At the date of this release, we have certified over 70 HE processing and storage cells at our Site 300 facility.« less

  1. Noncontact conductivity and dielectric measurement for high throughput roll-to-roll nanomanufacturing

    NASA Astrophysics Data System (ADS)

    Orloff, Nathan D.; Long, Christian J.; Obrzut, Jan; Maillaud, Laurent; Mirri, Francesca; Kole, Thomas P.; McMichael, Robert D.; Pasquali, Matteo; Stranick, Stephan J.; Alexander Liddle, J.

    2015-11-01

    Advances in roll-to-roll processing of graphene and carbon nanotubes have at last led to the continuous production of high-quality coatings and filaments, ushering in a wave of applications for flexible and wearable electronics, woven fabrics, and wires. These applications often require specific electrical properties, and hence precise control over material micro- and nanostructure. While such control can be achieved, in principle, by closed-loop processing methods, there are relatively few noncontact and nondestructive options for quantifying the electrical properties of materials on a moving web at the speed required in modern nanomanufacturing. Here, we demonstrate a noncontact microwave method for measuring the dielectric constant and conductivity (or geometry for samples of known dielectric properties) of materials in a millisecond. Such measurement times are compatible with current and future industrial needs, enabling real-time materials characterization and in-line control of processing variables without disrupting production.

  2. Studies for determining thermal ion extraction potential for aluminium plasma generated by electron beam evaporator

    NASA Astrophysics Data System (ADS)

    Dileep Kumar, V.; Barnwal, Tripti A.; Mukherjee, Jaya; Gantayet, L. M.

    2010-02-01

    For effective evaporation of refractory metal, electron beam is found to be most suitable vapour generator source. Using electron beam, high throughput laser based purification processes are carried out. But due to highly concentrated electron beam, the vapour gets ionised and these ions lead to dilution of the pure product of laser based separation process. To estimate the concentration of these ions and extraction potential requirement to remove these ions from vapour stream, experiments have been conducted using aluminium as evaporant. The aluminium ingots were placed in water cooled copper crucible. Inserts were used to hold the evaporant, in order to attain higher number density in the vapour processing zone and also for confining the liquid metal. Parametric studies with beam power, number density and extraction potential were conducted. In this paper we discuss the trend of the generation of thermal ions and electrostatic field requirement for extraction.

  3. Noncontact conductivity and dielectric measurement for high throughput roll-to-roll nanomanufacturing

    PubMed Central

    Orloff, Nathan D.; Long, Christian J.; Obrzut, Jan; Maillaud, Laurent; Mirri, Francesca; Kole, Thomas P.; McMichael, Robert D.; Pasquali, Matteo; Stranick, Stephan J.; Alexander Liddle, J.

    2015-01-01

    Advances in roll-to-roll processing of graphene and carbon nanotubes have at last led to the continuous production of high-quality coatings and filaments, ushering in a wave of applications for flexible and wearable electronics, woven fabrics, and wires. These applications often require specific electrical properties, and hence precise control over material micro- and nanostructure. While such control can be achieved, in principle, by closed-loop processing methods, there are relatively few noncontact and nondestructive options for quantifying the electrical properties of materials on a moving web at the speed required in modern nanomanufacturing. Here, we demonstrate a noncontact microwave method for measuring the dielectric constant and conductivity (or geometry for samples of known dielectric properties) of materials in a millisecond. Such measurement times are compatible with current and future industrial needs, enabling real-time materials characterization and in-line control of processing variables without disrupting production. PMID:26592441

  4. Dense wavelength division multiplexing devices for metropolitan-area datacom and telecom networks

    NASA Astrophysics Data System (ADS)

    DeCusatis, Casimer M.; Priest, David G.

    2000-12-01

    Large data processing environments in use today can require multi-gigabyte or terabyte capacity in the data communication infrastructure; these requirements are being driven by storage area networks with access to petabyte data bases, new architecture for parallel processing which require high bandwidth optical links, and rapidly growing network applications such as electronic commerce over the Internet or virtual private networks. These datacom applications require high availability, fault tolerance, security, and the capacity to recover from any single point of failure without relying on traditional SONET-based networking. These requirements, coupled with fiber exhaust in metropolitan areas, are driving the introduction of dense optical wavelength division multiplexing (DWDM) in data communication systems, particularly for large enterprise servers or mainframes. In this paper, we examine the technical requirements for emerging nextgeneration DWDM systems. Protocols for storage area networks and computer architectures such as Parallel Sysplex are presented, including their fiber bandwidth requirements. We then describe two commercially available DWDM solutions, a first generation 10 channel system and a recently announced next generation 32 channel system. Technical requirements, network management and security, fault tolerant network designs, new network topologies enabled by DWDM, and the role of time division multiplexing in the network are all discussed. Finally, we present a description of testing conducted on these networks and future directions for this technology.

  5. Electrically Conductive Polyimide Films Containing Gold Surface

    NASA Technical Reports Server (NTRS)

    Caplan, Maggie L.; Stoakley, Diane M.; St. Clair, Anne K.

    1994-01-01

    Polyimide films exhibiting high thermo-oxidative stability and including electrically conductive surface layers containing gold made by casting process. Many variations of basic process conditions, ingredients, and sequence of operations possible, and not all resulting versions of process yield electrically conductive films. Gold-containing layer formed on film surface during cure. These metallic gold-containing polyimides used in film and coating applications requiring electrical conductivity, high reflectivity, exceptional thermal stability, and/or mechanical integrity. They also find commercial potential in areas ranging from thin films for satellite antennas to decorative coatings and packaging.

  6. The Importance of Water for High Fidelity Information Processing and for Life

    NASA Technical Reports Server (NTRS)

    Hoehler, Tori M.; Pohorille, Andrew

    2011-01-01

    Is water an absolute prerequisite for life? Life depends on a variety of non-covalent interactions among molecules, the nature of which is determined as much by the solvent in which they occur as by the molecules themselves. Catalysis and information processing, two essential functions of life, require non-covalent molecular recognition with very high specificity. For example, to correctly reproduce a string consisting of 600,000 units of information (e.g ., 600 kilobases, equivalent to the genome of the smallest free living terrestrial organisms) with a 90% success rate requires specificity > 107 : 1 for the target molecule vs. incorrect alternatives. Such specificity requires (i) that the correct molecular association is energetically stabilized by at least 40 kJ/mol relative to alternatives, and (ii) that the system is able to sample among possible states (alternative molecular associations) rapidly enough to allow the system to fall under thermodynamic control and express the energetic stabilization. We argue that electrostatic interactions are required to confer the necessary energetic stabilization vs. a large library of molecular alternatives, and that a solvent with polarity and dielectric properties comparable to water is required for the system to sample among possible states and express thermodynamic control. Electrostatic associations can be made in non-polar solvents, but the resulting complexes are too stable to be "unmade" with sufficient frequency to confer thermodynamic control on the system. An electrostatic molecular complex representing 3 units of information (e.g., 3 base pairs) with specificity > 107 per unit has a stability in non-polar solvent comparable to that of a carbon-carbon bond at room temperature. These considerations suggest that water, or a solvent with properties very like water, is necessary to support high-fidelity information processing, and can therefore be considered a critical prerequisite for life.

  7. Faculty Return to Industry. Final Report.

    ERIC Educational Resources Information Center

    Hulse, David R.

    Lagging productivity and increased competitiveness from abroad are forcing U.S. management companies to adopt new management styles that promote efficiency and quality in production. High on the agenda is the incorporation of Statistical Process Control (SPC). SPC requires definition of process and product specifications to facilitate their…

  8. A 48Cycles/MB H.264/AVC Deblocking Filter Architecture for Ultra High Definition Applications

    NASA Astrophysics Data System (ADS)

    Zhou, Dajiang; Zhou, Jinjia; Zhu, Jiayi; Goto, Satoshi

    In this paper, a highly parallel deblocking filter architecture for H.264/AVC is proposed to process one macroblock in 48 clock cycles and give real-time support to QFHD@60fps sequences at less than 100MHz. 4 edge filters organized in 2 groups for simultaneously processing vertical and horizontal edges are applied in this architecture to enhance its throughput. While parallelism increases, pipeline hazards arise owing to the latency of edge filters and data dependency of deblocking algorithm. To solve this problem, a zig-zag processing schedule is proposed to eliminate the pipeline bubbles. Data path of the architecture is then derived according to the processing schedule and optimized through data flow merging, so as to minimize the cost of logic and internal buffer. Meanwhile, the architecture's data input rate is designed to be identical to its throughput, while the transmission order of input data can also match the zig-zag processing schedule. Therefore no intercommunication buffer is required between the deblocking filter and its previous component for speed matching or data reordering. As a result, only one 24×64 two-port SRAM as internal buffer is required in this design. When synthesized with SMIC 130nm process, the architecture costs a gate count of 30.2k, which is competitive considering its high performance.

  9. Solution-Processed Cu 2Se Nanocrystal Films with Bulk-Like Thermoelectric Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, Jason D.; Lynch, Jared J.; Coates, Nelson E.

    Thermoelectric power generation can play a key role in a sustainable energy future by converting waste heat from power plants and other industrial processes into usable electrical power. Current thermoelectric devices, however, require energy intensive manufacturing processes such as alloying and spark plasma sintering. Here, we describe the fabrication of a p-type thermoelectric material, copper selenide (Cu 2 Se), utilizing solution-processing and thermal annealing to produce a thin film that achieves a figure of merit, ZT, which is as high as its traditionally processed counterpart, a value of 0.14 at room temperature. This is the first report of amore » fully solution-processed nanomaterial achieving performance equivalent to its bulk form and represents a general strategy to reduce the energy required to manufacture advanced energy conversion and harvesting materials.« less

  10. Solution-Processed Cu 2Se Nanocrystal Films with Bulk-Like Thermoelectric Performance

    DOE PAGES

    Forster, Jason D.; Lynch, Jared J.; Coates, Nelson E.; ...

    2017-06-05

    Thermoelectric power generation can play a key role in a sustainable energy future by converting waste heat from power plants and other industrial processes into usable electrical power. Current thermoelectric devices, however, require energy intensive manufacturing processes such as alloying and spark plasma sintering. Here, we describe the fabrication of a p-type thermoelectric material, copper selenide (Cu 2 Se), utilizing solution-processing and thermal annealing to produce a thin film that achieves a figure of merit, ZT, which is as high as its traditionally processed counterpart, a value of 0.14 at room temperature. This is the first report of amore » fully solution-processed nanomaterial achieving performance equivalent to its bulk form and represents a general strategy to reduce the energy required to manufacture advanced energy conversion and harvesting materials.« less

  11. A Highly Flexible, Automated System Providing Reliable Sample Preparation in Element- and Structure-Specific Measurements.

    PubMed

    Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin

    2016-10-01

    Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.

  12. Influences of diesel pilot injection on ethanol autoignition - a numerical analysis

    NASA Astrophysics Data System (ADS)

    Burnete, N. V.; Burnete, N.; Jurchis, B.; Iclodean, C.

    2017-10-01

    The aim of this study is to highlight the influences of the diesel pilot quantity as well as the timing on the autoignition of ethanol and the pollutant emissions resulting from the combustion process. The combustion concept presented in this paper requires the injection of a small quantity of diesel fuel in order to create the required autoignition conditions for ethanol. The combustion of the diesel droplets injected in the combustion chamber lead to the creation of high temperature locations that favour the autoignition of ethanol. However, due to the high vaporization enthalpy and the better distribution inside the combustion chamber of ethanol, the peak temperature values are reduced. Due to the lower temperature values and the high burning velocity of ethanol (combined with the fact that there are multiple ignition sources) the conditions required for the formation of nitric oxides are not achieved anymore, thus leading to significantly lower NOx emissions. This way the benefits of the Diesel engine and of the constant volume combustion are combined to enable a more efficient and environmentally friendly combustion process.

  13. Rapid doubling of the critical current of YBa 2Cu 3O 7-δ coated conductors for viable high-speed industrial processing

    DOE PAGES

    Leroux, M.; Kihlstrom, K. J.; Holleis, S.; ...

    2015-11-09

    Here, we demonstrate that 3.5-MeV oxygen irradiation can markedly enhance the in-field critical current of commercial second generation superconducting tapes with an exposure time of just 1 s per 0.8 cm 2. Furthermore we demonstrate how speed is now at the level required for an industrial reel-to-reel post-processing. The irradiation is made on production line samples through the protective silver coating and does not require any modification of the growth process. From TEM imaging, we identify small clusters as the main source of increased vortex pinning.

  14. Low-Energy Water Recovery from Subsurface Brines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Young Chul; Kim, Gyu Dong; Hendren, Zachary

    A novel non-aqueous phase solvent (NAS) desalination process was proposed and developed in this research project. The NAS desalination process uses less energy than thermal processes, doesn’t require any additional chemicals for precipitation, and can be utilized to treat high TDS brine. In this project, our experimental work determined that water solubility changes and selective absorption are the key characteristics of NAS technology for successful desalination. Three NAS desalination mechanisms were investigated: (1) CO2 switchable, (2) high-temp absorption to low-temp desorption (thermally switchable), and (3) low-temp absorption to high-temp desorption (thermally switchable). Among these mechanisms, thermally switchable (low-temp absorption tomore » high-temp desorption) showed the highest water recovery and relatively high salt rejection. A test procedure for semi-continuous, bench scale NAS desalination process was also developed and used to assess performance under a range of conditions.« less

  15. Higher biodiversity is required to sustain multiple ecosystem processes across temperature regimes

    PubMed Central

    Perkins, Daniel M; Bailey, R A; Dossena, Matteo; Gamfeldt, Lars; Reiss, Julia; Trimmer, Mark; Woodward, Guy

    2015-01-01

    Biodiversity loss is occurring rapidly worldwide, yet it is uncertain whether few or many species are required to sustain ecosystem functioning in the face of environmental change. The importance of biodiversity might be enhanced when multiple ecosystem processes (termed multifunctionality) and environmental contexts are considered, yet no studies have quantified this explicitly to date. We measured five key processes and their combined multifunctionality at three temperatures (5, 10 and 15 °C) in freshwater aquaria containing different animal assemblages (1–4 benthic macroinvertebrate species). For single processes, biodiversity effects were weak and were best predicted by additive-based models, i.e. polyculture performances represented the sum of their monoculture parts. There were, however, significant effects of biodiversity on multifunctionality at the low and the high (but not the intermediate) temperature. Variation in the contribution of species to processes across temperatures meant that greater biodiversity was required to sustain multifunctionality across different temperatures than was the case for single processes. This suggests that previous studies might have underestimated the importance of biodiversity in sustaining ecosystem functioning in a changing environment. PMID:25131335

  16. Combined heat treatment and acid hydrolysis of cassava grate waste (CGW) biomass for ethanol production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agu, R.C.; Amadife, A.E.; Ude, C.M.

    1997-12-31

    The effect of combined heat treatment and acid hydrolysis (various concentrations) on cassava grate waste (CGW) biomass for ethanol production was investigated. At high concentrations of H{sub 2}SO{sub 4} (1--5 M), hydrolysis of the CGW biomass was achieved but with excessive charring or dehydration reaction. At lower acid concentrations, hydrolysis of CGW biomass was also achieved with 0.3--0.5 M H{sub 2}SO{sub 4}, while partial hydrolysis was obtained below 0.3 M H{sub 2}SO{sub 4} (the lowest acid concentration that hydrolyzed CGW biomass) at 120 C and 1 atm pressure for 30 min. A 60% process efficiency was achieved with 0.3 Mmore » H{sub 2}SO{sub 4} in hydrolyzing the cellulose and lignin materials present in the CGW biomass. High acid concentration is therefore not required for CGW biomass hydrolysis. The low acid concentration required for CGW biomass hydrolysis, as well as the minimal cost required for detoxification of CGW biomass because of low hydrogen cyanide content of CGW biomass would seem to make this process very economical. From three liters of the CGW biomass hydrolysate obtained from hydrolysis with 0.3M H{sub 2}SO{sub 4}, ethanol yield was 3.5 (v/v%) after yeast fermentation. However, although the process resulted in gainful utilization of CGW biomass, additional costs would be required to effectively dispose new by-products generated from CGW biomass processing.« less

  17. Design analysis of levitation facility for space processing applications. [Skylab program, space shuttles

    NASA Technical Reports Server (NTRS)

    Frost, R. T.; Kornrumpf, W. P.; Napaluch, L. J.; Harden, J. D., Jr.; Walden, J. P.; Stockhoff, E. H.; Wouch, G.; Walker, L. H.

    1974-01-01

    Containerless processing facilities for the space laboratory and space shuttle are defined. Materials process examples representative of the most severe requirements for the facility in terms of electrical power, radio frequency equipment, and the use of an auxiliary electron beam heater were used to discuss matters having the greatest effect upon the space shuttle pallet payload interfaces and envelopes. Improved weight, volume, and efficiency estimates for the RF generating equipment were derived. Results are particularly significant because of the reduced requirements for heat rejection from electrical equipment, one of the principal envelope problems for shuttle pallet payloads. It is shown that although experiments on containerless melting of high temperature refractory materials make it desirable to consider the highest peak powers which can be made available on the pallet, total energy requirements are kept relatively low by the very fast processing times typical of containerless experiments and allows consideration of heat rejection capabilities lower than peak power demand if energy storage in system heat capacitances is considered. Batteries are considered to avoid a requirement for fuel cells capable of furnishing this brief peak power demand.

  18. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  19. In-Space Propulsion Assessment Processes and Criteria for Affordable Systems

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar; Rhodes, Russel

    1999-01-01

    In a world of high launch costs to Low Earth Orbit (LEO), and of costs nearly twice as high to Geosynchronous Earth Orbit (GEO), it is clear that processes and criteria are required which will surface the path to greater affordability. Further, with propulsion systems making up a major part of the systems placed into multiple orbits, or beyond, it is clear that addressing propulsion systems for in-space propulsion (ISP) is a key part to breaking the barriers to affordable systems. While multitudes of Earth to Orbit transportation system efforts focus on reduced costs, the often neglected costs and related interactions of the in-space system equally require improvements that will enable broad end-to end customer affordability.

  20. Process for fabrication of large titanium diboride ceramic bodies

    DOEpatents

    Moorhead, Arthur J.; Bomar, E. S.; Becher, Paul F.

    1989-01-01

    A process for manufacturing large, fully dense, high purity TiB.sub.2 articles by pressing powders with a sintering aid at relatively low temperatures to reduce grain growth. The process requires stringent temperature and pressure applications in the hot-pressing step to ensure maximum removal of sintering aid and to avoid damage to the fabricated article or the die.

  1. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1995-09-01

    vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems

  2. Laboratory Powder Metallurgy Makes Tough Aluminum Sheet

    NASA Technical Reports Server (NTRS)

    Royster, D. M.; Thomas, J. R.; Singleton, O. R.

    1993-01-01

    Aluminum alloy sheet exhibits high tensile and Kahn tear strengths. Rapid solidification of aluminum alloys in powder form and subsequent consolidation and fabrication processes used to tailor parts made of these alloys to satisfy such specific aerospace design requirements as high strength and toughness.

  3. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  4. Development and Testing of High Surface Area Iridium Anodes for Molten Oxide Electrolysis

    NASA Technical Reports Server (NTRS)

    Shchetkovskiy, Anatoliy; McKechnie, Timothy; Sadoway, Donald R.; Paramore, James; Melendez, Orlando; Curreri, Peter A.

    2010-01-01

    Processing of lunar regolith into oxygen for habitat and propulsion is needed to support future space missions. Direct electrochemical reduction of molten regolith is an attractive method of processing, because no additional chemical reagents are needed. The electrochemical processing of molten oxides requires high surface area, inert anodes. Such electrodes need to be structurally robust at elevated temperatures (1400-1600?C), be resistant to thermal shock, have good electrical conductivity, be resistant to attack by molten oxide (silicate), be electrochemically stable and support high current density. Iridium with its high melting point, good oxidation resistance, superior high temperature strength and ductility is the most promising candidate for anodes in high temperature electrochemical processes. Several innovative concepts for manufacturing such anodes by electrodeposition of iridium from molten salt electrolyte (EL-Form? process) were evaluated. Iridium electrodeposition to form of complex shape components and coating was investigated. Iridium coated graphite, porous iridium structure and solid iridium anodes were fabricated. Testing of electroformed iridium anodes shows no visible degradation. The result of development, manufacturing and testing of high surface, inert iridium anodes will be presented.

  5. Development and Testing of High Surface Area Iridium Anodes for Molten Oxide Electrolysis

    NASA Technical Reports Server (NTRS)

    Shchetkovskiy, Anatoliy; McKechnie, Timothy; Sadoway, Donald R.; Paramore, James; Melendez, Orlando; Curreri, Peter A.

    2010-01-01

    Processing of lunar regolith into oxygen for habitat and propulsion is needed to support future space missions. Direct electrochemical reduction of molten regolith is an attractive method of processing, because no additional chemical reagents are needed. The electrochemical processing of molten oxides requires high surface area, inert anodes. Such electrodes need to be structurally robust at elevated temperatures (1400-1600 C), be resistant to thermal shock, have good electrical conductivity, be resistant to attack by molten oxide (silicate), be electrochemically stable and support high current density. Iridium with its high melting point, good oxidation resistance, superior high temperature strength and ductility is the most promising candidate for anodes in high temperature electrochemical processes. Several innovative concepts for manufacturing such anodes by electrodeposition of iridium from molten salt electrolyte (EL-Form process) were evaluated. Iridium electrodeposition to form of complex shape components and coating was investigated. Iridium coated graphite, porous iridium structure and solid iridium anodes were fabricated. Testing of electroformed iridium anodes shows no visible degradation. The result of development, manufacturing and testing of high surface, inert iridium anodes will be presented.

  6. Optimization of dual-energy subtraction chest radiography by use of a direct-conversion flat-panel detector system.

    PubMed

    Fukao, Mari; Kawamoto, Kiyosumi; Matsuzawa, Hiroaki; Honda, Osamu; Iwaki, Takeshi; Doi, Tsukasa

    2015-01-01

    We aimed to optimize the exposure conditions in the acquisition of soft-tissue images using dual-energy subtraction chest radiography with a direct-conversion flat-panel detector system. Two separate chest images were acquired at high- and low-energy exposures with standard or thick chest phantoms. The high-energy exposure was fixed at 120 kVp with the use of an auto-exposure control technique. For the low-energy exposure, the tube voltages and entrance surface doses ranged 40-80 kVp and 20-100 % of the dose required for high-energy exposure, respectively. Further, a repetitive processing algorithm was used for reduction of the image noise generated by the subtraction process. Seven radiology technicians ranked soft-tissue images, and these results were analyzed using the normalized-rank method. Images acquired at 60 kVp were of acceptable quality regardless of the entrance surface dose and phantom size. Using a repetitive processing algorithm, the minimum acceptable doses were reduced from 75 to 40 % for the standard phantom and to 50 % for the thick phantom. We determined that the optimum low-energy exposure was 60 kVp at 50 % of the dose required for the high-energy exposure. This allowed the simultaneous acquisition of standard radiographs and soft-tissue images at 1.5 times the dose required for a standard radiograph, which is significantly lower than the values reported previously.

  7. The distinguishing signature of Magnetic Penrose Process

    NASA Astrophysics Data System (ADS)

    Dadhich, Naresh; Tursunov, Arman; Ahmedov, Bobomurat; Stuchlík, Zdeněk

    2018-04-01

    In this Letter, we wish to point out that the distinguishing feature of Magnetic Penrose process (MPP) is its super high efficiency exceeding 100% (which was established in mid 1980s for discrete particle accretion) of extraction of rotational energy of a rotating black hole electromagnetically for a magnetic field of milli Gauss order. Another similar process, which is also driven by electromagnetic field, is Blandford-Znajek mechanism (BZ), which could be envisaged as high magnetic field limit MPP as it requires threshold magnetic field of order 104G. Recent simulation studies of fully relativistic magnetohydrodynamic flows have borne out super high efficiency signature of the process for high magnetic field regime; viz BZ. We would like to make a clear prediction that similar simulation studies of MHD flows for low magnetic field regime, where BZ would be inoperative, would also have super efficiency.

  8. Fabrication and processing of high-strength densely packed carbon nanotube yarns without solution processes.

    PubMed

    Liu, Kai; Zhu, Feng; Liu, Liang; Sun, Yinghui; Fan, Shoushan; Jiang, Kaili

    2012-06-07

    Defects of carbon nanotubes, weak tube-tube interactions, and weak carbon nanotube joints are bottlenecks for obtaining high-strength carbon nanotube yarns. Some solution processes are usually required to overcome these drawbacks. Here we fabricate ultra-long and densely packed pure carbon nanotube yarns by a two-rotator twisting setup with the aid of some tensioning rods. The densely packed structure enhances the tube-tube interactions, thus making high tensile strengths of carbon nanotube yarns up to 1.6 GPa. We further use a sweeping laser to thermally treat as-produced yarns for recovering defects of carbon nanotubes and possibly welding carbon nanotube joints, which improves their Young's modulus by up to ∼70%. The spinning and laser sweeping processes are solution-free and capable of being assembled together to produce high-strength yarns continuously as desired.

  9. High-School Buildings and Grounds. Bulletin, 1922, No. 23

    ERIC Educational Resources Information Center

    Bureau of Education, Department of the Interior, 1922

    1922-01-01

    The success of any high school depends largely upon the planning of its building. The wise planning of a high-school building requires familiarity with school needs and processes, knowledge of the best approved methods of safety, lighting, sanitation, and ventilation, and ability to solve the educational, structural, and architectural problems…

  10. Rheology and extrusion of high-solids biomass

    Treesearch

    Tim Scott; Joseph R. Samaniuk; Daniel J. Klingenberg

    2011-01-01

    Economical biorefining of lignocellulosic biomass (LCB) requires processing high-solids particulate streams. We have developed new techniques and testing protocols to measure the rheological properties of high-solids LCB using a modified torque rheometer (TR). The flow field in the TR is similar to that of a twin-screw extruder and for modeling purposes can be...

  11. A new sono-electrochemical method for enhanced detoxification of hydrophilic chloroorganic pollutants in water.

    PubMed

    Yasman, Yakov; Bulatov, Valery; Gridin, Vladimir V; Agur, Sabina; Galil, Noah; Armon, Robert; Schechter, Israel

    2004-09-01

    A new method for detoxification of hydrophilic chloroorganic pollutants in effluent water was developed, using a combination of ultrasound waves, electrochemistry and Fenton's reagent. The advantages of the method are exemplified using two target compounds: the common herbicide 2,4-dichlorophenoxyacetic acid (2,4-D) and its derivative 2,4-dichlorophenol (2,4-DCP). The high degradation power of this process is due to the large production of oxidizing hydroxyl radicals and high mass transfer due to sonication. Application of this sono-electrochemical Fenton process (SEF) treatment (at 20 kHz) with quite a small current density, accomplished almost 50% oxidation of 2,4-D solution (300 ppm, 1.2 mM) in just 60 s. Similar treatments ran for 600 s resulted in practically full degradation of the herbicide; sizable oxidation of 2,4-DCP also occurs. The main intermediate compounds produced in the SEF process were identified. Their kinetic profile was measured and a chemical reaction scheme was suggested. The efficiency of the SEF process is tentatively much higher than the reference degradation methods and the time required for full degradation is considerably shorter. The SEF process maintains high performance up to concentrations which are higher than reference methods. The optimum concentration of Fe2+ ions required for this process was found to be of about 2 mM, which is lower than that in reference techniques. These findings indicate that SEF process may be an effective method for detoxification of environmental water.

  12. A high-throughput system for high-quality tomographic reconstruction of large datasets at Diamond Light Source

    PubMed Central

    Atwood, Robert C.; Bodey, Andrew J.; Price, Stephen W. T.; Basham, Mark; Drakopoulos, Michael

    2015-01-01

    Tomographic datasets collected at synchrotrons are becoming very large and complex, and, therefore, need to be managed efficiently. Raw images may have high pixel counts, and each pixel can be multidimensional and associated with additional data such as those derived from spectroscopy. In time-resolved studies, hundreds of tomographic datasets can be collected in sequence, yielding terabytes of data. Users of tomographic beamlines are drawn from various scientific disciplines, and many are keen to use tomographic reconstruction software that does not require a deep understanding of reconstruction principles. We have developed Savu, a reconstruction pipeline that enables users to rapidly reconstruct data to consistently create high-quality results. Savu is designed to work in an ‘orthogonal’ fashion, meaning that data can be converted between projection and sinogram space throughout the processing workflow as required. The Savu pipeline is modular and allows processing strategies to be optimized for users' purposes. In addition to the reconstruction algorithms themselves, it can include modules for identification of experimental problems, artefact correction, general image processing and data quality assessment. Savu is open source, open licensed and ‘facility-independent’: it can run on standard cluster infrastructure at any institution. PMID:25939626

  13. Context-Sensitive Adjustment of Cognitive Control in Dual-Task Performance

    ERIC Educational Resources Information Center

    Fischer, Rico; Gottschalk, Caroline; Dreisbach, Gesine

    2014-01-01

    Performing 2 highly similar tasks at the same time requires an adaptive regulation of cognitive control to shield prioritized primary task processing from between-task (cross-talk) interference caused by secondary task processing. In the present study, the authors investigated how implicitly and explicitly delivered information promotes the…

  14. Hydrogen suppression of 'ductile' processes

    NASA Technical Reports Server (NTRS)

    Sisson, R. D., Jr.; Wilson, J. H.; Adler, T. A.; Mcnitt, R. P.; Louthan, M. R., Jr.

    1980-01-01

    Experimental results are reported for torsional fatigue specimens of high-strength steel 4370 and tensile bars of mild steel A-106 which present evidence of a hydrogen-induced strain-aided hardening effect. These results are consistent with the postulate that hydrogen suppresses ductile processes required for crack initiation at large plastic strains.

  15. A Winning Transition Plan

    ERIC Educational Resources Information Center

    Moeder-Chandler, Markus

    2014-01-01

    Helping high school athletes navigate the college recruitment process requires some extra steps. This article assists school counselors in the athletic identification process with support provided for both the student and parents. Also covered is how the recruitment criteria for a college and team works. The role of counselor places them on the…

  16. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida : [summary].

    DOT National Transportation Integrated Search

    2014-03-01

    Similar to an ill patient, road safety issues can : also be diagnosed, if the right tools are available. : Statistics on roadway incidents can locate areas : that have a high rate of incidents and require : a solution, such as better signage, lightin...

  17. Integrated tools and techniques applied to the TES ground data system

    NASA Technical Reports Server (NTRS)

    Morrison, B. A.

    2000-01-01

    The author of this paper will dicuss the selection of CASE tools, a decision making process, requirements tracking and a review mechanism that leads to a highly integrated approach to software development that must deal with the constant pressure to change software requirements and design that is associated with research and development.

  18. Perceived Challenges to Integrating Reading Strategies in Content Areas: A Single Case Study

    ERIC Educational Resources Information Center

    Pezzolla, Karen

    2017-01-01

    An alarming percentage of middle and high school students find themselves unable to read their textbooks at grade level proficiency; lacking the necessary skills to access and process information, and read critically. The Common Core State Standards require students to apply reading strategies across the curriculum, therefore requiring teachers to…

  19. Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI

    NASA Technical Reports Server (NTRS)

    Gulkis, S.

    1989-01-01

    The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.

  20. Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI.

    PubMed

    Gulkis, S

    1989-01-01

    The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.

  1. Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI

    NASA Astrophysics Data System (ADS)

    Gulkis, Samuel

    The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.

  2. Using IT to improve quality at NewYork-Presybterian Hospital: a requirements-driven strategic planning process.

    PubMed

    Kuperman, Gilad J; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality.

  3. Recent Advances in Food Processing Using High Hydrostatic Pressure Technology.

    PubMed

    Wang, Chung-Yi; Huang, Hsiao-Wen; Hsu, Chiao-Ping; Yang, Binghuei Barry

    2016-01-01

    High hydrostatic pressure is an emerging non-thermal technology that can achieve the same standards of food safety as those of heat pasteurization and meet consumer requirements for fresher tasting, minimally processed foods. Applying high-pressure processing can inactivate pathogenic and spoilage microorganisms and enzymes, as well as modify structures with little or no effects on the nutritional and sensory quality of foods. The U.S. Food and Drug Administration (FDA) and the U.S. Department of Agriculture (USDA) have approved the use of high-pressure processing (HPP), which is a reliable technological alternative to conventional heat pasteurization in food-processing procedures. This paper presents the current applications of HPP in processing fruits, vegetables, meats, seafood, dairy, and egg products; such applications include the combination of pressure and biopreservation to generate specific characteristics in certain products. In addition, this paper describes recent findings on the microbiological, chemical, and molecular aspects of HPP technology used in commercial and research applications.

  4. Using IT to Improve Quality at NewYork-Presybterian Hospital: A Requirements-Driven Strategic Planning Process

    PubMed Central

    Kuperman, Gilad J.; Boyer, Aurelia; Cole, Curt; Forman, Bruce; Stetson, Peter D.; Cooper, Mary

    2006-01-01

    At NewYork-Presbyterian Hospital, we are committed to the delivery of high quality care. We have implemented a strategic planning process to determine the information technology initiatives that will best help us improve quality. The process began with the creation of a Clinical Quality and IT Committee. The Committee identified 2 high priority goals that would enable demonstrably high quality care: 1) excellence at data warehousing, and 2) optimal use of automated clinical documentation to capture encounter-related quality and safety data. For each high priority goal, a working group was created to develop specific recommendations. The Data Warehousing subgroup has recommended the implementation of an architecture management process and an improved ability for users to get access to aggregate data. The Structured Documentation subgroup is establishing recommendations for a documentation template creation process. The strategic planning process at times is slow, but assures that the organization is focusing on the information technology activities most likely to lead to improved quality. PMID:17238381

  5. A user-system interface agent

    NASA Technical Reports Server (NTRS)

    Wakim, Nagi T.; Srivastava, Sadanand; Bousaidi, Mehdi; Goh, Gin-Hua

    1995-01-01

    Agent-based technologies answer to several challenges posed by additional information processing requirements in today's computing environments. In particular, (1) users desire interaction with computing devices in a mode which is similar to that used between people, (2) the efficiency and successful completion of information processing tasks often require a high-level of expertise in complex and multiple domains, (3) information processing tasks often require handling of large volumes of data and, therefore, continuous and endless processing activities. The concept of an agent is an attempt to address these new challenges by introducing information processing environments in which (1) users can communicate with a system in a natural way, (2) an agent is a specialist and a self-learner and, therefore, it qualifies to be trusted to perform tasks independent of the human user, and (3) an agent is an entity that is continuously active performing tasks that are either delegated to it or self-imposed. The work described in this paper focuses on the development of an interface agent for users of a complex information processing environment (IPE). This activity is part of an on-going effort to build a model for developing agent-based information systems. Such systems will be highly applicable to environments which require a high degree of automation, such as, flight control operations and/or processing of large volumes of data in complex domains, such as the EOSDIS environment and other multidisciplinary, scientific data systems. The concept of an agent as an information processing entity is fully described with emphasis on characteristics of special interest to the User-System Interface Agent (USIA). Issues such as agent 'existence' and 'qualification' are discussed in this paper. Based on a definition of an agent and its main characteristics, we propose an architecture for the development of interface agents for users of an IPE that is agent-oriented and whose resources are likely to be distributed and heterogeneous in nature. The architecture of USIA is outlined in two main components: (1) the user interface which is concerned with issues as user dialog and interaction, user modeling, and adaptation to user profile, and (2) the system interface part which deals with identification of IPE capabilities, task understanding and feasibility assessment, and task delegation and coordination of assistant agents.

  6. Plasmonic nanobubbles for target cell-specific gene and drug delivery and multifunctional processing of heterogeneous cell systems

    NASA Astrophysics Data System (ADS)

    Lukianova-Hleb, Ekaterina Y.; Huye, Leslie E.; Brenner, Malcolm K.; Lapotko, Dmitri O.

    2014-03-01

    Cell and gene cancer therapies require ex vivo cell processing of human grafts. Such processing requires at least three steps - cell enrichment, cell separation (destruction), and gene transfer - each of which requires the use of a separate technology. While these technologies may be satisfactory for research use, they are of limited usefulness in the clinical treatment setting because they have a low processing rate, as well as a low transfection and separation efficacy and specificity in heterogeneous human grafts. Most problematic, because current technologies are administered in multiple steps - rather than in a single, multifunctional, and simultaneous procedure - they lengthen treatment process and introduce an unnecessary level of complexity, labor, and resources into clinical treatment; all these limitations result in high losses of valuable cells. We report a universal, high-throughput, and multifunctional technology that simultaneously (1) inject free external cargo in target cells, (2) destroys unwanted cells, and (3) preserve valuable non-target cells in heterogeneous grafts. Each of these functions has single target cell specificity in heterogeneous cell system, processing rate > 45 mln cell/min, injection efficacy 90% under 96% viability of the injected cells, target cell destruction efficacy > 99%, viability of not-target cells >99% The developed technology employs novel cellular agents, called plasmonic nanobubbles (PNBs). PNBs are not particles, but transient, intracellular events, a vapor nanobubbles that expand and collapse in mere nanoseconds under optical excitation of gold nanoparticles with short picosecond laser pulses. PNBs of different, cell-specific, size (1) inject free external cargo with small PNBs, (2) Destroy other target cells mechanically with large PNBs and (3) Preserve non-target cells. The multi-functionality, precision, and high throughput of all-in-one PNB technology will tremendously impact cell and gene therapies and other clinical applications that depend on ex vivo processing of heterogeneous cell systems.

  7. Lubricant Coating Process

    NASA Technical Reports Server (NTRS)

    1989-01-01

    "Peen Plating," a NASA developed process for applying molybdenum disulfide, is the key element of Techniblast Co.'s SURFGUARD process for applying high strength solid lubricants. The process requires two machines -- one for cleaning and one for coating. The cleaning step allows the coating to be bonded directly to the substrate to provide a better "anchor." The coating machine applies a half a micron thick coating. Then, a blast gun, using various pressures to vary peening intensities for different applications, fires high velocity "media" -- peening hammers -- ranging from plastic pellets to steel shot. Techniblast was assisted by Rural Enterprises, Inc. Coating service can be performed at either Techniblast's or a customer's facility.

  8. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  9. Process for CO.sub.2 capture using zeolites from high pressure and moderate temperature gas streams

    DOEpatents

    Siriwardane, Ranjani V [Morgantown, WV; Stevens, Robert W [Morgantown, WV

    2012-03-06

    A method for separating CO.sub.2 from a gas stream comprised of CO.sub.2 and other gaseous constituents using a zeolite sorbent in a swing-adsorption process, producing a high temperature CO.sub.2 stream at a higher CO.sub.2 pressure than the input gas stream. The method utilizes CO.sub.2 desorption in a CO.sub.2 atmosphere and effectively integrates heat transfers for optimizes overall efficiency. H.sub.2O adsorption does not preclude effective operation of the sorbent. The cycle may be incorporated in an IGCC for efficient pre-combustion CO.sub.2 capture. A particular application operates on shifted syngas at a temperature exceeding 200.degree. C. and produces a dry CO.sub.2 stream at low temperature and high CO.sub.2 pressure, greatly reducing any compression energy requirements which may be subsequently required.

  10. Fluorine-Based DRIE of Fused Silica

    NASA Technical Reports Server (NTRS)

    Yee, Karl; Shcheglov, Kirill; Li, Jian; Choi, Daniel

    2007-01-01

    A process of deep reactive-ion etching (DRIE) using a fluorine-based gas mixture enhanced by induction-coupled plasma (ICP) has been demonstrated to be effective in forming high-aspect-ratio three-dimensional patterns in fused silica. The patterns are defined in part by an etch mask in the form of a thick, high-quality aluminum film. The process was developed to satisfy a need to fabricate high-aspect-ratio fused-silica resonators for vibratory microgyroscopes, and could be used to satisfy similar requirements for fabricating other fused-silica components.

  11. Thermionic cogeneration burner design

    NASA Astrophysics Data System (ADS)

    Miskolczy, G.; Goodale, D.; Moffat, A. L.; Morgan, D. T.

    Since thermionic converters receive heat at very high temperatures (approximately 1800 K) and reject heat at moderately high temperatures (approximately 800 K), they are useful for cogeneration applications involving high temperature processes. The electric power from thermionic converters is produced as a high amperage, low-voltage direct current. An ideal cogeneration application would be to utilize the reject heat at the collector temperature and the electricity without power conditioning. A cogeneration application in the edible oil industry fulfills both of these requirements since both direct heat and hydrogen gas are required in the hydrogenation of the oils. In this application, the low-voltage direct current would be used in a hydrogen electrolyzer.

  12. Advanced sensors and instrumentation

    NASA Technical Reports Server (NTRS)

    Calloway, Raymond S.; Zimmerman, Joe E.; Douglas, Kevin R.; Morrison, Rusty

    1990-01-01

    NASA is currently investigating the readiness of Advanced Sensors and Instrumentation to meet the requirements of new initiatives in space. The following technical objectives and technologies are briefly discussed: smart and nonintrusive sensors; onboard signal and data processing; high capacity and rate adaptive data acquisition systems; onboard computing; high capacity and rate onboard storage; efficient onboard data distribution; high capacity telemetry; ground and flight test support instrumentation; power distribution; and workstations, video/lighting. The requirements for high fidelity data (accuracy, frequency, quantity, spatial resolution) in hostile environments will continue to push the technology developers and users to extend the performance of their products and to develop new generations.

  13. Automatic Coregistration and orthorectification (ACRO) and subsequent mosaicing of NASA high-resolution imagery over the Mars MC11 quadrangle, using HRSC as a baseline

    NASA Astrophysics Data System (ADS)

    Sidiropoulos, Panagiotis; Muller, Jan-Peter; Watson, Gillian; Michael, Gregory; Walter, Sebastian

    2018-02-01

    This work presents the coregistered, orthorectified and mosaiced high-resolution products of the MC11 quadrangle of Mars, which have been processed using novel, fully automatic, techniques. We discuss the development of a pipeline that achieves fully automatic and parameter independent geometric alignment of high-resolution planetary images, starting from raw input images in NASA PDS format and following all required steps to produce a coregistered geotiff image, a corresponding footprint and useful metadata. Additionally, we describe the development of a radiometric calibration technique that post-processes coregistered images to make them radiometrically consistent. Finally, we present a batch-mode application of the developed techniques over the MC11 quadrangle to validate their potential, as well as to generate end products, which are released to the planetary science community, thus assisting in the analysis of Mars static and dynamic features. This case study is a step towards the full automation of signal processing tasks that are essential to increase the usability of planetary data, but currently, require the extensive use of human resources.

  14. Impact of Energy Gain and Subsystem Characteristics on Fusion Propulsion Performance

    NASA Technical Reports Server (NTRS)

    Chakrabarti, S.; Schmidt, G. R.

    2001-01-01

    Rapid transport of large payloads and human crews throughout the solar system requires propulsion systems having very high specific impulse (I(sub sp) > 10(exp 4) to 10(exp 5) s). It also calls for systems with extremely low mass-power ratios (alpha < 10(exp -1) kg/kW). Such low alpha are beyond the reach of conventional power-limited propulsion, but may be attainable with fusion and other nuclear concepts that produce energy within the propellant. The magnitude of energy gain must be large enough to sustain the nuclear process while still providing a high jet power relative to the massive energy-intensive subsystems associated with these concepts. This paper evaluates the impact of energy gain and subsystem characteristics on alpha. Central to the analysis are general parameters that embody the essential features of any 'gain-limited' propulsion power balance. Results show that the gains required to achieve alpha = 10(exp -1) kg/kW with foreseeable technology range from approximately 100 to over 2000, which is three to five orders of magnitude greater than current fusion state of the arL Sensitivity analyses point to the parameters exerting the most influence for either: (1) lowering a and improving mission performance or (2) relaxing gain requirements and reducing demands on the fusion process. The greatest impact comes from reducing mass and increasing efficiency of the thruster and subsystems downstream of the fusion process. High relative gain, through enhanced fusion processes or more efficient drivers and processors, is also desirable. There is a benefit in improving driver and subsystem characteristics upstream of the fusion process, but it diminishes at relative gains > 100.

  15. A real-time spectrum acquisition system design based on quantum dots-quantum well detector

    NASA Astrophysics Data System (ADS)

    Zhang, S. H.; Guo, F. M.

    2016-01-01

    In this paper, we studied the structure characteristics of quantum dots-quantum well photodetector with response wavelength range from 400 nm to 1000 nm. It has the characteristics of high sensitivity, low dark current and the high conductance gain. According to the properties of the quantum dots-quantum well photodetectors, we designed a new type of capacitive transimpedence amplifier (CTIA) readout circuit structure with the advantages of adjustable gain, wide bandwidth and high driving ability. We have implemented the chip packaging between CTIA-CDS structure readout circuit and quantum dots detector and tested the readout response characteristics. According to the timing signals requirements of our readout circuit, we designed a real-time spectral data acquisition system based on FPGA and ARM. Parallel processing mode of programmable devices makes the system has high sensitivity and high transmission rate. In addition, we realized blind pixel compensation and smoothing filter algorithm processing to the real time spectrum data by using C++. Through the fluorescence spectrum measurement of carbon quantum dots and the signal acquisition system and computer software system to realize the collection of the spectrum signal processing and analysis, we verified the excellent characteristics of detector. It meets the design requirements of quantum dot spectrum acquisition system with the characteristics of short integration time, real-time and portability.

  16. Scalable, High-performance 3D Imaging Software Platform: System Architecture and Application to Virtual Colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2013-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system. PMID:23366803

  17. Challenges for the registration of vaccines in emerging countries: Differences in dossier requirements, application and evaluation processes.

    PubMed

    Dellepiane, Nora; Pagliusi, Sonia

    2018-06-07

    The divergence of regulatory requirements and processes in developing and emerging countries contributes to hamper vaccines' registration, and therefore delay access to high-quality, safe and efficacious vaccines for their respective populations. This report focuses on providing insights on the heterogeneity of registration requirements in terms of numbering structure and overall content of dossiers for marketing authorisation applications for vaccines in different areas of the world. While it also illustrates the divergence of regulatory processes in general, as well as the need to avoid redundant reviews, it does not claim to provide a comprehensive view of all processes nor existing facilitating mechanisms, nor is it intended to touch upon the differences in assessments made by different regulatory authorities. This report describes the work analysed by regulatory experts from vaccine manufacturing companies during a meeting held in Geneva in May 2017, in identifying and quantifying differences in the requirements for vaccine registration in three aspects for comparison: the dossier numbering structure and contents, the application forms, and the evaluation procedures, in different countries and regions. The Module 1 of the Common Technical Document (CTD) of 10 countries were compared. Modules 2-5 of the CTDs of two regions and three countries were compared to the CTD of the US FDA. The application forms of eight countries were compared and the registration procedures of 134 importing countries were compared as well. The analysis indicates a high degree of divergence in numbering structure and content requirements. Possible interventions that would lead to significant improvements in registration efficiency include alignment in CTD numbering structure, a standardised model-application form, and better convergence of evaluation procedures. Copyright © 2018.

  18. Supportability Technologies for Future Exploration Missions

    NASA Technical Reports Server (NTRS)

    Watson, Kevin; Thompson, Karen

    2007-01-01

    Future long-duration human exploration missions will be challenged by resupply limitations and mass and volume constraints. Consequently, it will be essential that the logistics footprint required to support these missions be minimized and that capabilities be provided to make them highly autonomous from a logistics perspective. Strategies to achieve these objectives include broad implementation of commonality and standardization at all hardware levels and across all systems, repair of failed hardware at the lowest possible hardware level, and manufacture of structural and mechanical replacement components as needed. Repair at the lowest hardware levels will require the availability of compact, portable systems for diagnosis of failures in electronic systems and verification of system functionality following repair. Rework systems will be required that enable the removal and replacement of microelectronic components with minimal human intervention to minimize skill requirements and training demand for crews. Materials used in the assembly of electronic systems (e.g. solders, fluxes, conformal coatings) must be compatible with the available repair methods and the spacecraft environment. Manufacturing of replacement parts for structural and mechanical applications will require additive manufacturing systems that can generate near-net-shape parts from the range of engineering alloys employed in the spacecraft structure and in the parts utilized in other surface systems. These additive manufacturing processes will need to be supported by real-time non-destructive evaluation during layer-additive processing for on-the-fly quality control. This will provide capabilities for quality control and may serve as an input for closed-loop process control. Additionally, non-destructive methods should be available for material property determination. These nondestructive evaluation processes should be incorporated with the additive manufacturing process - providing an in-process capability to ensure that material deposited during layer-additive processing meets required material property criteria.

  19. Mobil process converts methanol to high-quality synthetic gasoline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, A.

    1978-12-11

    If production of gasoline from coal becomes commercially attractive in the United States, a process under development at the Mobil Research and Development Corp. may compete with better known coal liquefaction processes. Mobil process converts methanol to high-octane, unleaded gasoline; methanol can be produced commercially from coal. If gasoline is the desired product, the Mobil process offers strong technical and cost advantages over H-coal, Exxon donor solvent, solvent-refined coal, and Fischer--Tropsch processes. The cost analysis, contained in a report to the Dept. of Energy, concludes that the Mobil process produces more-expensive liquid products than any other liquefaction process except Fischer--Tropsch.more » But Mobil's process produces ready-to-use gasoline, while the others produce oils which require further expensive refining to yield gasoline. Disadvantages and advantages are discussed.« less

  20. Techno-economic analysis of extraction-based separation systems for acetone, butanol, and ethanol recovery and purification.

    PubMed

    Grisales Díaz, Víctor Hugo; Olivar Tost, Gerard

    2017-01-01

    Dual extraction, high-temperature extraction, mixture extraction, and oleyl alcohol extraction have been proposed in the literature for acetone, butanol, and ethanol (ABE) production. However, energy and economic evaluation under similar assumptions of extraction-based separation systems are necessary. Hence, the new process proposed in this work, direct steam distillation (DSD), for regeneration of high-boiling extractants was compared with several extraction-based separation systems. The evaluation was performed under similar assumptions through simulation in Aspen Plus V7.3 ® software. Two end distillation systems (number of non-ideal stages between 70 and 80) were studied. Heat integration and vacuum operation of some units were proposed reducing the energy requirements. Energy requirement of hybrid processes, substrate concentration of 200 g/l, was between 6.4 and 8.3 MJ-fuel/kg-ABE. The minimum energy requirements of extraction-based separation systems, feeding a water concentration in the substrate equivalent to extractant selectivity, and ideal assumptions were between 2.6 and 3.5 MJ-fuel/kg-ABE, respectively. The efficiencies of recovery systems for baseline case and ideal evaluation were 0.53-0.57 and 0.81-0.84, respectively. The main advantages of DSD were the operation of the regeneration column at atmospheric pressure, the utilization of low-pressure steam, and the low energy requirements of preheating. The in situ recovery processes, DSD, and mixture extraction with conventional regeneration were the approaches with the lowest energy requirements and total annualized costs.

  1. High voltage requirements and issues for the 1990's. [for spacecraft power supplies

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.; Faymon, K. A.

    1984-01-01

    The development of high-power high-voltage space systems will require advances in power generation and processing. The systems must be reliable, adaptable, and durable for space mission success. The issues, which must be resolved in order to produce a high power system, are weight and volume reduction of components and modules and the creation of a reliable high repetition pulse power processor. Capacitor energy density must be increased by twice the present capacity and packaging must be reduced by a factor of 10 to 20 times. The packaging must also protect the system from interaction with the natural space environment and the induced environment, produced from spacecraft systems and environment interaction.

  2. Gradient Tempering Of Bearing Races

    NASA Technical Reports Server (NTRS)

    Parr, Richardson A.

    1991-01-01

    Gradient-tempering process increases fracture toughness and resistance to stress-corrosion cracking of ball-bearing races made of hard, strong steels and subject to high installation stresses and operation in corrosive media. Also used in other applications in which local toughening of high-strength/low-toughness materials required.

  3. FPGA-Based Smart Sensor for Online Displacement Measurements Using a Heterodyne Interferometer

    PubMed Central

    Vera-Salas, Luis Alberto; Moreno-Tapia, Sandra Veronica; Garcia-Perez, Arturo; de Jesus Romero-Troncoso, Rene; Osornio-Rios, Roque Alfredo; Serroukh, Ibrahim; Cabal-Yepez, Eduardo

    2011-01-01

    The measurement of small displacements on the nanometric scale demands metrological systems of high accuracy and precision. In this context, interferometer-based displacement measurements have become the main tools used for traceable dimensional metrology. The different industrial applications in which small displacement measurements are employed requires the use of online measurements, high speed processes, open architecture control systems, as well as good adaptability to specific process conditions. The main contribution of this work is the development of a smart sensor for large displacement measurement based on phase measurement which achieves high accuracy and resolution, designed to be used with a commercial heterodyne interferometer. The system is based on a low-cost Field Programmable Gate Array (FPGA) allowing the integration of several functions in a single portable device. This system is optimal for high speed applications where online measurement is needed and the reconfigurability feature allows the addition of different modules for error compensation, as might be required by a specific application. PMID:22164040

  4. High performance dielectric materials development

    NASA Technical Reports Server (NTRS)

    Piche, Joe; Kirchner, Ted; Jayaraj, K.

    1994-01-01

    The mission of polymer composites materials technology is to develop materials and processing technology to meet DoD and commercial needs. The following are outlined in this presentation: high performance capacitors, high temperature aerospace insulation, rationale for choosing Foster-Miller (the reporting industry), the approach to the development and evaluation of high temperature insulation materials, and the requirements/evaluation parameters. Supporting tables and diagrams are included.

  5. High performance dielectric materials development

    NASA Astrophysics Data System (ADS)

    Piche, Joe; Kirchner, Ted; Jayaraj, K.

    1994-09-01

    The mission of polymer composites materials technology is to develop materials and processing technology to meet DoD and commercial needs. The following are outlined in this presentation: high performance capacitors, high temperature aerospace insulation, rationale for choosing Foster-Miller (the reporting industry), the approach to the development and evaluation of high temperature insulation materials, and the requirements/evaluation parameters. Supporting tables and diagrams are included.

  6. Simulated Single Tooth Bending of High Temperature Alloys

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert, F.; Burke, Christopher

    2012-01-01

    Future unmanned space missions will require mechanisms to operate at extreme conditions in order to be successful. In some of these mechanisms, very high gear reductions will be needed to permit very small motors to drive other components at low rotational speed with high output torque. Therefore gearing components are required that can meet the mission requirements. In mechanisms such as this, bending fatigue strength capacity of the gears is very important. The bending fatigue capacity of a high temperature, nickel-based alloy, typically used for turbine disks in gas turbine engines and two tool steel materials with high vanadium content, were compared to that of a typical aerospace alloy-AISI 9310. Test specimens were fabricated by electro-discharge machining without post machining processing. Tests were run at 24 and at 490 C. As test temperature increased from 24 to 490 C the bending fatigue strength was reduced by a factor of five.

  7. Transmitter experiment package for the communications technology satellite

    NASA Technical Reports Server (NTRS)

    Farber, B.; Goldin, D. S.; Marcus, B.; Mock, P.

    1977-01-01

    The operating requirements, system design characteristics, high voltage packaging considerations, nonstandard components development, and test results for the transmitter experiment package (TEP) are described. The TEP is used for broadcasting power transmission from the Communications Technology Satellite. The TEP consists of a 12 GHz, 200-watt output stage tube (OST), a high voltage processing system that converts the unregulated spacecraft solar array power to the regulated voltages required for OST operation, and a variable conductance heat pipe system that is used to cool the OST body.

  8. CNC Machining Of The Complex Copper Electrodes

    NASA Astrophysics Data System (ADS)

    Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina

    2015-07-01

    This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.

  9. Color image processing and vision system for an automated laser paint-stripping system

    NASA Astrophysics Data System (ADS)

    Hickey, John M., III; Hise, Lawson

    1994-10-01

    Color image processing in machine vision systems has not gained general acceptance. Most machine vision systems use images that are shades of gray. The Laser Automated Decoating System (LADS) required a vision system which could discriminate between substrates of various colors and textures and paints ranging from semi-gloss grays to high gloss red, white and blue (Air Force Thunderbirds). The changing lighting levels produced by the pulsed CO2 laser mandated a vision system that did not require a constant color temperature lighting for reliable image analysis.

  10. IONAC-Lite

    NASA Technical Reports Server (NTRS)

    Torgerson, Jordan L.; Clare, Loren P.; Pang, Jackson

    2011-01-01

    The Interplanetary Overlay Net - working Protocol Accelerator (IONAC) described previously in The Inter - planetary Overlay Networking Protocol Accelerator (NPO-45584), NASA Tech Briefs, Vol. 32, No. 10, (October 2008) p. 106 (http://www.techbriefs.com/component/ content/article/3317) provides functions that implement the Delay Tolerant Networking (DTN) bundle protocol. New missions that require high-speed downlink-only use of DTN can now be accommodated by the unidirectional IONAC-Lite to support high data rate downlink mission applications. Due to constrained energy resources, a conventional software implementation of the DTN protocol can provide only limited throughput for any given reasonable energy consumption rate. The IONAC-Lite DTN Protocol Accelerator is able to reduce this energy consumption by an order of magnitude and increase the throughput capability by two orders of magnitude. In addition, a conventional DTN implementation requires a bundle database with a considerable storage requirement. In very high downlink datarate missions such as near-Earth radar science missions, the storage space utilization needs to be maximized for science data and minimized for communications protocol-related storage needs. The IONAC-Lite DTN Protocol Accelerator is implemented in a reconfigurable hardware device to accomplish exactly what s needed for high-throughput DTN downlink-only scenarios. The following are salient features of the IONAC-Lite implementation: An implementation of the Bundle Protocol for an environment that requires a very high rate bundle egress data rate. The C&DH (command and data handling) subsystem is also expected to be very constrained so the interaction with the C&DH processor and the temporary storage are minimized. Fully pipelined design so that bundle processing database is not required. Implements a lookup table-based approach to eliminate multi-pass processing requirement imposed by the Bundle Protocol header s length field structure and the SDNV (self-delimiting numeric value) data field formatting. 8-bit parallel datapath to support high data-rate missions. Reduced resource utilization implementation for missions that do not require custody transfer features. There was no known implementation of the DTN protocol in a field programmable gate array (FPGA) device prior to the current implementation. The combination of energy and performance optimization that embodies this design makes the work novel.

  11. Comparison of lignin extraction processes: Economic and environmental assessment.

    PubMed

    Carvajal, Juan C; Gómez, Álvaro; Cardona, Carlos A

    2016-08-01

    This paper presents the technical-economic and environmental assessment of four lignin extraction processes from two different raw materials (sugarcane bagasse and rice husks). The processes are divided into two categories, the first processes evaluates lignin extraction with prior acid hydrolysis step, while in the second case the extraction processes are evaluated standalone for a total analysis of 16 scenarios. Profitability indicators as the net present value (NPV) and environmental indicators as the potential environmental impact (PEI) are used through a process engineering approach to understand and select the best lignin extraction process. The results show that both economically and environmentally process with sulfites and soda from rice husk presents the best results; however the quality of lignin obtained with sulfites is not suitable for high value-added products. Then, the soda is an interesting option for the extraction of lignin if high quality lignin is required for high value-added products at low costs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. When does reading dirty words impede picture processing? Taboo interference with verbal and manual responses.

    PubMed

    Mädebach, Andreas; Markuske, Anna-Maria; Jescheniak, Jörg D

    2018-05-22

    Picture naming takes longer in the presence of socially inappropriate (taboo) distractor words compared with neutral distractor words. Previous studies have attributed this taboo interference effect to increased attentional capture by taboo words or verbal self-monitoring-that is, control processes scrutinizing verbal responses before articulation. In this study, we investigated the cause and locus of the taboo interference effect by contrasting three tasks that used the same target pictures, but systematically differed with respect to the processing stages involved: picture naming (requiring conceptual processing, lexical processing, and articulation), phoneme decision (requiring conceptual and lexical processing), and natural size decision (requiring conceptual processing only). We observed taboo interference in picture naming and phoneme decision. In size decision, taboo interference was not reliably observed under the same task conditions in which the effect arose in picture naming and phoneme decision, but it emerged when the difficulty of the size decision task was increased by visually degrading the target pictures. Overall, these results suggest that taboo interference cannot be exclusively attributed to verbal self-monitoring operating over articulatory responses. Instead, taboo interference appears to arise already prior to articulatory preparation, during lexical processing and-at least with sufficiently high task difficulty-during prelexical processing stages.

  13. Effects of high-dose ethanol intoxication and hangover on cognitive flexibility.

    PubMed

    Wolff, Nicole; Gussek, Philipp; Stock, Ann-Kathrin; Beste, Christian

    2018-01-01

    The effects of high-dose ethanol intoxication on cognitive flexibility processes are not well understood, and processes related to hangover after intoxication have remained even more elusive. Similarly, it is unknown in how far the complexity of cognitive flexibility processes is affected by intoxication and hangover effects. We performed a neurophysiological study applying high density electroencephalography (EEG) recording to analyze event-related potentials (ERPs) and perform source localization in a task switching paradigm which varied the complexity of task switching by means of memory demands. The results show that high-dose ethanol intoxication only affects task switching (i.e. cognitive flexibility processes) when memory processes are required to control task switching mechanisms, suggesting that even high doses of ethanol compromise cognitive processes when they are highly demanding. The EEG and source localization data show that these effects unfold by modulating response selection processes in the anterior cingulate cortex. Perceptual and attentional selection processes as well as working memory processes were only unspecifically modulated. In all subprocesses examined, there were no differences between the sober and hangover states, thus suggesting a fast recovery of cognitive flexibility after high-dose ethanol intoxication. We assume that the gamma-aminobutyric acid (GABAergic) system accounts for the observed effects, while they can hardly be explained by the dopaminergic system. © 2016 Society for the Study of Addiction.

  14. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-01-12

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  15. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-05-19

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy ''Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO)' (Nguyen 1999a), ''Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Butch X (LAW DQO) (Nguyen 1999b)'', ''Low Activity Wastemore » and High-Level Waste Feed Data Quality Objectives (L&H DQO)'' (Patello et al. 1999), and ''Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO)'' (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide sub-samples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  16. Square Kilometre Array Science Data Processing

    NASA Astrophysics Data System (ADS)

    Nikolic, Bojan; SDP Consortium, SKA

    2014-04-01

    The Square Kilometre Array (SKA) is planned to be, by a large factor, the largest and most sensitive radio telescope ever constructed. The first phase of the telescope (SKA1), now in the design phase, will in itself represent a major leap in capabilities compared to current facilities. These advances are to a large extent being made possible by advances in available computer processing power so that that larger numbers of smaller, simpler and cheaper receptors can be used. As a result of greater reliance and demands on computing, ICT is becoming an ever more integral part of the telescope. The Science Data Processor is the part of the SKA system responsible for imaging, calibration, pulsar timing, confirmation of pulsar candidates, derivation of some further derived data products, archiving and providing the data to the users. It will accept visibilities at data rates at several TB/s and require processing power for imaging in range 100 petaFLOPS -- ~1 ExaFLOPS, putting SKA1 into the regime of exascale radio astronomy. In my talk I will present the overall SKA system requirements and how they drive these high data throughput and processing requirements. Some of the key challenges for the design of SDP are: - Identifying sufficient parallelism to utilise very large numbers of separate compute cores that will be required to provide exascale computing throughput - Managing efficiently the high internal data flow rates - A conceptual architecture and software engineering approach that will allow adaptation of the algorithms as we learn about the telescope and the atmosphere during the commissioning and operational phases - System management that will deal gracefully with (inevitably frequent) failures of individual units of the processing system In my talk I will present possible initial architectures for the SDP system that attempt to address these and other challenges.

  17. A digital gigapixel large-format tile-scan camera.

    PubMed

    Ben-Ezra, M

    2011-01-01

    Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.

  18. Validating the Inactivation Effectiveness of Chemicals on Ebola Virus.

    PubMed

    Haddock, Elaine; Feldmann, Friederike

    2017-01-01

    While viruses such as Ebola virus must be handled in high-containment laboratories, there remains the need to process virus-infected samples for downstream research testing. This processing often includes removal to lower containment areas and therefore requires assurance of complete viral inactivation within the sample before removal from high-containment. Here we describe methods for the removal of chemical reagents used in inactivation procedures, allowing for validation of the effectiveness of various inactivation protocols.

  19. High-performance wavelet engine

    NASA Astrophysics Data System (ADS)

    Taylor, Fred J.; Mellot, Jonathon D.; Strom, Erik; Koren, Iztok; Lewis, Michael P.

    1993-11-01

    Wavelet processing has shown great promise for a variety of image and signal processing applications. Wavelets are also among the most computationally expensive techniques in signal processing. It is demonstrated that a wavelet engine constructed with residue number system arithmetic elements offers significant advantages over commercially available wavelet accelerators based upon conventional arithmetic elements. Analysis is presented predicting the dynamic range requirements of the reported residue number system based wavelet accelerator.

  20. Assessment of Advanced Coal Gasification Processes

    NASA Technical Reports Server (NTRS)

    McCarthy, John; Ferrall, Joseph; Charng, Thomas; Houseman, John

    1981-01-01

    This report represents a technical assessment of the following advanced coal gasification processes: AVCO High Throughput Gasification (HTG) Process; Bell Single-Stage High Mass Flux (HMF) Process; Cities Service/Rockwell (CS/R) Hydrogasification Process; Exxon Catalytic Coal Gasification (CCG) Process. Each process is evaluated for its potential to produce SNG from a bituminous coal. In addition to identifying the new technology these processes represent, key similarities/differences, strengths/weaknesses, and potential improvements to each process are identified. The AVCO HTG and the Bell HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging and syngas as the initial raw product gas. The CS/R Hydrogasifier is also SRT but is non-slagging and produces a raw gas high in methane content. The Exxon CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier. The report makes the following assessments: 1) while each process has significant potential as coal gasifiers, the CS/R and Exxon processes are better suited for SNG production; 2) the Exxon process is the closest to a commercial level for near-term SNG production; and 3) the SRT processes require significant development including scale-up and turndown demonstration, char processing and/or utilization demonstration, and reactor control and safety features development.

  1. Method for materials deposition by ablation transfer processing

    DOEpatents

    Weiner, K.H.

    1996-04-16

    A method in which a thin layer of semiconducting, insulating, or metallic material is transferred by ablation from a source substrate, coated uniformly with a thin layer of said material, to a target substrate, where said material is desired, with a pulsed, high intensity, patternable beam of energy. The use of a patternable beam allows area-selective ablation from the source substrate resulting in additive deposition of the material onto the target substrate which may require a very low percentage of the area to be covered. Since material is placed only where it is required, material waste can be minimized by reusing the source substrate for depositions on multiple target substrates. Due to the use of a pulsed, high intensity energy source the target substrate remains at low temperature during the process, and thus low-temperature, low cost transparent glass or plastic can be used as the target substrate. The method can be carried out atmospheric pressures and at room temperatures, thus eliminating vacuum systems normally required in materials deposition processes. This invention has particular application in the flat panel display industry, as well as minimizing materials waste and associated costs. 1 fig.

  2. Route to one-step microstructure mold fabrication for PDMS microfluidic chip

    NASA Astrophysics Data System (ADS)

    Lv, Xiaoqing; Geng, Zhaoxin; Fan, Zhiyuan; Wang, Shicai; Su, Yue; Fang, Weihao; Pei, Weihua; Chen, Hongda

    2018-04-01

    The microstructure mold fabrication for PDMS microfluidic chip remains complex and time-consuming process requiring special equipment and protocols: photolithography and etching. Thus, a rapid and cost-effective method is highly needed. Comparing with the traditional microfluidic chip fabricating process based on the micro-electromechanical system (MEMS), this method is simple and easy to implement, and the whole fabrication process only requires 1-2 h. Different size of microstructure from 100 to 1000 μm was fabricated, and used to culture four kinds of breast cancer cell lines. Cell viability and morphology was assessed when they were cultured in the micro straight channels, micro square holes and the bonding PDMS-glass microfluidic chip. The experimental results indicate that the microfluidic chip is good and meet the experimental requirements. This method can greatly reduce the process time and cost of the microfluidic chip, and provide a simple and effective way for the structure design and in the field of biological microfabrications and microfluidic chips.

  3. The two-pore channel TPC1 is required for efficient protein processing through early and recycling endosomes.

    PubMed

    Castonguay, Jan; Orth, Joachim H C; Müller, Thomas; Sleman, Faten; Grimm, Christian; Wahl-Schott, Christian; Biel, Martin; Mallmann, Robert Theodor; Bildl, Wolfgang; Schulte, Uwe; Klugbauer, Norbert

    2017-08-30

    Two-pore channels (TPCs) are localized in endo-lysosomal compartments and assumed to play an important role for vesicular fusion and endosomal trafficking. Recently, it has been shown that both TPC1 and 2 were required for host cell entry and pathogenicity of Ebola viruses. Here, we investigate the cellular function of TPC1 using protein toxins as model substrates for distinct endosomal processing routes. Toxin uptake and activation through early endosomes but not processing through other compartments were reduced in TPC1 knockout cells. Detailed co-localization studies with subcellular markers confirmed predominant localization of TPC1 to early and recycling endosomes. Proteomic analysis of native TPC1 channels finally identified direct interaction with a distinct set of syntaxins involved in fusion of intracellular vesicles. Together, our results demonstrate a general role of TPC1 for uptake and processing of proteins in early and recycling endosomes, likely by providing high local Ca 2+ concentrations required for SNARE-mediated vesicle fusion.

  4. Pressure profiles of the BRing based on the simulation used in the CSRm

    NASA Astrophysics Data System (ADS)

    Wang, J. C.; Li, P.; Yang, J. C.; Yuan, Y. J.; Wu, B.; Chai, Z.; Luo, C.; Dong, Z. Q.; Zheng, W. H.; Zhao, H.; Ruan, S.; Wang, G.; Liu, J.; Chen, X.; Wang, K. D.; Qin, Z. M.; Yin, B.

    2017-07-01

    HIAF-BRing, a new multipurpose accelerator facility of the High Intensity heavy-ion Accelerator Facility project, requires an extremely high vacuum lower than 10-11 mbar to fulfill the requirements of radioactive beam physics and high energy density physics. To achieve the required process pressure, the bench-marked codes of VAKTRAK and Molflow+ are used to simulate the pressure profiles of the BRing system. In order to ensure the accuracy of the implementation of VAKTRAK, the computational results are verified by measured pressure data and compared with a new simulation code BOLIDE on the current synchrotron CSRm. Since the verification of VAKTRAK has been done, the pressure profiles of the BRing are calculated with different parameters such as conductance, out-gassing rates and pumping speeds. According to the computational results, the optimal parameters are selected to achieve the required pressure for the BRing.

  5. Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays

    PubMed Central

    Wu, Hao; Tang, Biao; Hayes, Robert A.; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu

    2016-01-01

    Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities. PMID:28773826

  6. Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays.

    PubMed

    Wu, Hao; Tang, Biao; Hayes, Robert A; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu

    2016-08-19

    Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities.

  7. Automatic Synthesis of UML Designs from Requirements in an Iterative Process

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Whittle, Jon; Clancy, Daniel (Technical Monitor)

    2001-01-01

    The Unified Modeling Language (UML) is gaining wide popularity for the design of object-oriented systems. UML combines various object-oriented graphical design notations under one common framework. A major factor for the broad acceptance of UML is that it can be conveniently used in a highly iterative, Use Case (or scenario-based) process (although the process is not a part of UML). Here, the (pre-) requirements for the software are specified rather informally as Use Cases and a set of scenarios. A scenario can be seen as an individual trace of a software artifact. Besides first sketches of a class diagram to illustrate the static system breakdown, scenarios are a favorite way of communication with the customer, because scenarios describe concrete interactions between entities and are thus easy to understand. Scenarios with a high level of detail are often expressed as sequence diagrams. Later in the design and implementation stage (elaboration and implementation phases), a design of the system's behavior is often developed as a set of statecharts. From there (and the full-fledged class diagram), actual code development is started. Current commercial UML tools support this phase by providing code generators for class diagrams and statecharts. In practice, it can be observed that the transition from requirements to design to code is a highly iterative process. In this talk, a set of algorithms is presented which perform reasonable synthesis and transformations between different UML notations (sequence diagrams, Object Constraint Language (OCL) constraints, statecharts). More specifically, we will discuss the following transformations: Statechart synthesis, introduction of hierarchy, consistency of modifications, and "design-debugging".

  8. SpaceCubeX: A Framework for Evaluating Hybrid Multi-Core CPU FPGA DSP Architectures

    NASA Technical Reports Server (NTRS)

    Schmidt, Andrew G.; Weisz, Gabriel; French, Matthew; Flatley, Thomas; Villalpando, Carlos Y.

    2017-01-01

    The SpaceCubeX project is motivated by the need for high performance, modular, and scalable on-board processing to help scientists answer critical 21st century questions about global climate change, air quality, ocean health, and ecosystem dynamics, while adding new capabilities such as low-latency data products for extreme event warnings. These goals translate into on-board processing throughput requirements that are on the order of 100-1,000 more than those of previous Earth Science missions for standard processing, compression, storage, and downlink operations. To study possible future architectures to achieve these performance requirements, the SpaceCubeX project provides an evolvable testbed and framework that enables a focused design space exploration of candidate hybrid CPU/FPGA/DSP processing architectures. The framework includes ArchGen, an architecture generator tool populated with candidate architecture components, performance models, and IP cores, that allows an end user to specify the type, number, and connectivity of a hybrid architecture. The framework requires minimal extensions to integrate new processors, such as the anticipated High Performance Spaceflight Computer (HPSC), reducing time to initiate benchmarking by months. To evaluate the framework, we leverage a wide suite of high performance embedded computing benchmarks and Earth science scenarios to ensure robust architecture characterization. We report on our projects Year 1 efforts and demonstrate the capabilities across four simulation testbed models, a baseline SpaceCube 2.0 system, a dual ARM A9 processor system, a hybrid quad ARM A53 and FPGA system, and a hybrid quad ARM A53 and DSP system.

  9. Investigation of coherent receiver designs in high-speed optical inter-satellite links using digital signal processing

    NASA Astrophysics Data System (ADS)

    Schaefer, S.; Gregory, M.; Rosenkranz, W.

    2017-09-01

    Due to higher data rates, better data security and unlicensed spectral usage optical inter-satellite links (OISL) offer an attractive alternative to conventional RF-communication. However, the very high transmission distances necessitate an optical receiver design enabling high receiver sensitivity which requires careful carrier synchronization and a quasi-coherent detection scheme.

  10. Overview and development of EDA tools for integration of DSA into patterning solutions

    NASA Astrophysics Data System (ADS)

    Torres, J. Andres; Fenger, Germain; Khaira, Daman; Ma, Yuansheng; Granik, Yuri; Kapral, Chris; Mitra, Joydeep; Krasnova, Polina; Ait-Ferhat, Dehia

    2017-03-01

    Directed Self-Assembly is the method by which a self-assembly polymer is forced to follow a desired geometry defined or influenced by a guiding pattern. Such guiding pattern uses surface potentials, confinement or both to achieve polymer configurations that result in circuit-relevant topologies, which can be patterned onto a substrate. Chemo, and grapho epitaxy of lines and space structures are now routinely inspected at full wafer level to understand the defectivity limits of the materials and their maximum resolution. In the same manner, there is a deeper understanding about the formation of cylinders using grapho-epitaxy processes. Academia has also contributed by developing methods that help reduce the number of masks in advanced nodes by "combining" DSA-compatible groups, thus reducing the total cost of the process. From the point of view of EDA, new tools are required when a technology is adopted, and most technologies are adopted when they show a clear cost-benefit over alternative techniques. In addition, years of EDA development have led to the creation of very flexible toolkits that permit rapid prototyping and evaluation of new process alternatives. With the development of high-chi materials, and by moving away of the well characterized PS-PMMA systems, as well as novel integrations in the substrates that work in tandem with diblock copolymer systems, it is necessary to assess any new requirements that may or may not need custom tools to support such processes. Hybrid DSA processes (which contain both chemo and grapho elements), are currently being investigated as possible contenders for sub-5nm process techniques. Because such processes permit the re-distribution of discontinuities in the regular arrays between the substrate and a cut operation, they have the potential to extend the number of applications for DSA. This paper illustrates the reason as to why some DSA processes can be supported by existing rules and technology, while other processes require the development of highly customized correction tools and models. It also illustrates how developing DSA cannot be done in isolation, and it requires the full collaboration of EDA, Material's suppliers, Manufacturing equipment, Metrology, and electronic manufacturers.

  11. Extending enterprise architecture modelling with business goals and requirements

    NASA Astrophysics Data System (ADS)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  12. A combined electron beam/optical lithography process step for the fabrication of sub-half-micron-gate-length MMIC chips

    NASA Technical Reports Server (NTRS)

    Sewell, James S.; Bozada, Christopher A.

    1994-01-01

    Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.

  13. A combined electron beam/optical lithography process step for the fabrication of sub-half-micron-gate-length MMIC chips

    NASA Astrophysics Data System (ADS)

    Sewell, James S.; Bozada, Christopher A.

    1994-02-01

    Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.

  14. In-situ quality monitoring during laser brazing

    NASA Astrophysics Data System (ADS)

    Ungers, Michael; Fecker, Daniel; Frank, Sascha; Donst, Dmitri; Märgner, Volker; Abels, Peter; Kaierle, Stefan

    Laser brazing of zinc coated steel is a widely established manufacturing process in the automotive sector, where high quality requirements must be fulfilled. The strength, impermeablitiy and surface appearance of the joint are particularly important for judging its quality. The development of an on-line quality control system is highly desired by the industry. This paper presents recent works on the development of such a system, which consists of two cameras operating in different spectral ranges. For the evaluation of the system, seam imperfections are created artificially during experiments. Finally image processing algorithms for monitoring process parameters based the captured images are presented.

  15. Fabrication of High-Resolution Gamma-Ray Metallic Magnetic Calorimeters with Ag:Er Sensor and Thick Electroplated Absorbers

    NASA Astrophysics Data System (ADS)

    Hummatov, Ruslan; Hall, John A.; Kim, Geon-Bo; Friedrich, Stephan; Cantor, Robin; Boyd, S. T. P.

    2018-05-01

    We are developing metallic magnetic calorimeters for high-resolution gamma-ray spectroscopy for non-destructive assay of nuclear materials. Absorbers for these higher-energy photons can require substantial thickness to achieve adequate stopping power. We developed a new absorber fabrication process using dry-film photoresists to electroform cantilevered, thick absorbers. Gamma detectors with these absorbers have an energy resolution of 38 eV FWHM at 60 keV. In this report, we summarize modifications to STARCryo's "Delta 1000" process for our devices and describe the new absorber fabrication process.

  16. Neural manufacturing: a novel concept for processing modeling, monitoring, and control

    NASA Astrophysics Data System (ADS)

    Fu, Chi Y.; Petrich, Loren; Law, Benjamin

    1995-09-01

    Semiconductor fabrication lines have become extremely costly, and achieving a good return from such a high capital investment requires efficient utilization of these expensive facilities. It is highly desirable to shorten processing development time, increase fabrication yield, enhance flexibility, improve quality, and minimize downtime. We propose that these ends can be achieved by applying recent advances in the areas of artificial neural networks, fuzzy logic, machine learning, and genetic algorithms. We use the term neural manufacturing to describe such applications. This paper describes our use of artificial neural networks to improve the monitoring and control of semiconductor process.

  17. A fast, programmable hardware architecture for spaceborne SAR processing

    NASA Technical Reports Server (NTRS)

    Bennett, J. R.; Cumming, I. G.; Lim, J.; Wedding, R. M.

    1983-01-01

    The launch of spaceborne SARs during the 1980's is discussed. The satellite SARs require high quality and high throughput ground processors. Compression ratios in range and azimuth of greater than 500 and 150 respectively lead to frequency domain processing and data computation rates in excess of 2000 million real operations per second for C-band SARs under consideration. Various hardware architectures are examined and two promising candidates and proceeds to recommend a fast, programmable hardware architecture for spaceborne SAR processing are selected. Modularity and programmability are introduced as desirable attributes for the purpose of HTSP hardware selection.

  18. Fabrication of X-ray Microcalorimeter Focal Planes Composed of Two Distinct Pixel Types.

    PubMed

    Wassell, E J; Adams, J S; Bandler, S R; Betancourt-Martinez, G L; Chiao, M P; Chang, M P; Chervenak, J A; Datesman, A M; Eckart, M E; Ewin, A J; Finkbeiner, F M; Ha, J Y; Kelley, R; Kilbourne, C A; Miniussi, A R; Sakai, K; Porter, F; Sadleir, J E; Smith, S J; Wakeham, N A; Yoon, W

    2017-06-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeter focal planes for versatility in meeting specifications of X-ray imaging spectrometers including high count-rate, high energy resolution, and large field-of-view. In particular, a focal plane composed of two sub-arrays: one of fine-pitch, high count-rate devices and the other of slower, larger pixels with similar energy resolution, offers promise for the next generation of astrophysics instruments, such as the X-ray Integral Field Unit (X-IFU) instrument on the European Space Agency's Athena mission. We have based the sub-arrays of our current design on successful pixel designs that have been demonstrated separately. Pixels with an all gold X-ray absorber on 50 and 75 micron scales where the Mo/Au TES sits atop a thick metal heatsinking layer have shown high resolution and can accommodate high count-rates. The demonstrated larger pixels use a silicon nitride membrane for thermal isolation, thinner Au and an added bismuth layer in a 250 micron square absorber. To tune the parameters of each sub-array requires merging the fabrication processes of the two detector types. We present the fabrication process for dual production of different X-ray absorbers on the same substrate, thick Au on the small pixels and thinner Au with a Bi capping layer on the larger pixels to tune their heat capacities. The process requires multiple electroplating and etching steps, but the absorbers are defined in a single ion milling step. We demonstrate methods for integrating heatsinking of the two types of pixel into the same focal plane consistent with the requirements for each sub-array, including the limiting of thermal crosstalk. We also discuss fabrication process modifications for tuning the intrinsic transition temperature (T c ) of the bilayers for the different device types through variation of the bilayer thicknesses. The latest results on these "hybrid" arrays will be presented.

  19. Fabrication of X-ray Microcalorimeter Focal Planes Composed of Two Distinct Pixel Types

    PubMed Central

    Wassell, E. J.; Adams, J. S.; Bandler, S. R.; Betancourt-Martinez, G. L.; Chiao, M. P.; Chang, M. P.; Chervenak, J. A.; Datesman, A. M.; Eckart, M. E.; Ewin, A. J.; Finkbeiner, F. M.; Ha, J. Y.; Kelley, R.; Kilbourne, C. A.; Miniussi, A. R.; Sakai, K.; Porter, F.; Sadleir, J. E.; Smith, S. J.; Wakeham, N. A.; Yoon, W.

    2017-01-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeter focal planes for versatility in meeting specifications of X-ray imaging spectrometers including high count-rate, high energy resolution, and large field-of-view. In particular, a focal plane composed of two sub-arrays: one of fine-pitch, high count-rate devices and the other of slower, larger pixels with similar energy resolution, offers promise for the next generation of astrophysics instruments, such as the X-ray Integral Field Unit (X-IFU) instrument on the European Space Agency’s Athena mission. We have based the sub-arrays of our current design on successful pixel designs that have been demonstrated separately. Pixels with an all gold X-ray absorber on 50 and 75 micron scales where the Mo/Au TES sits atop a thick metal heatsinking layer have shown high resolution and can accommodate high count-rates. The demonstrated larger pixels use a silicon nitride membrane for thermal isolation, thinner Au and an added bismuth layer in a 250 micron square absorber. To tune the parameters of each sub-array requires merging the fabrication processes of the two detector types. We present the fabrication process for dual production of different X-ray absorbers on the same substrate, thick Au on the small pixels and thinner Au with a Bi capping layer on the larger pixels to tune their heat capacities. The process requires multiple electroplating and etching steps, but the absorbers are defined in a single ion milling step. We demonstrate methods for integrating heatsinking of the two types of pixel into the same focal plane consistent with the requirements for each sub-array, including the limiting of thermal crosstalk. We also discuss fabrication process modifications for tuning the intrinsic transition temperature (Tc) of the bilayers for the different device types through variation of the bilayer thicknesses. The latest results on these “hybrid” arrays will be presented. PMID:28804229

  20. Fabrication of X-ray Microcalorimeter Focal Planes Composed of Two Distinct Pixel Types

    NASA Technical Reports Server (NTRS)

    Wassell, Edward J.; Adams, Joseph S.; Bandler, Simon R.; Betancour-Martinez, Gabriele L; Chiao, Meng P.; Chang, Meng Ping; Chervenak, James A.; Datesman, Aaron M.; Eckart, Megan E.; Ewin, Audrey J.; hide

    2016-01-01

    We develop superconducting transition-edge sensor (TES) microcalorimeter focal planes for versatility in meeting the specifications of X-ray imaging spectrometers, including high count rate, high energy resolution, and large field of view. In particular, a focal plane composed of two subarrays: one of fine pitch, high count-rate devices and the other of slower, larger pixels with similar energy resolution, offers promise for the next generation of astrophysics instruments, such as the X-ray Integral Field Unit Instrument on the European Space Agencys ATHENA mission. We have based the subarrays of our current design on successful pixel designs that have been demonstrated separately. Pixels with an all-gold X-ray absorber on 50 and 75 micron pitch, where the Mo/Au TES sits atop a thick metal heatsinking layer, have shown high resolution and can accommodate high count rates. The demonstrated larger pixels use a silicon nitride membrane for thermal isolation, thinner Au, and an added bismuth layer in a 250-sq micron absorber. To tune the parameters of each subarray requires merging the fabrication processes of the two detector types. We present the fabrication process for dual production of different X-ray absorbers on the same substrate, thick Au on the small pixels and thinner Au with a Bi capping layer on the larger pixels to tune their heat capacities. The process requires multiple electroplating and etching steps, but the absorbers are defined in a single-ion milling step. We demonstrate methods for integrating the heatsinking of the two types of pixel into the same focal plane consistent with the requirements for each subarray, including the limiting of thermal crosstalk. We also discuss fabrication process modifications for tuning the intrinsic transition temperature (T(sub c)) of the bilayers for the different device types through variation of the bilayer thicknesses. The latest results on these 'hybrid' arrays will be presented.

  1. Computational study on a puzzle in the biosynthetic pathway of anthocyanin: Why is an enzymatic oxidation/ reduction process required for a simple tautomerization?

    PubMed Central

    Sato, Hajime; Wang, Chao; Yamazaki, Mami; Saito, Kazuki; Uchiyama, Masanobu

    2018-01-01

    In the late stage of anthocyanin biosynthesis, dihydroflavonol reductase (DFR) and anthocyanidin synthase (ANS) mediate a formal tautomerization. However, such oxidation/reduction process requires high energy and appears to be unnecessary, as the oxidation state does not change during the transformation. Thus, a non-enzymatic pathway of tautomerization has also been proposed. To resolve the long-standing issue of whether this non-enzymatic pathway is the main contributor for the biosynthesis, we carried out density functional theory (DFT) calculations to examine this non-enzymatic pathway from dihydroflavonol to anthocyanidin. We show here that the activation barriers for the proposed non-enzymatic tautomerization are too high to enable the reaction to proceed under normal aqueous conditions in plants. The calculations also explain the experimentally observed requirement for acidic conditions during the final step of conversion of 2-flaven-3,4-diol to anthocyanidin; a thermodynamically and kinetically favorable concerted pathway can operate under these conditions. PMID:29897974

  2. Computational study on a puzzle in the biosynthetic pathway of anthocyanin: Why is an enzymatic oxidation/ reduction process required for a simple tautomerization?

    PubMed

    Sato, Hajime; Wang, Chao; Yamazaki, Mami; Saito, Kazuki; Uchiyama, Masanobu

    2018-01-01

    In the late stage of anthocyanin biosynthesis, dihydroflavonol reductase (DFR) and anthocyanidin synthase (ANS) mediate a formal tautomerization. However, such oxidation/reduction process requires high energy and appears to be unnecessary, as the oxidation state does not change during the transformation. Thus, a non-enzymatic pathway of tautomerization has also been proposed. To resolve the long-standing issue of whether this non-enzymatic pathway is the main contributor for the biosynthesis, we carried out density functional theory (DFT) calculations to examine this non-enzymatic pathway from dihydroflavonol to anthocyanidin. We show here that the activation barriers for the proposed non-enzymatic tautomerization are too high to enable the reaction to proceed under normal aqueous conditions in plants. The calculations also explain the experimentally observed requirement for acidic conditions during the final step of conversion of 2-flaven-3,4-diol to anthocyanidin; a thermodynamically and kinetically favorable concerted pathway can operate under these conditions.

  3. Narrative writing: Effective ways and best practices

    PubMed Central

    Ledade, Samir D.; Jain, Shishir N.; Darji, Ankit A.; Gupta, Vinodkumar H.

    2017-01-01

    A narrative is a brief summary of specific events experienced by patients, during the course of a clinical trial. Narrative writing involves multiple activities such as generation of patient profiles, review of data sources, and identification of events for which narratives are required. A sponsor outsources narrative writing activities to leverage the expertise of service providers which in turn requires effective management of resources, cost, time, quality, and overall project management. Narratives are included as an appendix to the clinical study report and are submitted to the regulatory authorities as a part of dossier. Narratives aid in the evaluation of the safety profile of the investigational drug under study. To deliver high-quality narratives within the specified timeframe to the sponsor can be achieved by standardizing processes, increasing efficiency, optimizing working capacity, implementing automation, and reducing cost. This paper focuses on effective ways to design narrative writing process and suggested best practices, which enable timely delivery of high-quality narratives to fulfill the regulatory requirement. PMID:28447014

  4. Certifying leaders? high-quality management practices and healthy organisations: an ISO-9000 based standardisation approach

    PubMed Central

    MONTANO, Diego

    2016-01-01

    The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach. PMID:26860787

  5. Certifying leaders? high-quality management practices and healthy organisations: an ISO-9000 based standardisation approach.

    PubMed

    Montano, Diego

    2016-08-05

    The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach.

  6. Narrative writing: Effective ways and best practices.

    PubMed

    Ledade, Samir D; Jain, Shishir N; Darji, Ankit A; Gupta, Vinodkumar H

    2017-01-01

    A narrative is a brief summary of specific events experienced by patients, during the course of a clinical trial. Narrative writing involves multiple activities such as generation of patient profiles, review of data sources, and identification of events for which narratives are required. A sponsor outsources narrative writing activities to leverage the expertise of service providers which in turn requires effective management of resources, cost, time, quality, and overall project management. Narratives are included as an appendix to the clinical study report and are submitted to the regulatory authorities as a part of dossier. Narratives aid in the evaluation of the safety profile of the investigational drug under study. To deliver high-quality narratives within the specified timeframe to the sponsor can be achieved by standardizing processes, increasing efficiency, optimizing working capacity, implementing automation, and reducing cost. This paper focuses on effective ways to design narrative writing process and suggested best practices, which enable timely delivery of high-quality narratives to fulfill the regulatory requirement.

  7. A programmable computational image sensor for high-speed vision

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian

    2013-08-01

    In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.

  8. The distinguishing signature of magnetic Penrose process

    NASA Astrophysics Data System (ADS)

    Dadhich, Naresh; Tursunov, Arman; Ahmedov, Bobomurat; Stuchlík, Zdeněk

    2018-07-01

    In this Letter, we wish to point out that the distinguishing feature of magnetic Penrose process (MPP) is its super high-efficiency exceeding 100 per cent (which was established in mid 1980s for discrete particle accretion) of extraction of rotational energy of a rotating black hole electromagnetically for a magnetic field of milli Gauss order. Another similar process, which is also driven by the electromagnetic field, is Blandford-Znajek mechanism (BZ) that could be envisaged as high magnetic field limit MPP as it requires threshold magnetic field of order 104 G. Recent simulation studies of fully relativistic magnetohydrodynamic (MHD) flows have borne out super high-efficiency signature of the process for high magnetic field regime; viz BZ. We would like to make a clear prediction that similar simulation studies of MHD flows for low magnetic field regime, where BZ would be inoperative, would also have superefficiency.

  9. OPERATOR BURDEN IN METAL ADDITIVE MANUFACTURING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Amy M; Love, Lonnie J

    2016-01-01

    Additive manufacturing (AM) is an emerging manufacturing process that creates usable machine parts via layer-by-layer joining of a stock material. With this layer-wise approach, high-performance geometries can be created which are impossible with traditional manufacturing methods. Metal AM technology has the potential to significantly reduce the manufacturing burden of developing custom hardware; however, a major consideration in choosing a metal AM system is the required amount of operator involvement (i.e., operator burden) in the manufacturing process. The operator burden not only determines the amount of operator training and specialization required but also the usability of the system in a facility.more » As operators of several metal AM processes, the Manufacturing Demonstration Facility (MDF) at Oak Ridge National Labs is uniquely poised to provide insight into requirements for operator involvement in each of the three major metal AM processes. The paper covers an overview of each of the three metal AM technologies, focusing on the burden on the operator to complete the build cycle, process the part for final use, and reset the AM equipment for future builds.« less

  10. Alternative process for thin layer etching: Application to nitride spacer etching stopping on silicon germanium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Posseme, N., E-mail: nicolas.posseme@cea.fr; Pollet, O.; Barnola, S.

    2014-08-04

    Silicon nitride spacer etching realization is considered today as one of the most challenging of the etch process for the new devices realization. For this step, the atomic etch precision to stop on silicon or silicon germanium with a perfect anisotropy (no foot formation) is required. The situation is that none of the current plasma technologies can meet all these requirements. To overcome these issues and meet the highly complex requirements imposed by device fabrication processes, we recently proposed an alternative etching process to the current plasma etch chemistries. This process is based on thin film modification by light ionsmore » implantation followed by a selective removal of the modified layer with respect to the non-modified material. In this Letter, we demonstrate the benefit of this alternative etch method in term of film damage control (silicon germanium recess obtained is less than 6 A), anisotropy (no foot formation), and its compatibility with other integration steps like epitaxial. The etch mechanisms of this approach are also addressed.« less

  11. Quadratic Polynomial Regression using Serial Observation Processing:Implementation within DART

    NASA Astrophysics Data System (ADS)

    Hodyss, D.; Anderson, J. L.; Collins, N.; Campbell, W. F.; Reinecke, P. A.

    2017-12-01

    Many Ensemble-Based Kalman ltering (EBKF) algorithms process the observations serially. Serial observation processing views the data assimilation process as an iterative sequence of scalar update equations. What is useful about this data assimilation algorithm is that it has very low memory requirements and does not need complex methods to perform the typical high-dimensional inverse calculation of many other algorithms. Recently, the push has been towards the prediction, and therefore the assimilation of observations, for regions and phenomena for which high-resolution is required and/or highly nonlinear physical processes are operating. For these situations, a basic hypothesis is that the use of the EBKF is sub-optimal and performance gains could be achieved by accounting for aspects of the non-Gaussianty. To this end, we develop here a new component of the Data Assimilation Research Testbed [DART] to allow for a wide-variety of users to test this hypothesis. This new version of DART allows one to run several variants of the EBKF as well as several variants of the quadratic polynomial lter using the same forecast model and observations. Dierences between the results of the two systems will then highlight the degree of non-Gaussianity in the system being examined. We will illustrate in this work the differences between the performance of linear versus quadratic polynomial regression in a hierarchy of models from Lorenz-63 to a simple general circulation model.

  12. Pathways for Energization of Ca in Mercury's Exosphere

    NASA Technical Reports Server (NTRS)

    Killen, Rosemary M.

    2015-01-01

    We investigate the possible pathways to produce the extreme energy observed in the calcium exosphere of Mercury. Any mechanism must explain the facts that Ca in Mercury's exosphere is extremely hot, that it is seen almost exclusively on the dawnside of the planet, and that its content varies seasonally, not sporadically. Simple diatomic molecules or their clusters are considered, focusing on calcium oxides while acknowledging that Ca sulfides may also be the precursor molecules. We first discuss impact vaporization to justify the assumption that CaO and Ca-oxide clusters are expected from impacts on Mercury. Then we discuss processes by which the atomic Ca is energized to a 70,000 K gas. The processes considered are (1) electron-impact dissociation of CaO molecules, (2) spontaneous dissociation of Ca-bearing molecules following impact vaporization, (3) shock-induced dissociative ionization, (4) photodissociation and (5) sputtering. We conclude that electron-impact dissociation cannot produce the required abundance of Ca, and sputtering cannot reproduce the observed spatial and temporal variation that is measured. Spontaneous dissociation is unlikely to result in the high energy that is seen. Of the two remaining processes, shock induced dissociative ionization produces the required energy and comes close to producing the required abundance, but rates are highly dependent on the incoming velocity distribution of the impactors. Photodissociation probably can produce the required abundance of Ca, but simulations show that photodissociation cannot reproduce the observed spatial distribution.

  13. Vacuum Brazing of Accelerator Components

    NASA Astrophysics Data System (ADS)

    Singh, Rajvir; Pant, K. K.; Lal, Shankar; Yadav, D. P.; Garg, S. R.; Raghuvanshi, V. K.; Mundra, G.

    2012-11-01

    Commonly used materials for accelerator components are those which are vacuum compatible and thermally conductive. Stainless steel, aluminum and copper are common among them. Stainless steel is a poor heat conductor and not very common in use where good thermal conductivity is required. Aluminum and copper and their alloys meet the above requirements and are frequently used for the above purpose. The accelerator components made of aluminum and its alloys using welding process have become a common practice now a days. It is mandatory to use copper and its other grades in RF devices required for accelerators. Beam line and Front End components of the accelerators are fabricated from stainless steel and OFHC copper. Fabrication of components made of copper using welding process is very difficult and in most of the cases it is impossible. Fabrication and joining in such cases is possible using brazing process especially under vacuum and inert gas atmosphere. Several accelerator components have been vacuum brazed for Indus projects at Raja Ramanna Centre for Advanced Technology (RRCAT), Indore using vacuum brazing facility available at RRCAT, Indore. This paper presents details regarding development of the above mentioned high value and strategic components/assemblies. It will include basics required for vacuum brazing, details of vacuum brazing facility, joint design, fixturing of the jobs, selection of filler alloys, optimization of brazing parameters so as to obtain high quality brazed joints, brief description of vacuum brazed accelerator components etc.

  14. Space and Missile Systems Center Standard: Technical Requirements for Electronic Parts, Materials, and Processes used in Space Vehicles

    DTIC Science & Technology

    2013-04-12

    DTL-38999 Connector, Electrical, Circular, Miniature, High Density, Quick Disconnect (Bayonet, Threaded , and Breach Coupling), Environment Resistant ...186 Table 1160-1. Resistance Tolerance and Required Derating...For MIL-DTL-5015 Connector, Electrical, Circular Threaded , AN Type, General Specification for MIL-H-6088G(1) Heat Treatment of Aluminum Alloys

  15. 37 CFR 2.54 - Requirements for drawings submitted on paper.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... OFFICE, DEPARTMENT OF COMMERCE RULES OF PRACTICE IN TRADEMARK CASES Drawing § 2.54 Requirements for... black ink, or in color if color is claimed as a feature of the mark. (e) Drawings must be typed or made with a pen or by a process that will provide high definition when copied. A photolithographic, printer...

  16. New alloys for electroformed replicated x-ray optics

    NASA Astrophysics Data System (ADS)

    Engelhaupt, Darell E.; Ramsey, Brian D.; O'Dell, Stephen L.; Jones, William D.; Russell, J. Kevin

    2000-11-01

    The process of electroforming nickel x-ray mirror shells from superpolished mandrels has been widely used. The recently launched XMM mission by the European Space Agency (ESA) is an excellent example, containing 174 such mirror shells of diameters ranging from 0.3 - 0.7 meters and with a thickness range of 0.47 - 1.07 mm. To continue to utilize this technique for the next generation of x-ray observatories, where larger collecting areas will be required within the constraints of tight weight budgets, demands that new alloys be developed that can withstand the large stresses imposed on very thin shells by the replication, handling and launch processes. Towards this end, we began a development program in late 1997 to produce a high-strength alloy suitable for electroforming very thin high-resolution x-ray optics for the proposed Constellation-X project. Requirements for this task are quite severe; not only must the electroformed deposit be very strong, it must also have very low residual stresses to prevent serious figure distortions in large thin-walled shells. Further, the processing must be done reasonably near room temperature, as large temperature changes will modify the figure of the mandrel. Also the environment must not be corrosive or otherwise damaging to the mandrel during the processing. The results of the development program are presented, showing the evolution of our plating processes and materials through to the present 'glassy' nickel alloy that satisfies the above requirements.

  17. "Assessment Drives Learning": Do Assessments Promote High-Level Cognitive Processing?

    ERIC Educational Resources Information Center

    Bezuidenhout, M. J.; Alt, H.

    2011-01-01

    Students tend to learn in the way they know, or think, they will be assessed. Therefore, to ensure deep, meaningful learning, assessments must be geared to promote cognitive processing that requires complex, contextualised thinking to construct meaning and create knowledge. Bloom's taxonomy of cognitive levels is used worldwide to assist in…

  18. Developing the Multicultural Personality of a Senior High School Student in the Process of Foreign Language Learning

    ERIC Educational Resources Information Center

    Khairutdinova, Milyausha R.; Lebedeva, Olga V.

    2016-01-01

    The relevance of the research problem is determined by intensification of integration processes in all spheres of life, which results in broadening international cooperation and cultural interaction between different nations and countries. The modern contradictory and heterogeneous world requires serious rethinking of the existing traditions of…

  19. 40 CFR Appendix B to Subpart B of... - Standard for Recover Equipment

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... process it to ARI (Air-Conditioning and Refrigeration Institute) standard 700-93 as a minimum. It is not... equipment capability is required which shall process contaminated refrigerant samples at specific... flare male thread connection as identified in SAE J639 CFC-12 High Pressure Charging Valve Figure 2. 6.3...

  20. 40 CFR Appendix B to Subpart B of... - Standard for Recover Equipment

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... process it to ARI (Air-Conditioning and Refrigeration Institute) standard 700-93 as a minimum. It is not... equipment capability is required which shall process contaminated refrigerant samples at specific... flare male thread connection as identified in SAE J639 CFC-12 High Pressure Charging Valve Figure 2. 6.3...

  1. 40 CFR Appendix B to Subpart B of... - Standard for Recover Equipment

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... process it to ARI (Air-Conditioning and Refrigeration Institute) standard 700-93 as a minimum. It is not... equipment capability is required which shall process contaminated refrigerant samples at specific... flare male thread connection as identified in SAE J639 CFC-12 High Pressure Charging Valve Figure 2. 6.3...

  2. 40 CFR Appendix B to Subpart B of... - Standard for Recover Equipment

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... process it to ARI (Air-Conditioning and Refrigeration Institute) standard 700-93 as a minimum. It is not... equipment capability is required which shall process contaminated refrigerant samples at specific... flare male thread connection as identified in SAE J639 CFC-12 High Pressure Charging Valve Figure 2. 6.3...

  3. 40 CFR Appendix B to Subpart B of... - Standard for Recover Equipment

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... process it to ARI (Air-Conditioning and Refrigeration Institute) standard 700-93 as a minimum. It is not... equipment capability is required which shall process contaminated refrigerant samples at specific... flare male thread connection as identified in SAE J639 CFC-12 High Pressure Charging Valve Figure 2. 6.3...

  4. Effect of starch source in pelleted concentrates on fecal bacterial communities in Thoroughbred mares

    USDA-ARS?s Scientific Manuscript database

    High starch concentrates are often added to equine diets to meet digestible energy requirements of some horses, such as broodmares. Starch source has been shown to affect fecal bacterial communities of horses when fed cereal grains with little to no processing. Others suggest that grain processing, ...

  5. Joining precipitation-hardened nickel-base alloys by friction welding

    NASA Technical Reports Server (NTRS)

    Moore, T. J.

    1972-01-01

    Solid state deformation welding process, friction welding, has been developed for joining precipitation hardened nickel-base alloys and other gamma prime-strengthened materials which heretofore have been virtually unweldable. Method requires rotation of one of the parts to be welded, but where applicable, it is an ideal process for high volume production jobs.

  6. Active Parent Consent for Health Surveys with Urban Middle School Students: Processes and Outcomes

    ERIC Educational Resources Information Center

    Secor-Turner, Molly; Sieving, Renee; Widome, Rachel; Plowman, Shari; Vanden Berk, Eric

    2010-01-01

    Background: To achieve high participation rates and a representative sample, active parent consent procedures require a significant investment of study resources. The purpose of this article is to describe processes and outcomes of utilizing active parent consent procedures with sixth-grade students from urban, ethnically diverse, economically…

  7. Intact Spectral but Abnormal Temporal Processing of Auditory Stimuli in Autism

    ERIC Educational Resources Information Center

    Groen, Wouter B.; van Orsouw, Linda; ter Huurne, Niels; Swinkels, Sophie; van der Gaag, Rutger-Jan; Buitelaar, Jan K.; Zwiers, Marcel P.

    2009-01-01

    The perceptual pattern in autism has been related to either a specific localized processing deficit or a pathway-independent, complexity-specific anomaly. We examined auditory perception in autism using an auditory disembedding task that required spectral and temporal integration. 23 children with high-functioning-autism and 23 matched controls…

  8. High-level waste tank farm set point document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREASmore » listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.« less

  9. A high-speed linear algebra library with automatic parallelism

    NASA Technical Reports Server (NTRS)

    Boucher, Michael L.

    1994-01-01

    Parallel or distributed processing is key to getting highest performance workstations. However, designing and implementing efficient parallel algorithms is difficult and error-prone. It is even more difficult to write code that is both portable to and efficient on many different computers. Finally, it is harder still to satisfy the above requirements and include the reliability and ease of use required of commercial software intended for use in a production environment. As a result, the application of parallel processing technology to commercial software has been extremely small even though there are numerous computationally demanding programs that would significantly benefit from application of parallel processing. This paper describes DSSLIB, which is a library of subroutines that perform many of the time-consuming computations in engineering and scientific software. DSSLIB combines the high efficiency and speed of parallel computation with a serial programming model that eliminates many undesirable side-effects of typical parallel code. The result is a simple way to incorporate the power of parallel processing into commercial software without compromising maintainability, reliability, or ease of use. This gives significant advantages over less powerful non-parallel entries in the market.

  10. High performance wire grid polarizers using jet and flashTM imprint lithography

    NASA Astrophysics Data System (ADS)

    Ahn, Sean; Yang, Jack; Miller, Mike; Ganapathisubramanian, Maha; Menezes, Marlon; Choi, Jin; Xu, Frank; Resnick, Douglas J.; Sreenivasan, S. V.

    2013-03-01

    The ability to pattern materials at the nanoscale can enable a variety of applications ranging from high density data storage, displays, photonic devices and CMOS integrated circuits to emerging applications in the biomedical and energy sectors. These applications require varying levels of pattern control, short and long range order, and have varying cost tolerances. Extremely large area roll to roll (R2R) manufacturing on flexible substrates is ubiquitous for applications such as paper and plastic processing. It combines the benefits of high speed and inexpensive substrates to deliver a commodity product at low cost. The challenge is to extend this approach to the realm of nanopatterning and realize similar benefits. The cost of manufacturing is typically driven by speed (or throughput), tool complexity, cost of consumables (materials used, mold or master cost, etc.), substrate cost, and the downstream processing required (annealing, deposition, etching, etc.). In order to achieve low cost nanopatterning, it is imperative to move towards high speed imprinting, less complex tools, near zero waste of consumables and low cost substrates. The Jet and Flash Imprint Lithography (J-FILTM) process uses drop dispensing of UV curable resists to assist high resolution patterning for subsequent dry etch pattern transfer. The technology is actively being used to develop solutions for memory markets including Flash memory and patterned media for hard disk drives. In this paper we have developed a roll based J-FIL process and applied it to technology demonstrator tool, the LithoFlex 100, to fabricate large area flexible bilayer wire grid polarizers (WGP) and high performance WGPs on rigid glass substrates. Extinction ratios of better than 10000 were obtained for the glass-based WGPs. Two simulation packages were also employed to understand the effects of pitch, aluminum thickness and pattern defectivity on the optical performance of the WGP devices. It was determined that the WGPs can be influenced by both clear and opaque defects in the gratings, however the defect densities are relaxed relative to the requirements of a high density semiconductor device.

  11. Getting to the point: Rapid point selection and variable density InSAR time series for urban deformation monitoring

    NASA Astrophysics Data System (ADS)

    Spaans, K.; Hooper, A. J.

    2017-12-01

    The short revisit time and high data acquisition rates of current satellites have resulted in increased interest in the development of deformation monitoring and rapid disaster response capability, using InSAR. Fast, efficient data processing methodologies are required to deliver the timely results necessary for this, and also to limit computing resources required to process the large quantities of data being acquired. Contrary to volcano or earthquake applications, urban monitoring requires high resolution processing, in order to differentiate movements between buildings, or between buildings and the surrounding land. Here we present Rapid time series InSAR (RapidSAR), a method that can efficiently update high resolution time series of interferograms, and demonstrate its effectiveness over urban areas. The RapidSAR method estimates the coherence of pixels on an interferogram-by-interferogram basis. This allows for rapid ingestion of newly acquired images without the need to reprocess the earlier acquired part of the time series. The coherence estimate is based on ensembles of neighbouring pixels with similar amplitude behaviour through time, which are identified on an initial set of interferograms, and need be re-evaluated only occasionally. By taking into account scattering properties of points during coherence estimation, a high quality coherence estimate is achieved, allowing point selection at full resolution. The individual point selection maximizes the amount of information that can be extracted from each interferogram, as no selection compromise has to be reached between high and low coherence interferograms. In other words, points do not have to be coherent throughout the time series to contribute to the deformation time series. We demonstrate the effectiveness of our method over urban areas in the UK. We show how the algorithm successfully extracts high density time series from full resolution Sentinel-1 interferograms, and distinguish clearly between buildings and surrounding vegetation or streets. The fact that new interferograms can be processed separately from the remainder of the time series helps manage the high data volumes, both in space and time, generated by current missions.

  12. Monitoring techniques for high accuracy interference fit assembly processes

    NASA Astrophysics Data System (ADS)

    Liuti, A.; Vedugo, F. Rodriguez; Paone, N.; Ungaro, C.

    2016-06-01

    In the automotive industry, there are many assembly processes that require a high geometric accuracy, in the micrometer range; generally open-loop controllers cannot meet these requirements. This results in an increased defect rate and high production costs. This paper presents an experimental study of interference fit process, aimed to evaluate the aspects which have the most impact on the uncertainty in the final positioning. The press-fitting process considered, consists in a press machine operating with a piezoelectric actuator to press a plug into a sleeve. Plug and sleeve are designed and machined to obtain a known interference fit. Differential displacement and velocity measurements of the plug with respect to the sleeve are measured by a fiber optic differential laser Doppler vibrometer. Different driving signals of the piezo actuator allow to have an insight into the differences between a linear and a pulsating press action. The paper highlights how the press-fit assembly process is characterized by two main phases: the first is an elastic deformation of the plug and sleeve, which produces a reversible displacement, the second is a sliding of the plug with respect to the sleeve, which results in an irreversible displacement and finally realizes the assembly. The simultaneous measurements of the displacement and the force have permitted to define characteristic features in the signal useful to identify the start of the irreversible movement. These indicators could be used to develop a control logic in a press assembly process.

  13. Environmental Benign Process for Production of Molybdenum Metal from Sulphide Based Minerals

    NASA Astrophysics Data System (ADS)

    Rajput, Priyanka; Janakiram, Vangada; Jayasankar, Kalidoss; Angadi, Shivakumar; Bhoi, Bhagyadhar; Mukherjee, Partha Sarathi

    2017-10-01

    Molybdenum is a strategic and high temperature refractory metal which is not found in nature in free state, it is predominantly found in earth's crust in the form of MoO3/MoS2. The main disadvantage of the industrial treatment of Mo concentrate is that the process contains many stages and requires very high temperature. Almost in every step many gaseous, liquid, solid chemical substances are formed which require further treatment. To overcome the above drawback, a new alternative one step novel process is developed for the treatment of sulphide and trioxide molybdenum concentrates. This paper presents the results of the investigations on molybdenite dissociation (MoS2) using microwave assisted plasma unit as well as transferred arc thermal plasma torch. It is a single step process for the preparation of pure molybdenum metal from MoS2 by hydrogen reduction in thermal plasma. Process variable such as H2 gas, Ar gas, input current, voltage and time have been examined to prepare molybdenum metal. Molybdenum recovery of the order of 95% was achieved. The XRD results confirm the phases of molybdenum metal and the chemical analysis of the end product indicate the formation of metallic molybdenum (Mo 98%).

  14. Language and culture modulate online semantic processing.

    PubMed

    Ellis, Ceri; Kuipers, Jan R; Thierry, Guillaume; Lovett, Victoria; Turnbull, Oliver; Jones, Manon W

    2015-10-01

    Language has been shown to influence non-linguistic cognitive operations such as colour perception, object categorization and motion event perception. Here, we show that language also modulates higher level processing, such as semantic knowledge. Using event-related brain potentials, we show that highly fluent Welsh-English bilinguals require significantly less processing effort when reading sentences in Welsh which contain factually correct information about Wales, than when reading sentences containing the same information presented in English. Crucially, culturally irrelevant information was processed similarly in both Welsh and English. Our findings show that even in highly proficient bilinguals, language interacts with factors associated with personal identity, such as culture, to modulate online semantic processing. © The Author (2015). Published by Oxford University Press.

  15. Implementing high-temperature short-time media treatment in commercial-scale cell culture manufacturing processes.

    PubMed

    Pohlscheidt, Michael; Charaniya, Salim; Kulenovic, Fikret; Corrales, Mahalia; Shiratori, Masaru; Bourret, Justin; Meier, Steven; Fallon, Eric; Kiss, Robert

    2014-04-01

    The production of therapeutic proteins by mammalian cell culture is complex and sets high requirements for process, facility, and equipment design, as well as rigorous regulatory and quality standards. One particular point of concern and significant risk to supply chain is the susceptibility to contamination such as bacteria, fungi, mycoplasma, and viruses. Several technologies have been developed to create barriers for these agents to enter the process, e.g. filtration, UV inactivation, and temperature inactivation. However, if not implemented during development of the manufacturing process, these types of process changes can have significant impact on process performance if not managed appropriately. This article describes the implementation of the high-temperature short-time (HTST) treatment of cell culture media as an additional safety barrier against adventitious agents during the transfer of a large-scale commercial cell culture manufacturing process. The necessary steps and experiments, as well as subsequent results during qualification runs and routine manufacturing, are shown.

  16. Investigation of Space Interferometer Control Using Imaging Sensor Output Feedback

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse A.; Cheng, Victor H. L.

    2003-01-01

    Numerous space interferometry missions are planned for the next decade to verify different enabling technologies towards very-long-baseline interferometry to achieve high-resolution imaging and high-precision measurements. These objectives will require coordinated formations of spacecraft separately carrying optical elements comprising the interferometer. High-precision sensing and control of the spacecraft and the interferometer-component payloads are necessary to deliver sub-wavelength accuracy to achieve the scientific objectives. For these missions, the primary scientific product of interferometer measurements may be the only source of data available at the precision required to maintain the spacecraft and interferometer-component formation. A concept is studied for detecting the interferometer's optical configuration errors based on information extracted from the interferometer sensor output. It enables precision control of the optical components, and, in cases of space interferometers requiring formation flight of spacecraft that comprise the elements of a distributed instrument, it enables the control of the formation-flying vehicles because independent navigation or ranging sensors cannot deliver the high-precision metrology over the entire required geometry. Since the concept can act on the quality of the interferometer output directly, it can detect errors outside the capability of traditional metrology instruments, and provide the means needed to augment the traditional instrumentation to enable enhanced performance. Specific analyses performed in this study include the application of signal-processing and image-processing techniques to solve the problems of interferometer aperture baseline control, interferometer pointing, and orientation of multiple interferometer aperture pairs.

  17. Color line scan camera technology and machine vision: requirements to consider

    NASA Astrophysics Data System (ADS)

    Paernaenen, Pekka H. T.

    1997-08-01

    Color machine vision has shown a dynamic uptrend in use within the past few years as the introduction of new cameras and scanner technologies itself underscores. In the future, the movement from monochrome imaging to color will hasten, as machine vision system users demand more knowledge about their product stream. As color has come to the machine vision, certain requirements for the equipment used to digitize color images are needed. Color machine vision needs not only a good color separation but also a high dynamic range and a good linear response from the camera used. Good dynamic range and linear response is necessary for color machine vision. The importance of these features becomes even more important when the image is converted to another color space. There is always lost some information when converting integer data to another form. Traditionally the color image processing has been much slower technique than the gray level image processing due to the three times greater data amount per image. The same has applied for the three times more memory needed. The advancements in computers, memory and processing units has made it possible to handle even large color images today cost efficiently. In some cases he image analysis in color images can in fact even be easier and faster than with a similar gray level image because of more information per pixel. Color machine vision sets new requirements for lighting, too. High intensity and white color light is required in order to acquire good images for further image processing or analysis. New development in lighting technology is bringing eventually solutions for color imaging.

  18. High-throughput electrical characterization for robust overlay lithography control

    NASA Astrophysics Data System (ADS)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  19. Business Process Management

    NASA Astrophysics Data System (ADS)

    Hantry, Francois; Papazoglou, Mike; van den Heuvel, Willem-Jan; Haque, Rafique; Whelan, Eoin; Carroll, Noel; Karastoyanova, Dimka; Leymann, Frank; Nikolaou, Christos; Lammersdorf, Winfried; Hacid, Mohand-Said

    Business process management is one of the core drivers of business innovation and is based on strategic technology and capable of creating and successfully executing end-to-end business processes. The trend will be to move from relatively stable, organization-specific applications to more dynamic, high-value ones where business process interactions and trends are examined closely to understand more accurately an application's requirements. Such collaborative, complex end-to-end service interactions give rise to the concept of Service Networks (SNs).

  20. Simplified dichromated gelatin hologram recording process

    NASA Technical Reports Server (NTRS)

    Georgekutty, Tharayil G.; Liu, Hua-Kuang

    1987-01-01

    A simplified method for making dichromated gelatin (DCG) holographic optical elements (HOE) has been discovered. The method is much less tedious and it requires a period of processing time comparable with that for processing a silver halide hologram. HOE characteristics including diffraction efficiency (DE), linearity, and spectral sensitivity have been quantitatively investigated. The quality of the holographic grating is very high. Ninety percent or higher diffraction efficiency has been achieved in simple plane gratings made by this process.

  1. Application and Prospects of High-strength Lightweight Materials used in Coal mine

    NASA Astrophysics Data System (ADS)

    He, Pan

    2017-09-01

    This paper describes some high-strength lightweight materials used in coal mine, and if their performance can meet the requirements of underground safety for explosion-proof, anti-static, friction sparks mine; and reviewed the species, characteristic, preparation process of high-strength lightweight materials for having inspired lightweight high-strength performance by modifying or changing the synthesis mode used in coal mine equipment.

  2. Design of signal reception and processing system of embedded ultrasonic endoscope

    NASA Astrophysics Data System (ADS)

    Li, Ming; Yu, Feng; Zhang, Ruiqiang; Li, Yan; Chen, Xiaodong; Yu, Daoyin

    2009-11-01

    Embedded Ultrasonic Endoscope, based on embedded microprocessor and embedded real-time operating system, sends a micro ultrasonic probe into coelom through the biopsy channel of the Electronic Endoscope to get the fault histology features of digestive organs by rotary scanning, and acquires the pictures of the alimentary canal mucosal surface. At the same time, ultrasonic signals are processed by signal reception and processing system, forming images of the full histology of the digestive organs. Signal Reception and Processing System is an important component of Embedded Ultrasonic Endoscope. However, the traditional design, using multi-level amplifiers and special digital processing circuits to implement signal reception and processing, is no longer satisfying the standards of high-performance, miniaturization and low power requirements that embedded system requires, and as a result of the high noise that multi-level amplifier brought, the extraction of small signal becomes hard. Therefore, this paper presents a method of signal reception and processing based on double variable gain amplifier and FPGA, increasing the flexibility and dynamic range of the Signal Reception and Processing System, improving system noise level, and reducing power consumption. Finally, we set up the embedded experiment system, using a transducer with the center frequency of 8MHz to scan membrane samples, and display the image of ultrasonic echo reflected by each layer of membrane, with a frame rate of 5Hz, verifying the correctness of the system.

  3. Lasers for industrial production processing: tailored tools with increasing flexibility

    NASA Astrophysics Data System (ADS)

    Rath, Wolfram

    2012-03-01

    High-power fiber lasers are the newest generation of diode-pumped solid-state lasers. Due to their all-fiber design they are compact, efficient and robust. Rofin's Fiber lasers are available with highest beam qualities but the use of different process fiber core sizes enables the user additionally to adapt the beam quality, focus size and Rayleigh length to his requirements for best processing results. Multi-mode fibers from 50μm to 600μm with corresponding beam qualities of 2.5 mm.mrad to 25 mm.mrad are typically used. The integrated beam switching modules can make the laser power available to 4 different manufacturing systems or can share the power to two processing heads for parallel processing. Also CO2 Slab lasers combine high power with either "single-mode" beam quality or higher order modes. The wellestablished technique is in use for a large number of industrial applications, processing either metals or non-metallic materials. For many of these applications CO2 lasers remain the best choice of possible laser sources either driven by the specific requirements of the application or because of the cost structure of the application. The actual technical properties of these lasers will be presented including an overview over the wavelength driven differences of application results, examples of current industrial practice as cutting, welding, surface processing including the flexible use of scanners and classical optics processing heads.

  4. Solar cell efficiency and high temperature processing of n-type silicon grown by the noncontact crucible method

    DOE PAGES

    Jensen, Mallory A.; LaSalvia, Vincenzo; Morishige, Ashley E.; ...

    2016-08-01

    The capital expense (capex) of conventional crystal growth methods is a barrier to sustainable growth of the photovoltaic industry. It is challenging for innovative techniques to displace conventional growth methods due the low dislocation density and high lifetime required for high efficiency devices. One promising innovation in crystal growth is the noncontact crucible method (NOC-Si), which combines aspects of Czochralski (Cz) and conventional casting. This material has the potential to satisfy the dual requirements, with capex likely between that of Cz (high capex) and multicrystalline silicon (mc-Si, low capex). In this contribution, we observe a strong dependence of solar cellmore » efficiency on ingot height, correlated with the evolution of swirl-like defects, for single crystalline n-type silicon grown by the NOC-Si method. We posit that these defects are similar to those observed in Cz, and we explore the response of NOC-Si to high temperature treatments including phosphorous diffusion gettering (PDG) and Tabula Rasa (TR). The highest lifetimes (2033 us for the top of the ingot and 342 us for the bottom of the ingot) are achieved for TR followed by a PDG process comprising a standard plateau and a low temperature anneal. Further improvements can be gained by tailoring the time-temperature profiles of each process. Lifetime analysis after the PDG process indicates the presence of a getterable impurity in the as-grown material, while analysis after TR points to the presence of oxide precipitates especially at the bottom of the ingot. Uniform lifetime degradation is observed after TR which we assign to a presently unknown defect. Lastly, future work includes additional TR processing to uncover the nature of this defect, microstructural characterization of suspected oxide precipitates, and optimization of the TR process to achieve the dual goals of high lifetime and spatial homogenization.« less

  5. Enhanced LAW Glass Correlation - Phase 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, Isabelle S.; Matlack, Keith S.; Pegg, Ian L.

    About 50 million gallons of high-level mixed waste is currently stored in underground tanks at the United States Department of Energy’s (DOE’s) Hanford site in the State of Washington. The Hanford Tank Waste Treatment and Immobilization Plant (WTP) will provide DOE’s Office of River Protection (ORP) with a means of treating this waste by vitrification for subsequent disposal. The tank waste will be separated into low- and high-activity waste fractions, which will then be vitrified respectively into Immobilized Low Activity Waste (ILAW) and Immobilized High Level Waste (IHLW) products. The ILAW product will be disposed in an engineered facility onmore » the Hanford site while the IHLW product is designed for acceptance into a national deep geological disposal facility for high-level nuclear waste. The ILAW and IHLW products must meet a variety of requirements with respect to protection of the environment before they can be accepted for disposal. Acceptable glass formulations for vitrification of Hanford low activity waste (LAW) must meet a variety of product quality, processability, and waste loading requirements. To this end, The Vitreous State Laboratory (VSL) at The Catholic University of America (CUA) developed and tested a number of glass formulations during Part A, Part B1 and Part B2 of the WTP development program. The testing resulted in the selection of target glass compositions for the processing of eight of the Phase I LAW tanks. The selected glass compositions were tested at the crucible scale to confirm their compliance with ILAW performance requirements. Duramelter 100 (DM100) and LAW Pilot Melter tests were then conducted to demonstrate the viability of these glass compositions for LAW vitrification at high processing rates.« less

  6. Possibilities for specific utilization of material properties for an optimal part design

    NASA Astrophysics Data System (ADS)

    Beier, T.; Gerlach, J.; Roettger, R.; Kuhn, P.

    2017-09-01

    High-strength, cold-formable steels offer great potential for meeting cost and safety requirements in the automotive industry. In view of strengths of up to 1200 MPa now attainable, certain aspects need to be analysed and evaluated in advance in the development process using these materials. In addition to early assessment of crash properties, it is also highly important to adapt the forming process to match the material potential. The steel making companies have widened their portfolios of cold-rolled dual-phase steels well beyond the conventional high-strength steels. There are added new grades which offer a customized selection of high energy absorption, deformation resistance or enhanced cold-forming properties. In this article the necessary components for material modelling for finite element simulation are discussed. Additionally the required tests for material model calibration are presented and the potentials of the thyssenkrupp Steel material data base are introduced. Besides classical tensile tests at different angles to rolling direction and the forming limit curve, the hydraulic bulge test is now available for a wide range of modern steel grades. Using the conventional DP-K®60/98 and the DP-K®700Y980T with higher yield strength the method for calibrating yield locus, hardening and formability is given. With reference to the examples of an A-pillar reinforcement and different crash tests the procedure is shown how the customer can evaluate an optimal steel grade for specific requirements. Although the investigated materials have different yield strengths, no large differences in the forming process between the two steel grades can be found. However some advantages of the high-yield grade can be detected in crash performance depending on the specific boundary and loading conditions.

  7. Alkaline treatment of high-solids sludge and its application to anaerobic digestion.

    PubMed

    Li, Chenchen; Li, Huan; Zhang, Yuyao

    2015-01-01

    High-solids anaerobic digestion is a promising new process for sludge reduction and bioenergy recovery, requiring smaller digestion tanks and less energy for heating, but a longer digestion time, than traditional low-solids anaerobic digestion. To accelerate this process, alkaline sludge disintegration was tested as a pretreatment method for anaerobic digestion of high-solids sludge. The results showed that alkaline treatment effectively disintegrated both low-solids sludge and high-solids sludge, and treatment duration of 30 min was the most efficient. The relation between sludge disintegration degree and NaOH dose can be described by a transmutative power function model. At NaOH dose lower than 0.2 mol/L, sludge disintegration degree remained virtually unchanged when sludge total solids (TS) content increased from 2.0 to 11.0%, and decreased only slightly when sludge TS increased to 14.2%. Although high-solids sludge required a slightly higher molarity of NaOH to reach the same disintegration level of low-solids sludge, the required mass of NaOH actually decreased due to sludge thickening. From the view of NaOH consumption, sludge TS of 8-12% and a NaOH dose of 0.05 mol/L were optimum conditions for alkaline pretreatment, which resulted in a slight increase in accumulative biogas yield, but a decrease by 24-29% in digestion time during the subsequent anaerobic digestion.

  8. Solid Waste Management Requirements Definition for Advanced Life Support Missions: Results

    NASA Technical Reports Server (NTRS)

    Alazraki, Michael P.; Hogan, John; Levri, Julie; Fisher, John; Drysdale, Alan

    2002-01-01

    Prior to determining what Solid Waste Management (SWM) technologies should be researched and developed by the Advanced Life Support (ALS) Project for future missions, there is a need to define SWM requirements. Because future waste streams will be highly mission-dependent, missions need to be defined prior to developing SWM requirements. The SWM Working Group has used the mission architecture outlined in the System Integration, Modeling and Analysis (SIMA) Element Reference Missions Document (RMD) as a starting point in the requirement development process. The missions examined include the International Space Station (ISS), a Mars Dual Lander mission, and a Mars Base. The SWM Element has also identified common SWM functionalities needed for future missions. These functionalities include: acceptance, transport, processing, storage, monitoring and control, and disposal. Requirements in each of these six areas are currently being developed for the selected missions. This paper reviews the results of this ongoing effort and identifies mission-dependent resource recovery requirements.

  9. Study on loading path optimization of internal high pressure forming process

    NASA Astrophysics Data System (ADS)

    Jiang, Shufeng; Zhu, Hengda; Gao, Fusheng

    2017-09-01

    In the process of internal high pressure forming, there is no formula to describe the process parameters and forming results. The article use numerical simulation to obtain several input parameters and corresponding output result, use the BP neural network to found their mapping relationship, and with weighted summing method make each evaluating parameters to set up a formula which can evaluate quality. Then put the training BP neural network into the particle swarm optimization, and take the evaluating formula of the quality as adapting formula of particle swarm optimization, finally do the optimization and research at the range of each parameters. The results show that the parameters obtained by the BP neural network algorithm and the particle swarm optimization algorithm can meet the practical requirements. The method can solve the optimization of the process parameters in the internal high pressure forming process.

  10. System Definition Document

    DOT National Transportation Integrated Search

    1996-06-12

    The Gary-Chicago-Milwaukee (GCM) Corridor Transportation Information Center : (C-TIC) System Definition Document describes the C-TIC concept and defines the : high level processes and dataflows. The Requirements Specification together : with the Inte...

  11. Decreased reproductive rates in sheep fed a high selenium diet

    USDA-ARS?s Scientific Manuscript database

    High Se-containing forages grow on seleniferous soils in many parts of the United States and throughout the world. Selenium is an essential trace element that is required for many physiological processes but can also be either acutely or chronically toxic to livestock. Anecdotal reports of decrease...

  12. High School Student Information Access and Engineering Design Performance

    ERIC Educational Resources Information Center

    Mentzer, Nathan

    2014-01-01

    Developing solutions to engineering design problems requires access to information. Research has shown that appropriately accessing and using information in the design process improves solution quality. This quasi-experimental study provides two groups of high school students with a design problem in a three hour design experience. One group has…

  13. Self-Consciousness, Evaluation of Physical Characteristics, and Physical Attractiveness.

    ERIC Educational Resources Information Center

    Turner, Robert G.; Gilliland, LuNell

    1981-01-01

    Investigated the relationship between public self-consciousness and speed of processing information about self. Results indicated that high public self-conciousness subjects required less time to report evaluations of their physical features. In a second study high public self-conciousness was shown to be positively related to judged physical…

  14. Synthesis and processing of nanostructured BN and BN/Ti composites

    NASA Astrophysics Data System (ADS)

    Horvath, Robert Steven

    Superhard materials, such as cubic-BN, are widely used in machine tools, grinding wheels, and abrasives. Low density combined with high hardness makes c-BN and its composites attractive candidate materials for personnel and vehicular armor. However, improvements in toughness, and ballistic-impact performance, are needed to meet anticipated performance requirements. To achieve such improvements, we have targeted for development nanostructured c-BN, and its composites with Ti. Current research utilizes an experimental high pressure/high temperature (HPHT) method to produce these materials on a laboratory scale. Results from this work should transfer well into the industrial arena, utilizing high-tonnage presses used in the production of synthetic diamond and c-BN. Progress has been made in: (1) HPHT synthesis of cBN powder using Mg as catalyst; (2) HPHT consolidation of cBN powder to produce nanostructured cBN; (3) reactive-HPHT consolidation of mixed cBN/Ti powder to produce nanostructured Ti- or TiB2/TiN-bonded cBN; and (4) reactive-HPHT consolidation of mixed hBN/Ti powder to produce nanostructured Ti-bonded TiB2/TiN or TiB2/TiN. Even so, much remains to be done to lay a firm scientific foundation to enable the reproducible fabrication of large-area panels for armor applications. To this end, Rutgers has formed a partnership with a major producer of hard and superhard materials. The ability to produce hard and superhard nanostructured composites by reacting cBN or hBN with Ti under high pressure also enables multi-layered structures to be developed. Such structures may be designed to satisfy impedance-mismatch requirements for high performance armor, and possibly provide a multi-hit capability. A demonstration has been made of reactive-HPHT processing of multi-layered composites, consisting of alternating layers of superhard Ti-bonded cBN and tough Ti. It is noteworthy that the pressure requirements for processing Ti-bonded cBN, Ti-bonded TiB2/TiN, and their corresponding multi-layered structures are in the 0.1-1.0 GPa range, well within the capabilities of today's hot-pressing technologies; thus scaling this new reactive-HPHT processing technology seems assured. Future research will focus on establishing mechanisms and kinetics of the various phase transformations observed during reactive-HPHT processing, with the objective of being able to optimize processing parameters to generate nanostructured cBN-based and TiB2/TiN-based composites that display superior mechanical properties, particularly under high-strain-rate conditions.

  15. High-Si content BARC for dual-BARC systems such as trilayer patterning

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph; Xie, Song-Yuan; Wu, Ze-Yu; Katsanes, Ron; Flanigan, Kyle; Lee, Kevin; Slezak, Mark; Liu, Zhi; Lin, Shang-Ho

    2009-03-01

    This work discusses the requirements and performance of Honeywell's middle layer material, UVAS, for tri-layer patterning. UVAS is a high Si content polymer synthesized directly from Si containing starting monomer components. The monomers are selected to produce a film that meets the requirements as a middle layer for tri-layer patterning (TLP) and gives us a level of flexibility to adjust the properties of the film to meet the customer's specific photoresist and patterning requirements. Results of simulations of the substrate reflectance versus numerical aperture, UVAS thickness, and under layer film are presented. ArF photoresist line profiles and process latitude versus UVAS bake at temperatures as low as 150ºC are presented and discussed. Immersion lithographic patterning of ArF photoresist line space and contact hole features will be presented. A sequence of SEM images detailing the plasma etch transfer of line space photoresist features through the middle and under layer films comprising the TLP film stack will be presented. Excellent etch selectivity between the UVAS and the organic under layer film exists as no edge erosion or faceting is observed as a result of the etch process. A detailed study of the impact of a PGMEA solvent photoresist rework process on the lithographic process window of a TLP film stack was performed with the results indicating that no degradation to the UVAS film occurs.

  16. Low Melt Viscosity Resins for Resin Transfer Molding

    NASA Technical Reports Server (NTRS)

    Harris, Frank W.

    2002-01-01

    In recent years, resin transfer molding (RTM) has become one of the methods of choice for high performance composites. Its cost effectiveness and ease of fabrication are major advantages of RTM. RTM process usually requires resins with very low melt viscosity (less than 10 Poise). The optimum RTM resins also need to display high thennal-oxidative stability, high glass transition temperature (T(sub g)), and good toughness. The traditional PMR-type polyimides (e.g. PMR-15) do not fit this requirement, because the viscosities are too high and the nadic endcap cures too fast. High T(sub g), low-melt viscosity resins are highly desirable for aerospace applications and NASA s Reusable Launch Vehicle (RLV) program. The objective of this work is to prepare low-melt viscosity polyimide resins for RTM or resin film infusion (RFI) processes. The approach involves the synthesis of phenylethynyl-terminated imide oligomers. These materials have been designed to minimize their melt viscosity so that they can be readily processed. During the cure, the oligomers undergo both chain extension and crosslinking via the thermal polymerization of the phenylethynyl groups. The Phenylethynyl endcap is preferred over the nadic group due to its high curing temperature, which provides broader processing windows. This work involved the synthesis and polymerization of oligomers containing zig-zag backbones and twisted biphenyl structures. Some A-B type precursors which possessed both nitro and anhydride functionality, or both nitro and amine functionality, were also synthesized in order to obtain the well defined oligomers. The resulting zig-zag structured oligomers were then end-capped with 4-phenylethynylphthalic anhydride (PEPA) for further cure. The properties of these novel imide oligomers are evaluated.

  17. Optimization of an innovative approach involving mechanical activation and acid digestion for the extraction of lithium from lepidolite

    NASA Astrophysics Data System (ADS)

    Vieceli, Nathália; Nogueira, Carlos A.; Pereira, Manuel F. C.; Durão, Fernando O.; Guimarães, Carlos; Margarido, Fernanda

    2018-01-01

    The recovery of lithium from hard rock minerals has received increased attention given the high demand for this element. Therefore, this study optimized an innovative process, which does not require a high-temperature calcination step, for lithium extraction from lepidolite. Mechanical activation and acid digestion were suggested as crucial process parameters, and experimental design and response-surface methodology were applied to model and optimize the proposed lithium extraction process. The promoting effect of amorphization and the formation of lithium sulfate hydrate on lithium extraction yield were assessed. Several factor combinations led to extraction yields that exceeded 90%, indicating that the proposed process is an effective approach for lithium recovery.

  18. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    PubMed

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  19. Performance of electrodialysis reversal and reverse osmosis for reclaiming wastewater from high-tech industrial parks in Taiwan: A pilot-scale study.

    PubMed

    Yen, Feng-Chi; You, Sheng-Jie; Chang, Tien-Chin

    2017-02-01

    Wastewater reclamation is considered an absolute necessity in Taiwan, as numerous industrial parks experience water shortage. However, the water quality of secondary treated effluents from sewage treatment plants generally does not meet the requirements of industrial water use because of the high inorganic constituents. This paper reports experimental data from a pilot-plant study of two treatment processes-(i) fiber filtration (FF)-ultrafiltration (UF)-reverse osmosis (RO) and (ii) sand filtration (SF)-electrodialysis reversal (EDR)-for treating industrial high conductivity effluents from the Xianxi wastewater treatment plant in Taiwan. The results demonstrated that FF-UF was excellent for turbidity removal and it was a suitable pretreatment process for RO. The influence of two membrane materials on the operating characteristics and process stability of the UF process was determined. The treatment performance of FF-UF-RO was higher than that of SF-EDR with an average desalination rate of 97%, a permeate conductivity of 272.7 ± 32.0, turbidity of 0.183 ± 0.02 NTU and a chemical oxigen demand of <4.5 mg/L. The cost analysis for both processes in a water reclamation plant of 4000 m 3 /d capacity revealed that using FF-UF-RO had a lower treatment cost than using SF-EDR, which required activated carbon filtration as a post treatment process. On the basis of the results in this study, the FF-UF-RO system is recommended as a potential process for additional applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Preliminary results from the High Speed Airframe Integration Research project

    NASA Technical Reports Server (NTRS)

    Coen, Peter G.; Sobieszczanski-Sobieski, Jaroslaw; Dollyhigh, Samuel M.

    1992-01-01

    A review is presented of the accomplishment of the near term objectives of developing an analysis system and optimization methods during the first year of the NASA Langley High Speed Airframe Integration Research (HiSAIR) project. The characteristics of a Mach 3 HSCT transport have been analyzed utilizing the newly developed process. In addition to showing more detailed information about the aerodynamic and structural coupling for this type of vehicle, this exercise aided in further refining the data requirements for the analysis process.

  1. Retooling the nurse executive for 21st century practice: decision support systems.

    PubMed

    Fralic, M F; Denby, C B

    2000-01-01

    Health care financing and care delivery systems are changing at almost warp speed. This requires new responses and new capabilities from contemporary nurse executives and calls for new approaches to the preparation of the next generation of nursing leaders. The premise of this article is that, in these highly unstable environments, the nurse executive faces the need to make high-impact decisions in relatively short time frames. A standardized process for objective decision making becomes essential. This article describes that process.

  2. Anaerobic digestion of municipal solid waste: Technical developments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivard, C.J.

    1996-01-01

    The anaerobic biogasification of organic wastes generates two useful products: a medium-Btu fuel gas and a compost-quality organic residue. Although commercial-scale digestion systems are used to treat municipal sewage wastes, the disposal of solid organic wastes, including municipal solid wastes (MSW), requires a more cost-efficient process. Modern biogasification systems employ high-rate, high-solids fermentation methods to improve process efficiency and reduce capital costs. The design criteria and development stages are discussed. These systems are also compared with conventional low-solids fermentation technology.

  3. Process design and economic analysis of the zinc selenide thermochemical hydrogen cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otsuki, H.H.; Krikorian, O.H.

    1978-09-06

    A detailed preliminary design for a hydrogen production plant has been developed based on an improved version of the ZnSe thermochemical cycle for decomposing water. In the latest version of the cycle, ZnCl/sub 2/ is converted directly to ZnO through high temperature steam hydrolysis. This eliminates the need for first converting ZnCl/sub 2/ to ZnSO/sub 4/ and also slightly reduces the overall heat requirement. Moreover, it broadens the temperature range over which prime heat is required and improves the coupling of the cycle with a nuclear reactor heat source. The ZnSe cycle is driven by a very-high-temperature nuclear reactor (VHTR)more » proposed by Westinghouse that provides a high-temperature (1283 K) helium working gas for process heat and power. The plant is sized to produce 27.3 Mg H/sub 2//h (60,000 lb H/sub 2//h) and requires specially designed equipment to perform the critical reaction steps in the cycle. We have developed conceptual designs for several of the important process steps to make cost estimates, and have obtained a cycle efficiency of about 40% and a hydrogen production cost of about $14/GJ. We believe that the cost is high because input data on reaction rates and equipment lifetimes have been conservatively estimated and the cycle parameters have not been optimized. Nonetheless, this initial analysis serves an important function in delineating areas in the cycle where additional research is needed to increase efficiency and reduce costs in a more advanced version of the cycle.« less

  4. Processing Conjugated-Diene-Containing Polymers

    NASA Technical Reports Server (NTRS)

    Bell, Vernon L.; Havens, Stephen J.

    1987-01-01

    Diels-Alder reaction used to cross-linked thermoplastics. Process uses Diels-Alder reaction to cross-link and/or extend conjugated-diene-containing polymers by reacting them with bis-unsaturated dienophiles results in improved polymer properties. Quantities of diene groups required for cross-linking varies from very low to very high concentrations. Process also used to extend, or build up molecular weights of, low-molecular-weight linear polymers with terminal conjugated dienic groups.

  5. Low Cost Process for Manufacture of Oxide Dispersion Strengthened (ODS) Turbine Nozzle Components.

    DTIC Science & Technology

    1979-12-01

    SWTTPROCESS FORJIANUFACTURE OF9OXIDE)ISPERSIONSTRENGTHENED (ODS) O0 TURBINE !IJOZZLE COMPONENTS, -- , General Electric Company Aircraft Engine Group...machining processes for low pressure turbine (LPT) vanes , high pressure turbine (HPT) vanes , and HPT band segments for the F101 engine . The primary intent...for aircraft turbine nozzle components. These processes were shown capable of maintaining required microstructures and properties for the vane and

  6. Holistic face representation is highly orientation-specific.

    PubMed

    Rosenthal, Gideon; Levakov, Gidon; Avidan, Galia

    2017-09-29

    It has long been argued that face processing requires disproportionate reliance on holistic processing (HP), relative to that required for nonface object recognition. Nevertheless, whether the holistic nature of face perception is achieved via a unique internal representation or by the employment of an automated attention mechanism is still debated. Previous studies had used the face inversion effect (FIE), a unique face-processing marker, or the face composite task, a gold standard paradigm measuring holistic processing, to examine the validity of these two different hypotheses, with some studies combining the two paradigms. However, the results of such studies remain inconclusive, particularly pertaining to the issue of the two proposed HP mechanisms-an internal representation as opposed to an automated attention mechanism. Here, using the complete composite paradigm design, we aimed to examine whether face rotation yields a nonlinear or a linear drop in HP, thus supporting an account that face processing is based either on an orientation-dependent internal representation or on automated attention. Our results reveal that even a relatively small perturbation in face orientation (30 deg away from upright) already causes a sharp decline in HP. These findings support the face internal representation hypothesis and the notion that the holistic processing of faces is highly orientation-specific.

  7. Atomic layer deposition and etching methods for far ultraviolet aluminum mirrors

    NASA Astrophysics Data System (ADS)

    Hennessy, John; Moore, Christopher S.; Balasubramanian, Kunjithapatham; Jewell, April D.; Carter, Christian; France, Kevin; Nikzad, Shouleh

    2017-09-01

    High-performance aluminum mirrors at far ultraviolet wavelengths require transparent dielectric materials as protective coatings to prevent oxidation. Reducing the thickness of this protective layer can result in additional performance gains by minimizing absorption losses, and provides a path toward high Al reflectance in the challenging wavelength range of 90 to 110 nm. We have pursued the development of new atomic layer deposition processes (ALD) for the metal fluoride materials of MgF2, AlF3 and LiF. Using anhydrous hydrogen fluoride as a reactant, these films can be deposited at the low temperatures required for large-area surface-finished optics and polymeric diffraction gratings. We also report on the development and application of an atomic layer etching (ALE) procedure to controllably etch native aluminum oxide. Our ALE process utilizes the same chemistry used in the ALD of AlF3 thin films, allowing for a combination of high-performance evaporated Al layers and ultrathin ALD encapsulation without requiring vacuum transfer. Progress in demonstrating the scalability of this approach, as well as the environmental stability of ALD/ALE Al mirrors are discussed in the context of possible future applications for NASA LUVOIR and HabEx mission concepts.

  8. The hyperthermophilic α-amylase from Thermococcus sp. HJ21 does not require exogenous calcium for thermostability because of high-binding affinity to calcium.

    PubMed

    Cheng, Huaixu; Luo, Zhidan; Lu, Mingsheng; Gao, Song; Wang, Shujun

    2017-05-01

    The hyperthermophilic α-amylase from Thermococcus sp. HJ21 does not require exogenous calcium ions for thermostability, and is a promising alternative to commercially available α-amylases to increase the efficiency of industrial processes like the liquefaction of starch. We analyzed the amino acid sequence of this α-amylase by sequence alignments and structural modeling, and found that this α-amylase closely resembles the α-amylase from Pyrococcus woesei. The gene of this α-amylase was cloned in Escherichia coli and the recombinant α-amylase was overexpressed and purified with a combined renaturation-purification procedure. We confirmed thermostability and exogenous calcium ion independency of the recombinant α-amylase and further investigated the mechanism of the independency using biochemical approaches. The results suggested that the α-amylase has a high calcium ion binding affinity that traps a calcium ion that would not dissociate at high temperatures, providing a direct explanation as to why the addition of calcium ions is not required for thermostability. Understanding of the mechanism offers a strong base on which to further engineer properties of this α-amylase for better potential applications in industrial processes.

  9. Localization of multiple defects using the compact phased array (CPA) method

    NASA Astrophysics Data System (ADS)

    Senyurek, Volkan Y.; Baghalian, Amin; Tashakori, Shervin; McDaniel, Dwayne; Tansel, Ibrahim N.

    2018-01-01

    Array systems of transducers have found numerous applications in detection and localization of defects in structural health monitoring (SHM) of plate-like structures. Different types of array configurations and analysis algorithms have been used to improve the process of localization of defects. For accurate and reliable monitoring of large structures by array systems, a high number of actuator and sensor elements are often required. In this study, a compact phased array system consisting of only three piezoelectric elements is used in conjunction with an updated total focusing method (TFM) for localization of single and multiple defects in an aluminum plate. The accuracy of the localization process was greatly improved by including wave propagation information in TFM. Results indicated that the proposed CPA approach can locate single and multiple defects with high accuracy while decreasing the processing costs and the number of required transducers. This method can be utilized in critical applications such as aerospace structures where the use of a large number of transducers is not desirable.

  10. Remote Earth Sciences data collection using ACTS

    NASA Technical Reports Server (NTRS)

    Evans, Robert H.

    1992-01-01

    Given the focus on global change and the attendant scope of such research, we anticipate significant growth of requirements for investigator interaction, processing system capabilities, and availability of data sets. The increased complexity of global processes requires interdisciplinary teams to address them; the investigators will need to interact on a regular basis; however, it is unlikely that a single institution will house sufficient investigators with the required breadth of skills. The complexity of the computations may also require resources beyond those located within a single institution; this lack of sufficient computational resources leads to a distributed system located at geographically dispersed institutions. Finally the combination of long term data sets like the Pathfinder datasets and the data to be gathered by new generations of satellites such as SeaWiFS and MODIS-N yield extra-ordinarily large amounts of data. All of these factors combine to increase demands on the communications facilities available; the demands are generating requirements for highly flexible, high capacity networks. We have been examining the applicability of the Advanced Communications Technology Satellite (ACTS) to address the scientific, computational, and, primarily, communications questions resulting from global change research. As part of this effort three scenarios for oceanographic use of ACTS have been developed; a full discussion of this is contained in Appendix B.

  11. Laser processes and system technology for the production of high-efficient crystalline solar cells

    NASA Astrophysics Data System (ADS)

    Mayerhofer, R.; Hendel, R.; Zhu, Wenjie; Geiger, S.

    2012-10-01

    The laser as an industrial tool is an essential part of today's solar cell production. Due to the on-going efforts in the solar industry, to increase the cell efficiency, more and more laser-based processes, which have been discussed and tested at lab-scale for many years, are now being implemented in mass production lines. In order to cope with throughput requirements, standard laser concepts have to be improved continuously with respect to available average power levels, repetition rates or beam profile. Some of the laser concepts, that showed high potential in the past couple of years, will be substituted by other, more economic laser types. Furthermore, requirements for processing with less-heat affected zones fuel the development of industry-ready ultra short pulsed lasers with pulse widths even below the picosecond range. In 2011, the German Ministry of Education and Research (BMBF) had launched the program "PV-Innovation Alliance", with the aim to support the rapid transfer of high-efficiency processes out of development departments and research institutes into solar cell production lines. Here, lasers play an important role as production tools, allowing the fast implementation of high-performance solar cell concepts. We will report on the results achieved within the joint project FUTUREFAB, where efficiency optimization, throughput enhancement and cost reduction are the main goals. Here, the presentation will focus on laser processes like selective emitter doping and ablation of dielectric layers. An indispensable part of the efforts towards cost reduction in solar cell production is the improvement of wafer handling and throughput capabilities of the laser processing system. Therefore, the presentation will also elaborate on new developments in the design of complete production machines.

  12. Ultrahigh-efficiency solution-processed simplified small-molecule organic light-emitting diodes using universal host materials

    PubMed Central

    Han, Tae-Hee; Choi, Mi-Ri; Jeon, Chan-Woo; Kim, Yun-Hi; Kwon, Soon-Ki; Lee, Tae-Woo

    2016-01-01

    Although solution processing of small-molecule organic light-emitting diodes (OLEDs) has been considered as a promising alternative to standard vacuum deposition requiring high material and processing cost, the devices have suffered from low luminous efficiency and difficulty of multilayer solution processing. Therefore, high efficiency should be achieved in simple-structured small-molecule OLEDs fabricated using a solution process. We report very efficient solution-processed simple-structured small-molecule OLEDs that use novel universal electron-transporting host materials based on tetraphenylsilane with pyridine moieties. These materials have wide band gaps, high triplet energy levels, and good solution processabilities; they provide balanced charge transport in a mixed-host emitting layer. Orange-red (~97.5 cd/A, ~35.5% photons per electron), green (~101.5 cd/A, ~29.0% photons per electron), and white (~74.2 cd/A, ~28.5% photons per electron) phosphorescent OLEDs exhibited the highest recorded electroluminescent efficiencies of solution-processed OLEDs reported to date. We also demonstrate a solution-processed flexible solid-state lighting device as a potential application of our devices. PMID:27819053

  13. Gas Chromatic Mass Spectrometer

    NASA Technical Reports Server (NTRS)

    Wey, Chowen

    1995-01-01

    Gas chromatograph/mass spectrometer (GC/MS) used to measure and identify combustion species present in trace concentration. Advanced extractive diagnostic method measures to parts per billion (PPB), as well as differentiates between different types of hydrocarbons. Applicable for petrochemical, waste incinerator, diesel transporation, and electric utility companies in accurately monitoring types of hydrocarbon emissions generated by fuel combustion, in order to meet stricter environmental requirements. Other potential applications include manufacturing processes requiring precise detection of toxic gaseous chemicals, biomedical applications requiring precise identification of accumulative gaseous species, and gas utility operations requiring high-sensitivity leak detection.

  14. Process Studies on Laser Welding of Copper with Brilliant Green and Infrared Lasers

    NASA Astrophysics Data System (ADS)

    Engler, Sebastian; Ramsayer, Reiner; Poprawe, Reinhart

    Copper materials are classified as difficult to weld with state-of-the-art lasers. High thermal conductivity in combination with low absorption at room temperature require high intensities for reaching a deep penetration welding process. The low absorption also causes high sensitivity to variations in surface conditions. Green laser radiation shows a considerable higher absorption at room temperature. This reduces the threshold intensity for deep penetration welding significantly. The influence of the green wavelength on energy coupling during heat conduction welding and deep penetration welding as well as the influence on the weld shape has been investigated.

  15. Shielding NSLS-II light source: Importance of geometry for calculating radiation levels from beam losses

    DOE PAGES

    Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; ...

    2016-08-10

    We present that third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and rampedmore » operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. In conclusion, this made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.« less

  16. Functional Requirements for an Electronic Work Package System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna H.

    This document provides a set of high level functional requirements for a generic electronic work package (eWP) system. The requirements have been identified by the U.S. nuclear industry as a part of the Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative. The functional requirements are mainly applied to eWP system supporting Basic and Moderate types of smart documents, i.e., documents that have fields for recording input such as text, dates, numbers, and equipment status, and documents which incorporate additional functionalities such as form field data “type“ validation (e.g. date, text, number, and signature) of data entered and/or self-populate basicmore » document information (usually from existing host application meta data) on the form when the user first opens it. All the requirements are categorized by the roles; Planner, Supervisor, Craft, Work Package Approval Reviewer, Operations, Scheduling/Work Control, and Supporting Functions. The categories Statistics, Records, Information Technology are also included used to group the requirements. All requirements are presented in Section 2 through Section 11. Examples of more detailed requirements are provided for the majority of high level requirements. These examples are meant as an inspiration to be used as each utility goes through the process of identifying their specific requirements. The report’s table of contents provides a summary of the high level requirements.« less

  17. Life Out of Chaos

    NASA Technical Reports Server (NTRS)

    Arrhenius, Gustaf

    2002-01-01

    Doctinary overlays on the definition of life can effectively be avoided by focusing discussion on microorganisms, their vital processes, and their genetic pedigree. To reach beyond these present and highly advanced forms of life and to inquire about its origin it is necessary to consider the requirements imposed by the environment. These requirements include geophysically and geochemically acceptable conjectures for the generation of source compounds, their concentration from dilute solution, and their selective combination into functional biomolecules. For vital function these macromolecules require programming in the form of specific sequence motifs. This critical programming constitutes the scientifically least understood process in the origin of life. Once this stage has been surpassed the laws of Darwinian evolution can operate in ways that are understood and experimentally demonstrated.

  18. High Power Laser Processing Of Materials

    NASA Astrophysics Data System (ADS)

    Martyr, D. R.; Holt, T.

    1987-09-01

    The first practical demonstration of a laser device was in 1960 and in the following years, the high power carbon dioxide laser has matured as an industrial machine tool. Modern carbon dioxide gas lasers can be used for cutting, welding, heat treatment, drilling, scribing and marking. Since their invention over 25 years ago they are now becoming recognised as highly reliable devices capable of achieving huge savings in production costs in many situations. This paper introduces the basic laser processing techniques of cutting, welding and heat treatment as they apply to the most common engineering materials. Typical processing speeds achieved with a wide range of laser powers are reported. Accuracies achievable and fit-up tolerances required are presented. Methods of integrating lasers with machine tools are described and their suitability in a wide range of manufacturing industries is described by reference to recent installations. Examples from small batch manufacturing, high volume production using dedicated laser welding equipment, and high volume manufacturing using 'flexible' automated laser welding equipment are described Future applications of laser processing are suggested by reference to current process developments.

  19. An acetate precursor process for BSCCO (2223) thin films and coprecipitated powders

    NASA Technical Reports Server (NTRS)

    Haertling, Gene H.

    1992-01-01

    Since the discovery of high temperature superconducting oxides much attention has been paid to finding better and useful ways to take advantage of the special properties exhibited by these materials. One such process is the development of thin films for engineering applications. Another such process is the coprecipitation route to producing superconducting powders. An acetate precursor process for use in thin film fabrication and a chemical coprecipitation route to Bismuth based superconducting materials has been developed. Data obtained from the thin film process were inconclusive to date and require more study. The chemical coprecipitation method of producing bulk material is a viable method, and is preferred over the previously used solid state route. This method of powder production appears to be an excellent route to producing thin section tape cast material and screen printed devices, as it requires less calcines than the oxide route to produce quality powders.

  20. New signal processing technique for density profile reconstruction using reflectometry.

    PubMed

    Clairet, F; Ricaud, B; Briolle, F; Heuraux, S; Bottereau, C

    2011-08-01

    Reflectometry profile measurement requires an accurate determination of the plasma reflected signal. Along with a good resolution and a high signal to noise ratio of the phase measurement, adequate data analysis is required. A new data processing based on time-frequency tomographic representation is used. It provides a clearer separation between multiple components and improves isolation of the relevant signals. In this paper, this data processing technique is applied to two sets of signals coming from two different reflectometer devices used on the Tore Supra tokamak. For the standard density profile reflectometry, it improves the initialization process and its reliability, providing a more accurate profile determination in the far scrape-off layer with density measurements as low as 10(16) m(-1). For a second reflectometer, which provides measurements in front of a lower hybrid launcher, this method improves the separation of the relevant plasma signal from multi-reflection processes due to the proximity of the plasma.

  1. Limiting factors in the production of deep microstructures

    NASA Astrophysics Data System (ADS)

    Tolfree, David W. L.; O'Neill, William; Tunna, Leslie; Sutcliffe, Christopher

    1999-10-01

    Microsystems increasingly require precision deep microstructures that can be cost-effectively designed and manufactured. New products must be able to meet the demands of the rapidly growing markets for microfluidic, micro- optical and micromechanical devices in industrial sectors which include chemicals, pharmaceuticals, biosciences, medicine and food. The realization of such products, first requires an effective process to design and manufacture prototypes. Two process methods used for the fabrication of high aspect-ratio microstructures are based on X-ray beam lithography with electroforming processes and direct micromachining with a frequency multiplied Nd:YAG laser using nanosecond pulse widths. Factors which limit the efficiency and precision obtainable using such processes are important parameters when deciding on the best fabrication method to use. A basic microstructure with narrow channels suitable for a microfluidic mixer have been fabricated using both these techniques and comparisons made of the limitations and suitability of the processes in respect of fast prototyping and manufacture or working devices.

  2. Improvement of the System of Training of Specialists by University for Coal Mining Enterprises

    NASA Astrophysics Data System (ADS)

    Mikhalchenko, Vadim; Seredkina, Irina

    2017-11-01

    In the article the ingenious technique of the Quality Function Deployment with reference to the process of training of specialists with higher education by university is considered. The method is based on the step-by-step conversion of customer requirements into specific organizational, meaningful and functional transformations of the technological process of the university. A fully deployed quality function includes four stages of tracking customer requirements while creating a product: product planning and design, process design, production design. The Quality Function Deployment can be considered as one of the methods for optimizing the technological processes of training of specialists with higher education in the current economic conditions. Implemented at the initial stages of the life cycle of the technological process, it ensures not only the high quality of the "product" of graduate school, but also the fullest possible satisfaction of consumer's requests and expectations.

  3. High dislocation density-induced large ductility in deformed and partitioned steels

    NASA Astrophysics Data System (ADS)

    He, B. B.; Hu, B.; Yen, H. W.; Cheng, G. J.; Wang, Z. K.; Luo, H. W.; Huang, M. X.

    2017-09-01

    A wide variety of industrial applications require materials with high strength and ductility. Unfortunately, the strategies for increasing material strength, such as processing to create line defects (dislocations), tend to decrease ductility. We developed a strategy to circumvent this in inexpensive, medium manganese steel. Cold rolling followed by low-temperature tempering developed steel with metastable austenite grains embedded in a highly dislocated martensite matrix. This deformed and partitioned (D and P) process produced dislocation hardening but retained high ductility, both through the glide of intensive mobile dislocations and by allowing us to control martensitic transformation. The D and P strategy should apply to any other alloy with deformation-induced martensitic transformation and provides a pathway for the development of high-strength, high-ductility materials.

  4. The Bendability of Ultra High strength Steels

    NASA Astrophysics Data System (ADS)

    Hazra, S. K.; Efthymiadis, P.; Alamoudi, A.; Kumar, R. L. V.; Shollock, B.; Dashwood, R.

    2016-08-01

    Automotive manufacturers have been reducing the weight of their vehicles to meet increasingly stringent environmental legislation that reflects public demand. A strategy is to use higher strength materials for parts with reduced cross-sections. However, such materials are less formable than traditional grades. The frequent result is increased processing and piece costs. 3D roll forming is a novel and flexible process: it is estimated that a quarter of the structure of a vehicle can be made with a single set of tooling. Unlike stamping, this process requires material with low work hardening rates. In this paper, we present results of ultra high strength steels that have low elongation in a tension but display high formability in bending through the suppression of the necking response.

  5. Lunar oxygen and metal for use in near-earth space - Magma electrolysis

    NASA Technical Reports Server (NTRS)

    Colson, Russell O.; Haskin, Larry A.

    1990-01-01

    The unique conditions on the moon, such as vacuum, absence of many reagents common on the earth, and presence of very nontraditional 'ores', suggest that a unique and nontraditional process for extracting materials from the ores may prove the most practical. An investigation has begun into unfluxed silicate electrolysis as a method for extracting oxygen, Fe, and Si from lunar regolith. The advantages of the process include simplicity of concept, absence of need to supply reagents from the earth, and low power and mass requirements for the processing plant. Disadvantages include the need for uninterrupted high temperature and the highly corrosive nature of the high-temperature silicate melts, which has made identifying suitable electrode and container materials difficult.

  6. Indigenous Manufacturing realization of TWIN Source

    NASA Astrophysics Data System (ADS)

    Pandey, R.; Bandyopadhyay, M.; Parmar, D.; Yadav, R.; Tyagi, H.; Soni, J.; Shishangiya, H.; Sudhir Kumar, D.; Shah, S.; Bansal, G.; Pandya, K.; Parmar, K.; Vuppugalla, M.; Gahlaut, A.; Chakraborty, A.

    2017-04-01

    TWIN source is two RF driver based negative ion source that has been planned to bridge the gap between single driver based ROBIN source (currently operational) and eight river based DNB source (to be operated under IN-TF test facility). TWIN source experiments have been planned at IPR keeping the objective of long term domestic fusion programme to gain operational experiences on vacuum immersed multi driver RF based negative ion source. High vacuum compatible components of twin source are designed at IPR keeping an aim on indigenous built in attempt. These components of TWIN source are mainly stainless steel and OFC-Cu. Being high heat flux receiving components, one of the major functional requirements is continuous heat removal via water as cooling medium. Hence for the purpose stainless steel parts are provided with externally milled cooling lines and that shall be covered with a layer of OFC-cu which would be on the receiving side of high heat flux. Manufacturability of twin source components requires joining of these dissimilar materials via process like electrode position, electron beam welding and vacuum brazing. Any of these manufacturing processes shall give a vacuum tight joint having proper joint strength at operating temperature and pressure. Taking the indigenous development effort vacuum brazing (in non-nuclear environment) has been opted for joining of dissimilar materials of twin source being one of the most reliable joining techniques and commercially feasible across the suppliers of country. Manufacturing design improvisation for the components has been done to suit the vacuum brazing process requirement and to ease some of the machining without comprising over the functional and operational requirements. This paper illustrates the details on the indigenous development effort, design improvisation to suits manufacturability, vacuum brazing basics and its procedures for twin source components.

  7. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  8. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  9. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  10. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  11. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  12. Introduction to Session 5

    NASA Astrophysics Data System (ADS)

    Zullo, Luca; Snyder, Seth W.

    Production of bio-based products that are cost competitive in the market place requires well-developed operations that include innovative processes and separation solutions. Separations costs can make the difference between an interesting laboratory project and a successful commercial process. Bioprocessing and separations research and development addresses some of the most significant cost barriers in production of bioffuels and bio-based chemicals. Models of integrated biorefineries indicate that success will require production of higher volume fuels in conjunction with high margin chemical products. Addressing the bioprocessing and separations cost barriers will be critical to the overall success of the integrated biorefinery.

  13. Precision manufacturing for clinical-quality regenerative medicines.

    PubMed

    Williams, David J; Thomas, Robert J; Hourd, Paul C; Chandra, Amit; Ratcliffe, Elizabeth; Liu, Yang; Rayment, Erin A; Archer, J Richard

    2012-08-28

    Innovations in engineering applied to healthcare make a significant difference to people's lives. Market growth is guaranteed by demographics. Regulation and requirements for good manufacturing practice-extreme levels of repeatability and reliability-demand high-precision process and measurement solutions. Emerging technologies using living biological materials add complexity. This paper presents some results of work demonstrating the precision automated manufacture of living materials, particularly the expansion of populations of human stem cells for therapeutic use as regenerative medicines. The paper also describes quality engineering techniques for precision process design and improvement, and identifies the requirements for manufacturing technology and measurement systems evolution for such therapies.

  14. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    PubMed

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  15. MOEX: Solvent extraction approach for recycling enriched 98Mo/ 100Mo material

    DOE PAGES

    Tkac, Peter; Brown, M. Alex; Momen, Abdul; ...

    2017-03-20

    Several promising pathways exist for the production of 99Mo/ 99mTc using enriched 98Mo or 100Mo. Use of Mo targets require a major change in current generator technology, and the necessity for an efficient recycle pathway to recover valuable enriched Mo material. High recovery yields, purity, suitable chemical form and particle size are required. Results on the development of the MOEX– molybdenum solvent extraction – approach to recycle enriched Mo material are presented. Furthermore, the advantages of the MOEX process are very high decontamination factors from potassium and other elements, high throughput, easy scalability, automation, and minimal waste generation.

  16. Development and flight test of an experimental maneuver autopilot for a highly maneuverable aircraft

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Jones, Frank P.; Roncoli, Ralph B.

    1986-01-01

    This report presents the development of an experimental flight test maneuver autopilot (FTMAP) for a highly maneuverable aircraft. The essence of this technique is the application of an autopilot to provide precise control during required flight test maneuvers. This newly developed flight test technique is being applied at the Dryden Flight Research Facility of NASA Ames Research Center. The FTMAP is designed to increase the quantity and quality of data obtained in test flight. The technique was developed and demonstrated on the highly maneuverable aircraft technology (HiMAT) vehicle. This report describes the HiMAT vehicle systems, maneuver requirements, FTMAP development process, and flight results.

  17. MOEX: Solvent extraction approach for recycling enriched 98Mo/ 100Mo material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tkac, Peter; Brown, M. Alex; Momen, Abdul

    Several promising pathways exist for the production of 99Mo/ 99mTc using enriched 98Mo or 100Mo. Use of Mo targets require a major change in current generator technology, and the necessity for an efficient recycle pathway to recover valuable enriched Mo material. High recovery yields, purity, suitable chemical form and particle size are required. Results on the development of the MOEX– molybdenum solvent extraction – approach to recycle enriched Mo material are presented. Furthermore, the advantages of the MOEX process are very high decontamination factors from potassium and other elements, high throughput, easy scalability, automation, and minimal waste generation.

  18. Aspheres for high speed cine lenses

    NASA Astrophysics Data System (ADS)

    Beder, Christian

    2005-09-01

    To fulfil the requirements of today's high performance cine lenses aspheres are an indispensable part of lens design. Among making them manageable in shape and size, tolerancing aspheres is an essential part of the development process. The traditional method of tolerancing individual aspherical coefficients results in unemployable theoretical figures only. In order to obtain viable parameters that can easily be dealt with in a production line, more enhanced techniques are required. In this presentation, a method of simulating characteristic manufacturing errors and deducing surface deviation and slope error tolerances will be shown.

  19. Arsine flow requirement for the flow modulation growth of high purity GaAs using adduct-grade triethylgallium

    NASA Astrophysics Data System (ADS)

    Pitts, B. L.; Emerson, D. T.; Shealy, J. R.

    1992-10-01

    Using arsine and triethylgallium with flow modulation, organometallic vapor phase epitaxy can produce high purity GaAs layers with V/III molar ratios near unity. We have estimated that under appropriate growth conditions the arsine incorporation efficiency into epitaxial GaAs can exceed 30%. The arsine flow requirement for obtaining good morphology has been identified over a range of substrate temperatures using adduct-grade triethylgallium. The process described reduces the environmental impact and life safety risk of the hydride based organometallic vapor phase epitaxial method.

  20. Interference effects of vocalization on dual task performance

    NASA Astrophysics Data System (ADS)

    Owens, J. M.; Goodman, L. S.; Pianka, M. J.

    1984-09-01

    Voice command and control systems have been proposed as a potential means of off-loading the typically overburdened visual information processing system. However, prior to introducing novel human-machine interfacing technologies in high workload environments, consideration must be given to the integration of the new technologists within existing task structures to ensure that no new sources of workload or interference are systematically introduced. This study examined the use of voice interactive systems technology in the joint performance of two cognitive information processing tasks requiring continuous memory and choice reaction wherein a basis for intertask interference might be expected. Stimuli for the continuous memory task were presented aurally and either voice or keyboard responding was required in the choice reaction task. Performance was significantly degraded in each task when voice responding was required in the choice reaction time task. Performance degradation was evident in higher error scores for both the choice reaction and continuous memory tasks. Performance decrements observed under conditions of high intertask stimulus similarity were not statistically significant. The results signal the need to consider further the task requirements for verbal short-term memory when applying speech technology in multitask environments.

  1. Miniaturized Power Processing Unit Study: A Cubesat Electric Propulsion Technology Enabler Project

    NASA Technical Reports Server (NTRS)

    Ghassemieh, Shakib M.

    2014-01-01

    This study evaluates High Voltage Power Processing Unit (PPU) technology and driving requirements necessary to enable the Microfluidic Electric Propulsion technology research and development by NASA and university partners. This study provides an overview of the state of the art PPU technology with recommendations for technology demonstration projects and missions for NASA to pursue.

  2. Grinding and classification of pine bark for use as plywood adhesive filler

    Treesearch

    Thomas L. Eberhardt; Karen G. Reed

    2005-01-01

    Prior efforts to incorporate bark or bark extracts into composites have met with only limited success because of poor performance relative to existing products and/or economic barriers stemming from high levels of processing. We are currently investigating applications for southern yellow pine (SYP) bark that require intermediate levels of processing, one being the use...

  3. Production and cost of harvesting, processing, and transporting small-diameter (< 5 inches) trees for energy

    Treesearch

    Fei Pan; Han-Sup Han; Leonard R. Johnson; William J. Elliot

    2008-01-01

    Dense, small-diameter stands generally require thinning from below to improve fire-tolerance. The resulting forest biomass can be used for energy production. The cost of harvesting, processing, and transporting small-diameter trees often exceeds revenues due to high costs associated with harvesting and transportation and low market values for forest biomass....

  4. Plasma separation

    NASA Technical Reports Server (NTRS)

    Steurer, Wolfgang

    1992-01-01

    This process employs a thermal plasma for the separation and production of oxygen and metals. It is a continuous process that requires no consumables and relies entirely on space resources. The almost complete absence of waste renders it relatively clean. It can be turned on or off without any undesirable side effects or residues. The prime disadvantage is its high power consumption.

  5. Study for Identification of Beneficial Uses of Space (BUS). Volume 2: Technical report. Book 4: Development and business analysis of space processed surface acoustic wave devices

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Preliminary development plans, analysis of required R and D and production resources, the costs of such resources, and, finally, the potential profitability of a commercial space processing opportunity for the production of very high frequency surface acoustic wave devices are presented.

  6. Onboard FPGA-based SAR processing for future spaceborne systems

    NASA Technical Reports Server (NTRS)

    Le, Charles; Chan, Samuel; Cheng, Frank; Fang, Winston; Fischman, Mark; Hensley, Scott; Johnson, Robert; Jourdan, Michael; Marina, Miguel; Parham, Bruce; hide

    2004-01-01

    We present a real-time high-performance and fault-tolerant FPGA-based hardware architecture for the processing of synthetic aperture radar (SAR) images in future spaceborne system. In particular, we will discuss the integrated design approach, from top-level algorithm specifications and system requirements, design methodology, functional verification and performance validation, down to hardware design and implementation.

  7. Critical and Creative Thinking as Learning Processes at Top-Ranking Chinese Middle Schools: Possibilities and Required Improvements

    ERIC Educational Resources Information Center

    Liu, Z. K.; He, J.; Li, B.

    2015-01-01

    Fostering and enabling critical and creative thinking of students is considered an important goal, and it is assumed that in particular, talented students have considerable potential for applying such high-level cognitive processes for learning in classrooms. However, Chinese students are often considered as rote learners, and that learning…

  8. The Validation of Vapor Phase Hydrogen Peroxide Microbial Reduction for Planetary Protection and a Proposed Vacuum Process Specification

    NASA Technical Reports Server (NTRS)

    Chung, Shirley; Barengoltz, Jack; Kern, Roger; Koukol, Robert; Cash, Howard

    2006-01-01

    The Jet Propulsion Laboratory, in conjunction with the NASA Planetary Protection Officer, has selected the vapor phase hydrogen peroxide sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal is to include this technique, with an appropriate specification, in NPR 8020.12C as a low temperature complementary technique to the dry heat sterilization process.To meet microbial reduction requirements for all Mars in-situ life detection and sample return missions, various planetary spacecraft subsystems will have to be exposed to a qualified sterilization process. This process could be the elevated temperature dry heat sterilization process (115 C for 40 hours) which was used to sterilize the Viking lander spacecraft. However, with utilization of such elements as highly sophisticated electronics and sensors in modern spacecraft, this process presents significant materials challenges and is thus an undesirable bioburden reduction method to design engineers. The objective of this work is to introduce vapor hydrogen peroxide (VHP) as an alternative to dry heat microbial reduction to meet planetary protection requirements.The VHP process is widely used by the medical industry to sterilize surgical instruments and biomedical devices, but high doses of VHP may degrade the performance of flight hardware, or compromise material properties. Our goal for this study was to determine the minimum VHP process conditions to achieve microbial reduction levels acceptable for planetary protection.

  9. A flow-through chromatography process for influenza A and B virus purification.

    PubMed

    Weigel, Thomas; Solomaier, Thomas; Peuker, Alessa; Pathapati, Trinath; Wolff, Michael W; Reichl, Udo

    2014-10-01

    Vaccination is still the most efficient measure to protect against influenza virus infections. Besides the seasonal wave of influenza, pandemic outbreaks of bird or swine flu represent a high threat to human population. With the establishment of cell culture-based processes, there is a growing demand for robust, economic and efficient downstream processes for influenza virus purification. This study focused on the development of an economic flow-through chromatographic process avoiding virus strain sensitive capture steps. Therefore, a three-step process consisting of anion exchange chromatography (AEC), Benzonase(®) treatment, and size exclusion chromatography with a ligand-activated core (LCC) was established, and tested for purification of two influenza A virus strains and one influenza B virus strain. The process resulted in high virus yields (≥68%) with protein contamination levels fulfilling requirements of the European Pharmacopeia for production of influenza vaccines for human use. DNA was depleted by ≥98.7% for all strains. The measured DNA concentrations per dose were close to the required limits of 10ng DNA per dose set by the European Pharmacopeia. In addition, the added Benzonase(®) could be successfully removed from the product fraction. Overall, the presented downstream process could potentially represent a simple, robust and economic platform technology for production of cell culture-derived influenza vaccines. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. High-density plasma deposition manufacturing productivity improvement

    NASA Astrophysics Data System (ADS)

    Olmer, Leonard J.; Hudson, Chris P.

    1999-09-01

    High Density Plasma (HDP) deposition provides a means to deposit high quality dielectrics meeting submicron gap fill requirements. But, compared to traditional PECVD processing, HDP is relatively expensive due to the higher capital cost of the equipment. In order to keep processing costs low, it became necessary to maximize the wafer throughput of HDP processing without degrading the film properties. The approach taken was to optimize the post deposition microwave in-situ clean efficiency. A regression model, based on actual data, indicated that number of wafers processed before a chamber clean was the dominant factor. Furthermore, a design change in the ceramic hardware, surrounding the electrostatic chuck, provided thermal isolation resulting in an enhanced clean rate of the chamber process kit. An infra-red detector located in the chamber exhaust line provided a means to endpoint the clean and in-film particle data confirmed the infra-red results. The combination of increased chamber clean frequency, optimized clean time and improved process.

  11. Processing and damage recovery of intrinsic self-healing glass fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Sordo, Federica; Michaud, Véronique

    2016-08-01

    Glass fiber reinforced composites with a self-healing, supramolecular hybrid network matrix were produced using a modified vacuum assisted resin infusion moulding process adapted to high temperature processing. The quality and fiber volume fraction (50%) of the obtained materials were assessed through microscopy and matrix burn-off methods. The thermo-mechanical properties were quantified by means of dynamic mechanical analysis, revealing very high damping properties compared to traditional epoxy-based glass fiber reinforced composites. Self-healing properties were assessed by three-point bending tests. A high recovery of the flexural properties, around 72% for the elastic modulus and 65% of the maximum flexural stress, was achieved after a resting period of 24 h at room temperature. Recovery after low velocity impact events was also visually observed. Applications for this intrinsic and autonomic self-healing highly reinforced composite material point towards semi-structural applications where high damping and/or integrity recovery after impact are required.

  12. High Temperature Transfer Molding Resins Based on 2,3,3',4'-Biphenyltetracarboxylic Dianhydride

    NASA Technical Reports Server (NTRS)

    Smith, J. G., Jr.; Connell, J. W.; Hergenrother, P. M.; Yokota, R.; Criss, J. M.

    2002-01-01

    As part of an ongoing effort to develop materials for resin transfer molding (RTM) processes to fabricate high performance/high temperature composite structures, phenylethynyl containing imides have been under investigation. New phenylethynyl containing imide compositions were prepared using 2,3,3',4'-biphenyltetracarboxylic dianhydride (a-BPDA) and evaluated for cured glass transition temperature (Tg), melt flow behavior, and for processability into flat composite panels via RTM. The a-BPDA imparts a unique combination of properties that are desirable for high temperature transfer molding resins. In comparison to its symmetrical counterpart (i.e. 3,3',4,4'-biphenyltetracarboxylic dianhydride), a-BPDA affords oligomers with lower melt viscosities and when cured, higher Tgs. Several candidates exhibited the appropriate combination of properties such as a low and stable melt viscosity required for RTM processes, high cured Tg, and moderate toughness. The chemistry, physical, and composite properties of select resins will be discussed.

  13. An Overview of Natural Gas Conversion Technologies for Co-Production of Hydrogen and Value-Added Solid Carbon Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dagle, Robert A.; Dagle, Vanessa; Bearden, Mark D.

    This report was prepared in response to the U.S. Department of Energy Fuel Cell Technologies Office Congressional Appropriation language to support research on carbon-free production of hydrogen using new chemical processes that utilize natural gas to produce solid carbon and hydrogen. The U.S. produces 9-10 million tons of hydrogen annually with more than 95% of the hydrogen produced by steam-methane reforming (SMR) of natural gas. SMR is attractive because of its high hydrogen yield; but it also converts the carbon to carbon dioxide. Non-oxidative thermal decomposition of methane to carbon and hydrogen is an alternative to SMR and produces COmore » 2-free hydrogen. The produced carbon can be sold as a co-product, thus providing economic credit that reduces the delivered net cost of hydrogen. The combination of producing hydrogen with potentially valuable carbon byproducts has market value in that this allows greater flexibility to match the market prices of hydrogen and carbon. That is, the higher value product can subsidize the other in pricing decisions. In this report we highlight the relevant technologies reported in the literature—primarily thermochemical and plasma conversion processes—and recent research progress and commercial activities. Longstanding technical challenges include the high energetic requirements (e.g., high temperatures and/or electricity requirements) necessary for methane activation and, for some catalytic processes, the separation of solid carbon product from the spent catalyst. We assess current and new carbon product markets that could be served given technological advances, and we discuss technical barriers and potential areas of research to address these needs. We provide preliminary economic analysis for these processes and compare to other emerging (e.g., electrolysis) and conventional (e.g., SMR) processes for hydrogen production. The overarching conclusion of this study is that the cost of hydrogen can be potentially reduced to target levels of $2/kg with the co-production and sale of a sufficiently high-value carbon product. Technological advances are required to understand the reaction conditions and design reactor systems that can achieve high yields of the select carbon products and segregate or separate the high-value carbon products, and optimize the production process for both hydrogen and carbon.« less

  14. Rare targets are less susceptible to attention capture once detection has begun.

    PubMed

    Hon, Nicholas; Ng, Gavin; Chan, Gerald

    2016-04-01

    Rare or low probability targets are detected more slowly and/ or less accurately than higher probability counterparts. Various proposals have implicated perceptual and response-based processes in this deficit. Recent evidence, however, suggests that it is attentional in nature, with low probability targets requiring more attentional resources than high probability ones to detect. This difference in attentional requirements, in turn, suggests the possibility that low and high probability targets may have different susceptibilities to attention capture, which is also known to be resource-dependent. Supporting this hypothesis, we found that, once attentional resources have begun to be engaged by detection processes, low, but not high, probability targets have a reduced susceptibility to capture. Our findings speak to several issues. First, they indicate that the likelihood of attention capture occurring when a given task-relevant stimulus is being processed is dependent, to some extent, on how said stimulus is represented within mental task sets. Second, they provide added support for the idea that the behavioural deficit associated with low probability targets is attention-based. Finally, the current data point to reduced top-down biasing of target templates as a likely mechanism underlying the attentional locus of the deficit in question.

  15. Micro-assembly of three-dimensional rotary MEMS mirrors

    NASA Astrophysics Data System (ADS)

    Wang, Lidai; Mills, James K.; Cleghorn, William L.

    2009-02-01

    We present a novel approach to construct three-dimensional rotary micro-mirrors, which are fundamental components to build 1×N or N×M optical switching systems. A rotary micro-mirror consists of two microparts: a rotary micro-motor and a micro-mirror. Both of the two microparts are fabricated with PolyMUMPs, a surface micromachining process. A sequential robotic microassembly process is developed to join the two microparts together to construct a threedimensional device. In order to achieve high positioning accuracy and a strong mechanical connection, the micro-mirror is joined to the micro-motor using an adhesive mechanical fastener. The mechanical fastener has self-alignment ability and provides a temporary joint between the two microparts. The adhesive bonding can create a strong permanent connection, which does not require extra supporting plates for the micro-mirror. A hybrid manipulation strategy, which includes pick-and-place and pushing-based manipulations, is utilized to manipulation the micro-mirror. The pick-andplace manipulation has the ability to globally position the micro-mirror in six degrees of freedom. The pushing-based manipulation can achieve high positioning accuracy. This microassembly approach has great flexibility and high accuracy; furthermore, it does not require extra supporting plates, which greatly simplifies the assembly process.

  16. Trace detection of tetrahydrocannabinol (THC) with a SERS-based capillary platform prepared by the in situ microwave synthesis of AgNPs.

    PubMed

    Yüksel, Sezin; Schwenke, Almut M; Soliveri, Guido; Ardizzone, Silvia; Weber, Karina; Cialla-May, Dana; Hoeppener, Stephanie; Schubert, Ulrich S; Popp, Jürgen

    2016-10-05

    In the present study, an ultra-sensitive and highly reproducible novel SERS-based capillary platform was developed and utilized for the trace detection of tetrahydrocannabinol (THC). The approach combines the advantages of microwave-assisted nanoparticle synthesis, plasmonics and capillary forces. By employing a microwave-assisted preparation method, glass capillaries were reproducibly coated with silver nanoparticles in a batch fabrication process that required a processing time of 3 min without needing to use any pre-surface modifications or add surfactants. The coated capillaries exhibited an excellent SERS activity with a high reproducibility and enabled the detection of low concentrations of target molecules. At the same time, only a small amount of analyte and a short and simple incubation process was required. The developed platform was applied to the spectroscopic characterization of tetrahydrocannabinol (THC) and its identification at concentration levels down to 1 nM. Thus, a highly efficient detection system for practical applications, e.g., in drug monitoring/detection, is introduced, which can be fabricated at low cost by using microwave-assisted batch synthesis techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Repairable chip bonding/interconnect process

    DOEpatents

    Bernhardt, Anthony F.; Contolini, Robert J.; Malba, Vincent; Riddle, Robert A.

    1997-01-01

    A repairable, chip-to-board interconnect process which addresses cost and testability issues in the multi-chip modules. This process can be carried out using a chip-on-sacrificial-substrate technique, involving laser processing. This process avoids the curing/solvent evolution problems encountered in prior approaches, as well is resolving prior plating problems and the requirements for fillets. For repairable high speed chip-to-board connection, transmission lines can be formed on the sides of the chip from chip bond pads, ending in a gull wing at the bottom of the chip for subsequent solder.

  18. Turboexpanders with pressurized magnetic bearings for off-shore applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agahi, R.R.; Ershaghi, B.; Baudelocque, L.

    1995-12-31

    There are two primary parameters that encourage the use of magnetic bearings in turbomachinery: oil-free process and space requirements. For cryogenic processes such as hydrogen purification and ethylene plants, oil free process is the primary objective. In the case of off-shore platforms for oil and gas production, the occupied space and weight are of prime concern. In off-shore operations, the process gas density is usually higher than in normal process plants because the gas is untreated and at high pressure. High density process gas generates more windage loss and may also cause excessive radial load to journal bearings. The bearingmore » assembly design should be suitable for sour gas environments as well. Furthermore, the thrust bearing system should withstand process fluctuations which are more severe due to high pressure. In this paper, the authors explain their experience of designing a turboexpander-compressor with magnetic bearings for an off-shore oil production platform. They will present side load analysis and their solutions for heat dissipation and coping with process fluctuations.« less

  19. Coincidence timing of a soccer pass: effects of stimulus velocity and movement distance.

    PubMed

    Williams, L R

    2000-08-01

    The effect of stimulus velocity and movement extent on coincidence timing and spatial accuracy of a soccer pass was investigated. A Bassin anticipation timer provided light stimulus velocities of 1.79 or 2.68 m/sec. (designated as "Low" and "High", respectively), and subjects were required to kick a stationary soccer ball so that it struck a target in coincidence with the arrival of the light stimulus at the end of the runway. Two kick types were used. The "Short" condition began with the subject 70 cm from the ball and required a single forward step with the nonkicking leg before making the kick. The "Long" condition began 140 cm from the ball and required two steps before the kick. Twenty male subjects were given 16 trials under each of the four combinations of stimulus velocity and kick type. The expectation that the faster stimulus velocity would be associated with lower coincidence timing scores for both absolute error (AE) and variable error (VE) and with late responding for constant error (CEO) was upheld with the exception that for the Long Kick-High Velocity condition, AE was highest. The index of preprogramming (IP) was used to test the hypothesis that a two-stage control process would characterise coincidence anticipation performance involving whole-body movements. Results showed that the preparatory phase of responding produced zero-order IPs signifying reliance on feedback control. Also, while the striking phase produced high IP and suggested reliance on preprogrammed control, the possibility that the High Velocity conditions may have limited the responses was recognised. As a consequence, the role of open-loop processes remained equivocal. The findings are, however, in agreement with the view that the sensorimotor and movement-execution phases of responding require a process that is characterised by adaptability to regulatory features of the environment via closed loop mechanisms involving perception-action coupling.

  20. Flat-plate solar array project. Volume 3: Silicon sheet: Wafers and ribbons

    NASA Technical Reports Server (NTRS)

    Briglio, A.; Dumas, K.; Leipold, M.; Morrison, A.

    1986-01-01

    The primary objective of the Silicon Sheet Task of the Flat-Plate Solar Array (FSA) Project was the development of one or more low cost technologies for producing silicon sheet suitable for processing into cost-competitive solar cells. Silicon sheet refers to high purity crystalline silicon of size and thickness for fabrication into solar cells. Areas covered in the project were ingot growth and casting, wafering, ribbon growth, and other sheet technologies. The task made and fostered significant improvements in silicon sheet including processing of both ingot and ribbon technologies. An additional important outcome was the vastly improved understanding of the characteristics associated with high quality sheet, and the control of the parameters required for higher efficiency solar cells. Although significant sheet cost reductions were made, the technology advancements required to meet the task cost goals were not achieved.

  1. A parallel implementation of a multisensor feature-based range-estimation method

    NASA Technical Reports Server (NTRS)

    Suorsa, Raymond E.; Sridhar, Banavar

    1993-01-01

    There are many proposed vision based methods to perform obstacle detection and avoidance for autonomous or semi-autonomous vehicles. All methods, however, will require very high processing rates to achieve real time performance. A system capable of supporting autonomous helicopter navigation will need to extract obstacle information from imagery at rates varying from ten frames per second to thirty or more frames per second depending on the vehicle speed. Such a system will need to sustain billions of operations per second. To reach such high processing rates using current technology, a parallel implementation of the obstacle detection/ranging method is required. This paper describes an efficient and flexible parallel implementation of a multisensor feature-based range-estimation algorithm, targeted for helicopter flight, realized on both a distributed-memory and shared-memory parallel computer.

  2. Rapid Tooling for Functional Prototype of Metal Mold Processes Final Report CRADA No. TC-1032-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heestand, G.; Jaskolski, T.

    Production inserts for die-casting were generally fabricated from materials with sufficient strength and· good wear properties at casting temperatures for long life. Frequently tool steels were used and machining was done with a combination of. conventional and Electric Discharge Machining (EDM) with some handwork, an expensive and time consuming process, partilly for prototype work. We proposed electron beam physical vapor deposition (EBPVD) as a process for rapid fabrication of dies. Metals, ranging from low melting point to refractory metals (Ta, Mo, etc.), would be evaporated and deposited at high rates (-2mm/hr.). Alloys could be easily evaporated and deposited if theirmore » constituent vapor pressures were similar and with more difficulty if they were not. Of course, layering of different materials was possible if required for a specific application. For example, a hard surface layer followed by a tough steel and backed by a high thermal conductivity (possibly cooled) copper layer could be fabricated. Electron-beam deposits exhibited 100% density and lull strength when deposited at a substrate (mandrel) temperature that was a substantial fraction of the deposited material's melting point. There were several materials that could have the required high temperature properties and ease of fabrication required for such a mandrel. We had successfully used graphite, machined from free formed objects with a replicator, to produce aluminum-bronze test molds. There were several parting layer materials of interest, but the ideal material depended upon the specific application.« less

  3. Rapid visual grouping and figure-ground processing using temporally structured displays.

    PubMed

    Cheadle, Samuel; Usher, Marius; Müller, Hermann J

    2010-08-23

    We examine the time course of visual grouping and figure-ground processing. Figure (contour) and ground (random-texture) elements were flickered with different phases (i.e., contour and background are alternated), requiring the observer to group information within a pre-specified time window. It was found this grouping has a high temporal resolution: less than 20ms for smooth contours, and less than 50ms for line conjunctions with sharp angles. Furthermore, the grouping process takes place without an explicit knowledge of the phase of the elements, and it requires a cumulative build-up of information. The results are discussed in relation to the neural mechanism for visual grouping and figure-ground segregation. Copyright 2010 Elsevier Ltd. All rights reserved.

  4. Regelation and ice segregation

    NASA Technical Reports Server (NTRS)

    Miller, Robert D.

    1988-01-01

    Macroscopic processes can have an important effect on the state of regolith water. The two primary mechanisms responsible for the formation of segregated ice on Earth, thermally induced regelation and hydraulic fracturing, are reviewed while their potential importance on Mars is examined. While regelation is the dominant terrestrial process, it requires a warmer and wetter environment than currently exists on Mars. In this respect, the conditions required for hydraulic fracturing are less demanding. In assessing its potential importance on Mars, it is noted that hydraulic fracturing can produce a localized zone of high pressure water that could readily disrupt an overburden of frozen ground. Such a process, it is concluded, may have triggered the release of groundwater that led to the formation of the major outflow channels.

  5. New levels of language processing complexity and organization revealed by granger causation.

    PubMed

    Gow, David W; Caplan, David N

    2012-01-01

    Granger causation analysis of high spatiotemporal resolution reconstructions of brain activation offers a new window on the dynamic interactions between brain areas that support language processing. Premised on the observation that causes both precede and uniquely predict their effects, this approach provides an intuitive, model-free means of identifying directed causal interactions in the brain. It requires the analysis of all non-redundant potentially interacting signals, and has shown that even "early" processes such as speech perception involve interactions of many areas in a strikingly large network that extends well beyond traditional left hemisphere perisylvian cortex that play out over hundreds of milliseconds. In this paper we describe this technique and review several general findings that reframe the way we think about language processing and brain function in general. These include the extent and complexity of language processing networks, the central role of interactive processing dynamics, the role of processing hubs where the input from many distinct brain regions are integrated, and the degree to which task requirements and stimulus properties influence processing dynamics and inform our understanding of "language-specific" localized processes.

  6. Laser dimpling process parameters selection and optimization using surrogate-driven process capability space

    NASA Astrophysics Data System (ADS)

    Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz

    2017-08-01

    Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii) optimize process parameters under competing quality requirements such as maximizing the dimple height while minimizing the dimple lower surface area.

  7. Working capital management in the process of financial support of investment and construction projects and of the construction material industry

    NASA Astrophysics Data System (ADS)

    Danilochkina, Nadezhda; Lukmanova, Inessa; Roshchina, Olga; Voytolovskiy, Nikolay

    2018-03-01

    The article presents the analysis of working capital in the process of financial support of high-rise construction investment projects. The factors influencing the choice of the working capital management model were analyzed, the reasons of the change in the requirement for the values of current assets in the process of construction of high-rise facilities were determined. The author has developed the scheme of interrelation between production, operational and financial activity cycles of enterprises implementing investment projects of unique buildings and structures and made a comparative description of their financing sources.

  8. Applying a punch with microridges in multistage deep drawing processes.

    PubMed

    Lin, Bor-Tsuen; Yang, Cheng-Yu

    2016-01-01

    The developers of high aspect ratio components aim to minimize the processing stages in deep drawing processes. This study elucidates the application of microridge punches in multistage deep drawing processes. A microridge punch improves drawing performance, thereby reducing the number of stages required in deep forming processes. As an example, the original eight-stage deep forming process for a copper cylindrical cup with a high aspect ratio was analyzed by finite element simulation. Microridge punch designs were introduced in Stages 4 and 7 to replace the original punches. In addition, Stages 3 and 6 were eliminated. Finally, these changes were verified through experiments. The results showed that the microridge punches reduced the number of deep drawing stages yielding similar thickness difference percentages. Further, the numerical and experimental results demonstrated good consistency in the thickness distribution.

  9. Improving conversion yield of fermentable sugars into fuel ethanol in 1st generation yeast-based production processes.

    PubMed

    Gombert, Andreas K; van Maris, Antonius J A

    2015-06-01

    Current fuel ethanol production using yeasts and starch or sucrose-based feedstocks is referred to as 1st generation (1G) ethanol production. These processes are characterized by the high contribution of sugar prices to the final production costs, by high production volumes, and by low profit margins. In this context, small improvements in the ethanol yield on sugars have a large impact on process economy. Three types of strategies used to achieve this goal are discussed: engineering free-energy conservation, engineering redox-metabolism, and decreasing sugar losses in the process. Whereas the two former strategies lead to decreased biomass and/or glycerol formation, the latter requires increased process and/or yeast robustness. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Optical sectioning microscopy using two-frame structured illumination and Hilbert-Huang data processing

    NASA Astrophysics Data System (ADS)

    Trusiak, M.; Patorski, K.; Tkaczyk, T.

    2014-12-01

    We propose a fast, simple and experimentally robust method for reconstructing background-rejected optically-sectioned microscopic images using two-shot structured illumination approach. Innovative data demodulation technique requires two grid-illumination images mutually phase shifted by π (half a grid period) but precise phase displacement value is not critical. Upon subtraction of the two frames the input pattern with increased grid modulation is computed. The proposed demodulation procedure comprises: (1) two-dimensional data processing based on the enhanced, fast empirical mode decomposition (EFEMD) method for the object spatial frequency selection (noise reduction and bias term removal), and (2) calculating high contrast optically-sectioned image using the two-dimensional spiral Hilbert transform (HS). The proposed algorithm effectiveness is compared with the results obtained for the same input data using conventional structured-illumination (SIM) and HiLo microscopy methods. The input data were collected for studying highly scattering tissue samples in reflectance mode. In comparison with the conventional three-frame SIM technique we need one frame less and no stringent requirement on the exact phase-shift between recorded frames is imposed. The HiLo algorithm outcome is strongly dependent on the set of parameters chosen manually by the operator (cut-off frequencies for low-pass and high-pass filtering and η parameter value for optically-sectioned image reconstruction) whereas the proposed method is parameter-free. Moreover very short processing time required to efficiently demodulate the input pattern predestines proposed method for real-time in-vivo studies. Current implementation completes full processing in 0.25s using medium class PC (Inter i7 2,1 GHz processor and 8 GB RAM). Simple modification employed to extract only first two BIMFs with fixed filter window size results in reducing the computing time to 0.11s (8 frames/s).

  11. KENNEDY SPACE CENTER, FLA. - This bird's-eye view of a high bay in the Orbiter Processing Facility (OPF) shows the open payload bay of Space Shuttle Discovery surrounded by the standard platforms and equipment required to process a Space Shuttle orbiter. The high bay is 197 feet (60 meters) long, 150 feet (46 meters) wide, 95 feet (29 meters) high, and encompasses a 29,000-square-foot (2,694-meter) area. The 30-ton (27-metric-ton) bridge crane (yellow device, right) has a hook height of approximately 66 feet (20 meters). Platforms, a main access bridge, and two rolling bridges with trucks provide access to various parts of the orbiter. In addition to routine servicing and checkout, the inspections and modifications made to enhance Discovery's performance and upgrade its systems were performed in the OPF during its recently completed Orbiter Major Modification (OMM) period.

    NASA Image and Video Library

    2003-09-02

    KENNEDY SPACE CENTER, FLA. - This bird's-eye view of a high bay in the Orbiter Processing Facility (OPF) shows the open payload bay of Space Shuttle Discovery surrounded by the standard platforms and equipment required to process a Space Shuttle orbiter. The high bay is 197 feet (60 meters) long, 150 feet (46 meters) wide, 95 feet (29 meters) high, and encompasses a 29,000-square-foot (2,694-meter) area. The 30-ton (27-metric-ton) bridge crane (yellow device, right) has a hook height of approximately 66 feet (20 meters). Platforms, a main access bridge, and two rolling bridges with trucks provide access to various parts of the orbiter. In addition to routine servicing and checkout, the inspections and modifications made to enhance Discovery's performance and upgrade its systems were performed in the OPF during its recently completed Orbiter Major Modification (OMM) period.

  12. Metrology requirements for the serial production of ELT primary mirror segments

    NASA Astrophysics Data System (ADS)

    Rees, Paul C. T.; Gray, Caroline

    2015-08-01

    The manufacture of the next generation of large astronomical telescopes, the extremely large telescopes (ELT), requires the rapid manufacture of greater than 500 1.44m hexagonal segments for the primary mirror of each telescope. Both leading projects, the Thirty Meter Telescope (TMT) and the European Extremely Large Telescope (E-ELT), have set highly demanding technical requirements for each fabricated segment. These technical requirements, when combined with the anticipated construction schedule for each telescope, suggest that more than one optical fabricator will be involved in the delivery of the primary mirror segments in order to meet the project schedule. For one supplier, the technical specification is challenging and requires highly consistent control of metrology in close coordination with the polishing technologies used in order to optimize production rates. For production using multiple suppliers, however the supply chain is structured, consistent control of metrology along the supply chain will be required. This requires a broader pattern of independent verification than is the case of a single supplier. This paper outlines the metrology requirements for a single supplier throughout all stages of the fabrication process. We identify and outline those areas where metrology accuracy and duration have a significant impact on production efficiency. We use the challenging ESO E-ELT technical specification as an example of our treatment, including actual process data. We further develop this model for the case of a supply chain consisting of multiple suppliers. Here, we emphasize the need to control metrology throughout the supply chain in order to optimize net production efficiency.

  13. HIPAA: update on rule revisions and compliance requirements.

    PubMed

    Maddox, P J

    2002-01-01

    Due to the highly technical requirements for HIPAA compliance and the numerous administrative and clinical functions and processes involved, guidance from experts who are knowledgeable about systems design and use to secure private data is necessary. In health care organizations, this will require individuals who are knowledgeable about clinical processes and those who understand health information technology, security, and privacy to work together to establish an entity's compliance plans and revise operations and practices accordingly. As a precondition of designing such systems, it is essential that covered entities understand the HIPAA's statutory requirements and timeline for compliance. An organization's success in preparing for HIPAA will depend upon an active program of assessment, planning, and implementation. Compliance with security and privacy standards can be expected to increase costs initially. However, greater use of EDI is expected to reduce costs and enhance revenues in the long run if processes and systems are improved. NOTE: Special protection for psychotherapy notes holds them to a higher standard of protection. Notes used only by a psychotherapist are not intended to be shared with anyone and are not considered part of the medical record.

  14. Noncontact temperature measurement: Requirements and applications for metals and alloys research

    NASA Technical Reports Server (NTRS)

    Perepezko, J. H.

    1988-01-01

    Temperature measurement is an essential capability for almost all areas of metals and alloys research. In the microgravity environment many of the science priorities that have been identified for metals and alloys also require noncontact temperature measurement capability. For example, in order to exploit the full potential of containerless processing, it is critical to have available a suitable noncontact temperature measurement system. This system is needed to track continuously the thermal history, including melt undercooling and rapid recalescence, of relatively small metal spheres during free-fall motion in drop tube systems. During containerless processing with levitation-based equipment, accurate noncontact temperature measurement is required to monitor one or more quasi-static samples with sufficient spatial and thermal resolution to follow the progress of solidification fronts originating in undercooled melts. In crystal growth, thermal migration, coarsening and other experiments high resolution thermal maps would be a valuable asset in the understanding and modeling of solidification processes, fluid flows and microstructure development. The science and applications requirements place several constraints on the spatial resolution, response time and accuracy of suitable instrumentation.

  15. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.

  16. High Bar Swing Performance in Novice Adults: Effects of Practice and Talent

    ERIC Educational Resources Information Center

    Busquets, Albert; Marina, Michel; Irurtia, Alfredo; Ranz, Daniel; Angulo-Barroso, Rosa M.

    2011-01-01

    An individual's a priori talent can affect movement performance during learning. Also, task requirements and motor-perceptual factors are critical to the learning process. This study describes changes in high bar swing performance after a 2-month practice period. Twenty-five novice participants were divided by a priori talent level…

  17. Approaches for geospatial processing of field-based high-throughput plant phenomics data from ground vehicle platforms

    USDA-ARS?s Scientific Manuscript database

    Understanding the genetic basis of complex plant traits requires connecting genotype to phenotype information, known as the “G2P question.” In the last three decades, genotyping methods have become highly developed. Much less innovation has occurred for measuring plant traits (phenotyping), particul...

  18. Super-emitters in natural gas infrastructure are caused by abnormal process conditions

    NASA Astrophysics Data System (ADS)

    Zavala-Araiza, Daniel; Alvarez, Ramón A.; Lyon, David R.; Allen, David T.; Marchese, Anthony J.; Zimmerle, Daniel J.; Hamburg, Steven P.

    2017-01-01

    Effectively mitigating methane emissions from the natural gas supply chain requires addressing the disproportionate influence of high-emitting sources. Here we use a Monte Carlo simulation to aggregate methane emissions from all components on natural gas production sites in the Barnett Shale production region (Texas). Our total emission estimates are two-thirds of those derived from independent site-based measurements. Although some high-emitting operations occur by design (condensate flashing and liquid unloadings), they occur more than an order of magnitude less frequently than required to explain the reported frequency at which high site-based emissions are observed. We conclude that the occurrence of abnormal process conditions (for example, malfunctions upstream of the point of emissions; equipment issues) cause additional emissions that explain the gap between component-based and site-based emissions. Such abnormal conditions can cause a substantial proportion of a site's gas production to be emitted to the atmosphere and are the defining attribute of super-emitting sites.

  19. Supervisory control and diagnostics system for the mirror fusion test facility: overview and status 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGoldrick, P.R.

    1981-01-01

    The Mirror Fusion Test Facility (MFTF) is a complex facility requiring a highly-computerized Supervisory Control and Diagnostics System (SCDS) to monitor and provide control over ten subsystems; three of which require true process control. SCDS will provide physicists with a method of studying machine and plasma behavior by acquiring and processing up to four megabytes of plasma diagnostic information every five minutes. A high degree of availability and throughput is provided by a distributed computer system (nine 32-bit minicomputers on shared memory). Data, distributed across SCDS, is managed by a high-bandwidth Distributed Database Management System. The MFTF operators' control roommore » consoles use color television monitors with touch sensitive screens; this is a totally new approach. The method of handling deviations to normal machine operation and how the operator should be notified and assisted in the resolution of problems has been studied and a system designed.« less

  20. Energy efficient engine high-pressure turbine single crystal vane and blade fabrication technology report

    NASA Technical Reports Server (NTRS)

    Giamei, A. F.; Salkeld, R. W.; Hayes, C. W.

    1981-01-01

    The objective of the High-Pressure Turbine Fabrication Program was to demonstrate the application and feasibility of Pratt & Whitney Aircraft-developed two-piece, single crystal casting and bonding technology on the turbine blade and vane configurations required for the high-pressure turbine in the Energy Efficient Engine. During the first phase of the program, casting feasibility was demonstrated. Several blade and vane halves were made for the bonding trials, plus solid blades and vanes were successfully cast for materials evaluation tests. Specimens exhibited the required microstructure and chemical composition. Bonding feasibility was demonstrated in the second phase of the effort. Bonding yields of 75 percent for the vane and 30 percent for the blade were achieved, and methods for improving these yield percentages were identified. A bond process was established for PWA 1480 single crystal material which incorporated a transient liquid phase interlayer. Bond properties were substantiated and sensitivities determined. Tooling die materials were identified, and an advanced differential thermal expansion tooling concept was incorporated into the bond process.

Top