Sample records for improved processing techniques

  1. Improving Service Delivery in a County Health Department WIC Clinic: An Application of Statistical Process Control Techniques

    PubMed Central

    Boe, Debra Thingstad; Parsons, Helen

    2009-01-01

    Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964

  2. Applications of process improvement techniques to improve workflow in abdominal imaging.

    PubMed

    Tamm, Eric Peter

    2016-03-01

    Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.

  3. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  4. Employee empowerment through team building and use of process control methods.

    PubMed

    Willems, S

    1998-02-01

    The article examines the use of statistical process control and performance improvement techniques in employee empowerment. The focus is how these techniques provide employees with information to improve their productivity and become involved in the decision-making process. Findings suggest that at one Mississippi hospital employee improvement has had a positive effect on employee productivity, morale, and quality of work.

  5. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  6. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  7. Ensemble analyses improve signatures of tumour hypoxia and reveal inter-platform differences

    PubMed Central

    2014-01-01

    Background The reproducibility of transcriptomic biomarkers across datasets remains poor, limiting clinical application. We and others have suggested that this is in-part caused by differential error-structure between datasets, and their incomplete removal by pre-processing algorithms. Methods To test this hypothesis, we systematically assessed the effects of pre-processing on biomarker classification using 24 different pre-processing methods and 15 distinct signatures of tumour hypoxia in 10 datasets (2,143 patients). Results We confirm strong pre-processing effects for all datasets and signatures, and find that these differ between microarray versions. Importantly, exploiting different pre-processing techniques in an ensemble technique improved classification for a majority of signatures. Conclusions Assessing biomarkers using an ensemble of pre-processing techniques shows clear value across multiple diseases, datasets and biomarkers. Importantly, ensemble classification improves biomarkers with initially good results but does not result in spuriously improved performance for poor biomarkers. While further research is required, this approach has the potential to become a standard for transcriptomic biomarkers. PMID:24902696

  8. Flash X-ray with image enhancement applied to combustion events

    NASA Astrophysics Data System (ADS)

    White, K. J.; McCoy, D. G.

    1983-10-01

    Flow visualization of interior ballistic processes by use of X-rays has placed more stringent requirements on flash X-ray techniques. The problem of improving radiographic contrast of propellants in X-ray transparent chambers was studied by devising techniques for evaluating, measuring and reducing the effects of scattering from both the test object and structures in the test area. X-ray film and processing is reviewed and techniques for evaluating and calibrating these are outlined. Finally, after X-ray techniques were optimized, the application of image enhancement processing which can improve image quality is described. This technique was applied to X-ray studies of the combustion of very high burning rate (VHBR) propellants and stick propellant charges.

  9. MORS Workshop on Improving Defense Analysis through Better Data Practices, held in Alexandria, Virginia on March 25, 26 and 27, 2003

    DTIC Science & Technology

    2004-12-03

    other process improvements could also enhance DoD data practices. These include the incorporation of library science techniques as well as processes to...coalition communities as well as adapting the approaches and lessons of the library science community. Second, there is a need to generate a plan of...Best Practices (2 of 2) - Processes - Incorporate library science techniques in repository design - Improve visibility and accessibility of DoD data

  10. Improved Photoresist Coating for Making CNT Field Emitters

    NASA Technical Reports Server (NTRS)

    Toda, Risaku; Manohara, Harish

    2009-01-01

    An improved photoresist-coating technique has been developed for use in the fabrication of carbon-nanotube- (CNT) based field emitters is described. The improved photoresist coating technique overcomes what, heretofore, has been a major difficulty in the fabrication process.

  11. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  12. Densification of powder metallurgy billets by a roll consolidation technique

    NASA Technical Reports Server (NTRS)

    Sellman, W. H.; Weinberger, W. R.

    1973-01-01

    Container design is used to convert partially densified powder metallurgy compacts into fully densified slabs in one processing step. Technique improves product yield, lowers costs and yields great flexibility in process scale-up. Technique is applicable to all types of fabricable metallic materials that are produced from powder metallurgy process.

  13. Applied in situ product recovery in ABE fermentation

    PubMed Central

    Lalander, Carl‐Axel; Lee, Jonathan G. M.; Davies, E. Timothy; Harvey, Adam P.

    2017-01-01

    The production of biobutanol is hindered by the product's toxicity to the bacteria, which limits the productivity of the process. In situ product recovery of butanol can improve the productivity by removing the source of inhibition. This paper reviews in situ product recovery techniques applied to the acetone butanol ethanol fermentation in a stirred tank reactor. Methods of in situ recovery include gas stripping, vacuum fermentation, pervaporation, liquid–liquid extraction, perstraction, and adsorption, all of which have been investigated for the acetone, butanol, and ethanol fermentation. All techniques have shown an improvement in substrate utilization, yield, productivity or both. Different fermentation modes favored different techniques. For batch processing gas stripping and pervaporation were most favorable, but in fed‐batch fermentations gas stripping and adsorption were most promising. During continuous processing perstraction appeared to offer the best improvement. The use of hybrid techniques can increase the final product concentration beyond that of single‐stage techniques. Therefore, the selection of an in situ product recovery technique would require comparable information on the energy demand and economics of the process. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:563–579, 2017 PMID:28188696

  14. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  15. Lean principles optimize on-time vascular surgery operating room starts and decrease resident work hours.

    PubMed

    Warner, Courtney J; Walsh, Daniel B; Horvath, Alexander J; Walsh, Teri R; Herrick, Daniel P; Prentiss, Steven J; Powell, Richard J

    2013-11-01

    Lean process improvement techniques are used in industry to improve efficiency and quality while controlling costs. These techniques are less commonly applied in health care. This study assessed the effectiveness of Lean principles on first case on-time operating room starts and quantified effects on resident work hours. Standard process improvement techniques (DMAIC methodology: define, measure, analyze, improve, control) were used to identify causes of delayed vascular surgery first case starts. Value stream maps and process flow diagrams were created. Process data were analyzed with Pareto and control charts. High-yield changes were identified and simulated in computer and live settings prior to implementation. The primary outcome measure was the proportion of on-time first case starts; secondary outcomes included hospital costs, resident rounding time, and work hours. Data were compared with existing benchmarks. Prior to implementation, 39% of first cases started on time. Process mapping identified late resident arrival in preoperative holding as a cause of delayed first case starts. Resident rounding process inefficiencies were identified and changed through the use of checklists, standardization, and elimination of nonvalue-added activity. Following implementation of process improvements, first case on-time starts improved to 71% at 6 weeks (P = .002). Improvement was sustained with an 86% on-time rate at 1 year (P < .001). Resident rounding time was reduced by 33% (from 70 to 47 minutes). At 9 weeks following implementation, these changes generated an opportunity cost potential of $12,582. Use of Lean principles allowed rapid identification and implementation of perioperative process changes that improved efficiency and resulted in significant cost savings. This improvement was sustained at 1 year. Downstream effects included improved resident efficiency with decreased work hours. Copyright © 2013 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  16. A continuous quality improvement project to improve the quality of cervical Papanicolaou smears.

    PubMed

    Burkman, R T; Ward, R; Balchandani, K; Kini, S

    1994-09-01

    To improve the quality of cervical Papanicolaou smears by continuous quality improvement techniques. The study used a Papanicolaou smear data base of over 200,000 specimens collected between June 1988 and December 1992. A team approach employing techniques such as process flow-charting, cause and effect diagrams, run charts, and a randomized trial of collection methods was used to evaluate potential causes of Papanicolaou smear reports with the notation "inadequate" or "less than optimal" due to too few or absent endocervical cells. Once a key process variable (method of collection) was identified, the proportion of Papanicolaou smears with inadequate or absent endocervical cells was determined before and after employment of a collection technique using a spatula and Cytobrush. We measured the rate of less than optimal Papanicolaou smears due to too few or absent endocervical cells. Before implementing the new collection technique fully by June 1990, the overall rate of less than optimal cervical Papanicolaou smears ranged from 20-25%; by December 1993, it had stabilized at about 10%. Continuous quality improvement can be used successfully to study a clinical process and implement change that will lead to improvement.

  17. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    PubMed Central

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  18. Application of off-line image processing for optimization in chest computed radiography using a low cost system.

    PubMed

    Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato

    2015-03-08

     The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.

  19. Application of off‐line image processing for optimization in chest computed radiography using a low cost system

    PubMed Central

    Msaki, Peter; Padovani, Renato

    2015-01-01

    The objective of this study was to improve the visibility of anatomical details by applying off‐line postimage processing in chest computed radiography (CR). Four spatial domain‐based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann‐Whitney U‐test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005≤p≤0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60≤kVp≤70. However, there was no improvement for images acquired using 102≤kVp≤107 (0.127≤p≤0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists. PACS number: 87.59.−e, 87.59.−B, 87.59.−bd PMID:26103165

  20. A new data processing technique for Rayleigh-Taylor instability growth experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, Yongteng; Tu, Shaoyong; Miao, Wenyong

    Typical face-on experiments for Rayleigh-Taylor instability study involve the time-resolved radiography of an accelerated foil with line-of-sight of the radiography along the direction of motion. The usual method which derives perturbation amplitudes from the face-on images reverses the actual image transmission procedure, so the obtained results will have a large error in the case of large optical depth. In order to improve the accuracy of data processing, a new data processing technique has been developed to process the face-on images. This technique based on convolution theorem, refined solutions of optical depth can be achieved by solving equations. Furthermore, we discussmore » both techniques for image processing, including the influence of modulation transfer function of imaging system and the backlighter spatial profile. Besides, we use the two methods to the process the experimental results in Shenguang-II laser facility and the comparison shows that the new method effectively improve the accuracy of data processing.« less

  1. Parametric and Generative Design Techniques for Digitalization in Building Industry: the Case Study of Glued- Laminated-Timber Industry

    NASA Astrophysics Data System (ADS)

    Pasetti Monizza, G.; Matt, D. T.; Benedetti, C.

    2016-11-01

    According to Wortmann classification, the Building Industry (BI) can be defined as engineer-to-order (ETO) industry: the engineering-process starts only when an order is acquired. This definition implies that every final product (building) is almost unique’ and processes cannot be easily standardized or automated. Because of this, BI is one of the less efficient industries today’ mostly leaded by craftsmanship. In the last years’ several improvements in process efficiency have been made focusing on manufacturing and installation processes only. In order to improve the efficiency of design and engineering processes as well, the scientific community agrees that the most fruitful strategy should be Front-End Design (FED). Nevertheless, effective techniques and tools are missing. This paper discusses outcomes of a research activity that aims at highlighting whether Parametric and Generative Design techniques allow reducing wastes of resources and improving the overall efficiency of the BI, by pushing the Digitalization of design and engineering processes of products. Focusing on the Glued-Laminated-Timber industry, authors will show how Parametric and Generative Design techniques can be introduced in a standard supply-chain system, highlighting potentials and criticism on the supply-chain system as a whole.

  2. Application of ultrasound to improve lees ageing processes in red wines.

    PubMed

    Del Fresno, Juan Manuel; Loira, Iris; Morata, Antonio; González, Carmen; Suárez-Lepe, Jose Antonio; Cuerda, Rafael

    2018-09-30

    Ageing on lees (AOL) is a technique that increases volatile compounds, promotes colour stability, improves mouthfeel and reduces astringency in red wines. The main drawback is that it is a slow process. Several months are necessary to obtain perceptible effects in wines. Different authors have studied the application of new techniques to accelerate the AOL process. Ultrasound (US) has been used to improve different food industry processes; it could be interesting to accelerate the yeast autolysis during AOL. This work evaluates the use of the US technique together with AOL and oak chips for this purpose studying the effects of different oenological parameters of red wines. The results obtained indicate an increase of polysaccharides content when US is applied in wine AOL. In addition, total polyphenol index (TPI) and volatile acidity were not affected. However, this treatment increases the dissolved oxygen affecting the volatile compounds and total anthocyanins. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Applied in situ product recovery in ABE fermentation.

    PubMed

    Outram, Victoria; Lalander, Carl-Axel; Lee, Jonathan G M; Davies, E Timothy; Harvey, Adam P

    2017-05-01

    The production of biobutanol is hindered by the product's toxicity to the bacteria, which limits the productivity of the process. In situ product recovery of butanol can improve the productivity by removing the source of inhibition. This paper reviews in situ product recovery techniques applied to the acetone butanol ethanol fermentation in a stirred tank reactor. Methods of in situ recovery include gas stripping, vacuum fermentation, pervaporation, liquid-liquid extraction, perstraction, and adsorption, all of which have been investigated for the acetone, butanol, and ethanol fermentation. All techniques have shown an improvement in substrate utilization, yield, productivity or both. Different fermentation modes favored different techniques. For batch processing gas stripping and pervaporation were most favorable, but in fed-batch fermentations gas stripping and adsorption were most promising. During continuous processing perstraction appeared to offer the best improvement. The use of hybrid techniques can increase the final product concentration beyond that of single-stage techniques. Therefore, the selection of an in situ product recovery technique would require comparable information on the energy demand and economics of the process. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:563-579, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  4. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement.

    PubMed

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.

  5. Defective Reduction in Frozen Pie Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Nooted, Oranuch; Tangjitsitcharoen, Somkiat

    2017-06-01

    The frozen pie production has a lot of defects resulting in high production cost. Failure mode and effect analysis (FMEA) technique has been applied to improve the frozen pie process. Pareto chart is also used to determine the major defects of frozen pie. There are 3 main processes that cause the defects which are the 1st freezing to glazing process, the forming process, and the folding process. The Risk Priority Number (RPN) obtained from FMEA is analyzed to reduce the defects. If RPN of each cause exceeds 45, the process will be considered to be improved and selected for the corrective and preventive actions. The results showed that RPN values decreased after the correction. Therefore, the implementation of FMEA technique can help to improve the performance of frozen pie process and reduce the defects approximately 51.9%.

  6. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  7. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    NASA Astrophysics Data System (ADS)

    Kuntoro, Hadiyan Yusuf; Hudaya, Akhmad Zidni; Dinaryanto, Okto; Majid, Akmal Irfan; Deendarlianto

    2016-06-01

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (hL) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.

  8. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methodsmore » and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.« less

  9. Powder-Metallurgy Process And Product

    NASA Technical Reports Server (NTRS)

    Paris, Henry G.

    1988-01-01

    Rapid-solidification processing yields alloys with improved properties. Study undertaken to extend favorable property combinations of I/M 2XXX alloys through recently developed technique of rapid-solidification processing using powder metallurgy(P/M). Rapid-solidification processing involves impingement of molten metal stream onto rapidly-spinning chill block or through gas medium using gas atomization technique.

  10. Combination of process and vibration data for improved condition monitoring of industrial systems working under variable operating conditions

    NASA Astrophysics Data System (ADS)

    Ruiz-Cárcel, C.; Jaramillo, V. H.; Mba, D.; Ottewill, J. R.; Cao, Y.

    2016-01-01

    The detection and diagnosis of faults in industrial processes is a very active field of research due to the reduction in maintenance costs achieved by the implementation of process monitoring algorithms such as Principal Component Analysis, Partial Least Squares or more recently Canonical Variate Analysis (CVA). Typically the condition of rotating machinery is monitored separately using vibration analysis or other specific techniques. Conventional vibration-based condition monitoring techniques are based on the tracking of key features observed in the measured signal. Typically steady-state loading conditions are required to ensure consistency between measurements. In this paper, a technique based on merging process and vibration data is proposed with the objective of improving the detection of mechanical faults in industrial systems working under variable operating conditions. The capabilities of CVA for detection and diagnosis of faults were tested using experimental data acquired from a compressor test rig where different process faults were introduced. Results suggest that the combination of process and vibration data can effectively improve the detectability of mechanical faults in systems working under variable operating conditions.

  11. A technique for improved maxillary record base adaptation through controlled polymerization of light-activated dental resins.

    PubMed

    Hopkins, D S; Phoenix, R D; Abrahamsen, T C

    1997-09-01

    A technique for the fabrication of light-activated maxillary record bases is described. The use of a segmental polymerization process provides improved palatal adaptation by minimizing the effects of polymerization shrinkage. Utilization of this technique results in record bases that are well adapted to the corresponding master casts.

  12. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement

    PubMed Central

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344

  13. A FMEA clinical laboratory case study: how to make problems and improvements measurable.

    PubMed

    Capunzo, Mario; Cavallo, Pierpaolo; Boccia, Giovanni; Brunetti, Luigi; Pizzuti, Sante

    2004-01-01

    The authors have experimented the application of the Failure Mode and Effect Analysis (FMEA) technique in a clinical laboratory. FMEA technique allows: a) to evaluate and measure the hazards of a process malfunction, b) to decide where to execute improvement actions, and c) to measure the outcome of those actions. A small sample of analytes has been studied: there have been determined the causes of the possible malfunctions of the analytical process, calculating the risk probability index (RPI), with a value between 1 and 1,000. Only for the cases of RPI > 400, improvement actions have been implemented that allowed a reduction of RPI values between 25% to 70% with a costs increment of < 1%. FMEA technique can be applied to the processes of a clinical laboratory, even if of small dimensions, and offers a high potential of improvement. Nevertheless, such activity needs a thorough planning because it is complex, even if the laboratory already operates an ISO 9000 Quality Management System.

  14. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  15. Burn-injured tissue detection for debridement surgery through the combination of non-invasive optical imaging techniques.

    PubMed

    Heredia-Juesas, Juan; Thatcher, Jeffrey E; Lu, Yang; Squiers, John J; King, Darlene; Fan, Wensheng; DiMaio, J Michael; Martinez-Lorenzo, Jose A

    2018-04-01

    The process of burn debridement is a challenging technique requiring significant skills to identify the regions that need excision and their appropriate excision depths. In order to assist surgeons, a machine learning tool is being developed to provide a quantitative assessment of burn-injured tissue. This paper presents three non-invasive optical imaging techniques capable of distinguishing four kinds of tissue-healthy skin, viable wound bed, shallow burn, and deep burn-during serial burn debridement in a porcine model. All combinations of these three techniques have been studied through a k-fold cross-validation method. In terms of global performance, the combination of all three techniques significantly improves the classification accuracy with respect to just one technique, from 0.42 up to more than 0.76. Furthermore, a non-linear spatial filtering based on the mode of a small neighborhood has been applied as a post-processing technique, in order to improve the performance of the classification. Using this technique, the global accuracy reaches a value close to 0.78 and, for some particular tissues and combination of techniques, the accuracy improves by 13%.

  16. Steam generator tubing NDE performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, G.; Welty, C.S. Jr.

    1997-02-01

    Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less

  17. A review of micro-powder injection moulding as a microfabrication technique

    NASA Astrophysics Data System (ADS)

    Attia, Usama M.; Alcock, Jeffrey R.

    2011-04-01

    Micro-powder injection moulding (µPIM) is a fast-developing micro-manufacturing technique for the production of metal and ceramic components. Shape complexity, dimensional accuracy, replication fidelity, material variety combined with high-volume capabilities are some of the key advantages of the technology. This review assesses the capabilities and limitations of µPIM as a micro-manufacturing technique by reviewing the latest developments in the area and by considering potential improvements. The basic elements of the process chain, variant processes and simulation attempts are discussed and evaluated. Challenges and research gaps are highlighted, and potential areas for improvement are presented.

  18. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Credille, Jennifer; Owens, Elizabeth

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restrictedmore » to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.« less

  19. Search Radar Track-Before-Detect Using the Hough Transform.

    DTIC Science & Technology

    1995-03-01

    before - detect processing method which allows previous data to help in target detection. The technique provides many advantages compared to...improved target detection scheme, applicable to search radars, using the Hough transform image processing technique. The system concept involves a track

  20. The Application of Collaborative Business Intelligence Technology in the Hospital SPD Logistics Management Model.

    PubMed

    Liu, Tongzhu; Shen, Aizong; Hu, Xiaojian; Tong, Guixian; Gu, Wei

    2017-06-01

    We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers.

  1. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Technical Reports Server (NTRS)

    Raiman, Laura B.

    1992-01-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  2. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Astrophysics Data System (ADS)

    Raiman, Laura B.

    1992-12-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  3. The Applicability of Lean and Six Sigma Techniques to Clinical and Translational Research

    PubMed Central

    Schweikhart, Sharon A.; Dembe, Allard E

    2010-01-01

    Background Lean and Six Sigma are business management strategies commonly used in production industries to improve process efficiency and quality. During the past decade, these process improvement techniques increasingly have been applied outside of the manufacturing sector, for example, in health care and in software development. This article concerns the potential use of Lean and Six Sigma to improve the processes involved in clinical and translational research. Improving quality, avoiding delays and errors, and speeding up the time to implementation of biomedical discoveries are prime objectives of the NIH Roadmap for Biomedical Research and the NIH Clinical and Translational Science Award (CTSA) program. Methods This article presents a description of the main principles, practices, and methodologies used in Lean and Six Sigma. Available literature involving applications of Lean and Six Sigma to health care, laboratory science, and clinical and translational research is reviewed. Specific issues concerning the use of these techniques in different phases of translational research are identified. Results Examples are provided of Lean and Six Sigma applications that are being planned at a current CTSA site, which could potentially be replicated elsewhere. We describe how different process improvement approaches are best adapted for particularly translational research phases. Conclusions Lean and Six Sigma process improvement methodologies are well suited to help achieve NIH’s goal of making clinical and translational research more efficient and cost-effective, enhancing the quality of the research, and facilitating the successful adoption of biomedical research findings into practice. PMID:19730130

  4. The applicability of Lean and Six Sigma techniques to clinical and translational research.

    PubMed

    Schweikhart, Sharon A; Dembe, Allard E

    2009-10-01

    Lean and Six Sigma are business management strategies commonly used in production industries to improve process efficiency and quality. During the past decade, these process improvement techniques increasingly have been applied outside the manufacturing sector, for example, in health care and in software development. This article concerns the potential use of Lean and Six Sigma in improving the processes involved in clinical and translational research. Improving quality, avoiding delays and errors, and speeding up the time to implementation of biomedical discoveries are prime objectives of the National Institutes of Health (NIH) Roadmap for Medical Research and the NIH's Clinical and Translational Science Award program. This article presents a description of the main principles, practices, and methods used in Lean and Six Sigma. Available literature involving applications of Lean and Six Sigma to health care, laboratory science, and clinical and translational research is reviewed. Specific issues concerning the use of these techniques in different phases of translational research are identified. Examples of Lean and Six Sigma applications that are being planned at a current Clinical and Translational Science Award site are provided, which could potentially be replicated elsewhere. We describe how different process improvement approaches are best adapted for particular translational research phases. Lean and Six Sigma process improvement methods are well suited to help achieve NIH's goal of making clinical and translational research more efficient and cost-effective, enhancing the quality of the research, and facilitating the successful adoption of biomedical research findings into practice.

  5. Study of Variable Frequency Induction Heating in Steel Making Process

    NASA Astrophysics Data System (ADS)

    Fukutani, Kazuhiko; Umetsu, Kenji; Itou, Takeo; Isobe, Takanori; Kitahara, Tadayuki; Shimada, Ryuichi

    Induction heating technologies have been the standard technologies employed in steel making processes because they are clean, they have a high energy density, they are highly the controllable, etc. However, there is a problem in using them; in general, frequencies of the electric circuits have to be kept fixed to improve their power factors, and this constraint makes the processes inflexible. In order to overcome this problem, we have developed a new heating technique-variable frequency power supply with magnetic energy recovery switching. This technique helps us in improving the quality of steel products as well as the productivity. We have also performed numerical calculations and experiments to evaluate its effect on temperature distributions on heated steel plates. The obtained results indicate that the application of the technique in steel making processes would be advantageous.

  6. The use of artificial intelligence techniques to improve the multiple payload integration process

    NASA Technical Reports Server (NTRS)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  7. An Information Filtering and Control System to Improve the Decision Making Process Within Future Command Information Centres

    DTIC Science & Technology

    2001-04-01

    part of the following report: TITLE: New Information Processing Techniques for Military Systems [les Nouvelles techniques de traitement de l’information...rapidly developing information increasing amount of time is needed for gathering and technology has until now not yet resulted in a substantial...Information Processing Techniques for Military Systems", held in Istanbul, Turkey, 9-11 October 2000, and published in RTO MP-049. 23-2 organisations. The

  8. Yield enhancement with DFM

    NASA Astrophysics Data System (ADS)

    Paek, Seung Weon; Kang, Jae Hyun; Ha, Naya; Kim, Byung-Moo; Jang, Dae-Hyun; Jeon, Junsu; Kim, DaeWook; Chung, Kun Young; Yu, Sung-eun; Park, Joo Hyun; Bae, SangMin; Song, DongSup; Noh, WooYoung; Kim, YoungDuck; Song, HyunSeok; Choi, HungBok; Kim, Kee Sup; Choi, Kyu-Myung; Choi, Woonhyuk; Jeon, JoongWon; Lee, JinWoo; Kim, Ki-Su; Park, SeongHo; Chung, No-Young; Lee, KangDuck; Hong, YoungKi; Kim, BongSeok

    2012-03-01

    A set of design for manufacturing (DFM) techniques have been developed and applied to 45nm, 32nm and 28nm logic process technologies. A noble technology combined a number of potential confliction of DFM techniques into a comprehensive solution. These techniques work in three phases for design optimization and one phase for silicon diagnostics. In the DFM prevention phase, foundation IP such as standard cells, IO, and memory and P&R tech file are optimized. In the DFM solution phase, which happens during ECO step, auto fixing of process weak patterns and advanced RC extraction are performed. In the DFM polishing phase, post-layout tuning is done to improve manufacturability. DFM analysis enables prioritization of random and systematic failures. The DFM technique presented in this paper has been silicon-proven with three successful tape-outs in Samsung 32nm processes; about 5% improvement in yield was achieved without any notable side effects. Visual inspection of silicon also confirmed the positive effect of the DFM techniques.

  9. Intermittent sizing on carbon fiber for composite application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norris, Jr, Robert E.; Paulauskas, Felix L.; Ozcan, Soydan

    Intermittent sizing is a technique designed to improve the bonding of carbon fiber to a resin when manufacturing composite parts. The purpose of this technique is to improve Sheet Molding Composites (SMC) made of non-continuous carbon fibers while using regular material. At the end of the project, tests showed that improved mechanical properties have been achieved using this technique compared to conventional process. Mechanical properties have been improved by 110% for the peak tensile stress and by 60% for the modulus at the laboratory scale. In this project, Continental Structural Plastics and ORNL have worked to demonstrate the scalability andmore » viability of commercialization of this technique.« less

  10. Effects of visual feedback-induced variability on motor learning of handrim wheelchair propulsion.

    PubMed

    Leving, Marika T; Vegter, Riemer J K; Hartog, Johanneke; Lamoth, Claudine J C; de Groot, Sonja; van der Woude, Lucas H V

    2015-01-01

    It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability.

  11. Effects of Visual Feedback-Induced Variability on Motor Learning of Handrim Wheelchair Propulsion

    PubMed Central

    Leving, Marika T.; Vegter, Riemer J. K.; Hartog, Johanneke; Lamoth, Claudine J. C.; de Groot, Sonja; van der Woude, Lucas H. V.

    2015-01-01

    Background It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. Methods 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. Results The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. Conclusion These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability. PMID:25992626

  12. A comparative analysis of frequency modulation threshold extension techniques

    NASA Technical Reports Server (NTRS)

    Arndt, G. D.; Loch, F. J.

    1970-01-01

    FM threshold extension for system performance improvement, comparing impulse noise elimination, correlation detection and delta modulation signal processing techniques implemented at demodulator output

  13. Ignition and monitoring technique for plasma processing of multicell superconducting radio-frequency cavities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doleans, Marc

    In this study, an in-situ plasma processing technique has been developed at the Spallation Neutron Source (SNS) to improve the performance of the superconducting radio-frequency (SRF) cavities in operation. The technique uses a low-density reactive neon-oxygen plasma at room-temperature to improve the surface work function, to help remove adsorbed gases on the RF surface and to reduce its secondary emission yield. SNS SRF cavities are six-cell elliptical cavities and the plasma typically ignites in the cell where the electric field is the highest. This article will detail a technique that was developed to ignite and monitor the plasma in eachmore » cell of the SNS cavities.« less

  14. Ignition and monitoring technique for plasma processing of multicell superconducting radio-frequency cavities

    DOE PAGES

    Doleans, Marc

    2016-12-27

    In this study, an in-situ plasma processing technique has been developed at the Spallation Neutron Source (SNS) to improve the performance of the superconducting radio-frequency (SRF) cavities in operation. The technique uses a low-density reactive neon-oxygen plasma at room-temperature to improve the surface work function, to help remove adsorbed gases on the RF surface and to reduce its secondary emission yield. SNS SRF cavities are six-cell elliptical cavities and the plasma typically ignites in the cell where the electric field is the highest. This article will detail a technique that was developed to ignite and monitor the plasma in eachmore » cell of the SNS cavities.« less

  15. Application of CRAFT (complete reduction to amplitude frequency table) in nonuniformly sampled (NUS) 2D NMR data processing.

    PubMed

    Krishnamurthy, Krish; Hari, Natarajan

    2017-09-15

    The recently published CRAFT (complete reduction to amplitude frequency table) technique converts the raw FID data (i.e., time domain data) into a table of frequencies, amplitudes, decay rate constants, and phases. It offers an alternate approach to decimate time-domain data, with minimal preprocessing step. It has been shown that application of CRAFT technique to process the t 1 dimension of the 2D data significantly improved the detectable resolution by its ability to analyze without the use of ubiquitous apodization of extensively zero-filled data. It was noted earlier that CRAFT did not resolve sinusoids that were not already resolvable in time-domain (i.e., t 1 max dependent resolution). We present a combined NUS-IST-CRAFT approach wherein the NUS acquisition technique (sparse sampling technique) increases the intrinsic resolution in time-domain (by increasing t 1 max), IST fills the gap in the sparse sampling, and CRAFT processing extracts the information without loss due to any severe apodization. NUS and CRAFT are thus complementary techniques to improve intrinsic and usable resolution. We show that significant improvement can be achieved with this combination over conventional NUS-IST processing. With reasonable sensitivity, the models can be extended to significantly higher t 1 max to generate an indirect-DEPT spectrum that rivals the direct observe counterpart. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Photo-reconnaissance applications of computer processing of images.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1972-01-01

    Discussion of imaging processing techniques for enhancement and calibration of Jet Propulsion Laboratory imaging experiment pictures returned from NASA space vehicles such as Ranger, Mariner and Surveyor. Particular attention is given to data transmission, resolution vs recognition, and color aspects of digital data processing. The effectiveness of these techniques in applications to images from a wide variety of sources is noted. It is anticipated that the use of computer processing for enhancement of imagery will increase with the improvement and cost reduction of these techniques in the future.

  17. Toward energy harvesting using active materials and conversion improvement by nonlinear processing.

    PubMed

    Guyomar, Daniel; Badel, Adrien; Lefeuvre, Elie; Richard, Claude

    2005-04-01

    This paper presents a new technique of electrical energy generation using mechanically excited piezoelectric materials and a nonlinear process. This technique, called synchronized switch harvesting (SSH), is derived from the synchronized switch damping (SSD), which is a nonlinear technique previously developed to address the problem of vibration damping on mechanical structures. This technique results in a significant increase of the electromechanical conversion capability of piezoelectric materials. Comparatively with standard technique, the electrical harvested power may be increased above 900%. The performance of the nonlinear processing is demonstrated on structures excited at their resonance frequency as well as out of resonance.

  18. Hybrid 3D printing by bridging micro/nano processes

    NASA Astrophysics Data System (ADS)

    Yoon, Hae-Sung; Jang, Ki-Hwan; Kim, Eunseob; Lee, Hyun-Taek; Ahn, Sung-Hoon

    2017-06-01

    A hybrid 3D printing process was developed for multiple-material/freeform nano-scale manufacturing. The process consisted of aerodynamically focused nanoparticle (AFN) printing, micro-machining, focused ion beam milling, and spin-coating. Theoretical and experimental investigations were carried out to improve the compatibility of each of the processes, enabling bridging of various different techniques. The resulting hybrid process could address the limitations of individual processes, enabling improved process scaling and dimensional degrees of freedom, without losing the advantages of the existing processes. The minimum structure width can be reduced to 50 nm using undercut structures. In addition, AFN printing employs particle impact for adhesion, and various inorganic materials are suitable for printing, including metals and functional ceramics. Using the developed system, we fabricated bi-material cantilevers for applications as a thermal actuator. The mechanical and thermal properties of the structure were investigated using an in situ measurement system, and irregular thermal phenomena due to the fabrication process were analyzed. We expect that this work will lead to improvements in the area of customized nano-scale manufacturing, as well as further improvements in manufacturing technology by combining different fabrication techniques.

  19. New signal processing technique for density profile reconstruction using reflectometry.

    PubMed

    Clairet, F; Ricaud, B; Briolle, F; Heuraux, S; Bottereau, C

    2011-08-01

    Reflectometry profile measurement requires an accurate determination of the plasma reflected signal. Along with a good resolution and a high signal to noise ratio of the phase measurement, adequate data analysis is required. A new data processing based on time-frequency tomographic representation is used. It provides a clearer separation between multiple components and improves isolation of the relevant signals. In this paper, this data processing technique is applied to two sets of signals coming from two different reflectometer devices used on the Tore Supra tokamak. For the standard density profile reflectometry, it improves the initialization process and its reliability, providing a more accurate profile determination in the far scrape-off layer with density measurements as low as 10(16) m(-1). For a second reflectometer, which provides measurements in front of a lower hybrid launcher, this method improves the separation of the relevant plasma signal from multi-reflection processes due to the proximity of the plasma.

  20. Developments in Signature Process Control

    NASA Astrophysics Data System (ADS)

    Keller, L. B.; Dominski, Marty

    1993-01-01

    Developments in the adaptive process control technique known as Signature Process Control for Advanced Composites (SPCC) are described. This computer control method for autoclave processing of composites was used to develop an optimum cure cycle for AFR 700B polyamide and for an experimental poly-isoimide. An improved process cycle was developed for Avimid N polyamide. The potential for extending the SPCC technique to pre-preg quality control, press modeling, pultrusion and RTM is briefly discussed.

  1. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  2. Application of Six Sigma towards improving surgical outcomes.

    PubMed

    Shukla, P J; Barreto, S G; Nadkarni, M S

    2008-01-01

    Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.

  3. Recombinant organisms for production of industrial products

    PubMed Central

    Adrio, Jose-Luis

    2010-01-01

    A revolution in industrial microbiology was sparked by the discoveries of ther double-stranded structure of DNA and the development of recombinant DNA technology. Traditional industrial microbiology was merged with molecular biology to yield improved recombinant processes for the industrial production of primary and secondary metabolites, protein biopharmaceuticals and industrial enzymes. Novel genetic techniques such as metabolic engineering, combinatorial biosynthesis and molecular breeding techniques and their modifications are contributing greatly to the development of improved industrial processes. In addition, functional genomics, proteomics and metabolomics are being exploited for the discovery of novel valuable small molecules for medicine as well as enzymes for catalysis. The sequencing of industrial microbal genomes is being carried out which bodes well for future process improvement and discovery of new industrial products. PMID:21326937

  4. A novel, two-step top seeded infiltration and growth process for the fabrication of single grain, bulk (RE)BCO superconductors

    NASA Astrophysics Data System (ADS)

    Namburi, Devendra K.; Shi, Yunhua; Palmer, Kysen G.; Dennis, Anthony R.; Durrell, John H.; Cardwell, David A.

    2016-09-01

    A fundamental requirement of the fabrication of high performing, (RE)-Ba-Cu-O bulk superconductors is achieving a single grain microstructure that exhibits good flux pinning properties. The top seeded melt growth (TSMG) process is a well-established technique for the fabrication of single grain (RE)BCO bulk samples and is now applied routinely by a number of research groups around the world. The introduction of a buffer layer to the TSMG process has been demonstrated recently to improve significantly the general reliability of the process. However, a number of growth-related defects, such as porosity and the formation of micro-cracks, remain inherent to the TSMG process, and are proving difficult to eliminate by varying the melt process parameters. The seeded infiltration and growth (SIG) process has been shown to yield single grain samples that exhibit significantly improved microstructures compared to the TSMG technique. Unfortunately, however, SIG leads to other processing challenges, such as the reliability of fabrication, optimisation of RE2BaCuO5 (RE-211) inclusions (size and content) in the sample microstructure, practical oxygenation of as processed samples and, hence, optimisation of the superconducting properties of the bulk single grain. In the present paper, we report the development of a near-net shaping technique based on a novel two-step, buffer-aided top seeded infiltration and growth (BA-TSIG) process, which has been demonstrated to improve greatly the reliability of the single grain growth process and has been used to fabricate successfully bulk, single grain (RE)BCO superconductors with improved microstructures and superconducting properties. A trapped field of ˜0.84 T and a zero field current density of 60 kA cm-2 have been measured at 77 K in a bulk, YBCO single grain sample of diameter 25 mm processed by this two-step BA-TSIG technique. To the best of our knowledge, this value of trapped field is the highest value ever reported for a sample fabricated by an infiltration and growth process. In this study we report the successful fabrication of 14 YBCO samples, with diameters of up to 32 mm, by this novel technique with a success rate of greater than 92%.

  5. The Application of Collaborative Business Intelligence Technology in the Hospital SPD Logistics Management Model

    PubMed Central

    LIU, Tongzhu; SHEN, Aizong; HU, Xiaojian; TONG, Guixian; GU, Wei

    2017-01-01

    Background: We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. Methods: We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. Results: For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Conclusion: Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers. PMID:28828316

  6. Device design and signal processing for multiple-input multiple-output multimode fiber links

    NASA Astrophysics Data System (ADS)

    Appaiah, Kumar; Vishwanath, Sriram; Bank, Seth R.

    2012-01-01

    Multimode fibers (MMFs) are limited in data rate capabilities owing to modal dispersion. However, their large core diameter simplifies alignment and packaging, and makes them attractive for short and medium length links. Recent research has shown that the use of signal processing and techniques such as multiple-input multiple-output (MIMO) can greatly improve the data rate capabilities of multimode fibers. In this paper, we review recent experimental work using MIMO and signal processing for multimode fibers, and the improvements in data rates achievable with these techniques. We then present models to design as well as simulate the performance benefits obtainable with arrays of lasers and detectors in conjunction with MIMO, using channel capacity as the metric to optimize. We also discuss some aspects related to complexity of the algorithms needed for signal processing and discuss techniques for low complexity implementation.

  7. Mechanical impedance measurements for improved cost-effective process monitoring

    NASA Astrophysics Data System (ADS)

    Clopet, Caroline R.; Pullen, Deborah A.; Badcock, Rodney A.; Ralph, Brian; Fernando, Gerard F.; Mahon, Steve W.

    1999-06-01

    The aerospace industry has seen a considerably growth in composite usage over the past ten years, especially with the development of cost effective manufacturing techniques such as Resin Transfer Molding and Resin Infusion under Flexible Tooling. The relatively high cost of raw material and conservative processing schedules has limited their growth further in non-aerospace technologies. In-situ process monitoring has been explored for some time as a means to improving the cost efficiency of manufacturing with dielectric spectroscopy and optical fiber sensors being the two primary techniques developed to date. A new emerging technique is discussed here making use of piezoelectric wafers with the ability to sense not only aspects of resin flow but also to detect the change in properties of the resin as it cures. Experimental investigations to date have shown a correlation between mechanical impedance measurements and the mechanical properties of cured epoxy systems with potential for full process monitoring.

  8. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  9. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  10. Continuous Quality Improvement Tools for Effective Teaching.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.

    This manual presents 15 Continuous Quality Improvement (CQI) tools and techniques necessary for effective teaching. By using the tools and techniques of CQI, teachers will be able to help themselves and their students to focus on the classroom processes. This will permit the teacher and students to plan, organize, implement, and make decisions…

  11. How Students Learn: Improving Teaching Techniques for Business Discipline Courses

    ERIC Educational Resources Information Center

    Cluskey, Bob; Elbeck, Matt; Hill, Kathy L.; Strupeck, Dave

    2011-01-01

    The focus of this paper is to familiarize business discipline faculty with cognitive psychology theories of how students learn together with teaching techniques to assist and improve student learning. Student learning can be defined as the outcome from the retrieval (free recall) of desired information. Student learning occurs in two processes.…

  12. Edge printability: techniques used to evaluate and improve extreme wafer edge printability

    NASA Astrophysics Data System (ADS)

    Roberts, Bill; Demmert, Cort; Jekauc, Igor; Tiffany, Jason P.

    2004-05-01

    The economics of semiconductor manufacturing have forced process engineers to develop techniques to increase wafer yield. Improvements in process controls and uniformities in all areas of the fab have reduced film thickness variations at the very edge of the wafer surface. This improved uniformity has provided the opportunity to consider decreasing edge exclusions, and now the outermost extents of the wafer must be considered in the yield model and expectations. These changes have increased the requirements on lithography to improve wafer edge printability in areas that previously were not even coated. This has taxed all software and hardware components used in defining the optical focal plane at the wafer edge. We have explored techniques to determine the capabilities of extreme wafer edge printability and the components of the systems that influence this printability. We will present current capabilities and new detection techniques and the influence that the individual hardware and software components have on edge printability. We will show effects of focus sensor designs, wafer layout, utilization of dummy edge fields, the use of non-zero overlay targets and chemical/optical edge bead optimization.

  13. Development of low-cost technology for the next generation of high efficiency solar cells composed of earth abundant elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Rakesh

    2014-09-28

    The development of renewable, affordable, and environmentally conscious means of generating energy on a global scale represents a grand challenge of our time. Due to the “permanence” of radiation from the sun, solar energy promises to remain a viable and sustainable power source far into the future. Established single-junction photovoltaic technologies achieve high power conversion efficiencies (pce) near 20% but require complicated manufacturing processes that prohibit the marriage of large-scale throughput (e.g. on the GW scale), profitability, and quality control. Our approach to this problem begins with the synthesis of nanocrystals of semiconductor materials comprising earth abundant elements and characterizedmore » by material and optoelectronic properties ideal for photovoltaic applications, namely Cu2ZnSn(S,Se)4 (CZTSSe). Once synthesized, such nanocrystals are formulated into an ink, coated onto substrates, and processed into completed solar cells in such a way that enables scale-up to high throughput, roll-to-roll manufacturing processes. This project aimed to address the major limitation to CZTSSe solar cell pce’s – the low open-circuit voltage (Voc) reported throughout literature for devices comprised of this material. Throughout the project significant advancements have been made in fundamental understanding of the CZTSSe material and device limitations associated with this material system. Additionally, notable improvements have been made to our nanocrystal based processing technique to alleviate performance limitations due to the identified device limitations. Notably, (1) significant improvements have been made in reducing intra- and inter-nanoparticle heterogeneity, (2) improvements in device performance have been realized with novel cation substitution in Ge-alloyed CZTGeSSe absorbers, (3) systematic analysis of absorber sintering has been conducted to optimize the selenization process for large grain CZTSSe absorbers, (4) novel electrical characterization analysis techniques have been developed to identify significant limitations to traditional electrical characterization of CZTSSe devices, and (5) the developed electrical analysis techniques have been used to identify the role that band gap and electrostatic potential fluctuations have in limiting device performance for this material system. The device modeling and characterization of CZTSSe undertaken with this project have significant implications for the CZTSSe research community, as the identified limitations due to potential fluctuations are expected to be a performance limitation to high-efficiency CZTSSe devices fabricated from all current processing techniques. Additionally, improvements realized through enhanced absorber processing conditions to minimize nanoparticle and large-grain absorber heterogeneity are suggested to be beneficial processing improvements which should be applied to CZTSSe devices fabricated from all processing techniques. Ultimately, our research has indicated that improved performance for CZTSSe will be achieved through novel absorber processing which minimizes defect formation, elemental losses, secondary phase formation, and compositional uniformity in CZTSSe absorbers; we believe this novel absorber processing can be achieved through nanocrystal based processing of CZTSSe which is an active area of research at the conclusion of this award. While significant fundamental understanding of CZTSSe and the performance limitations associated with this material system, as well as notable improvements in the processing of nanocrystal based CZTSSe absorbers, have been achieved under this project, the limitation of two years of research funding towards our goals prevents further significant advancements directly identified through pce. improvements relative to those reported herein. As the characterization and modeling subtask of this project has been the main driving force for understanding device limitations, the conclusions of this analysis have just recently been applied to the processing of nanocrystal based CZTSSe absorbers -- with notable success. We expect the notable fundamental understanding of device limitations and absorber sintering achieved under this project will lead to significant improvements in device performance for CZTSSe devices in the near future for devices fabricated from a variety of processing techniques« less

  14. Multimodal system planning technique : an analytical approach to peak period operation

    DOT National Transportation Integrated Search

    1995-11-01

    The multimodal system planning technique described in this report is an improvement of the methodology used in the Dallas System Planning Study. The technique includes a spreadsheet-based process to identify the costs of congestion, construction, and...

  15. Improvement of the reliability of laser beam microwelding as interconnection technique

    NASA Astrophysics Data System (ADS)

    Glasmacher, Mathias; Pucher, Hans-Joerg; Geiger, Manfred

    1996-04-01

    The requirements of actual trends for joining within modern electronics production can be met with the technique of laser beam micro welding, which is the topic of this paper. Thereby component leads are welded directly to the conducting tracks of the circuit board. This technique is not limited to electronics, because fine mechanical parts can be joined with the same equipment, too. The advantages as high temperature strength, reduced manufacturing time and simplified material separation at the end of the life cycle are noted. Furthermore the drawbacks of laser beam micro welding as a competitive joining technique to soldering are discussed. The reasons for the unstable process behavior of different welding scenarios can be understood by taking the changes of some process parameters into account. Since the process reliability can be improved by a proper process design as well as by closed-loop-control, results of finite element calculations of the temperature field as well as experimental setup for the determination of the melting point are presented. Future work is stated to spread the applicability of this joining technique as well as to develop an on-line control for high performance welding of locally restricted structures.

  16. Improvement of Storm Forecasts Using Gridded Bayesian Linear Regression for Northeast United States

    NASA Astrophysics Data System (ADS)

    Yang, J.; Astitha, M.; Schwartz, C. S.

    2017-12-01

    Bayesian linear regression (BLR) is a post-processing technique in which regression coefficients are derived and used to correct raw forecasts based on pairs of observation-model values. This study presents the development and application of a gridded Bayesian linear regression (GBLR) as a new post-processing technique to improve numerical weather prediction (NWP) of rain and wind storm forecasts over northeast United States. Ten controlled variables produced from ten ensemble members of the National Center for Atmospheric Research (NCAR) real-time prediction system are used for a GBLR model. In the GBLR framework, leave-one-storm-out cross-validation is utilized to study the performances of the post-processing technique in a database composed of 92 storms. To estimate the regression coefficients of the GBLR, optimization procedures that minimize the systematic and random error of predicted atmospheric variables (wind speed, precipitation, etc.) are implemented for the modeled-observed pairs of training storms. The regression coefficients calculated for meteorological stations of the National Weather Service are interpolated back to the model domain. An analysis of forecast improvements based on error reductions during the storms will demonstrate the value of GBLR approach. This presentation will also illustrate how the variances are optimized for the training partition in GBLR and discuss the verification strategy for grid points where no observations are available. The new post-processing technique is successful in improving wind speed and precipitation storm forecasts using past event-based data and has the potential to be implemented in real-time.

  17. Use of Process Improvement Tools in Radiology.

    PubMed

    Rawson, James V; Kannan, Amogha; Furman, Melissa

    2016-01-01

    Process improvement techniques are common in manufacturing and industry. Over the past few decades these principles have been slowly introduced in select health care settings. This article reviews the Plan, Do, Study, and Act cycle, Six Sigma, the System of Profound Knowledge, Lean, and the theory of constraints. Specific process improvement tools in health care and radiology are presented in the order the radiologist is likely to encounter them in an improvement project. Copyright © 2015 Mosby, Inc. All rights reserved.

  18. Micro-scale experimental study of Microbial-Induced Carbonate Precipitation (MICP) by using microfluidic devices

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Soga, K.; DeJong, J. T.; Kabla, A.

    2017-12-01

    Microbial-induced carbonate precipitation (MICP), one of the bio-mineralization processes, is an innovative subsurface improvement technique for enhancing the strength and stiffness of soils, and controlling their hydraulic conductivity. These macro-scale engineering properties of MICP treated soils controlled by micro-scale factors of the precipitated carbonate, such as its content, amount and distribution in the soil matrix. The precipitation process itself is affected by bacteria amount, reaction kinetics, porous medium geometry and flow distribution in the soils. Accordingly, to better understand the MICP process at the pore scale a new experimental technique that can observe the entire process of MICP at the pore-scale was developed. In this study, a 2-D transparent microfluidic chip made of Polydimethylsiloxane (PDMS) representing the soil matrix was designed and fabricated. A staged-injection MICP treatment procedure was simulated inside the microfluidic chip while continuously monitored using microscopic techniques. The staged-injection MICP treatment procedure started with the injection of bacteria suspension, followed with the bacteria setting for attachment, and then ended with the multiple injections of cementation liquid. The main MICP processes visualized during this procedure included the bacteria transport and attachment during the bacteria injection, the bacteria attachment and growth during the bacteria settling, the bacteria detachment during the cementation liquid injection, the cementation development during the cementation liquid injection, and the cementation development after the completion of cementation liquid injection. It is suggested that the visualization of the main MICP processes using the microfluidic technique can improve understating of the fundamental mechanisms of MICP and consequently help improve the treatment technique for in situ implementation of MICP.

  19. Contract management techniques for improving construction quality

    DOT National Transportation Integrated Search

    1997-07-01

    Efforts to improve quality in highway construction embrace many aspects of the construction process. Quality goals include enhanced efficiency and productivity, optimal cost and delivery time, improved performance, and changes in attitude-promoting a...

  20. Hot spot variability and lithography process window investigation by CDU improvement using CDC technique

    NASA Astrophysics Data System (ADS)

    Thamm, Thomas; Geh, Bernd; Djordjevic Kaufmann, Marija; Seltmann, Rolf; Bitensky, Alla; Sczyrba, Martin; Samy, Aravind Narayana

    2018-03-01

    Within the current paper, we will concentrate on the well-known CDC technique from Carl Zeiss to improve the CD distribution of the wafer by improving the reticle CDU and its impact on hotspots and Litho process window. The CDC technique uses an ultra-short pulse laser technology, which generates a micro-level Shade-In-Element (also known as "Pixels") into the mask quartz bulk material. These scatter centers are able to selectively attenuate certain areas of the reticle in higher resolution compared to other methods and thus improve the CD uniformity. In a first section, we compare the CDC technique with scanner dose correction schemes. It becomes obvious, that the CDC technique has unique advantages with respect to spatial resolution and intra-field flexibility over scanner correction schemes, however, due to the scanner flexibility across wafer both methods are rather complementary than competing. In a second section we show that a reference feature based correction scheme can be used to improve the CDU of a full chip with multiple different features that have different MEEF and dose sensitivities. In detail we will discuss the impact of forward scattering light originated by the CDC pixels on the illumination source and the related proximity signature. We will show that the impact on proximity is small compared to the CDU benefit of the CDC technique. Finally we show to which extend the reduced variability across reticle will result in a better common electrical process window of a whole chip design on the whole reticle field on wafer. Finally we will discuss electrical verification results between masks with purposely made bad CDU that got repaired by the CDC technique versus inherently good "golden" masks on a complex logic device. No yield difference is observed between the repaired bad masks and the masks with good CDU.

  1. The National Shipbuilding Research Program: Implementation of Past NSRP Research Through Education and Training

    DTIC Science & Technology

    1999-01-05

    used in each chapter to define the techniques of waste minimization are: improved operation management , material substitution, process substitution...1994 – Reduce Quantity & Toxicity of Waste • Improved Operation Management • Material & Process Substitution • Recycling • Treatment Advantages

  2. Overlay metrology for double patterning processes

    NASA Astrophysics Data System (ADS)

    Leray, Philippe; Cheng, Shaunee; Laidler, David; Kandel, Daniel; Adel, Mike; Dinu, Berta; Polli, Marco; Vasconi, Mauro; Salski, Bartlomiej

    2009-03-01

    The double patterning (DPT) process is foreseen by the industry to be the main solution for the 32 nm technology node and even beyond. Meanwhile process compatibility has to be maintained and the performance of overlay metrology has to improve. To achieve this for Image Based Overlay (IBO), usually the optics of overlay tools are improved. It was also demonstrated that these requirements are achievable with a Diffraction Based Overlay (DBO) technique named SCOLTM [1]. In addition, we believe that overlay measurements with respect to a reference grid are required to achieve the required overlay control [2]. This induces at least a three-fold increase in the number of measurements (2 for double patterned layers to the reference grid and 1 between the double patterned layers). The requirements of process compatibility, enhanced performance and large number of measurements make the choice of overlay metrology for DPT very challenging. In this work we use different flavors of the standard overlay metrology technique (IBO) as well as the new technique (SCOL) to address these three requirements. The compatibility of the corresponding overlay targets with double patterning processes (Litho-Etch-Litho-Etch (LELE); Litho-Freeze-Litho-Etch (LFLE), Spacer defined) is tested. The process impact on different target types is discussed (CD bias LELE, Contrast for LFLE). We compare the standard imaging overlay metrology with non-standard imaging techniques dedicated to double patterning processes (multilayer imaging targets allowing one overlay target instead of three, very small imaging targets). In addition to standard designs already discussed [1], we investigate SCOL target designs specific to double patterning processes. The feedback to the scanner is determined using the different techniques. The final overlay results obtained are compared accordingly. We conclude with the pros and cons of each technique and suggest the optimal metrology strategy for overlay control in double patterning processes.

  3. Improving patient safety and optimizing nursing teamwork using crew resource management techniques.

    PubMed

    West, Priscilla; Sculli, Gary; Fore, Amanda; Okam, Nwoha; Dunlap, Cleveland; Neily, Julia; Mills, Peter

    2012-01-01

    This project describes the application of the "sterile cockpit rule," a crew resource management (CRM) technique, targeted to improve efficacy and safety for nursing assistants in the performance of patient care duties. Crew resource management techniques have been successfully implemented in the aviation industry to improve flight safety. Application of these techniques can improve patient safety in medical settings. The Veterans Affairs (VA) National Center for Patient Safety conducted a CRM training program in select VA nursing units. One unit developed a novel application of the sterile cockpit rule to create protected time for certified nursing assistants (CNAs) while they collected vital signs and blood glucose data at the beginning of each shift. The typical nursing authority structure was reversed, with senior nurses protecting CNAs from distractions. This process led to improvements in efficiency and communication among nurses, with the added benefit of increased staff morale. Crew resource management techniques can be used to improve efficiency, morale, and patient safety in the healthcare setting.

  4. Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn

    This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…

  5. Recommended techniques for effective maintainability. A continuous improvement initiative of the NASA Reliability and Maintainability Steering Committee

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.

  6. A study of FM threshold extension techniques

    NASA Technical Reports Server (NTRS)

    Arndt, G. D.; Loch, F. J.

    1972-01-01

    The characteristics of three postdetection threshold extension techniques are evaluated with respect to the ability of such techniques to improve the performance of a phase lock loop demodulator. These techniques include impulse-noise elimination, signal correlation for the detection of impulse noise, and delta modulation signal processing. Experimental results from signal to noise ratio data and bit error rate data indicate that a 2- to 3-decibel threshold extension is readily achievable by using the various techniques. This threshold improvement is in addition to the threshold extension that is usually achieved through the use of a phase lock loop demodulator.

  7. Introduction of novel 3D-printed superficial applicators for high-dose-rate skin brachytherapy.

    PubMed

    Jones, Emma-Louise; Tonino Baldion, Anna; Thomas, Christopher; Burrows, Tom; Byrne, Nick; Newton, Victoria; Aldridge, Sarah

    Custom-made surface mold applicators often allow more flexibility when carrying out skin brachytherapy, particularly for small treatment areas with high surface obliquity. They can, however, be difficult to manufacture, particularly if there is a lack of experience in superficial high-dose-rate brachytherapy techniques or with limited resources. We present a novel method of manufacturing superficial brachytherapy applicators utilizing three-dimensional (3D)-printing techniques. We describe the treatment planning process and the process of applicator manufacture. The treatment planning process, with the introduction of a pre-plan, allows for an "ideal" catheter arrangement within an applicator to be determined, exploiting varying catheter orientations, heights, and curvatures if required. The pre-plan arrangement is then 3D printed to the exact specifications of the pre-plan applicator design. This results in improved target volume coverage and improved sparing of organs at risk. Using a pre-plan technique for ideal catheter placement followed by automated 3D-printed applicator manufacture has greatly improved the entire process of superficial high-dose-rate brachytherapy treatment. We are able to design and manufacture flexible, well-fitting, superior quality applicators resulting in a more efficient and improved patient pathway and patient experience. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  8. Continuous fiber-reinforced titanium aluminide composites

    NASA Technical Reports Server (NTRS)

    Mackay, R. A.; Brindley, P. K.; Froes, F. H.

    1991-01-01

    An account is given of the fabrication techniques, microstructural characteristics, and mechanical behavior of a lightweight, high service temperature SiC-reinforced alpha-2 Ti-14Al-21Nb intermetallic-matrix composite. Fabrication techniques under investigation to improve the low-temperature ductility and environmental resistance of this material system, while reducing manufacturing costs to competitive levels, encompass powder-cloth processing, foil-fiber-foil processing, and thermal-spray processing. Attention is given to composite microstructure problems associated with fiber distribution and fiber-matrix interfaces, as well as with mismatches of thermal-expansion coefficient; major improvements are noted to be required in tensile properties, thermal cycling effects, mechanical damage, creep, and environmental effects.

  9. Meeting the needs of an ever-demanding market.

    PubMed

    Rigby, Richard

    2002-04-01

    Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.

  10. Recombinant organisms for production of industrial products.

    PubMed

    Adrio, Jose-Luis; Demain, Arnold L

    2010-01-01

    A revolution in industrial microbiology was sparked by the discoveries of ther double-stranded structure of DNA and the development of recombinant DNA technology. Traditional industrial microbiology was merged with molecular biology to yield improved recombinant processes for the industrial production of primary and secondary metabolites, protein biopharmaceuticals and industrial enzymes. Novel genetic techniques such as metabolic engineering, combinatorial biosynthesis and molecular breeding techniques and their modifications are contributing greatly to the development of improved industrial processes. In addition, functional genomics, proteomics and metabolomics are being exploited for the discovery of novel valuable small molecules for medicine as well as enzymes for catalysis. The sequencing of industrial microbal genomes is being carried out which bodes well for future process improvement and discovery of new industrial products. © 2010 Landes Bioscience

  11. Improvements in the efficiency of turboexpanders in cryogenic applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agahi, R.R.; Lin, M.C.; Ershaghi, B.

    1996-12-31

    Process designers have utilized turboexpanders in cryogenic processes because of their higher thermal efficiencies when compared with conventional refrigeration cycles. Process design and equipment performance have improved substantially through the utilization of modern technologies. Turboexpander manufacturers have also adopted Computational Fluid Dynamic Software, Computer Numerical Control Technology and Holography Techniques to further improve an already impressive turboexpander efficiency performance. In this paper, the authors explain the design process of the turboexpander utilizing modern technology. Two cases of turboexpanders processing helium (4.35{degrees}K) and hydrogen (56{degrees}K) will be presented.

  12. Bootstrapping Process Improvement Metrics: CMMI Level 4 Process Improvement Metrics in a Level 3 World

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Lewicki, Scott; Morgan, Scott

    2011-01-01

    The measurement techniques for organizations which have achieved the Software Engineering Institutes CMMI Maturity Levels 4 and 5 are well documented. On the other hand, how to effectively measure when an organization is Maturity Level 3 is less well understood, especially when there is no consistency in tool use and there is extensive tailoring of the organizational software processes. Most organizations fail in their attempts to generate, collect, and analyze standard process improvement metrics under these conditions. But at JPL, NASA's prime center for deep space robotic exploration, we have a long history of proving there is always a solution: It just may not be what you expected. In this paper we describe the wide variety of qualitative and quantitative techniques we have been implementing over the last few years, including the various approaches used to communicate the results to both software technical managers and senior managers.

  13. A green desulfurization technique: utilization of flue gas SO2 to produce H2 via a photoelectrochemical process based on Mo-doped BiVO4

    NASA Astrophysics Data System (ADS)

    Han, Jin; Li, Kejian; Cheng, Hanyun; Zhang, Liwu

    2017-12-01

    A green photoelectrochemical (PEC) process with simultaneous SO2 removal and H2 production has attracted an increasing attention. The proposed process uses flue gas SO2 to improve H2 production. The improvement of the efficiency of this process is necessary before it can become industrial viable. Herein, we reported a Mo modified BiVO4 photocatalysts for a simultaneous SO2 removal and H2 production. And the PEC performance could be significantly improved with doping and flue gas removal. The evolution rate of H2 and removal of SO2 could be enhanced by almost 3 times after Mo doping as compared with pristine BiVO4. The enhanced H2 production and SO2 removal is attributed to the improved bulk charge carrier transportation after Mo doping, and greatly enhanced oxidation reaction kinetics on the photoanode due to the formation of SO32- after SO2 absorption by the electrolyte. Due to the utilization of SO2 to improve the production of H2, the proposed PEC process may become a profitable desulfurization technique.

  14. A Green Desulfurization Technique: Utilization of Flue Gas SO2 to Produce H2 via a Photoelectrochemical Process Based on Mo-Doped BiVO4

    PubMed Central

    Han, Jin; Li, Kejian; Cheng, Hanyun; Zhang, Liwu

    2017-01-01

    A green photoelectrochemical (PEC) process with simultaneous SO2 removal and H2 production has attracted an increasing attention. The proposed process uses flue gas SO2 to improve H2 production. The improvement of the efficiency of this process is necessary before it can become industrial viable. Herein, we reported a Mo modified BiVO4 photocatalysts for a simultaneous SO2 removal and H2 production. And the PEC performance could be significantly improved with doping and flue gas removal. The evolution rate of H2 and removal of SO2 could be enhanced by almost three times after Mo doping as compared with pristine BiVO4. The enhanced H2 production and SO2 removal is attributed to the improved bulk charge carrier transportation after Mo doping, and greatly enhanced oxidation reaction kinetics on the photoanode due to the formation of SO32− after SO2 absorption by the electrolyte. Due to the utilization of SO2 to improve the production of H2, the proposed PEC process may become a profitable desulfurization technique. PMID:29312924

  15. A Green Desulfurization Technique: Utilization of Flue Gas SO2 to Produce H2 via a Photoelectrochemical Process Based on Mo-Doped BiVO4.

    PubMed

    Han, Jin; Li, Kejian; Cheng, Hanyun; Zhang, Liwu

    2017-01-01

    A green photoelectrochemical (PEC) process with simultaneous SO 2 removal and H 2 production has attracted an increasing attention. The proposed process uses flue gas SO 2 to improve H 2 production. The improvement of the efficiency of this process is necessary before it can become industrial viable. Herein, we reported a Mo modified BiVO 4 photocatalysts for a simultaneous SO 2 removal and H 2 production. And the PEC performance could be significantly improved with doping and flue gas removal. The evolution rate of H 2 and removal of SO 2 could be enhanced by almost three times after Mo doping as compared with pristine BiVO 4 . The enhanced H 2 production and SO 2 removal is attributed to the improved bulk charge carrier transportation after Mo doping, and greatly enhanced oxidation reaction kinetics on the photoanode due to the formation of [Formula: see text] after SO 2 absorption by the electrolyte. Due to the utilization of SO 2 to improve the production of H 2 , the proposed PEC process may become a profitable desulfurization technique.

  16. Electrical test prediction using hybrid metrology and machine learning

    NASA Astrophysics Data System (ADS)

    Breton, Mary; Chao, Robin; Muthinti, Gangadhara Raja; de la Peña, Abraham A.; Simon, Jacques; Cepler, Aron J.; Sendelbach, Matthew; Gaudiello, John; Emans, Susan; Shifrin, Michael; Etzioni, Yoav; Urenski, Ronen; Lee, Wei Ti

    2017-03-01

    Electrical test measurement in the back-end of line (BEOL) is crucial for wafer and die sorting as well as comparing intended process splits. Any in-line, nondestructive technique in the process flow to accurately predict these measurements can significantly improve mean-time-to-detect (MTTD) of defects and improve cycle times for yield and process learning. Measuring after BEOL metallization is commonly done for process control and learning, particularly with scatterometry (also called OCD (Optical Critical Dimension)), which can solve for multiple profile parameters such as metal line height or sidewall angle and does so within patterned regions. This gives scatterometry an advantage over inline microscopy-based techniques, which provide top-down information, since such techniques can be insensitive to sidewall variations hidden under the metal fill of the trench. But when faced with correlation to electrical test measurements that are specific to the BEOL processing, both techniques face the additional challenge of sampling. Microscopy-based techniques are sampling-limited by their small probe size, while scatterometry is traditionally limited (for microprocessors) to scribe targets that mimic device ground rules but are not necessarily designed to be electrically testable. A solution to this sampling challenge lies in a fast reference-based machine learning capability that allows for OCD measurement directly of the electrically-testable structures, even when they are not OCD-compatible. By incorporating such direct OCD measurements, correlation to, and therefore prediction of, resistance of BEOL electrical test structures is significantly improved. Improvements in prediction capability for multiple types of in-die electrically-testable device structures is demonstrated. To further improve the quality of the prediction of the electrical resistance measurements, hybrid metrology using the OCD measurements as well as X-ray metrology (XRF) is used. Hybrid metrology is the practice of combining information from multiple sources in order to enable or improve the measurement of one or more critical parameters. Here, the XRF measurements are used to detect subtle changes in barrier layer composition and thickness that can have second-order effects on the electrical resistance of the test structures. By accounting for such effects with the aid of the X-ray-based measurements, further improvement in the OCD correlation to electrical test measurements is achieved. Using both types of solution incorporation of fast reference-based machine learning on nonOCD-compatible test structures, and hybrid metrology combining OCD with XRF technology improvement in BEOL cycle time learning could be accomplished through improved prediction capability.

  17. Cellulase producing microorganism ATCC 55702

    DOEpatents

    Dees, H. Craig

    1997-01-01

    Bacteria which produce large amounts of cellulase--containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualifies for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  18. Cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H. Craig

    1997-12-16

    Bacteria which produce large amounts of cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  19. Cellulase producing microorganism ATCC 55702

    DOEpatents

    Dees, H.C.

    1997-12-30

    Bacteria which produce large amounts of cellulase--containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualifies for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  20. Technologies and Trends to Improve Table Olive Quality and Safety

    PubMed Central

    Campus, Marco; Değirmencioğlu, Nurcan; Comunian, Roberta

    2018-01-01

    Table olives are the most widely consumed fermented food in the Mediterranean countries. Peculiar processing technologies are used to process olives, which are aimed at the debittering of the fruits and improvement of their sensory characteristics, ensuring safety of consumption at the same time. Processors demand for novel techniques to improve industrial performances, while consumers' attention for natural and healthy foods has increased in recent years. From field to table, new techniques have been developed to decrease microbial load of potential spoilage microorganisms, improve fermentation kinetics and ensure safety of consumption of the packed products. This review article depicts current technologies and recent advances in the processing technology of table olives. Attention has been paid on pre processing technologies, some of which are still under-researched, expecially physical techniques, such ad ionizing radiations, ultrasounds and electrolyzed water solutions, which are interesting also to ensure pesticide decontamination. The selections and use of starter cultures have been extensively reviewed, particularly the characterization of Lactic Acid Bacteria and Yeasts to fasten and safely drive the fermentation process. The selection and use of probiotic strains to address the request for functional foods has been reported, along with salt reduction strategies to address health concerns, associated with table olives consumption. In this respect, probiotics enriched table olives and strategies to reduce sodium intake are the main topics discussed. New processing technologies and post packaging interventions to extend the shelf life are illustrated, and main findings in modified atmosphere packaging, high pressure processing and biopreservaton applied to table olive, are reported and discussed. PMID:29670593

  1. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  2. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique

    PubMed Central

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364

  3. Comparing Parameter Estimation Techniques for an Electrical Power Transformer Oil Temperature Prediction Model

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    1999-01-01

    This paper examines various sources of error in MIT's improved top oil temperature rise over ambient temperature model and estimation process. The sources of error are the current parameter estimation technique, quantization noise, and post-processing of the transformer data. Results from this paper will show that an output error parameter estimation technique should be selected to replace the current least squares estimation technique. The output error technique obtained accurate predictions of transformer behavior, revealed the best error covariance, obtained consistent parameter estimates, and provided for valid and sensible parameters. This paper will also show that the output error technique should be used to minimize errors attributed to post-processing (decimation) of the transformer data. Models used in this paper are validated using data from a large transformer in service.

  4. Improving Group Processes in Transdisciplinary Case Studies for Sustainability Learning

    ERIC Educational Resources Information Center

    Hansmann, Ralf; Crott, Helmut W.; Mieg, Harald A.; Scholz, Roland W.

    2009-01-01

    Purpose: Deficient group processes such as conformity pressure can lead to inadequate group decisions with negative social, economic, or environmental consequences. The study aims to investigate how a group technique (called INFO) improves students' handling of conformity pressure and their collective judgments in the context of a…

  5. Development of a radiation-hard CMOS process

    NASA Technical Reports Server (NTRS)

    Power, W. L.

    1983-01-01

    It is recommended that various techniques be investigated which appear to have the potential for improving the radiation hardness of CMOS devices for prolonged space flight mission. The three key recommended processing techniques are: (1) making the gate oxide thin. It has been shown that radiation degradation is proportional to the cube of oxide thickness so that a relatively small reduction in thickness can greatly improve radiation resistance; (2) cleanliness and contamination control; and (3) to investigate different oxide growth (low temperature dry, TCE and HCL). All three produce high quality clean oxides, which are more radiation tolerant. Technique 2 addresses the reduction of metallic contamination. Technique 3 will produce a higher quality oxide by using slow growth rate conditions, and will minimize the effects of any residual sodium contamination through the introduction of hydrogen and chlorine into the oxide during growth.

  6. Post-processing of metal matrix composites by friction stir processing

    NASA Astrophysics Data System (ADS)

    Sharma, Vipin; Singla, Yogesh; Gupta, Yashpal; Raghuwanshi, Jitendra

    2018-05-01

    In metal matrix composites non-uniform distribution of reinforcement particles resulted in adverse affect on the mechanical properties. It is of great interest to explore post-processing techniques that can eliminate particle distribution heterogeneity. Friction stir processing is a relatively newer technique used for post-processing of metal matrix composites to improve homogeneity in particles distribution. In friction stir processing, synergistic effect of stirring, extrusion and forging resulted in refinement of grains, reduction of reinforcement particles size, uniformity in particles distribution, reduction in microstructural heterogeneity and elimination of defects.

  7. CMMI(Registered) for Acquisition, Version 1.3. CMMI-ACQ, V1.3

    DTIC Science & Technology

    2010-11-01

    and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information...International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology – Security Techniques...International Organization for Standardization/International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes

  8. A New Multi-Agent Approach to Adaptive E-Education

    NASA Astrophysics Data System (ADS)

    Chen, Jing; Cheng, Peng

    Improving customer satisfaction degree is important in e-Education. This paper describes a new approach to adaptive e-Education taking into account the full spectrum of Web service techniques and activities. It presents a multi-agents architecture based on artificial psychology techniques, which makes the e-Education process both adaptable and dynamic, and hence up-to-date. Knowledge base techniques are used to support the e-Education process, and artificial psychology techniques to deal with user psychology, which makes the e-Education system more effective and satisfying.

  9. Dictionary-based image reconstruction for superresolution in integrated circuit imaging.

    PubMed

    Cilingiroglu, T Berkin; Uyar, Aydan; Tuysuzoglu, Ahmet; Karl, W Clem; Konrad, Janusz; Goldberg, Bennett B; Ünlü, M Selim

    2015-06-01

    Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis. The effectiveness of the framework is demonstrated on experimental data.

  10. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method.

    PubMed

    Batres-Mendoza, Patricia; Ibarra-Manzano, Mario A; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Montoro-Sanjose, Carlos R; Romero-Troncoso, Rene J; Rostro-Gonzalez, Horacio

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications.

  11. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method

    PubMed Central

    Batres-Mendoza, Patricia; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Montoro-Sanjose, Carlos R.

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications. PMID:29348744

  12. Correlation processing for correction of phase distortions in subaperture imaging.

    PubMed

    Tavh, B; Karaman, M

    1999-01-01

    Ultrasonic subaperture imaging combines synthetic aperture and phased array approaches and permits low-cost systems with improved image quality. In subaperture processing, a large array is synthesized using echo signals collected from a number of receive subapertures by multiple firings of a phased transmit subaperture. Tissue inhomogeneities and displacements in subaperture imaging may cause significant phase distortions on received echo signals. Correlation processing on reference echo signals can be used for correction of the phase distortions, for which the accuracy and robustness are critically limited by the signal correlation. In this study, we explore correlation processing techniques for adaptive subaperture imaging with phase correction for motion and tissue inhomogeneities. The proposed techniques use new subaperture data acquisition schemes to produce reference signal sets with improved signal correlation. The experimental test results were obtained using raw radio frequency (RF) data acquired from two different phantoms with 3.5 MHz, 128-element transducer array. The results show that phase distortions can effectively be compensated by the proposed techniques in real-time adaptive subaperture imaging.

  13. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  14. Assessment of hospital processes using a process mining technique: Outpatient process analysis at a tertiary hospital.

    PubMed

    Yoo, Sooyoung; Cho, Minsu; Kim, Eunhye; Kim, Seok; Sim, Yerim; Yoo, Donghyun; Hwang, Hee; Song, Minseok

    2016-04-01

    Many hospitals are increasing their efforts to improve processes because processes play an important role in enhancing work efficiency and reducing costs. However, to date, a quantitative tool has not been available to examine the before and after effects of processes and environmental changes, other than the use of indirect indicators, such as mortality rate and readmission rate. This study used process mining technology to analyze process changes based on changes in the hospital environment, such as the construction of a new building, and to measure the effects of environmental changes in terms of consultation wait time, time spent per task, and outpatient care processes. Using process mining technology, electronic health record (EHR) log data of outpatient care before and after constructing a new building were analyzed, and the effectiveness of the technology in terms of the process was evaluated. Using the process mining technique, we found that the total time spent in outpatient care did not increase significantly compared to that before the construction of a new building, considering that the number of outpatients increased, and the consultation wait time decreased. These results suggest that the operation of the outpatient clinic was effective after changes were implemented in the hospital environment. We further identified improvements in processes using the process mining technique, thereby demonstrating the usefulness of this technique for analyzing complex hospital processes at a low cost. This study confirmed the effectiveness of process mining technology at an actual hospital site. In future studies, the use of process mining technology will be expanded by applying this approach to a larger variety of process change situations. Copyright © 2016. Published by Elsevier Ireland Ltd.

  15. Management strategies to effect change in intensive care units: lessons from the world of business. Part I. Targeting quality improvement initiatives.

    PubMed

    Gershengorn, Hayley B; Kocher, Robert; Factor, Phillip

    2014-02-01

    The business community has developed strategies to ensure the quality of the goods or services they produce and to improve the management of multidisciplinary work teams. With modification, many of these techniques can be imported into intensive care units (ICUs) to improve clinical operations and patient safety. In Part I of a three-part ATS Seminar series, we argue for adopting business management strategies in ICUs and set forth strategies for targeting selected quality improvement initiatives. These tools are relevant to health care today as focus is placed on limiting low-value care and measuring, reporting, and improving quality. In the ICU, the complexity of illness and the need to standardize processes make these tools even more appealing. Herein, we highlight four techniques to help prioritize initiatives. First, the "80/20 rule" mandates focus on the few (20%) interventions likely to drive the majority (80%) of improvement. Second, benchmarking--a process of comparison with peer units or institutions--is essential to identifying areas of strength and weakness. Third, root cause analyses, in which structured retrospective reviews of negative events are performed, can be used to identify and fix systems errors. Finally, failure mode and effects analysis--a process aimed at prospectively identifying potential sources of error--allows for systems fixes to be instituted in advance to prevent negative outcomes. These techniques originated in fields other than health care, yet adoption has and can help ICU managers prioritize issues for quality improvement.

  16. Processing Techniques for Intelligibility Improvement to Speech with Co-Channel Interference.

    DTIC Science & Technology

    1983-09-01

    processing was found to be always less than in the original unprocessed co-channel sig- nali also as the length of the comb filter increased, the...7 D- i35 702 PROCESSING TECHNIQUES FOR INTELLIGIBILITY IMPRO EMENT 1𔃼.TO SPEECH WITH CO-C..(U) SIGNAL TECHNOLOGY INC GOLETACA B A HANSON ET AL SEP...11111111122 11111.25 1111 .4 111.6 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU Of STANDARDS- 1963-A RA R.83-225 Set ,’ember 1983 PROCESSING

  17. The Taguchi Method Application to Improve the Quality of a Sustainable Process

    NASA Astrophysics Data System (ADS)

    Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.

    2018-06-01

    Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.

  18. The impact of the alexander technique on improving posture and surgical ergonomics during minimally invasive surgery: pilot study.

    PubMed

    Reddy, Pramod P; Reddy, Trisha P; Roig-Francoli, Jennifer; Cone, Lois; Sivan, Bezalel; DeFoor, W Robert; Gaitonde, Krishnanath; Noh, Paul H

    2011-10-01

    One of the main ergonomic challenges during surgical procedures is surgeon posture. There have been reports of a high number of work related injuries in laparoscopic surgeons. The Alexander technique is a process of psychophysical reeducation of the body to improve postural balance and coordination, permitting movement with minimal strain and maximum ease. We evaluated the efficacy of the Alexander technique in improving posture and surgical ergonomics during minimally invasive surgery. We performed a prospective cohort study in which subjects served as their own controls. Informed consent was obtained. Before Alexander technique instruction/intervention subjects underwent assessment of postural coordination and basic laparoscopic skills. All subjects were educated about the Alexander technique and underwent post-instruction/intervention assessment of posture and laparoscopic skills. Subjective and objective data obtained before and after instruction/intervention were tabulated and analyzed for statistical significance. All 7 subjects completed the study. Subjects showed improved ergonomics and improved ability to complete FLS™ as well as subjective improvement in overall posture. The Alexander technique training program resulted in a significant improvement in posture. Improved surgical ergonomics, endurance and posture decrease surgical fatigue and the incidence of repetitive stress injuries to laparoscopic surgeons. Further studies of the influence of the Alexander technique on surgical posture, minimally invasive surgery ergonomics and open surgical techniques are warranted to explore and validate the benefits for surgeons. Copyright © 2011 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree

    Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.

  20. Issues related to processability during the manufacture of thermoplastic composites using on-line consolidation techniques

    NASA Astrophysics Data System (ADS)

    Ghasemi Nejhad, M. N.

    1993-04-01

    The on-line consolidation of thermoplastic composites is a relatively new technology that can be used to manufacture composite parts with complex geometries. The localized melting/solidification technique employed in this process can reduce the residual stresses and allow for improved dimensional stability and performance. An additional advantage of this technique is the elimination of the curing steps which are necessary in the processing of thermoset-matrix composites. This article presents the effects of processing parameters on processability in on-line consolidation of thermoplastic composites for tape-laying and filament-winding processes employing anisotropic thermal analyses. The results show that the heater size, preheating conditions, and tow thickness can significantly affect the processing window which, in turn, affects the production rate and the quality of the parts.

  1. Signal processing for ION mobility spectrometers

    NASA Technical Reports Server (NTRS)

    Taylor, S.; Hinton, M.; Turner, R.

    1995-01-01

    Signal processing techniques for systems based upon Ion Mobility Spectrometry will be discussed in the light of 10 years of experience in the design of real-time IMS. Among the topics to be covered are compensation techniques for variations in the number density of the gas - the use of an internal standard (a reference peak) or pressure and temperature sensors. Sources of noise and methods for noise reduction will be discussed together with resolution limitations and the ability of deconvolution techniques to improve resolving power. The use of neural networks (either by themselves or as a component part of a processing system) will be reviewed.

  2. Evaluation of Teachers' Opinions Relating Improving Qualification in Teaching Process

    ERIC Educational Resources Information Center

    Dursun, Fevzi

    2017-01-01

    Improving quality and providing permanent learning in the teaching process undoubtedly depend on the time that teacher spends and active and voluntary participation of students. This study is important for providing perspectives about new techniques and suggestions to the teachers and related persons by determining actions and thoughts of teachers…

  3. Improving the Process of Career Decision Making: An Action Research Approach

    ERIC Educational Resources Information Center

    Greenbank, Paul

    2011-01-01

    Purpose: This study adopts an action research approach with the aim of improving the process of career decision making among undergraduates in a business school at a "new" university in the UK. Design/methodology/approach: The study utilised unfreezing techniques, multiple case studies in conjunction with the principle of analogical…

  4. Evaluating and Improving the Mathematics Teaching-Learning Process through Metacognition

    ERIC Educational Resources Information Center

    Desoete, Annemie

    2007-01-01

    Introduction: Despite all the emphasis on metacognition, researchers currently use different techniques to assess metacognition. The purpose of this contribution is to help to clarify some of the paradigms on the evaluation of metacognition. In addition the paper reviews studies aiming to improve the learning process through metacognition. Method:…

  5. Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    NASA Astrophysics Data System (ADS)

    Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.

    2011-08-01

    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.

  6. Forecasting, Forecasting

    Treesearch

    Michael A. Fosberg

    1987-01-01

    Future improvements in the meteorological forecasts used in fire management will come from improvements in three areas: observational systems, forecast techniques, and postprocessing of forecasts and better integration of this information into the fire management process.

  7. Fox-7 for Insensitive Boosters

    DTIC Science & Technology

    2010-08-01

    cavitation , and therefore nucleation, to occur at each frequency. As well as producing ultrasound at different frequencies, the method of delivery of...processing techniques using ultrasound , designed to optimise FOX-7 crystal size and morphology to improve booster formulations, and results from these...7 booster formulations. Also included are particle processing techniques using ultrasound , designed to optimise FOX-7 crystal size and morphology

  8. Combinative Particle Size Reduction Technologies for the Production of Drug Nanocrystals

    PubMed Central

    Salazar, Jaime; Müller, Rainer H.; Möschwitzer, Jan P.

    2014-01-01

    Nanosizing is a suitable method to enhance the dissolution rate and therefore the bioavailability of poorly soluble drugs. The success of the particle size reduction processes depends on critical factors such as the employed technology, equipment, and drug physicochemical properties. High pressure homogenization and wet bead milling are standard comminution techniques that have been already employed to successfully formulate poorly soluble drugs and bring them to market. However, these techniques have limitations in their particle size reduction performance, such as long production times and the necessity of employing a micronized drug as the starting material. This review article discusses the development of combinative methods, such as the NANOEDGE, H 96, H 69, H 42, and CT technologies. These processes were developed to improve the particle size reduction effectiveness of the standard techniques. These novel technologies can combine bottom-up and/or top-down techniques in a two-step process. The combinative processes lead in general to improved particle size reduction effectiveness. Faster production of drug nanocrystals and smaller final mean particle sizes are among the main advantages. The combinative particle size reduction technologies are very useful formulation tools, and they will continue acquiring importance for the production of drug nanocrystals. PMID:26556191

  9. Method of producing a cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H. Craig

    1998-01-01

    Bacteria which produce large amounts of cellulose-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  10. Detergent composition comprising a cellulase containing cell-free fermentate produced from microorganism ATCC 55702 or mutant thereof

    DOEpatents

    Dees, H. Craig

    1998-01-01

    Bacteria which produce large amounts of a cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.

  11. Cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H.C.

    1997-12-16

    Bacteria which produce large amounts of cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  12. Method of producing a cellulase-containing cell-free fermentate produced from microorganism ATCC 55702

    DOEpatents

    Dees, H.C.

    1998-05-26

    Bacteria which produce large amounts of cellulose-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  13. Improved Signal Processing Technique Leads to More Robust Self Diagnostic Accelerometer System

    NASA Technical Reports Server (NTRS)

    Tokars, Roger; Lekki, John; Jaros, Dave; Riggs, Terrence; Evans, Kenneth P.

    2010-01-01

    The self diagnostic accelerometer (SDA) is a sensor system designed to actively monitor the health of an accelerometer. In this case an accelerometer is considered healthy if it can be determined that it is operating correctly and its measurements may be relied upon. The SDA system accomplishes this by actively monitoring the accelerometer for a variety of failure conditions including accelerometer structural damage, an electrical open circuit, and most importantly accelerometer detachment. In recent testing of the SDA system in emulated engine operating conditions it has been found that a more robust signal processing technique was necessary. An improved accelerometer diagnostic technique and test results of the SDA system utilizing this technique are presented here. Furthermore, the real time, autonomous capability of the SDA system to concurrently compensate for effects from real operating conditions such as temperature changes and mechanical noise, while monitoring the condition of the accelerometer health and attachment, will be demonstrated.

  14. Roughness and uniformity improvements on self-aligned quadruple patterning technique for 10nm node and beyond by wafer stress engineering

    NASA Astrophysics Data System (ADS)

    Liu, Eric; Ko, Akiteru; O'Meara, David; Mohanty, Nihar; Franke, Elliott; Pillai, Karthik; Biolsi, Peter

    2017-05-01

    Dimension shrinkage has been a major driving force in the development of integrated circuit processing over a number of decades. The Self-Aligned Quadruple Patterning (SAQP) technique is widely adapted for sub-10nm node in order to achieve the desired feature dimensions. This technique provides theoretical feasibility of multiple pitch-halving from 193nm immersion lithography by using various pattern transferring steps. The major concept of this approach is to a create spacer defined self-aligned pattern by using single lithography print. By repeating the process steps, double, quadruple, or octuple are possible to be achieved theoretically. In these small architectures, line roughness control becomes extremely important since it may contribute to a significant portion of process and device performance variations. In addition, the complexity of SAQP in terms of processing flow makes the roughness improvement indirective and ineffective. It is necessary to discover a new approach in order to improve the roughness in the current SAQP technique. In this presentation, we demonstrate a novel method to improve line roughness performances on 30nm pitch SAQP flow. We discover that the line roughness performance is strongly related to stress management. By selecting different stress level of film to be deposited onto the substrate, we can manipulate the roughness performance in line and space patterns. In addition, the impact of curvature change by applied film stress to SAQP line roughness performance is also studied. No significant correlation is found between wafer curvature and line roughness performance. We will discuss in details the step-by-step physical performances for each processing step in terms of critical dimension (CD)/ critical dimension uniformity (CDU)/line width roughness (LWR)/line edge roughness (LER). Finally, we summarize the process needed to reach the full wafer performance targets of LWR/LER in 1.07nm/1.13nm on 30nm pitch line and space pattern.

  15. A new fabrication technique for back-to-back varactor diodes

    NASA Technical Reports Server (NTRS)

    Smith, R. Peter; Choudhury, Debabani; Martin, Suzanne; Frerking, Margaret A.; Liu, John K.; Grunthaner, Frank A.

    1992-01-01

    A new varactor diode process has been developed in which much of the processing is done from the back of an extremely thin semiconductor wafer laminated to a low-dielectric substrate. Back-to-back BNN diodes were fabricated with this technique; excellent DC and low-frequency capacitance measurements were obtained. Advantages of the new technique relative to other techniques include greatly reduced frontside wafer damage from exposure to process chemicals, improved capability to integrate devices (e.g. for antenna patterns, transmission lines, or wafer-scale grids), and higher line yield. BNN diodes fabricated with this technique exhibit approximately the expected capacitance-voltage characteristics while showing leakage currents under 10 mA at voltages three times that needed to deplete the varactor. This leakage is many orders of magnitude better than comparable Schottky diodes.

  16. Manufacture of conical springs with elastic medium technology improvement

    NASA Astrophysics Data System (ADS)

    Kurguzov, S. A.; Mikhailova, U. V.; Kalugina, O. B.

    2018-01-01

    This article considers the manufacturing technology improvement by using an elastic medium in the stamping tool forming space to improve the conical springs performance characteristics and reduce the costs of their production. Estimation technique of disk spring operational properties is developed by mathematical modeling of the compression process during the operation of a spring. A technique for optimizing the design parameters of a conical spring is developed, which ensures a minimum voltage value when operated in the edge of the spring opening.

  17. An improved infrared technique for sorting pecans

    NASA Astrophysics Data System (ADS)

    Graeve, Thorsten; Dereniak, Eustace L.; Lamonica, John A., Jr.

    1991-10-01

    This paper presents the results of a study of pecan spectral reflectances. It describes an experiment for measuring the contrast between several components of raw pecan product to be sorted. An analysis of the experimental data reveals high contrast ratios in the infrared spectrum, suggesting a potential improvement in sorting efficiency when separating pecan meat from shells. It is believed that this technique has the potential to dramatically improve the efficiency of current sorting machinery, and to reduce the cost of processing pecans for the consumer market.

  18. Multispectral image sharpening using a shift-invariant wavelet transform and adaptive processing of multiresolution edges

    USGS Publications Warehouse

    Lemeshewsky, G.P.; Rahman, Z.-U.; Schowengerdt, R.A.; Reichenbach, S.E.

    2002-01-01

    Enhanced false color images from mid-IR, near-IR (NIR), and visible bands of the Landsat thematic mapper (TM) are commonly used for visually interpreting land cover type. Described here is a technique for sharpening or fusion of NIR with higher resolution panchromatic (Pan) that uses a shift-invariant implementation of the discrete wavelet transform (SIDWT) and a reported pixel-based selection rule to combine coefficients. There can be contrast reversals (e.g., at soil-vegetation boundaries between NIR and visible band images) and consequently degraded sharpening and edge artifacts. To improve performance for these conditions, I used a local area-based correlation technique originally reported for comparing image-pyramid-derived edges for the adaptive processing of wavelet-derived edge data. Also, using the redundant data of the SIDWT improves edge data generation. There is additional improvement because sharpened subband imagery is used with the edge-correlation process. A reported technique for sharpening three-band spectral imagery used forward and inverse intensity, hue, and saturation transforms and wavelet-based sharpening of intensity. This technique had limitations with opposite contrast data, and in this study sharpening was applied to single-band multispectral-Pan image pairs. Sharpening used simulated 30-m NIR imagery produced by degrading the spatial resolution of a higher resolution reference. Performance, evaluated by comparison between sharpened and reference image, was improved when sharpened subband data were used with the edge correlation.

  19. Influence of Processing Techniques on Microstructure and Mechanical Properties of a Biodegradable Mg-3Zn-2Ca Alloy

    PubMed Central

    Doležal, Pavel; Zapletal, Josef; Fintová, Stanislava; Trojanová, Zuzanka; Greger, Miroslav; Roupcová, Pavla; Podrábský, Tomáš

    2016-01-01

    New Mg-3Zn-2Ca magnesium alloy was prepared using different processing techniques: gravity casting as well as squeeze casting in liquid and semisolid states. Materials were further thermally treated; thermal treatment of the gravity cast alloy was additionally combined with the equal channel angular pressing (ECAP). Alloy processed by the squeeze casting in liquid as well as in semisolid state exhibit improved plasticity; the ECAP processing positively influenced both the tensile and compressive characteristics of the alloy. Applied heat treatment influenced the distribution and chemical composition of present intermetallic phases. Influence of particular processing techniques, heat treatment, and intermetallic phase distribution is thoroughly discussed in relation to mechanical behavior of presented alloys. PMID:28774000

  20. Thick resist for MEMS processing

    NASA Astrophysics Data System (ADS)

    Brown, Joe; Hamel, Clifford

    2001-11-01

    The need for technical innovation is always present in today's economy. Microfabrication methods have evolved in support of the demand for smaller and faster integrated circuits with price performance improvements always in the scope of the manufacturing design engineer. The dispersion of processing technology spans well beyond IC fabrication today with batch fabrication and wafer scale processing lending advantages to MEMES applications from biotechnology to consumer electronics from oil exploration to aerospace. Today the demand for innovative processing techniques that enable technology is apparent where only a few years ago appeared too costly or not reliable. In high volume applications where yield and cost improvements are measured in fractions of a percent it is imperative to have process technologies that produce consistent results. Only a few years ago thick resist coatings were limited to thickness less than 20 microns. Factors such as uniformity, edge bead and multiple coatings made high volume production impossible. New developments in photoresist formulation combined with advanced coating equipment techniques that closely controls process parameters have enable thick photoresist coatings of 70 microns with acceptable uniformity and edge bead in one pass. Packaging of microelectronic and micromechanical devices is often a significant cost factor and a reliability issue for high volume low cost production. Technologies such as flip- chip assembly provide a solution for cost and reliability improvements over wire bond techniques. The processing for such technology demands dimensional control and presents a significant cost savings if it were compatible with mainstream technologies. Thick photoresist layers, with good sidewall control would allow wafer-bumping technologies to penetrate the barriers to yield and production where costs for technology are the overriding issue. Single pass processing is paramount to the manufacturability of packaging technology. Uniformity and edge bead control defined the success of process implementation. Today advanced packaging solutions are created with thick photoresist coatings. The techniques and results will be presented.

  1. Process Mining-Based Method of Designing and Optimizing the Layouts of Emergency Departments in Hospitals.

    PubMed

    Rismanchian, Farhood; Lee, Young Hoon

    2017-07-01

    This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.

  2. Kaizen: a process improvement model for the business of health care and perioperative nursing professionals.

    PubMed

    Tetteh, Hassan A

    2012-01-01

    Kaizen is a proven management technique that has a practical application for health care in the context of health care reform and the 2010 Institute of Medicine landmark report on the future of nursing. Compounded productivity is the unique benefit of kaizen, and its principles are change, efficiency, performance of key essential steps, and the elimination of waste through small and continuous process improvements. The kaizen model offers specific instruction for perioperative nurses to achieve process improvement in a five-step framework that includes teamwork, personal discipline, improved morale, quality circles, and suggestions for improvement. Published by Elsevier Inc.

  3. Opportunities to Create Active Learning Techniques in the Classroom

    ERIC Educational Resources Information Center

    Camacho, Danielle J.; Legare, Jill M.

    2015-01-01

    The purpose of this article is to contribute to the growing body of research that focuses on active learning techniques. Active learning techniques require students to consider a given set of information, analyze, process, and prepare to restate what has been learned--all strategies are confirmed to improve higher order thinking skills. Active…

  4. Usage of information safety requirements in improving tube bending process

    NASA Astrophysics Data System (ADS)

    Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.

    2018-05-01

    This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.

  5. NDE of ceramics and ceramic composites

    NASA Technical Reports Server (NTRS)

    Vary, Alex; Klima, Stanley J.

    1991-01-01

    Although nondestructive evaluation (NDE) techniques for ceramics are fairly well developed, they are difficult to apply in many cases for high probability detection of the minute flaws that can cause failure in monolithic ceramics. Conventional NDE techniques are available for monolithic and fiber reinforced ceramic matrix composites, but more exact quantitative techniques needed are still being investigated and developed. Needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in ceramic composites. NDE techniques that will ultimately be applicable to production and quality control of ceramic structures are still emerging from the lab. Needs are different depending on the processing stage, fabrication method, and nature of the finished product. NDE techniques are being developed in concert with materials processing research where they can provide feedback information to processing development and quality improvement. NDE techniques also serve as research tools for materials characterization and for understanding failure processes, e.g., during thermomechanical testing.

  6. Process research of non-CZ silicon material

    NASA Technical Reports Server (NTRS)

    1983-01-01

    High risk, high payoff research areas associated with the Westinghouse process for producing photovoltaic modules using non- CZ sheet material were investigated. All work was performed using dendritic web silicon. The following tasks are discussed and associated technical results are given: (1) determining the technical feasibility of forming front and back junctions in non-CT silicon using dopant techniques; (2) determining the feasibility of forming a liquid applied diffusion mask to replace the more costly chemical vapor deposited SiO2 diffusion mask; (3) determining the feasibility of applying liquid anti-reflective solutions using meniscus coating equipment; (4) studying the production of uniform, high efficiency solar cells using ion implanation junction formation techniques; and (5) quantifying cost improvements associated with process improvements.

  7. Proposed correlation of modern processing principles for Ayurvedic herbal drug manufacturing: A systematic review.

    PubMed

    Jain, Rahi; Venkatasubramanian, Padma

    2014-01-01

    Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.

  8. When Kids Act Out: A Comparison of Embodied Methods to Improve Children's Memory for a Story

    ERIC Educational Resources Information Center

    Berenhaus, Molly; Oakhill, Jane; Rusted, Jennifer

    2015-01-01

    Over the last decade, embodied cognition, the idea that sensorimotor processes facilitate higher cognitive processes, has proven useful for improving children's memory for a story. In order to compare the benefits of two embodiment techniques, active experiencing (AE) and indexing, for children's memory for a story, we compared the immediate…

  9. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  10. How Digital Image Processing Became Really Easy

    NASA Astrophysics Data System (ADS)

    Cannon, Michael

    1988-02-01

    In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.

  11. Applying a Continuous Quality Improvement Model To Assess Institutional Effectiveness.

    ERIC Educational Resources Information Center

    Roberts, Keith

    This handbook outlines techniques and processes for improving institutional effectiveness and ensuring continuous quality improvement, based on strategic planning activities at Wisconsin's Milwaukee Area Technical College (MATC). First, institutional effectiveness is defined and 17 core indicators of effectiveness developed by the Wisconsin…

  12. Data fusion for delivering advanced traveler information services

    DOT National Transportation Integrated Search

    2003-05-01

    Many transportation professionals have suggested that improved ATIS data fusion techniques and processing will improve the overall quality, timeliness, and usefulness of traveler information. The purpose of this study was four fold. First, conduct a ...

  13. Alternate deposition and hydrogen doping technique for ZnO thin films

    NASA Astrophysics Data System (ADS)

    Myong, Seung Yeop; Lim, Koeng Su

    2006-08-01

    We propose an alternate deposition and hydrogen doping (ADHD) technique for polycrystalline hydrogen-doped ZnO thin films, which is a sublayer-by-sublayer deposition based on metalorganic chemical vapor deposition and mercury-sensitized photodecomposition of hydrogen doping gas. Compared to conventional post-deposition hydrogen doping, the ADHD process provides superior electrical conductivity, stability, and surface roughness. Photoluminescence spectra measured at 10 K reveal that the ADHD technique improves ultraviolet and violet emissions by suppressing the green and yellow emissions. Therefore, the ADHD technique is shown to be very promising aid to the manufacture of improved transparent conducting electrodes and light emitting materials.

  14. Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-01-01

    After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.

  15. Matrix Synthesis and Characterization

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The role of NASA in the area of composite material synthesis; evaluation techniques; prediction analysis techniques; solvent-resistant tough composite matrix; resistance to paint strippers; acceptable processing temperature and pressure for thermoplastics; and the role of computer modeling and fiber interface improvement were discussed.

  16. Supercritical fluids as alternative, safe, food-processing media: an overview.

    PubMed

    Da Cruz Francisco, José; Szwajcer Dey, Estera

    2003-01-01

    The continuous growth of world population and its concentration in the urban areas require food supplies that are continuous, sufficient and of good quality. To resolve this problem techniques have been developed for increasing food quantity and quality. The techniques are applied throughout the food chain from production, conservation and during distribution to the consumers (from "the field to the fork"). During handling of food, chemicals are often deliberately added to achieve improved processing and better quality. This is one of the main reasons food undergoes different kinds of contamination. This overview focuses on the application of supercritical fluids as media for handling food materials during processing with the perspective of reducing chemical contamination of food. Examples of developmental applications of this technique and on research work in process are presented. Emphasis is given to extraction and biotransformation techniques.

  17. Use of Iba Techniques to Characterize High Velocity Thermal Spray Coatings

    NASA Astrophysics Data System (ADS)

    Trompetter, W.; Markwitz, A.; Hyland, M.

    Spray coatings are being used in an increasingly wide range of industries to improve the abrasive, erosive and sliding wear of machine components. Over the past decade industries have moved to the application of supersonic high velocity thermal spray techniques. These coating techniques produce superior coating quality in comparison to other traditional techniques such as plasma spraying. To date the knowledge of the bonding processes and the structure of the particles within thermal spray coatings is very subjective. The aim of this research is to improve our understanding of these materials through the use of IBA techniques in conjunction with other materials analysis techniques. Samples were prepared by spraying a widely used commercial NiCr powder onto substrates using a HVAF (high velocity air fuel) thermal spraying technique. Detailed analysis of the composition and structure of the power particles revealed two distinct types of particles. The majority was NiCr particles with a significant minority of particles composing of SiO2/CrO3. When the particles were investigated both as raw powder and in the sprayed coating, it was surprising to find that the composition of the coating meterial remained unchanged during the coating process despite the high velocity application.

  18. Application of Advanced Signal Processing Techniques to Angle of Arrival Estimation in ATC Navigation and Surveillance Systems

    DTIC Science & Technology

    1982-06-23

    Administration Systems Research and Development Service 14, Spseq Aese Ce ’ Washington, D.C. 20591 It. SeppkW•aae metm The work reported in this document was...consider sophisticated signal processing techniques as an alternative method of improving system performanceH Some work in this area has already taken place...demands on the frequency spectrum. As noted in Table 1-1, there has been considerable work on advanced signal processing in the MLS context

  19. Differential Deposition Technique for Figure Corrections in Grazing Incidence X-ray Optics

    NASA Technical Reports Server (NTRS)

    Kilaru, Kiranmayee; Ramsey, Brian D.; Gubarev, Mikhail

    2009-01-01

    A differential deposition technique is being developed to correct the low- and mid-spatial-frequency deviations in the axial figure profile of Wolter type grazing incidence X-ray optics. These deviations arise due to various factors in the fabrication process and they degrade the performance of the optics by limiting the achievable angular resolution. In the differential deposition technique, material of varying thickness is selectively deposited along the length of the optic to minimize these deviations, thereby improving the overall figure. High resolution focusing optics being developed at MSFC for small animal radionuclide imaging are being coated to test the differential deposition technique. The required spatial resolution for these optics is 100 m. This base resolution is achievable with the regular electroform-nickel-replication fabrication technique used at MSFC. However, by improving the figure quality of the optics through differential deposition, we aim at significantly improving the resolution beyond this value.

  20. Detergent composition comprising a cellulase containing cell-free fermentate produced from microorganism ATCC 55702 or mutant thereof

    DOEpatents

    Dees, H.C.

    1998-07-14

    Bacteria which produce large amounts of a cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.

  1. Pseudo-shading technique in the two-dimensional domain: a post-processing algorithm for enhancing the Z-buffer of a three-dimensional binary image.

    PubMed

    Tan, A C; Richards, R

    1989-01-01

    Three-dimensional (3D) medical graphics is becoming popular in clinical use on tomographic scanners. Research work in 3D reconstructive display of computerized tomography (CT) and magnetic resonance imaging (MRI) scans on conventional computers has produced many so-called pseudo-3D images. The quality of these images depends on the rendering algorithm, the coarseness of the digitized object, the number of grey levels and the image screen resolution. CT and MRI data are fundamentally voxel based and they produce images that are coarse because of the resolution of the data acquisition system. 3D images produced by the Z-buffer depth shading technique suffer loss of detail when complex objects with fine textural detail need to be displayed. Attempts have been made to improve the display of voxel objects, and existing techniques have shown the improvement possible using these post-processing algorithms. The improved rendering technique works on the Z-buffer image to generate a shaded image using a single light source in any direction. The effectiveness of the technique in generating a shaded image has been shown to be a useful means of presenting 3D information for clinical use.

  2. 2D biological representations with reduced speckle obtained from two perpendicular ultrasonic arrays.

    PubMed

    Rodriguez-Hernandez, Miguel A; Gomez-Sacristan, Angel; Sempere-Payá, Víctor M

    2016-04-29

    Ultrasound diagnosis is a widely used medical tool. Among the various ultrasound techniques, ultrasonic imaging is particularly relevant. This paper presents an improvement to a two-dimensional (2D) ultrasonic system using measurements taken from perpendicular planes, where digital signal processing techniques are used to combine one-dimensional (1D) A-scans were acquired by individual transducers in arrays located in perpendicular planes. An algorithm used to combine measurements is improved based on the wavelet transform, which includes a denoising step during the 2D representation generation process. The inclusion of this new denoising stage generates higher quality 2D representations with a reduced level of speckling. The paper includes different 2D representations obtained from noisy A-scans and compares the improvements obtained by including the denoising stage.

  3. Coherent diffractive imaging of time-evolving samples with improved temporal resolution

    DOE PAGES

    Ulvestad, A.; Tripathi, A.; Hruszkewycz, S. O.; ...

    2016-05-19

    Bragg coherent x-ray diffractive imaging is a powerful technique for investigating dynamic nanoscale processes in nanoparticles immersed in reactive, realistic environments. Its temporal resolution is limited, however, by the oversampling requirements of three-dimensional phase retrieval. Here, we show that incorporating the entire measurement time series, which is typically a continuous physical process, into phase retrieval allows the oversampling requirement at each time step to be reduced, leading to a subsequent improvement in the temporal resolution by a factor of 2-20 times. The increased time resolution will allow imaging of faster dynamics and of radiation-dose-sensitive samples. Furthermore, this approach, which wemore » call "chrono CDI," may find use in improving the time resolution in other imaging techniques.« less

  4. Lean-driven improvements slash wait times, drive up patient satisfaction scores.

    PubMed

    2012-07-01

    Administrators at LifePoint Hospitals, based in Brentwood, TN, used lean manufacturing techniques to slash wait times by as much as 30 minutes and achieve double-digit increases in patient satisfaction scores in the EDs at three hospitals. In each case, front-line workers took the lead on identifying opportunities for improvement and redesigning the patient-flow process. As a result of the new efficiencies, patient volume is up by about 25% at all three hospitals. At each hospital, the improvement process began with Kaizen, a lean process that involves bringing personnel together to flow-chart the current system, identify problem areas, and redesign the process. Improvement teams found big opportunities for improvement at the front end of the flow process. Key to the approach was having a plan up front to deal with non-compliance. To sustain improvements, administrators gather and disseminate key metrics on a daily basis.

  5. Improvement in recording and reading holograms

    NASA Technical Reports Server (NTRS)

    Hallock, J. N.

    1968-01-01

    Three-beam technique superimposes a number of patterns in the same plane of a hologram and then uniquely identifies each pattern by a suitable readout process. The developed readout process does not require any movement of parts.

  6. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  7. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  8. A knowledge-based approach to improving optimization techniques in system planning

    NASA Technical Reports Server (NTRS)

    Momoh, J. A.; Zhang, Z. Z.

    1990-01-01

    A knowledge-based (KB) approach to improve mathematical programming techniques used in the system planning environment is presented. The KB system assists in selecting appropriate optimization algorithms, objective functions, constraints and parameters. The scheme is implemented by integrating symbolic computation of rules derived from operator and planner's experience and is used for generalized optimization packages. The KB optimization software package is capable of improving the overall planning process which includes correction of given violations. The method was demonstrated on a large scale power system discussed in the paper.

  9. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  10. The Role of a Physical Analysis Laboratory in a 300 mm IC Development and Manufacturing Centre

    NASA Astrophysics Data System (ADS)

    Kwakman, L. F. Tz.; Bicais-Lepinay, N.; Courtas, S.; Delille, D.; Juhel, M.; Trouiller, C.; Wyon, C.; de la Bardonnie, M.; Lorut, F.; Ross, R.

    2005-09-01

    To remain competitive IC manufacturers have to accelerate the development of most advanced (CMOS) technology and to deliver high yielding products with best cycle times and at a competitive pricing. With the increase of technology complexity, also the need for physical characterization support increases, however many of the existing techniques are no longer adequate to effectively support the 65-45 nm technology node developments. New and improved techniques are definitely needed to better characterize the often marginal processes, but these should not significantly impact fabrication costs or cycle time. Hence, characterization and metrology challenges in state-of-the-art IC manufacturing are both of technical and economical nature. TEM microscopy is needed for high quality, high volume analytical support but several physical and practical hurdles have to be taken. The success rate of FIB-SEM based failure analysis drops as defects often are too small to be detected and fault isolation becomes more difficult in the nano-scale device structures. To remain effective and efficient, SEM and OBIRCH techniques have to be improved or complemented with other more effective methods. Chemical analysis of novel materials and critical interfaces requires improvements in the field of e.g. SIMS, ToF-SIMS. Techniques that previously were only used sporadically, like EBSD and XRD, have become a `must' to properly support backend process development. At the bright side, thanks to major technical advances, techniques that previously were practiced at laboratory level only now can be used effectively for at-line fab metrology: Voltage Contrast based defectivity control, XPS based gate dielectric metrology and XRD based control of copper metallization processes are practical examples. In this paper capabilities and shortcomings of several techniques and corresponding equipment are presented with practical illustrations of use in our Crolles facilities.

  11. Quality Assessment of College Admissions Processes.

    ERIC Educational Resources Information Center

    Fisher, Caroline; Weymann, Elizabeth; Todd, Amy

    2000-01-01

    This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…

  12. Multiscale Image Processing of Solar Image Data

    NASA Astrophysics Data System (ADS)

    Young, C.; Myers, D. C.

    2001-12-01

    It is often said that the blessing and curse of solar physics is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also increased the amount of highly complex data. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present several applications of multiscale techniques applied to solar image data. Specifically, we discuss uses of the wavelet, curvelet, and related transforms to define a multiresolution support for EIT, LASCO and TRACE images.

  13. Post Processing Methods used to Improve Surface Finish of Products which are Manufactured by Additive Manufacturing Technologies: A Review

    NASA Astrophysics Data System (ADS)

    Kumbhar, N. N.; Mulay, A. V.

    2016-08-01

    The Additive Manufacturing (AM) processes open the possibility to go directly from Computer-Aided Design (CAD) to a physical prototype. These prototypes are used as test models before it is finalized as well as sometimes as a final product. Additive Manufacturing has many advantages over the traditional process used to develop a product such as allowing early customer involvement in product development, complex shape generation and also save time as well as money. Additive manufacturing also possess some special challenges that are usually worth overcoming such as Poor Surface quality, Physical Properties and use of specific raw material for manufacturing. To improve the surface quality several attempts had been made by controlling various process parameters of Additive manufacturing and also applying different post processing techniques on components manufactured by Additive manufacturing. The main objective of this work is to document an extensive literature review in the general area of post processing techniques which are used in Additive manufacturing.

  14. Improving the work function of the niobium surface of SRF cavities by plasma processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tyagi, P. V.; Doleans, M.; Hannah, B.

    2016-01-01

    An in situ plasma processing technique using chemically reactive oxygen plasma to remove hydrocarbons from superconducting radio frequency cavity surfaces at room temperature was developed at the spallation neutron source, at Oak Ridge National Laboratory. To understand better the interaction between the plasma and niobium surface, surface studies on small samples were performed. In this article, we report the results from those surface studies. The results show that plasma processing removes hydrocarbons from top surface and improves the surface work function by 0.5₋1.0 eV. Improving the work function of RF surface of cavities can help to improve their operational performance.

  15. Combining LCT tools for the optimization of an industrial process: material and energy flow analysis and best available techniques.

    PubMed

    Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares

    2011-09-15

    Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. A novel pre-processing technique for improving image quality in digital breast tomosynthesis.

    PubMed

    Kim, Hyeongseok; Lee, Taewon; Hong, Joonpyo; Sabir, Sohail; Lee, Jung-Ryun; Choi, Young Wook; Kim, Hak Hee; Chae, Eun Young; Cho, Seungryong

    2017-02-01

    Nonlinear pre-reconstruction processing of the projection data in computed tomography (CT) where accurate recovery of the CT numbers is important for diagnosis is usually discouraged, for such a processing would violate the physics of image formation in CT. However, one can devise a pre-processing step to enhance detectability of lesions in digital breast tomosynthesis (DBT) where accurate recovery of the CT numbers is fundamentally impossible due to the incompleteness of the scanned data. Since the detection of lesions such as micro-calcifications and mass in breasts is the purpose of using DBT, it is justified that a technique producing higher detectability of lesions is a virtue. A histogram modification technique was developed in the projection data domain. Histogram of raw projection data was first divided into two parts: One for the breast projection data and the other for background. Background pixel values were set to a single value that represents the boundary between breast and background. After that, both histogram parts were shifted by an appropriate amount of offset and the histogram-modified projection data were log-transformed. Filtered-backprojection (FBP) algorithm was used for image reconstruction of DBT. To evaluate performance of the proposed method, we computed the detectability index for the reconstructed images from clinically acquired data. Typical breast border enhancement artifacts were greatly suppressed and the detectability of calcifications and masses was increased by use of the proposed method. Compared to a global threshold-based post-reconstruction processing technique, the proposed method produced images of higher contrast without invoking additional image artifacts. In this work, we report a novel pre-processing technique that improves detectability of lesions in DBT and has potential advantages over the global threshold-based post-reconstruction processing technique. The proposed method not only increased the lesion detectability but also reduced typical image artifacts pronounced in conventional FBP-based DBT. © 2016 American Association of Physicists in Medicine.

  17. Multispectral image sharpening using wavelet transform techniques and spatial correlation of edges

    USGS Publications Warehouse

    Lemeshewsky, George P.; Schowengerdt, Robert A.

    2000-01-01

    Several reported image fusion or sharpening techniques are based on the discrete wavelet transform (DWT). The technique described here uses a pixel-based maximum selection rule to combine respective transform coefficients of lower spatial resolution near-infrared (NIR) and higher spatial resolution panchromatic (pan) imagery to produce a sharpened NIR image. Sharpening assumes a radiometric correlation between the spectral band images. However, there can be poor correlation, including edge contrast reversals (e.g., at soil-vegetation boundaries), between the fused images and, consequently, degraded performance. To improve sharpening, a local area-based correlation technique originally reported for edge comparison with image pyramid fusion is modified for application with the DWT process. Further improvements are obtained by using redundant, shift-invariant implementation of the DWT. Example images demonstrate the improvements in NIR image sharpening with higher resolution pan imagery.

  18. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  19. Development of SiC/SiC composites by PIP in combination with RS

    NASA Astrophysics Data System (ADS)

    Kotani, Masaki; Kohyama, Akira; Katoh, Yutai

    2001-02-01

    In order to improve the mechanical performances of SiC/SiC composite, process improvement and modification of polymer impregnation and pyrolysis (PIP) and reaction sintering (RS) process were investigated. The fibrous prepregs were prepared by a polymeric intra-bundle densification technique using Tyranno-SA™ fiber. For inter-bundle matrix, four kinds of process options utilizing polymer pyrolysis and reaction sintering were studied. The process conditions were systematically optimized through fabricating monoliths. Then, SiC/SiC composites were fabricated using optimized inter-bundle matrix slurries in each process for the first inspection of process requirements.

  20. Application of lean manufacturing techniques in the Emergency Department.

    PubMed

    Dickson, Eric W; Singh, Sabi; Cheung, Dickson S; Wyatt, Christopher C; Nugent, Andrew S

    2009-08-01

    "Lean" is a set of principles and techniques that drive organizations to continually add value to the product they deliver by enhancing process steps that are necessary, relevant, and valuable while eliminating those that fail to add value. Lean has been used in manufacturing for decades and has been associated with enhanced product quality and overall corporate success. To evaluate whether the adoption of Lean principles by an Emergency Department (ED) improves the value of emergency care delivered. Beginning in December 2005, we implemented a variety of Lean techniques in an effort to enhance patient and staff satisfaction. The implementation followed a six-step process of Lean education, ED observation, patient flow analysis, process redesign, new process testing, and full implementation. Process redesign focused on generating improvement ideas from frontline workers across all departmental units. Value-based and operational outcome measures, including patient satisfaction, expense per patient, ED length of stay (LOS), and patient volume were compared for calendar year 2005 (pre-Lean) and periodically after 2006 (post-Lean). Patient visits increased by 9.23% in 2006. Despite this increase, LOS decreased slightly and patient satisfaction increased significantly without raising the inflation adjusted cost per patient. Lean improved the value of the care we delivered to our patients. Generating and instituting ideas from our frontline providers have been the key to the success of our Lean program. Although Lean represents a fundamental change in the way we think of delivering care, the specific process changes we employed tended to be simple, small procedure modifications specific to our unique people, process, and place. We, therefore, believe that institutions or departments aspiring to adopt Lean should focus on the core principles of Lean rather than on emulating specific process changes made at other institutions.

  1. Fast optically sectioned fluorescence HiLo endomicroscopy.

    PubMed

    Ford, Tim N; Lim, Daryl; Mertz, Jerome

    2012-02-01

    We describe a nonscanning, fiber bundle endomicroscope that performs optically sectioned fluorescence imaging with fast frame rates and real-time processing. Our sectioning technique is based on HiLo imaging, wherein two widefield images are acquired under uniform and structured illumination and numerically processed to reject out-of-focus background. This work is an improvement upon an earlier demonstration of widefield optical sectioning through a flexible fiber bundle. The improved device features lateral and axial resolutions of 2.6 and 17 μm, respectively, a net frame rate of 9.5 Hz obtained by real-time image processing with a graphics processing unit (GPU) and significantly reduced motion artifacts obtained by the use of a double-shutter camera. We demonstrate the performance of our system with optically sectioned images and videos of a fluorescently labeled chorioallantoic membrane (CAM) in the developing G. gallus embryo. HiLo endomicroscopy is a candidate technique for low-cost, high-speed clinical optical biopsies.

  2. Fast optically sectioned fluorescence HiLo endomicroscopy

    NASA Astrophysics Data System (ADS)

    Ford, Tim N.; Lim, Daryl; Mertz, Jerome

    2012-02-01

    We describe a nonscanning, fiber bundle endomicroscope that performs optically sectioned fluorescence imaging with fast frame rates and real-time processing. Our sectioning technique is based on HiLo imaging, wherein two widefield images are acquired under uniform and structured illumination and numerically processed to reject out-of-focus background. This work is an improvement upon an earlier demonstration of widefield optical sectioning through a flexible fiber bundle. The improved device features lateral and axial resolutions of 2.6 and 17 μm, respectively, a net frame rate of 9.5 Hz obtained by real-time image processing with a graphics processing unit (GPU) and significantly reduced motion artifacts obtained by the use of a double-shutter camera. We demonstrate the performance of our system with optically sectioned images and videos of a fluorescently labeled chorioallantoic membrane (CAM) in the developing G. gallus embryo. HiLo endomicroscopy is a candidate technique for low-cost, high-speed clinical optical biopsies.

  3. Development efforts to improve curved-channel microchannel plates

    NASA Technical Reports Server (NTRS)

    Corbett, M. B.; Feller, W. B.; Laprade, B. N.; Cochran, R.; Bybee, R.; Danks, A.; Joseph, C.

    1993-01-01

    Curved-channel microchannel plate (C-plate) improvements resulting from an ongoing NASA STIS microchannel plate (MCP) development program are described. Performance limitations of previous C-plates led to a development program in support of the STIS MAMA UV photon counter, a second generation instrument on the Hubble Space Telescope. C-plate gain, quantum detection efficiency, dark noise, and imaging distortion, which are influenced by channel curvature non-uniformities, have all been improved through use of a new centrifuge fabrication technique. This technique will be described, along with efforts to improve older, more conventional shearing methods. Process optimization methods used to attain targeted C-plate performance goals will be briefly characterized. Newly developed diagnostic measurement techniques to study image distortion, gain uniformity, input bias angle, channel curvature, and ion feedback, will be described. Performance characteristics and initial test results of the improved C-plates will be reported. Future work and applications will also be discussed.

  4. Applying Toyota production system techniques for medication delivery: improving hospital safety and efficiency.

    PubMed

    Newell, Terry L; Steinmetz-Malato, Laura L; Van Dyke, Deborah L

    2011-01-01

    The inpatient medication delivery system used at a large regional acute care hospital in the Midwest had become antiquated and inefficient. The existing 24-hr medication cart-fill exchange process with delivery to the patients' bedside did not always provide ordered medications to the nursing units when they were needed. In 2007 the principles of the Toyota Production System (TPS) were applied to the system. Project objectives were to improve medication safety and reduce the time needed for nurses to retrieve patient medications. A multidisciplinary team was formed that included representatives from nursing, pharmacy, informatics, quality, and various operational support departments. Team members were educated and trained in the tools and techniques of TPS, and then designed and implemented a new pull system benchmarking the TPS Ideal State model. The newly installed process, providing just-in-time medication availability, has measurably improved delivery processes as well as patient safety and satisfaction. Other positive outcomes have included improved nursing satisfaction, reduced nursing wait time for delivered medications, and improved efficiency in the pharmacy. After a successful pilot on two nursing units, the system is being extended to the rest of the hospital. © 2010 National Association for Healthcare Quality.

  5. Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.

    PubMed

    Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam

    2018-01-01

    During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.

  6. Using Innovative Technologies for Manufacturing and Evaluating Rocket Engine Hardware

    NASA Technical Reports Server (NTRS)

    Betts, Erin M.; Hardin, Andy

    2011-01-01

    Many of the manufacturing and evaluation techniques that are currently used for rocket engine component production are traditional methods that have been proven through years of experience and historical precedence. As we enter into a new space age where new launch vehicles are being designed and propulsion systems are being improved upon, it is sometimes necessary to adopt new and innovative techniques for manufacturing and evaluating hardware. With a heavy emphasis on cost reduction and improvements in manufacturing time, manufacturing techniques such as Direct Metal Laser Sintering (DMLS) and white light scanning are being adopted and evaluated for their use on J-2X, with hopes of employing both technologies on a wide variety of future projects. DMLS has the potential to significantly reduce the processing time and cost of engine hardware, while achieving desirable material properties by using a layered powdered metal manufacturing process in order to produce complex part geometries. The white light technique is a non-invasive method that can be used to inspect for geometric feature alignment. Both the DMLS manufacturing method and the white light scanning technique have proven to be viable options for manufacturing and evaluating rocket engine hardware, and further development and use of these techniques is recommended.

  7. Applying industrial process improvement techniques to increase efficiency in a surgical practice.

    PubMed

    Reznick, David; Niazov, Lora; Holizna, Eric; Siperstein, Allan

    2014-10-01

    The goal of this study was to examine how industrial process improvement techniques could help streamline the preoperative workup. Lean process improvement was used to streamline patient workup at an endocrine surgery service at a tertiary medical center utilizing multidisciplinary collaboration. The program consisted of several major changes in how patients are processed in the department. The goal was to shorten the wait time between initial call and consult visit and between consult and surgery. We enrolled 1,438 patients enrolled in the program. The wait time from the initial call until consult was reduced from 18.3 ± 0.7 to 15.4 ± 0.9 days. Wait time from consult until operation was reduced from 39.9 ± 1.5 to 33.9 ± 1.3 days for the overall practice and to 15.0 ± 4.8 days for low-risk patients. Patient cancellations were reduced from 27.9 ± 2.4% to 17.3 ± 2.5%. Overall patient flow increased from 30.9 ± 5.1 to 52.4 ± 5.8 consults per month (all P < .01). Utilizing process improvement methodology, surgery patients can benefit from an improved, streamlined process with significant reduction in wait time from call to initial consult and initial consult to surgery, with reduced cancellations. This generalized process has resulted in increased practice throughput and efficiency and is applicable to any surgery practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Patterned wafer geometry grouping for improved overlay control

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Woo, Jaeson; Park, Junbeom; Song, Changrock; Anis, Fatima; Vukkadala, Pradeep; Jeon, Sanghuck; Choi, DongSub; Huang, Kevin; Heo, Hoyoung; Smith, Mark D.; Robinson, John C.

    2017-03-01

    Process-induced overlay errors from outside the litho cell have become a significant contributor to the overlay error budget including non-uniform wafer stress. Previous studies have shown the correlation between process-induced stress and overlay and the opportunity for improvement in process control, including the use of patterned wafer geometry (PWG) metrology to reduce stress-induced overlay signatures. Key challenges of volume semiconductor manufacturing are how to improve not only the magnitude of these signatures, but also the wafer to wafer variability. This work involves a novel technique of using PWG metrology to provide improved litho-control by wafer-level grouping based on incoming process induced overlay, relevant for both 3D NAND and DRAM. Examples shown in this study are from 19 nm DRAM manufacturing.

  9. Process tool monitoring and matching using interferometry technique

    NASA Astrophysics Data System (ADS)

    Anberg, Doug; Owen, David M.; Mileham, Jeffrey; Lee, Byoung-Ho; Bouche, Eric

    2016-03-01

    The semiconductor industry makes dramatic device technology changes over short time periods. As the semiconductor industry advances towards to the 10 nm device node, more precise management and control of processing tools has become a significant manufacturing challenge. Some processes require multiple tool sets and some tools have multiple chambers for mass production. Tool and chamber matching has become a critical consideration for meeting today's manufacturing requirements. Additionally, process tools and chamber conditions have to be monitored to ensure uniform process performance across the tool and chamber fleet. There are many parameters for managing and monitoring tools and chambers. Particle defect monitoring is a well-known and established example where defect inspection tools can directly detect particles on the wafer surface. However, leading edge processes are driving the need to also monitor invisible defects, i.e. stress, contamination, etc., because some device failures cannot be directly correlated with traditional visualized defect maps or other known sources. Some failure maps show the same signatures as stress or contamination maps, which implies correlation to device performance or yield. In this paper we present process tool monitoring and matching using an interferometry technique. There are many types of interferometry techniques used for various process monitoring applications. We use a Coherent Gradient Sensing (CGS) interferometer which is self-referencing and enables high throughput measurements. Using this technique, we can quickly measure the topography of an entire wafer surface and obtain stress and displacement data from the topography measurement. For improved tool and chamber matching and reduced device failure, wafer stress measurements can be implemented as a regular tool or chamber monitoring test for either unpatterned or patterned wafers as a good criteria for improved process stability.

  10. Process simulation for advanced composites production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allendorf, M.D.; Ferko, S.M.; Griffiths, S.

    1997-04-01

    The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coatingmore » techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.« less

  11. Fabrication of Thermoelectric Devices Using Additive-Subtractive Manufacturing Techniques: Application to Waste-Heat Energy Harvesting

    NASA Astrophysics Data System (ADS)

    Tewolde, Mahder

    Thermoelectric generators (TEGs) are solid-state devices that convert heat directly into electricity. They are well suited for waste-heat energy harvesting applications as opposed to primary energy generation. Commercially available thermoelectric modules are flat, inflexible and have limited sizes available. State-of-art manufacturing of TEG devices relies on assembling prefabricated parts with soldering, epoxy bonding, and mechanical clamping. Furthermore, efforts to incorporate them onto curved surfaces such as exhaust pipes, pump housings, steam lines, mixing containers, reaction chambers, etc. require custom-built heat exchangers. This is costly and labor-intensive, in addition to presenting challenges in terms of space, thermal coupling, added weight and long-term reliability. Additive manufacturing technologies are beginning to address many of these issues by reducing part count in complex designs and the elimination of sub-assembly requirements. This work investigates the feasibility of utilizing such novel manufacturing routes for improving the manufacturing process of thermoelectric devices. Much of the research in thermoelectricity is primarily focused on improving thermoelectric material properties by developing of novel materials or finding ways to improve existing ones. Secondary to material development is improving the manufacturing process of TEGs to provide significant cost benefits. To improve the device fabrication process, this work explores additive manufacturing technologies to provide an integrated and scalable approach for TE device manufacturing directly onto engineering component surfaces. Additive manufacturing techniques like thermal spray and ink-dispenser printing are developed with the aim of improving the manufacturing process of TEGs. Subtractive manufacturing techniques like laser micromachining are also studied in detail. This includes the laser processing parameters for cutting the thermal spray materials efficiently by optimizing cutting speed and power while maintaining surface quality and interface properties. Key parameters are obtained from these experiments and used to develop a process that can be used to fabricate a working TEG directly onto the waste-heat component surface. A TEG module has been fabricated for the first time entirely by using thermal spray technology and laser micromachining. The target applications include automotive exhaust systems and other high-volume industrial waste heat sources. The application of TEGs for thermoelectrically powered sensors for Small Modular Reactors (SMRs) is presented. In conclusion, more ways to improve the fabrication process of TEGs are suggested.

  12. Video-signal improvement using comb filtering techniques.

    NASA Technical Reports Server (NTRS)

    Arndt, G. D.; Stuber, F. M.; Panneton, R. J.

    1973-01-01

    Significant improvement in the signal-to-noise performance of television signals has been obtained through the application of comb filtering techniques. This improvement is achieved by removing the inherent redundancy in the television signal through linear prediction and by utilizing the unique noise-rejection characteristics of the receiver comb filter. Theoretical and experimental results describe the signal-to-noise ratio and picture-quality improvement obtained through the use of baseband comb filters and the implementation of a comb network as the loop filter in a phase-lock-loop demodulator. Attention is given to the fact that noise becomes correlated when processed by the receiver comb filter.

  13. Polarimetric radar and aircraft observations of saggy bright bands during MC3E

    DOE PAGES

    Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree; ...

    2016-03-19

    Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.

  14. Quantification of unsteady heat transfer and phase changing process inside small icing water droplets.

    PubMed

    Jin, Zheyan; Hu, Hui

    2009-05-01

    We report progress made in our recent effort to develop and implement a novel, lifetime-based molecular tagging thermometry (MTT) technique to quantify unsteady heat transfer and phase changing process inside small icing water droplets pertinent to wind turbine icing phenomena. The lifetime-based MTT technique was used to achieve temporally and spatially resolved temperature distribution measurements within small, convectively cooled water droplets to quantify unsteady heat transfer within the small water droplets in the course of convective cooling process. The transient behavior of phase changing process within small icing water droplets was also revealed clearly by using the MTT technique. Such measurements are highly desirable to elucidate underlying physics to improve our understanding about important microphysical phenomena pertinent to ice formation and accreting process as water droplets impinging onto wind turbine blades.

  15. Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study

    NASA Astrophysics Data System (ADS)

    Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh

    2018-03-01

    Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.

  16. In-situ plasma processing to increase the accelerating gradients of SRF cavities

    DOE PAGES

    Doleans, Marc; Afanador, Ralph; Barnhart, Debra L.; ...

    2015-12-31

    A new in-situ plasma processing technique is being developed at the Spallation Neutron Source (SNS) to improve the performance of the cavities in operation. The technique utilizes a low-density reactive oxygen plasma at room temperature to remove top surface hydrocarbons. The plasma processing technique increases the work function of the cavity surface and reduces the overall amount of vacuum and electron activity during cavity operation; in particular it increases the field emission onset, which enables cavity operation at higher accelerating gradients. Experimental evidence also suggests that the SEY of the Nb surface decreases after plasma processing which helps mitigating multipactingmore » issues. This article discusses the main developments and results from the plasma processing R&D are presented and experimental results for in-situ plasma processing of dressed cavities in the SNS horizontal test apparatus.« less

  17. Applying Nonverbal Techniques to Organizational Diagnosis.

    ERIC Educational Resources Information Center

    Tubbs, Stewart L.; Koske, W. Cary

    Ongoing research programs conducted at General Motors Institute are motivated by the practical objective of improving the company's organizational effectiveness. Computer technology is being used whenever possible; for example, a technique developed by Herman Chernoff was used to process data from a survey of employee attitudes into 18 different…

  18. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    NASA Astrophysics Data System (ADS)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  19. The Power of Process Improvement

    ERIC Educational Resources Information Center

    Fairfield-Sonn, James W.; Morgan, Sandra; Sumukadas, Narendar

    2004-01-01

    Over the last several decades many systematic management approaches, such as Total Quality Management, aimed at improving organizational performance and employee satisfaction have captured organizations' attention. Given their origins in statistics, operations management, and engineering, many of the concepts and techniques are technical. When…

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ankireddy, Krishnamraju; Ghahremani, Amir H.; Martin, Blake

    Perovskite thin films are thermally annealed using a rapid intense pulsed light technique enabled by an alkyl halide that collectively improves device performance when processed in ambient conditions.

  1. Automation of energy demand forecasting

    NASA Astrophysics Data System (ADS)

    Siddique, Sanzad

    Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.

  2. New polyimide polymer has excellent processing characterisitcs with improved thermo-oxidative and hydrolytic stabilities

    NASA Technical Reports Server (NTRS)

    Jones, R. J.; Vaughan, R. W.; Kendrick, W. P.

    1972-01-01

    Polyimide P10P and its processing technique apply to most high temperature plastic products, devices and castings. Prepolymer, when used as varnish, impregnates fibers directly and is able to be processed into advanced composities. Material may also be used as molding powder and adhesive.

  3. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  4. Advantages of multigrid methods for certifying the accuracy of PDE modeling

    NASA Technical Reports Server (NTRS)

    Forester, C. K.

    1981-01-01

    Numerical techniques for assessing and certifying the accuracy of the modeling of partial differential equations (PDE) to the user's specifications are analyzed. Examples of the certification process with conventional techniques are summarized for the three dimensional steady state full potential and the two dimensional steady Navier-Stokes equations using fixed grid methods (FG). The advantages of the Full Approximation Storage (FAS) scheme of the multigrid technique of A. Brandt compared with the conventional certification process of modeling PDE are illustrated in one dimension with the transformed potential equation. Inferences are drawn for how MG will improve the certification process of the numerical modeling of two and three dimensional PDE systems. Elements of the error assessment process that are common to FG and MG are analyzed.

  5. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  6. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  7. The role of failure modes and effects analysis in showing the benefits of automation in the blood bank.

    PubMed

    Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew

    2013-05-01

    Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.

  8. Improving the quality of reconstructed X-ray CT images of polymer gel dosimeters: zero-scan coupled with adaptive mean filtering.

    PubMed

    Kakakhel, M B; Jirasek, A; Johnston, H; Kairn, T; Trapp, J V

    2017-03-01

    This study evaluated the feasibility of combining the 'zero-scan' (ZS) X-ray computed tomography (CT) based polymer gel dosimeter (PGD) readout with adaptive mean (AM) filtering for improving the signal to noise ratio (SNR), and to compare these results with available average scan (AS) X-ray CT readout techniques. NIPAM PGD were manufactured, irradiated with 6 MV photons, CT imaged and processed in Matlab. AM filter for two iterations, with 3 × 3 and 5 × 5 pixels (kernel size), was used in two scenarios (a) the CT images were subjected to AM filtering (pre-processing) and these were further employed to generate AS and ZS gel images, and (b) the AS and ZS images were first reconstructed from the CT images and then AM filtering was carried out (post-processing). SNR was computed in an ROI of 30 × 30 for different pre and post processing cases. Results showed that the ZS technique combined with AM filtering resulted in improved SNR. Using the previously-recommended 25 images for reconstruction the ZS pre-processed protocol can give an increase of 44% and 80% in SNR for 3 × 3 and 5 × 5 kernel sizes respectively. However, post processing using both techniques and filter sizes introduced blur and a reduction in the spatial resolution. Based on this work, it is possible to recommend that the ZS method may be combined with pre-processed AM filtering using appropriate kernel size, to produce a large increase in the SNR of the reconstructed PGD images.

  9. Ultrasound melted polymer sleeve for improved screw anchorage in trabecular bone--A novel screw augmentation technique.

    PubMed

    Schmoelz, W; Mayr, R; Schlottig, F; Ivanovic, N; Hörmann, R; Goldhahn, J

    2016-03-01

    Screw anchorage in osteoporotic bone is still limited and makes treatment of osteoporotic fractures challenging for surgeons. Conventional screws fail in poor bone quality due to loosening at the screw-bone interface. A new technology should help to improve this interface. In a novel constant amelioration process technique, a polymer sleeve is melted by ultrasound in the predrilled screw hole prior to screw insertion. The purpose of this study was to investigate in vitro the effect of the constant amelioration process platform technology on primary screw anchorage. Fresh frozen femoral heads (n=6) and vertebrae (n=6) were used to measure the maximum screw insertion torque of reference and constant amelioration process augmented screws. Specimens were cut in cranio-caudal direction, and the screws (reference and constant amelioration process) were implanted in predrilled holes in the trabecular structure on both sides of the cross section. This allowed the pairwise comparison of insertion torque for constant amelioration process and reference screws (femoral heads n=18, vertebrae n=12). Prior to screw insertion, a micro-CT scan was made to ensure comparable bone quality at the screw placement location. The mean insertion torque for the constant amelioration process augmented screws in both, the femoral heads (44.2 Ncm, SD 14.7) and the vertebral bodies (13.5 Ncm, SD 6.3) was significantly higher than for the reference screws of the femoral heads (31.7 Ncm, SD 9.6, p<0.001) and the vertebral bodies (7.1 Ncm, SD 4.5, p<0.001). The interconnection of the melted polymer sleeve with the surrounding trabecular bone in the constant amelioration process technique resulted in a higher screw insertion torque and can improve screw anchorage in osteoporotic trabecular bone. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Curbing variations in packaging process through Six Sigma way in a large-scale food-processing industry

    NASA Astrophysics Data System (ADS)

    Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar

    2015-03-01

    Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.

  11. Mapping Diffuse Seismicity Using Empirical Matched Field Processing Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Templeton, D C; Harris, D B

    The objective of this project is to detect and locate more microearthquakes using the empirical matched field processing (MFP) method than can be detected using only conventional earthquake detection techniques. We propose that empirical MFP can complement existing catalogs and techniques. We test our method on continuous seismic data collected at the Salton Sea Geothermal Field during November 2009 and January 2010. In the Southern California Earthquake Data Center (SCEDC) earthquake catalog, 619 events were identified in our study area during this time frame and our MFP technique identified 1094 events. Therefore, we believe that the empirical MFP method combinedmore » with conventional methods significantly improves the network detection ability in an efficient matter.« less

  12. Hybrid scatterometry measurement for BEOL process control

    NASA Astrophysics Data System (ADS)

    Timoney, Padraig; Vaid, Alok; Kang, Byeong Cheol; Liu, Haibo; Isbester, Paul; Cheng, Marjorie; Ng-Emans, Susan; Yellai, Naren; Sendelbach, Matt; Koret, Roy; Gedalia, Oram

    2017-03-01

    Scaling of interconnect design rules in advanced nodes has been accompanied by a reducing metrology budget for BEOL process control. Traditional inline optical metrology measurements of BEOL processes rely on 1-dimensional (1D) film pads to characterize film thickness. Such pads are designed on the assumption that solid copper blocks from previous metallization layers prevent any light from penetrating through the copper, thus simplifying the effective film stack for the 1D optical model. However, the reduction of the copper thickness in each metallization layer and CMP dishing effects within the pad, have introduced undesired noise in the measurement. To resolve this challenge and to measure structures that are more representative of product, scatterometry has been proposed as an alternative measurement. Scatterometry is a diffraction based optical measurement technique using Rigorous Coupled Wave Analysis (RCWA), where light diffracted from a periodic structure is used to characterize the profile. Scatterometry measurements on 3D structures have been shown to demonstrate strong correlation to electrical resistance parameters for BEOL Etch and CMP processes. However, there is significant modeling complexity in such 3D scatterometry models, in particlar due to complexity of front-end-of-line (FEOL) and middle-of-line (MOL) structures. The accompanying measurement noise associated with such structures can contribute significant measurement error. To address the measurement noise of the 3D structures and the impact of incoming process variation, a hybrid scatterometry technique is proposed that utilizes key information from the structure to significantly reduce the measurement uncertainty of the scatterometry measurement. Hybrid metrology combines measurements from two or more metrology techniques to enable or improve the measurement of a critical parameter. In this work, the hybrid scatterometry technique is evaluated for 7nm and 14nm node BEOL measurements of interlayer dielectric (ILD) thickness, hard mask thickness and dielectric trench etch in complex 3D structures. The data obtained from the hybrid scatterometry technique demonstrates stable measurement precision, improved within wafer and wafer to wafer range, robustness in cases where 3D scatterometry measurements incur undesired shifts in the measurements, accuracy as compared to TEM and correlation to process deposition time. Process capability indicator comparisons also demonstrate improvement as compared to conventional scatterometry measurements. The results validate the suitability of the method for monitoring of production BEOL processes.

  13. Use of Adaptive Digital Signal Processing to Improve Speech Communication for Normally Hearing aand Hearing-Impaired Subjects.

    ERIC Educational Resources Information Center

    Harris, Richard W.; And Others

    1988-01-01

    A two-microphone adaptive digital noise cancellation technique improved word-recognition ability for 20 normal and 12 hearing-impaired adults by reducing multitalker speech babble and speech spectrum noise 18-22 dB. Word recognition improvements averaged 37-50 percent for normal and 27-40 percent for hearing-impaired subjects. Improvement was best…

  14. Overall equipment efficiency of Flexographic Printing process: A case study

    NASA Astrophysics Data System (ADS)

    Zahoor, S.; Shehzad, A.; Mufti, NA; Zahoor, Z.; Saeed, U.

    2017-12-01

    This paper reports the efficiency improvement of a flexographic printing machine by reducing breakdown time with the help of a total productive maintenance measure called overall equipment efficiency (OEE). The methodology is comprised of calculating OEE of the machine before and after identifying the causes of the problems. Pareto diagram is used to prioritize main problem areas and 5-whys analysis approach is used to identify the root cause of these problems. OEE of the process is improved from 34% to 40.2% for a 30 days time period. It is concluded that OEE and 5-whys analysis techniques are useful in improving effectiveness of the equipment and for the continuous process improvement as well.

  15. Development of fire-resistant, low smoke generating, thermally stable end items for commercial aircraft and spacecraft using a basic polyimide resin

    NASA Technical Reports Server (NTRS)

    Gagliani, J.; Lee, R.; Sorathia, U. A.; Wilcoxson, A. L.

    1980-01-01

    A terpolyimide precursor was developed which can be foamed by microwave methods and yields foams possessing the best seating properties. A continuous process, based on spray drying techniques, permits production of polyimide powder precursors in large quantities. The constrained rise foaming process permits fabrication of rigid foam panels with improved mechanical properties and almost unlimited density characteristics. Polyimide foam core rigid panels were produced by this technique with woven fiberglass fabric bonded to each side of the panel in a one step microwave process. The fire resistance of polyimide foams was significantly improved by the addition of ceramic fibers to the powder precursors. Foams produced from these compositions are flexible, possess good acoustical attenuation and meet the minimum burnthrough requirements when impinged by high flux flame sources.

  16. Improving Word Learning in Children Using an Errorless Technique

    ERIC Educational Resources Information Center

    Warmington, Meesha; Hitch, Graham J.; Gathercole, Susan E.

    2013-01-01

    The current experiment examined the relative advantage of an errorless learning technique over an errorful one in the acquisition of novel names for unfamiliar objects in typically developing children aged between 7 and 9 years. Errorless learning led to significantly better learning than did errorful learning. Processing speed and vocabulary…

  17. Medical School Admissions: The Insider's Guide.

    ERIC Educational Resources Information Center

    Zebala, John A.; Jones, Daniel B.

    A handbook on the medical school admissions process is presented, offering a first hand account of what works. Six chapters discuss the following topics and subtopics: (1) premedical preparation (planning undergraduate study and picking the right college); (2) power techniques for higher grades (techniques for grade point success, improving grades…

  18. ELECTRODIALYSIS AS A TECHNIQUE FOR EXTENDING ELECTROLESS NICKEL BATH LIFE-IMPROVING SELECTIVITY AND REDUCING LOSSES OF VALUABLE BATH COMPONENTS

    EPA Science Inventory

    Over the last decade electrodialysis has emerged as an effective technique for removing accumulated reactant counterions (sodium and sulfate) and reaction products (orthophosphite) that interfere with the electroless nickel plating process, thus extending bath life by up to 50 me...

  19. Comparative evaluation of ibuprofen/beta-cyclodextrin complexes obtained by supercritical carbon dioxide and other conventional methods.

    PubMed

    Hussein, Khaled; Türk, Michael; Wahl, Martin A

    2007-03-01

    The preparation of drug/cyclodextrin complexes is a suitable method to improve the dissolution of poor soluble drugs. The efficacy of the Controlled Particle Deposition (CPD) as a new developed method to prepare these complexes in a single stage process using supercritical carbon dioxide is therefore compared with other conventional methods. Ibuprofen/beta-cyclodextrin complexes were prepared with different techniques and characterized using FTIR-ATR spectroscopy, powder X-ray diffractometry (PXRD), differential scanning calorimetry (DSC) and scanning electron microscopy (SEM). In addition, the influences of the processing technique on the drug content (HPLC) and the dissolution behavior were studied. Employing the CPD-process resulted in a drug content of 2.8+/-0.22 wt.% in the carrier. The material obtained by CPD showed an improved dissolution rate of ibuprofen at pH 5 compared with the pure drug and its physical mixture with beta-cyclodextrin. In addition CPD material displays the highest dissolution (93.5+/- 2.89% after 75 min) compared to material obtained by co-precipitation (61.3 +/-0.52%) or freeze-drying (90.6 +/-2.54%). This study presents the CPD-technique as a well suitable method to prepare a drug/beta-cyclodextrin complex with improved drug dissolution compared to the pure drug and materials obtained by other methods.

  20. Study of Profile Changes during Mechanical Polishing using Relocation Profilometry

    NASA Astrophysics Data System (ADS)

    Kumaran, S. Chidambara; Shunmugam, M. S.

    2017-10-01

    Mechanical polishing is a finishing process practiced conventionally to enhance quality of surface. Surface finish is improved by mechanical cutting action of abrasive particles on work surface. Polishing is complex in nature and research efforts have been focused on understanding the polishing mechanism. Study of changes in profile is a useful method of understanding behavior of the polishing process. Such a study requires tracing same profile at regular process intervals, which is a tedious job. An innovative relocation technique is followed in the present work to study profile changes during mechanical polishing of austenitic stainless steel specimen. Using special locating fixture, micro-indentation mark and cross-correlation technique, the same profile is traced at certain process intervals. Comparison of different parameters of profiles shows the manner in which metal removal takes place in the polishing process. Mass removal during process estimated by the same relocation technique is checked with that obtained using weight measurement. The proposed approach can be extended to other micro/nano finishing processes and favorable process conditions can be identified.

  1. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  2. Demonstration of emulator-based Bayesian calibration of safety analysis codes: Theory and formulation

    DOE PAGES

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-05-28

    System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less

  3. Applications notice. [application of space techniques to earth resources, environment management, and space processing

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The discipline programs of the Space and Terrestrial (S&T) Applications Program are described and examples of research areas of current interest are given. Application of space techniques to improve conditions on earth are summarized. Discipline programs discussed include: resource observations; environmental observations; communications; materials processing in space; and applications systems/information systems. Format information on submission of unsolicited proposals for research related to the S&T Applications Program are given.

  4. Impact of a poka-yoke device on job performance of individuals with cognitive impairments.

    PubMed

    Erlandson, R F; Noblett, M J; Phelps, J A

    1998-09-01

    Job performance and production related issues are important not only to successful vocational training and ultimate job placement for individuals with cognitive disabilities, but also for their ability to have expanded vocational options. This study hypothesized that the application of Kaizen philosophy, and poka-yoke techniques in particular, could create job opportunities and improve productivity of individuals with cognitive disabilities. Poka-yoke or error-proofing techniques are part of the collection of Kaizen techniques. Kaizen refers to continuous improvement in performance, cost/effectiveness, and quality. Kaizen strives to empower the worker, increase worker satisfaction, facilitate a sense of accomplishment, and thereby create pride-of-work. These techniques typically reduce the physical and cognitive demands of a task and thereby render the task more accessible. The job was a fuel clamp assembly. A redesigned assembly fixture was the poka-yoke intervention. Consistent with poka-yoke principles, the intervention improved the productivity of everyone attempting the assembly. In particular, the workers in this study showed an 80% increase in productivity and an average percent error drop from 52% to about 1% after the process redesign. Furthermore, the workers showed improved morale, self-esteem, and pride-of-work. Prior to the process redesign, only the higher functioning workers could successfully perform the assembly. After the redesign a greater number of workers could successfully perform the assembly. These results not only validated the study hypothesis, but demonstrated that the success facilitated by applying Kaizen techniques had similar results with individuals with cognitive disabilities as with nondisabled workers.

  5. Communication and cooperation in underwater acoustic networks

    NASA Astrophysics Data System (ADS)

    Yerramalli, Srinivas

    In this thesis, we present a study of several problems related to underwater point to point communications and network formation. We explore techniques to improve the achievable data rate on a point to point link using better physical layer techniques and then study sensor cooperation which improves the throughput and reliability in an underwater network. Robust point-to-point communications in underwater networks has become increasingly critical in several military and civilian applications related to underwater communications. We present several physical layer signaling and detection techniques tailored to the underwater channel model to improve the reliability of data detection. First, a simplified underwater channel model in which the time scale distortion on each path is assumed to be the same (single scale channel model in contrast to a more general multi scale model). A novel technique, which exploits the nature of OFDM signaling and the time scale distortion, called Partial FFT Demodulation is derived. It is observed that this new technique has some unique interference suppression properties and performs better than traditional equalizers in several scenarios of interest. Next, we consider the multi scale model for the underwater channel and assume that single scale processing is performed at the receiver. We then derive optimized front end pre-processing techniques to reduce the interference caused during single scale processing of signals transmitted on a multi-scale channel. We then propose an improvised channel estimation technique using dictionary optimization methods for compressive sensing and show that significant performance gains can be obtained using this technique. In the next part of this thesis, we consider the problem of sensor node cooperation among rational nodes whose objective is to improve their individual data rates. We first consider the problem of transmitter cooperation in a multiple access channel and investigate the stability of the grand coalition of transmitters using tools from cooperative game theory and show that the grand coalition in both the asymptotic regimes of high and low SNR. Towards studying the problem of receiver cooperation for a broadcast channel, we propose a game theoretic model for the broadcast channel and then derive a game theoretic duality between the multiple access and the broadcast channel and show that how the equilibria of the broadcast channel are related to the multiple access channel and vice versa.

  6. Blob-enhanced reconstruction technique

    NASA Astrophysics Data System (ADS)

    Castrillo, Giusy; Cafiero, Gioacchino; Discetti, Stefano; Astarita, Tommaso

    2016-09-01

    A method to enhance the quality of the tomographic reconstruction and, consequently, the 3D velocity measurement accuracy, is presented. The technique is based on integrating information on the objects to be reconstructed within the algebraic reconstruction process. A first guess intensity distribution is produced with a standard algebraic method, then the distribution is rebuilt as a sum of Gaussian blobs, based on location, intensity and size of agglomerates of light intensity surrounding local maxima. The blobs substitution regularizes the particle shape allowing a reduction of the particles discretization errors and of their elongation in the depth direction. The performances of the blob-enhanced reconstruction technique (BERT) are assessed with a 3D synthetic experiment. The results have been compared with those obtained by applying the standard camera simultaneous multiplicative reconstruction technique (CSMART) to the same volume. Several blob-enhanced reconstruction processes, both substituting the blobs at the end of the CSMART algorithm and during the iterations (i.e. using the blob-enhanced reconstruction as predictor for the following iterations), have been tested. The results confirm the enhancement in the velocity measurements accuracy, demonstrating a reduction of the bias error due to the ghost particles. The improvement is more remarkable at the largest tested seeding densities. Additionally, using the blobs distributions as a predictor enables further improvement of the convergence of the reconstruction algorithm, with the improvement being more considerable when substituting the blobs more than once during the process. The BERT process is also applied to multi resolution (MR) CSMART reconstructions, permitting simultaneously to achieve remarkable improvements in the flow field measurements and to benefit from the reduction in computational time due to the MR approach. Finally, BERT is also tested on experimental data, obtaining an increase of the signal-to-noise ratio in the reconstructed flow field and a higher value of the correlation factor in the velocity measurements with respect to the volume to which the particles are not replaced.

  7. OAO battery data analysis

    NASA Technical Reports Server (NTRS)

    Gaston, S.; Wertheim, M.; Orourke, J. A.

    1973-01-01

    Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.

  8. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  9. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  10. Pulsed-neutron imaging by a high-speed camera and center-of-gravity processing

    NASA Astrophysics Data System (ADS)

    Mochiki, K.; Uragaki, T.; Koide, J.; Kushima, Y.; Kawarabayashi, J.; Taketani, A.; Otake, Y.; Matsumoto, Y.; Su, Y.; Hiroi, K.; Shinohara, T.; Kai, T.

    2018-01-01

    Pulsed-neutron imaging is attractive technique in the research fields of energy-resolved neutron radiography and RANS (RIKEN) and RADEN (J-PARC/JAEA) are small and large accelerator-driven pulsed-neutron facilities for its imaging, respectively. To overcome the insuficient spatial resolution of the conunting type imaging detectors like μ NID, nGEM and pixelated detectors, camera detectors combined with a neutron color image intensifier were investigated. At RANS center-of-gravity technique was applied to spots image obtained by a CCD camera and the technique was confirmed to be effective for improving spatial resolution. At RADEN a high-frame-rate CMOS camera was used and super resolution technique was applied and it was recognized that the spatial resolution was futhermore improved.

  11. Rapid Prototyping of Continuous Fiber Reinforced Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, R.; Green, C.; Phillips, T.; Cipriani, R.; Yarlagadda, S.; Gillespie, J.; Effinger, M.; Cooper, K. C.; Gordon, Gail (Technical Monitor)

    2002-01-01

    For ceramics to be used as structural components in high temperature applications, their fracture toughness is improved by embedding continuous ceramic fibers. Ceramic matrix composite (CMC) materials allow increasing the overall operating temperature, raising the temperature safety margins, avoiding the need for cooling, and improving the damping capacity, while reducing the weight at the same time. They also need to be reliable and available in large quantities as well. In this paper, an innovative rapid prototyping technique to fabricate continuous fiber reinforced ceramic matrix composites is described. The process is simple, robust and will be widely applicable to a number of high temperature material systems. This technique was originally developed at the University of Delaware Center for Composite Materials (UD-CCM) for rapid fabrication of polymer matrix composites by a technique called automated tow placement or ATP. The results of mechanical properties and microstructural characterization are presented, together with examples of complex shapes and parts. It is believed that the process will be able to create complex shaped parts at an order of magnitude lower cost than current CVI and PIP processes.

  12. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  13. Modeling and managing risk early in software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  14. Improved accuracy of markerless motion tracking on bone suppression images: preliminary study for image-guided radiation therapy (IGRT)

    NASA Astrophysics Data System (ADS)

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-05-01

    The bone suppression technique based on advanced image processing can suppress the conspicuity of bones on chest radiographs, creating soft tissue images obtained by the dual-energy subtraction technique. This study was performed to evaluate the usefulness of bone suppression image processing in image-guided radiation therapy. We demonstrated the improved accuracy of markerless motion tracking on bone suppression images. Chest fluoroscopic images of nine patients with lung nodules during respiration were obtained using a flat-panel detector system (120 kV, 0.1 mAs/pulse, 5 fps). Commercial bone suppression image processing software was applied to the fluoroscopic images to create corresponding bone suppression images. Regions of interest were manually located on lung nodules and automatic target tracking was conducted based on the template matching technique. To evaluate the accuracy of target tracking, the maximum tracking error in the resulting images was compared with that of conventional fluoroscopic images. The tracking errors were decreased by half in eight of nine cases. The average maximum tracking errors in bone suppression and conventional fluoroscopic images were 1.3   ±   1.0 and 3.3   ±   3.3 mm, respectively. The bone suppression technique was especially effective in the lower lung area where pulmonary vessels, bronchi, and ribs showed complex movements. The bone suppression technique improved tracking accuracy without special equipment and implantation of fiducial markers, and with only additional small dose to the patient. Bone suppression fluoroscopy is a potential measure for respiratory displacement of the target. This paper was presented at RSNA 2013 and was carried out at Kanazawa University, JAPAN.

  15. Improvement on Gabor order tracking and objective comparison with Vold Kalman filtering order tracking

    NASA Astrophysics Data System (ADS)

    Pan, Min-Chun; Liao, Shiu-Wei; Chiu, Chun-Chin

    2007-02-01

    The waveform-reconstruction schemes of order tracking (OT) such as the Gabor and the Vold-Kalman filtering (VKF) techniques can extract specific order and/or spectral components in addition to characterizing the processed signal in rpm-frequency domain. The study first improves the Gabor OT (GOT) technique to handle the order-crossing problem, and then objectively compares the features of the improved GOT scheme and the angular-displacement VKF OT technique. It is numerically observed the improved method performs less accurately than the VKF_OT scheme at the crossing occurrences, but without end effect in the reconstructed waveform. As OT is not exact science, it may well be that the decrease in computation time can justify the reduced accuracy. The characterisation and discrimination of riding noise with crossing orders emitted by an electrical scooter are conducted as an example of the application.

  16. On the improvement of wave and storm surge hindcasts by downscaled atmospheric forcing: application to historical storms

    NASA Astrophysics Data System (ADS)

    Bresson, Émilie; Arbogast, Philippe; Aouf, Lotfi; Paradis, Denis; Kortcheva, Anna; Bogatchev, Andrey; Galabov, Vasko; Dimitrova, Marieta; Morvan, Guillaume; Ohl, Patrick; Tsenova, Boryana; Rabier, Florence

    2018-04-01

    Winds, waves and storm surges can inflict severe damage in coastal areas. In order to improve preparedness for such events, a better understanding of storm-induced coastal flooding episodes is necessary. To this end, this paper highlights the use of atmospheric downscaling techniques in order to improve wave and storm surge hindcasts. The downscaling techniques used here are based on existing European Centre for Medium-Range Weather Forecasts reanalyses (ERA-20C, ERA-40 and ERA-Interim). The results show that the 10 km resolution data forcing provided by a downscaled atmospheric model gives a better wave and surge hindcast compared to using data directly from the reanalysis. Furthermore, the analysis of the most extreme mid-latitude cyclones indicates that a four-dimensional blending approach improves the whole process, as it assimilates more small-scale processes in the initial conditions. Our approach has been successfully applied to ERA-20C (the 20th century reanalysis).

  17. Processing, quality and safety of irradiation - and high pressure processed meat and seafood products

    USDA-ARS?s Scientific Manuscript database

    In the past two decades, worldwide demands for meat and seafood products have increased dramatically due to the improved economical condition in many countries. To meet the demand, the producers have increased the production of meat and seafood products as well as applied new processing techniques t...

  18. Electromagnetic modelling, inversion and data-processing techniques for GPR: ongoing activities in Working Group 3 of COST Action TU1208

    NASA Astrophysics Data System (ADS)

    Pajewski, Lara; Giannopoulos, Antonis; van der Kruk, Jan

    2015-04-01

    This work aims at presenting the ongoing research activities carried out in Working Group 3 (WG3) 'EM methods for near-field scattering problems by buried structures; data processing techniques' of the COST (European COoperation in Science and Technology) Action TU1208 'Civil Engineering Applications of Ground Penetrating Radar' (www.GPRadar.eu). The principal goal of the COST Action TU1208 is to exchange and increase scientific-technical knowledge and experience of GPR techniques in civil engineering, simultaneously promoting throughout Europe the effective use of this safe and non-destructive technique in the monitoring of infrastructures and structures. WG3 is structured in four Projects. Project 3.1 deals with 'Electromagnetic modelling for GPR applications.' Project 3.2 is concerned with 'Inversion and imaging techniques for GPR applications.' The topic of Project 3.3 is the 'Development of intrinsic models for describing near-field antenna effects, including antenna-medium coupling, for improved radar data processing using full-wave inversion.' Project 3.4 focuses on 'Advanced GPR data-processing algorithms.' Electromagnetic modeling tools that are being developed and improved include the Finite-Difference Time-Domain (FDTD) technique and the spectral domain Cylindrical-Wave Approach (CWA). One of the well-known freeware and versatile FDTD simulators is GprMax that enables an improved realistic representation of the soil/material hosting the sought structures and of the GPR antennas. Here, input/output tools are being developed to ease the definition of scenarios and the visualisation of numerical results. The CWA expresses the field scattered by subsurface two-dimensional targets with arbitrary cross-section as a sum of cylindrical waves. In this way, the interaction is taken into account of multiple scattered fields within the medium hosting the sought targets. Recently, the method has been extended to deal with through-the-wall scenarios. One of the inversion techniques currently being improved is Full-Waveform Inversion (FWI) for on-ground, off-ground, and crosshole GPR configurations. In contrast to conventional inversion tools which are often based on approximations and use only part of the available data, FWI uses the complete measured data and detailed modeling tools to obtain an improved estimation of medium properties. During the first year of the Action, information was collected and shared about state-of-the-art of the available modelling, imaging, inversion, and data-processing methods. Advancements achieved by WG3 Members were presented during the TU1208 Second General Meeting (April 30 - May 2, 2014, Vienna, Austria) and the 15th International Conference on Ground Penetrating Radar (June 30 - July 4, 2014, Brussels, Belgium). Currently, a database of numerical and experimental GPR responses from natural and manmade structures is being designed. A geometrical and physical description of the scenarios, together with the available synthetic and experimental data, will be at the disposal of the scientific community. Researchers will thus have a further opportunity of testing and validating, against reliable data, their electromagnetic forward- and inverse-scattering techniques, imaging methods and data-processing algorithms. The motivation to start this database came out during TU1208 meetings and takes inspiration by successful past initiatives carried out in different areas, as the Ipswich and Fresnel databases in the field of free-space electromagnetic scattering, and the Marmousi database in seismic science. Acknowledgement The Authors thank COST, for funding the Action TU1208 'Civil Engineering Applications of Ground Penetrating Radar.'

  19. The study of crystals for space processing and the effect of o-gravity

    NASA Technical Reports Server (NTRS)

    Lal, R. B.

    1977-01-01

    The mechanism of crystal growth was studied by solution technique and how it was affected by space environment. Investigation was made as to how space processing methods are used to improve the promising candidate materials for different devices.

  20. Neural networks for dimensionality reduction of fluorescence spectra and prediction of drinking water disinfection by-products.

    PubMed

    Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C

    2018-06-01

    The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Control Improvement for Jump-Diffusion Processes with Applications to Finance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baeuerle, Nicole, E-mail: nicole.baeuerle@kit.edu; Rieder, Ulrich, E-mail: ulrich.rieder@uni-ulm.de

    2012-02-15

    We consider stochastic control problems with jump-diffusion processes and formulate an algorithm which produces, starting from a given admissible control {pi}, a new control with a better value. If no improvement is possible, then {pi} is optimal. Such an algorithm is well-known for discrete-time Markov Decision Problems under the name Howard's policy improvement algorithm. The idea can be traced back to Bellman. Here we show with the help of martingale techniques that such an algorithm can also be formulated for stochastic control problems with jump-diffusion processes. As an application we derive some interesting results in financial portfolio optimization.

  2. Multi-frame image processing with panning cameras and moving subjects

    NASA Astrophysics Data System (ADS)

    Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric

    2014-06-01

    Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.

  3. Improved compression molding technology for continuous fiber reinforced composite laminates. Part 2: AS-4/Polyimidesulfone prepreg system

    NASA Technical Reports Server (NTRS)

    Baucom, Robert M.; Hou, Tan-Hung; Kidder, Paul W.; Reddy, Rakasi M.

    1991-01-01

    AS-4/polyimidesulfone (PISO2) composite prepreg was utilized for the improved compression molding technology investigation. This improved technique employed molding stops which advantageously facilitate the escape of volatile by-products during the B-stage curing step, and effectively minimize the neutralization of the consolidating pressure by intimate interply fiber-fiber contact within the laminate in the subsequent molding cycle. Without the modifying the resin matrix properties, composite panels with both unidirectional and angled plies with outstanding C-scans and mechanical properties were successfully molded using moderate molding conditions, i.e., 660 F and 500 psi, using this technique. The size of the panels molded were up to 6.00 x 6.00 x 0.07 in. A consolidation theory was proposed for the understanding and advancement of the processing science. Processing parameters such as vacuum, pressure cycle design, prepreg quality, etc. were explored.

  4. Does Nursing Facility Use of Habilitation Therapy Improve Performance on Quality Measures?

    PubMed

    Fitzler, Sandra; Raia, Paul; Buckley, Fredrick O; Wang, Mei

    2016-12-01

    The purpose of the project, Centers for Medicare & Medicaid Services (CMS) Innovation study, was to evaluate the impact on 12 quality measures including 10 Minimum Data Set (MDS) publicly reported measures and 2 nursing home process measures using habilitation therapy techniques and a behavior team to manage dementia-related behaviors. A prospective design was used to assess the changes in the measures. A total of 30 Massachusetts nursing homes participated in the project over a 12-month period. Project participation required the creation of an interdisciplinary behavior team, habilitation therapy training, facility visit by the program coordinator, attendance at bimonthly support and sharing calls, and monthly collection of process measure data. Participating facilities showed improvement in 9 of the 12 reported measures. Findings indicate potential quality improvement in having nursing homes learn habilitation therapy techniques and know how to use the interdisciplinary team to manage problem behaviors. © The Author(s) 2016.

  5. TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook

    DTIC Science & Technology

    1989-08-01

    This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.

  6. Coater/developer based techniques to improve high-resolution EUV patterning defectivity

    NASA Astrophysics Data System (ADS)

    Hontake, Koichi; Huli, Lior; Lemley, Corey; Hetzer, Dave; Liu, Eric; Ko, Akiteru; Kawakami, Shinichiro; Shimoaoki, Takeshi; Hashimoto, Yusaku; Tanaka, Koichiro; Petrillo, Karen; Meli, Luciana; De Silva, Anuja; Xu, Yongan; Felix, Nelson; Johnson, Richard; Murray, Cody; Hubbard, Alex

    2017-10-01

    Extreme ultraviolet lithography (EUVL) technology is one of the leading candidates under consideration for enabling the next generation of devices, for 7nm node and beyond. As the focus shifts to driving down the 'effective' k1 factor and enabling the full scaling entitlement of EUV patterning, new techniques and methods must be developed to reduce the overall defectivity, mitigate pattern collapse, and eliminate film-related defects. In addition, CD uniformity and LWR/LER must be improved in terms of patterning performance. Tokyo Electron Limited (TEL™) and IBM Corporation are continuously developing manufacturing quality processes for EUV. In this paper, we review the ongoing progress in coater/developer based processes (coating, developing, baking) that are required to enable EUV patterning.

  7. Improving medium-range ensemble streamflow forecasts through statistical post-processing

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.

  8. Improved hybridization of Fuzzy Analytic Hierarchy Process (FAHP) algorithm with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW)

    NASA Astrophysics Data System (ADS)

    Zaiwani, B. E.; Zarlis, M.; Efendi, S.

    2018-03-01

    In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.

  9. Propagation-based x-ray phase contrast imaging using an iterative phase diversity technique

    NASA Astrophysics Data System (ADS)

    Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.

    2018-03-01

    Through the use of a phase diversity technique, we demonstrate a near-field in-line x-ray phase contrast algorithm that provides improved object reconstruction when compared to our previous iterative methods for a homogeneous sample. Like our previous methods, the new technique uses the sample refractive index distribution during the reconstruction process. The technique complements existing monochromatic and polychromatic methods and is useful in situations where experimental phase contrast data is affected by noise.

  10. Sixth Annual Flight Mechanics/Estimation Theory Symposium

    NASA Technical Reports Server (NTRS)

    Lefferts, E. (Editor)

    1981-01-01

    Methods of orbital position estimation were reviewed. The problem of accuracy in orbital mechanics is discussed and various techniques in current use are presented along with suggested improvements. Of special interest is the compensation for bias in satelliteborne instruments due to attitude instabilities. Image processing and correctional techniques are reported for geodetic measurements and mapping.

  11. Statistical Techniques for Efficient Indexing and Retrieval of Document Images

    ERIC Educational Resources Information Center

    Bhardwaj, Anurag

    2010-01-01

    We have developed statistical techniques to improve the performance of document image search systems where the intermediate step of OCR based transcription is not used. Previous research in this area has largely focused on challenges pertaining to generation of small lexicons for processing handwritten documents and enhancement of poor quality…

  12. [Low-energy wideband electromagnetic radiation and manual therapy in the treatment of neurological manifestations of spinal osteochondrosis].

    PubMed

    Afoshin, S A; Gerasimenko, M Iu

    2006-01-01

    It is shown that the advanced technique of low-energy wideband electromagnetic radiation improves vascular tonicity and peripheral circulation while a modified technique of manual therapy facilitates movements in the affected part of the spine and reduces tonicity of the muscles involved in the pathological process.

  13. Improved associative recall of binary data in volume holographic memories

    NASA Astrophysics Data System (ADS)

    Betzos, George A.; Laisné, Alexandre; Mitkas, Pericles A.

    1999-11-01

    A new technique is presented that improves the results of associative recall in a volume holographic memory system. A background is added to the normal search argument to increase the amount of optical power that is used to reconstruct the reference beams in the crystal. This is combined with post-processing of the captured image of the reference beams. The use of both the background and post-processing greatly improves the results by allowing associative recall using small arguments. In addition, the number of false hits is reduced and misses are virtually eliminated.

  14. Interactive Management and Updating of Spatial Data Bases

    NASA Technical Reports Server (NTRS)

    French, P.; Taylor, M.

    1982-01-01

    The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.

  15. Computer image processing: Geologic applications

    NASA Technical Reports Server (NTRS)

    Abrams, M. J.

    1978-01-01

    Computer image processing of digital data was performed to support several geological studies. The specific goals were to: (1) relate the mineral content to the spectral reflectance of certain geologic materials, (2) determine the influence of environmental factors, such as atmosphere and vegetation, and (3) improve image processing techniques. For detection of spectral differences related to mineralogy, the technique of band ratioing was found to be the most useful. The influence of atmospheric scattering and methods to correct for the scattering were also studied. Two techniques were used to correct for atmospheric effects: (1) dark object subtraction, (2) normalization of use of ground spectral measurements. Of the two, the first technique proved to be the most successful for removing the effects of atmospheric scattering. A digital mosaic was produced from two side-lapping LANDSAT frames. The advantages were that the same enhancement algorithm can be applied to both frames, and there is no seam where the two images are joined.

  16. A diagnostic technique used to obtain cross range radiation centers from antenna patterns

    NASA Technical Reports Server (NTRS)

    Lee, T. H.; Burnside, W. D.

    1988-01-01

    A diagnostic technique to obtain cross range radiation centers based on antenna radiation patterns is presented. This method is similar to the synthetic aperture processing of scattered fields in the radar application. Coherent processing of the radiated fields is used to determine the various radiation centers associated with the far-zone pattern of an antenna for a given radiation direction. This technique can be used to identify an unexpected radiation center that creates an undesired effect in a pattern; on the other hand, it can improve a numerical simulation of the pattern by identifying other significant mechanisms. Cross range results for two 8' reflector antennas are presented to illustrate as well as validate that technique.

  17. Improving Our Odds: Success through Continuous Risk Management

    NASA Technical Reports Server (NTRS)

    Greenhalgh, Phillip O.

    2009-01-01

    Launching a rocket, running a business, driving to work and even day-to-day living all involve some degree of risk. Risk is ever present yet not always recognized, adequately assessed and appropriately mitigated. Identification, assessment and mitigation of risk are elements of the risk management component of the "continuous improvement" way of life that has become a hallmark of successful and progressive enterprises. While the application of risk management techniques to provide continuous improvement may be detailed and extensive, the philosophy, ideals and tools can be beneficially applied to all situations. Experiences with the use of risk identification, assessment and mitigation techniques for complex systems and processes are described. System safety efforts and tools used to examine potential risks of the Ares I First Stage of NASA s new Constellation Crew Launch Vehicle (CLV) presently being designed are noted as examples. Recommendations from lessons learned are provided for the application of risk management during the development of new systems as well as for the improvement of existing systems. Lessons learned and suggestions given are also examined for applicability to simple systems, uncomplicated processes and routine personal daily tasks. This paper informs the reader of varied uses of risk management efforts and techniques to identify, assess and mitigate risk for improvement of products, success of business, protection of people and enhancement of personal life.

  18. Proton conducting ceramics in membrane separations

    DOEpatents

    Brinkman, Kyle S; Korinko, Paul S; Fox, Elise B; Chen, Frank

    2015-04-14

    Perovskite materials of the general formula SrCeO.sub.3 and BaCeO.sub.3 are provided having improved conductivity while maintaining an original ratio of chemical constituents, by altering the microstructure of the material. A process of making Pervoskite materials is also provided in which wet chemical techniques are used to fabricate nanocrystalline ceramic materials which have improved grain size and allow lower temperature densification than is obtainable with conventional solid-state reaction processing.

  19. Integrative techniques related to positive processes in psychotherapy.

    PubMed

    Cromer, Thomas D

    2013-09-01

    This review compiles and evaluates a number of therapist interventions that have been found to significantly contribute to positive psychotherapy processes (i.e., increased alliance, patient engagement/satisfaction, and symptomatic improvement). Four forms of intervention are presented: Affect-focused, Supportive, Exploratory, and Patient-Therapist Interaction. The intention of this review is to link specific interventions to applied practice so that integrative clinicians can potentially use these techniques to improve their clinical work. To this end, there is the inclusion of theory and empirical studies from a range of orientations including Emotionally Focused, Psychodynamic, Client-Centered, Cognitive-Behavioral, Interpersonal, Eclectic, and Motivational Interviewing. Each of the four sections will include the theoretical basis and proposed mechanism of change for the intervention, research that supports its positive impact on psychotherapy processes, and conclude with examples demonstrating its use in actual practice. Clinical implications and considerations regarding the use of these interventions will also be presented. 2013 APA, all rights reserved

  20. Soft Systems Methodology

    NASA Astrophysics Data System (ADS)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  1. A hybrid, auto-adaptive and rule-based multi-agent approach using evolutionary algorithms for improved searching

    NASA Astrophysics Data System (ADS)

    Izquierdo, Joaquín; Montalvo, Idel; Campbell, Enrique; Pérez-García, Rafael

    2016-08-01

    Selecting the most appropriate heuristic for solving a specific problem is not easy, for many reasons. This article focuses on one of these reasons: traditionally, the solution search process has operated in a given manner regardless of the specific problem being solved, and the process has been the same regardless of the size, complexity and domain of the problem. To cope with this situation, search processes should mould the search into areas of the search space that are meaningful for the problem. This article builds on previous work in the development of a multi-agent paradigm using techniques derived from knowledge discovery (data-mining techniques) on databases of so-far visited solutions. The aim is to improve the search mechanisms, increase computational efficiency and use rules to enrich the formulation of optimization problems, while reducing the search space and catering to realistic problems.

  2. Induction of engineered residual stresses fields and enhancement of fatigue life of high reliability metallic components by laser shock processing

    NASA Astrophysics Data System (ADS)

    Ocaña, J. L.; Porro, J. A.; Díaz, M.; Ruiz de Lara, L.; Correa, C.; Gil-Santos, A.; Peral, D.

    2013-02-01

    Laser shock processing (LSP) is being increasingly applied as an effective technology for the improvement of metallic materials mechanical and surface properties in different types of components as a means of enhancement of their corrosion and fatigue life behavior. As reported in previous contributions by the authors, a main effect resulting from the application of the LSP technique consists on the generation of relatively deep compression residual stresses field into metallic alloy pieces allowing an improved mechanical behaviour, explicitly the life improvement of the treated specimens against wear, crack growth and stress corrosion cracking. Additional results accomplished by the authors in the line of practical development of the LSP technique at an experimental level (aiming its integral assessment from an interrelated theoretical and experimental point of view) are presented in this paper. Concretely, follow-on experimental results on the residual stress profiles and associated surface properties modification successfully reached in typical materials (especially Al and Ti alloys characteristic of high reliability components in the aerospace, nuclear and biomedical sectors) under different LSP irradiation conditions are presented along with a practical correlated analysis on the protective character of the residual stress profiles obtained under different irradiation strategies. Additional remarks on the improved character of the LSP technique over the traditional "shot peening" technique in what concerns depth of induced compressive residual stresses fields are also made through the paper.

  3. Fast optically sectioned fluorescence HiLo endomicroscopy

    PubMed Central

    Lim, Daryl; Mertz, Jerome

    2012-01-01

    Abstract. We describe a nonscanning, fiber bundle endomicroscope that performs optically sectioned fluorescence imaging with fast frame rates and real-time processing. Our sectioning technique is based on HiLo imaging, wherein two widefield images are acquired under uniform and structured illumination and numerically processed to reject out-of-focus background. This work is an improvement upon an earlier demonstration of widefield optical sectioning through a flexible fiber bundle. The improved device features lateral and axial resolutions of 2.6 and 17 μm, respectively, a net frame rate of 9.5 Hz obtained by real-time image processing with a graphics processing unit (GPU) and significantly reduced motion artifacts obtained by the use of a double-shutter camera. We demonstrate the performance of our system with optically sectioned images and videos of a fluorescently labeled chorioallantoic membrane (CAM) in the developing G. gallus embryo. HiLo endomicroscopy is a candidate technique for low-cost, high-speed clinical optical biopsies. PMID:22463023

  4. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  5. Improving post-stroke cognitive and behavioral abnormalities by using virtual reality: A case report on a novel use of nirvana.

    PubMed

    De Luca, Rosaria; Torrisi, Michele; Piccolo, Adriana; Bonfiglio, Giovanni; Tomasello, Provvidenza; Naro, Antonino; Calabrò, Rocco Salvatore

    2017-10-11

    Cognitive impairment, as well as mood and anxiety disorders, occur frequently in patients following stroke. Aim of this study was to evaluate the effects of a combined rehabilitative treatment using conventional relaxation and respiratory techniques, in a specific rehabilitative virtual environment (by using Bts-Nirvana). A 58-year-old woman, affected by hemorrhagic stroke, underwent two different rehabilitation trainings, including either standard relaxation techniques alone in a common clinical setting or the same psychological approach in a semi-immersive virtual environment with an augmented sensorial (audio-video) and motor feedback (sensory motor-interaction). We evaluated the patient's cognitive and psychological profile before and after the two different trainings, by using a specific psychometric battery, aimed to assess cognitive status, attention processes and to estimate the presence of mood alterations, anxiety and coping strategies. Only at the end of the combined approach, we observed a significant improvement in attention and memory functions, with a nearly complete relief of anxiety symptoms and an improvement in coping strategies. Relaxation and respiratory techniques in a semi-immersive virtual reality environment, using Bts-Nirvana, may be a promising tool in improving attention process, coping strategies, and anxiety in individuals with neurological disorders, including stroke.

  6. Comparison of Sequential and Variational Data Assimilation

    NASA Astrophysics Data System (ADS)

    Alvarado Montero, Rodolfo; Schwanenberg, Dirk; Weerts, Albrecht

    2017-04-01

    Data assimilation is a valuable tool to improve model state estimates by combining measured observations with model simulations. It has recently gained significant attention due to its potential in using remote sensing products to improve operational hydrological forecasts and for reanalysis purposes. This has been supported by the application of sequential techniques such as the Ensemble Kalman Filter which require no additional features within the modeling process, i.e. it can use arbitrary black-box models. Alternatively, variational techniques rely on optimization algorithms to minimize a pre-defined objective function. This function describes the trade-off between the amount of noise introduced into the system and the mismatch between simulated and observed variables. While sequential techniques have been commonly applied to hydrological processes, variational techniques are seldom used. In our believe, this is mainly attributed to the required computation of first order sensitivities by algorithmic differentiation techniques and related model enhancements, but also to lack of comparison between both techniques. We contribute to filling this gap and present the results from the assimilation of streamflow data in two basins located in Germany and Canada. The assimilation introduces noise to precipitation and temperature to produce better initial estimates of an HBV model. The results are computed for a hindcast period and assessed using lead time performance metrics. The study concludes with a discussion of the main features of each technique and their advantages/disadvantages in hydrological applications.

  7. Denoising time-domain induced polarisation data using wavelet techniques

    NASA Astrophysics Data System (ADS)

    Deo, Ravin N.; Cull, James P.

    2016-05-01

    Time-domain induced polarisation (TDIP) methods are routinely used for near-surface evaluations in quasi-urban environments harbouring networks of buried civil infrastructure. A conventional technique for improving signal to noise ratio in such environments is by using analogue or digital low-pass filtering followed by stacking and rectification. However, this induces large distortions in the processed data. In this study, we have conducted the first application of wavelet based denoising techniques for processing raw TDIP data. Our investigation included laboratory and field measurements to better understand the advantages and limitations of this technique. It was found that distortions arising from conventional filtering can be significantly avoided with the use of wavelet based denoising techniques. With recent advances in full-waveform acquisition and analysis, incorporation of wavelet denoising techniques can further enhance surveying capabilities. In this work, we present the rationale for utilising wavelet denoising methods and discuss some important implications, which can positively influence TDIP methods.

  8. Postharvest processes of edible insects in Africa: A review of processing methods, and the implications for nutrition, safety and new products development.

    PubMed

    Mutungi, C; Irungu, F G; Nduko, J; Mutua, F; Affognon, H; Nakimbugwe, D; Ekesi, S; Fiaboe, K K M

    2017-08-30

    In many African cultures, insects are part of the diet of humans and domesticated animals. Compared to conventional food and feed sources, insects have been associated with a low ecological foot print because fewer natural resources are required for their production. To this end, the Food and Agriculture Organization of the United Nations recognized the role that edible insects can play in improving global food and nutrition security; processing technologies, as well as packaging and storage techniques that improve shelf-life were identified as being crucial. However, knowledge of these aspects in light of nutritional value, safety, and functionality is fragmentary and needs to be consolidated. This review attempts to contribute to this effort by evaluating the available evidence on postharvest processes for edible insects in Africa, with the aim of identifying areas that need research impetus. It further draws attention to potential postharvest technology options for overcoming hurdles associated with utilization of insects for food and feed. A greater research thrust is needed in processing and this can build on traditional knowledge. The focus should be to establish optimal techniques that improve presentation, quality and safety of products, and open possibilities to diversify use of edible insects for other benefits.

  9. An Evaluation of Understandability of Patient Journey Models in Mental Health

    PubMed Central

    2016-01-01

    Background There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. Objectives This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Method Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. Results The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. Conclusions The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers. PMID:27471006

  10. Microbial bioinformatics for food safety and production

    PubMed Central

    Alkema, Wynand; Boekhorst, Jos; Wels, Michiel

    2016-01-01

    In the production of fermented foods, microbes play an important role. Optimization of fermentation processes or starter culture production traditionally was a trial-and-error approach inspired by expert knowledge of the fermentation process. Current developments in high-throughput ‘omics’ technologies allow developing more rational approaches to improve fermentation processes both from the food functionality as well as from the food safety perspective. Here, the authors thematically review typical bioinformatics techniques and approaches to improve various aspects of the microbial production of fermented food products and food safety. PMID:26082168

  11. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Harry C.; Fang, Ho T.

    1991-01-01

    The results of a four year program to improve the strength and reliability of injection-molded silicon nitride are summarized. Statistically designed processing experiments were performed to identify and optimize critical processing parameters and compositions. Process improvements were monitored by strength testing at room and elevated temperatures, and microstructural characterization by optical, scanning electron microscopes, and scanning transmission electron microscope. Processing modifications resulted in a 20 percent strength and 72 percent Weibull slope improvement of the baseline material. Additional sintering aids screening and optimization experiments succeeded in developing a new composition (GN-10) capable of 581.2 MPa at 1399 C. A SiC whisker toughened composite using this material as a matrix achieved a room temperature toughness of 6.9 MPa m(exp .5) by the Chevron notched bar technique. Exploratory experiments were conducted on injection molding of turbocharger rotors.

  12. Inductively and capacitively coupled plasmas at interface: A comparative study towards highly efficient amorphous-crystalline Si solar cells

    NASA Astrophysics Data System (ADS)

    Guo, Yingnan; Ong, Thiam Min Brian; Levchenko, I.; Xu, Shuyan

    2018-01-01

    A comparative study on the application of two quite different plasma-based techniques to the preparation of amorphous/crystalline silicon (a-Si:H/c-Si) interfaces for solar cells is presented. The interfaces were fabricated and processed by hydrogen plasma treatment using the conventional plasma-enhanced chemical vacuum deposition (PECVD) and inductively coupled plasma chemical vapour deposition (ICP-CVD) methods The influence of processing temperature, radio-frequency power, treatment duration and other parameters on interface properties and degree of surface passivation were studied. It was found that passivation could be improved by post-deposition treatment using both ICP-CVD and PECVD, but PECVD treatment is more efficient for the improvement on passivation quality, whereas the minority carrier lifetime increased from 1.65 × 10-4 to 2.25 × 10-4 and 3.35 × 10-4 s after the hydrogen plasma treatment by ICP-CVD and PECVD, respectively. In addition to the improvement of carrier lifetimes at low temperatures, low RF powers and short processing times, both techniques are efficient in band gap adjustment at sophisticated interfaces.

  13. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  14. Measuring bio-oil upgrade intermediates and corrosive species with polarity-matched analytical approaches

    DOE PAGES

    Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...

    2014-10-03

    Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less

  15. Methods for Improving Information from ’Undesigned’ Human Factors Experiments.

    DTIC Science & Technology

    Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction

  16. Work Integration of People with Disabilities in the Regular Labour Market: What Can We Do to Improve These Processes?

    ERIC Educational Resources Information Center

    Vila, Montserrat; Pallisera, Maria; Fullana, Judit

    2007-01-01

    Background: It is important to ensure that regular processes of labour market integration are available for all citizens. Method: Thematic content analysis techniques, using semi-structured group interviews, were used to identify the principal elements contributing to the processes of integrating people with disabilities into the regular labour…

  17. Demography of Principals' Work and School Improvement: Content Validity of Kentucky's Standards and Indicators for School Improvement (SISI)

    ERIC Educational Resources Information Center

    Lindle, Jane Clark; Stalion, Nancy; Young, Lu

    2005-01-01

    Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…

  18. Leveraging Code Comments to Improve Software Reliability

    ERIC Educational Resources Information Center

    Tan, Lin

    2009-01-01

    Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…

  19. Guiding and Modelling Quality Improvement in Higher Education Institutions

    ERIC Educational Resources Information Center

    Little, Daniel

    2015-01-01

    The article considers the process of creating quality improvement in higher education institutions from the point of view of current organisational theory and social-science modelling techniques. The author considers the higher education institution as a functioning complex of rules, norms and other organisational features and reviews the social…

  20. Continuous Quality Improvement: Making the Transition to Education.

    ERIC Educational Resources Information Center

    Hubbard, Dean L., Ed.

    This book is a collection of case studies by 27 educational and industrial leaders describing the implementation of specific Total Quality Management techniques which have demonstrated their value. Essays and their authors are as follows: "Process Improvements Using Team Environments" (Scot M. Faulkner); "Team Effectiveness" (Robert S. Winter);…

  1. Improvement of sub-20nm pattern quality with dose modulation technique for NIL template production

    NASA Astrophysics Data System (ADS)

    Yagawa, Keisuke; Ugajin, Kunihiro; Suenaga, Machiko; Kanamitsu, Shingo; Motokawa, Takeharu; Hagihara, Kazuki; Arisawa, Yukiyasu; Kobayashi, Sachiko; Saito, Masato; Ito, Masamitsu

    2016-04-01

    Nanoimprint lithography (NIL) technology is in the spotlight as a next-generation semiconductor manufacturing technique for integrated circuits at 22 nm and beyond. NIL is the unmagnified lithography technique using template which is replicated from master templates. On the other hand, master templates are currently fabricated by electron-beam (EB) lithography[1]. In near future, finer patterns less than 15nm will be required on master template and EB data volume increases exponentially. So, we confront with a difficult challenge. A higher resolution EB mask writer and a high performance fabrication process will be required. In our previous study, we investigated a potential of photomask fabrication process for finer patterning and achieved 15.5nm line and space (L/S) pattern on template by using VSB (Variable Shaped Beam) type EB mask writer and chemically amplified resist. In contrast, we found that a contrast loss by backscattering decreases the performance of finer patterning. For semiconductor devices manufacturing, we must fabricate complicated patterns which includes high and low density simultaneously except for consecutive L/S pattern. Then it's quite important to develop a technique to make various size or coverage patterns all at once. In this study, a small feature pattern was experimentally formed on master template with dose modulation technique. This technique makes it possible to apply the appropriate exposure dose for each pattern size. As a result, we succeed to improve the performance of finer patterning in bright field area. These results show that the performance of current EB lithography process have a potential to fabricate NIL template.

  2. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  3. Artificial intelligence and signal processing for infrastructure assessment

    NASA Astrophysics Data System (ADS)

    Assaleh, Khaled; Shanableh, Tamer; Yehia, Sherif

    2015-04-01

    The Ground Penetrating Radar (GPR) is being recognized as an effective nondestructive evaluation technique to improve the inspection process. However, data interpretation and complexity of the results impose some limitations on the practicality of using this technique. This is mainly due to the need of a trained experienced person to interpret images obtained by the GPR system. In this paper, an algorithm to classify and assess the condition of infrastructures utilizing image processing and pattern recognition techniques is discussed. Features extracted form a dataset of images of defected and healthy slabs are used to train a computer vision based system while another dataset is used to evaluate the proposed algorithm. Initial results show that the proposed algorithm is able to detect the existence of defects with about 77% success rate.

  4. Radar data smoothing filter study

    NASA Technical Reports Server (NTRS)

    White, J. V.

    1984-01-01

    The accuracy of the current Wallops Flight Facility (WFF) data smoothing techniques for a variety of radars and payloads is examined. Alternative data reduction techniques are given and recommendations are made for improving radar data processing at WFF. A data adaptive algorithm, based on Kalman filtering and smoothing techniques, is also developed for estimating payload trajectories above the atmosphere from noisy time varying radar data. This algorithm is tested and verified using radar tracking data from WFF.

  5. Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques

    DTIC Science & Technology

    2018-04-30

    Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice

  6. Industrial benefits and future expectations in materials and processes resulting from space technology

    NASA Technical Reports Server (NTRS)

    Meyer, J. D.

    1977-01-01

    Space technology transfer is discussed as applied to the field of materials science. Advances made in processing include improved computer techniques, and structural analysis. Technology transfer is shown to have an important impact potential in the overall productivity of the United States.

  7. Latest medical applications of polypropylene.

    PubMed

    Van Lierde, Stijn

    2004-06-01

    PP volumes for use in medical applications increase every year for some obvious reasons. The polymer can be processed by practically all techniques and sterilisation and transparency processes, and its properties are continuously improved. Furthermore, it can replace many other materials, and is attractive from a cost and versatility perspective.

  8. Manufacturing Enhancement through Reduction of Cycle Time using Different Lean Techniques

    NASA Astrophysics Data System (ADS)

    Suganthini Rekha, R.; Periyasamy, P.; Nallusamy, S.

    2017-08-01

    In recent manufacturing system the most important parameters in production line are work in process, TAKT time and line balancing. In this article lean tools and techniques were implemented to reduce the cycle time. The aim is to enhance the productivity of the water pump pipe by identifying the bottleneck stations and non value added activities. From the initial time study the bottleneck processes were identified and then necessary expanding processes were also identified for the bottleneck process. Subsequently the improvement actions have been established and implemented using different lean tools like value stream mapping, 5S and line balancing. The current state value stream mapping was developed to describe the existing status and to identify various problem areas. 5S was used to implement the steps to reduce the process cycle time and unnecessary movements of man and material. The improvement activities were implemented with required suggested and the future state value stream mapping was developed. From the results it was concluded that the total cycle time was reduced about 290.41 seconds and the customer demand has been increased about 760 units.

  9. High-impact strength acrylic denture base material processed by autoclave.

    PubMed

    Abdulwahhab, Salwan Sami

    2013-10-01

    To investigate the effect of two different cycles of autoclave processing on the transverse strength, impact strength, surface hardness and the porosity of high-impact strength acrylic denture base material. High Impact Acryl was the heat-cured acrylic denture base material included in the study. A total of 120 specimens were prepared, the specimens were grouped into: control groups in which high-impact strength acrylic resins processed by conventional water-bath processing technique (74°C for 1.5 h then boil for 30 min) and experimental groups in which high-impact strength acrylic resins processed by autoclave at 121°C, 210 kPa .The experimental groups were divided into (fast) groups for 15 min, and (slow) groups for 30 min. To study the effect of the autoclave processing (Tuttnauer 2540EA), four tests were conducted transverse strength (Instron universal testing machine), impact strength (Charpy tester), surface hardness (shore D), and porosity test. The results were analyzed to ANOVA and LSD test. In ANOVA test, there were highly significant differences between the results of the processing techniques in transverse, impact, hardness, and porosity test. The LSD test showed a significant difference between control and fast groups in transverse and hardness tests and a non-significant difference in impact test and a highly significant difference in porosity test; while, there were a highly significant differences between control and slow groups in all examined tests; finally, there were a non-significant difference between fast and slow groups in transverse and porosity tests and a highly significant difference in impact and hardness tests. In the autoclave processing technique, the slow (long) curing cycle improved the tested physical and mechanical properties as compared with the fast (short) curing cycle. The autoclave processing technique improved the tested physical and mechanical properties of High Impact Acryl. Copyright © 2013 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  10. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  11. Discrimination techniques employing both reflective and thermal multispectral signals. [for remote sensor technology

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Crane, R. B.; Richardson, W.

    1973-01-01

    Recent improvements in remote sensor technology carry implications for data processing. Multispectral line scanners now exist that can collect data simultaneously and in registration in multiple channels at both reflective and thermal (emissive) wavelengths. Progress in dealing with two resultant recognition processing problems is discussed: (1) More channels mean higher processing costs; to combat these costs, a new and faster procedure for selecting subsets of channels has been developed. (2) Differences between thermal and reflective characteristics influence recognition processing; to illustrate the magnitude of these differences, some explanatory calculations are presented. Also introduced, is a different way to process multispectral scanner data, namely, radiation balance mapping and related procedures. Techniques and potentials are discussed and examples presented.

  12. In-line ATR-UV and Raman Spectroscopy for Monitoring API Dissolution Process During Liquid-Filled Soft-Gelatin Capsule Manufacturing.

    PubMed

    Wan, Boyong; Zordan, Christopher A; Lu, Xujin; McGeorge, Gary

    2016-10-01

    Complete dissolution of the active pharmaceutical ingredient (API) is critical in the manufacturing of liquid-filled soft-gelatin capsules (SGC). Attenuated total reflectance UV spectroscopy (ATR-UV) and Raman spectroscopy have been investigated for in-line monitoring of API dissolution during manufacturing of an SGC product. Calibration models have been developed with both techniques for in-line determination of API potency. Performance of both techniques was evaluated and compared. The ATR-UV methodology was found to be able to monitor the dissolution process and determine the endpoint, but was sensitive to temperature variations. The Raman technique was also capable of effectively monitoring the process and was more robust to the temperature variation and process perturbations by using an excipient peak for internal correction. Different data preprocessing methodologies were explored in an attempt to improve method performance.

  13. A Review of Research on the Use of Weighted Vests with Children on the Autism Spectrum

    ERIC Educational Resources Information Center

    Morrison, Erin E.

    2007-01-01

    Occupational therapists working in the school system setting report using weighted vests as a technique to improve attention and sensory processing for students who have an autism spectrum disorder. Some critics, however, contend that this technique is used without evidence of effectiveness. This study examines the overall research available on…

  14. Air Force and Army Corps of Engineers Improperly Managed the Award of Contracts for the Blue Devil Block 2 Persistent Surveillance System

    DTIC Science & Technology

    2013-09-19

    environments. This can include the development of new and/or improved analytical and numerical models, rapid data-processing techniques, and new subsurface ... imaging techniques that include active and passive sensor modalities in a variety of rural and urban terrains. Of particular interest is the broadband

  15. Multi-filter spectrophotometry of quasar environments

    NASA Technical Reports Server (NTRS)

    Craven, Sally E.; Hickson, Paul; Yee, Howard K. C.

    1993-01-01

    A many-filter photometric technique for determining redshifts and morphological types, by fitting spectral templates to spectral energy distributions, has good potential for application in surveys. Despite success in studies performed on simulated data, the results have not been fully reliable when applied to real, low signal-to-noise data. We are investigating techniques to improve the fitting process.

  16. Mapping modern software process engineering techniques onto an HEP development environment

    NASA Astrophysics Data System (ADS)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  17. Improving lateral resolution and image quality of optical coherence tomography by the multi-frame superresolution technique for 3D tissue imaging.

    PubMed

    Shen, Kai; Lu, Hui; Baig, Sarfaraz; Wang, Michael R

    2017-11-01

    The multi-frame superresolution technique is introduced to significantly improve the lateral resolution and image quality of spectral domain optical coherence tomography (SD-OCT). Using several sets of low resolution C-scan 3D images with lateral sub-spot-spacing shifts on different sets, the multi-frame superresolution processing of these sets at each depth layer reconstructs a higher resolution and quality lateral image. Layer by layer processing yields an overall high lateral resolution and quality 3D image. In theory, the superresolution processing including deconvolution can solve the diffraction limit, lateral scan density and background noise problems together. In experiment, the improved lateral resolution by ~3 times reaching 7.81 µm and 2.19 µm using sample arm optics of 0.015 and 0.05 numerical aperture respectively as well as doubling the image quality has been confirmed by imaging a known resolution test target. Improved lateral resolution on in vitro skin C-scan images has been demonstrated. For in vivo 3D SD-OCT imaging of human skin, fingerprint and retina layer, we used the multi-modal volume registration method to effectively estimate the lateral image shifts among different C-scans due to random minor unintended live body motion. Further processing of these images generated high lateral resolution 3D images as well as high quality B-scan images of these in vivo tissues.

  18. Improved Process for Fabricating Carbon Nanotube Probes

    NASA Technical Reports Server (NTRS)

    Stevens, R.; Nguyen, C.; Cassell, A.; Delzeit, L.; Meyyappan, M.; Han, Jie

    2003-01-01

    An improved process has been developed for the efficient fabrication of carbon nanotube probes for use in atomic-force microscopes (AFMs) and nanomanipulators. Relative to prior nanotube tip production processes, this process offers advantages in alignment of the nanotube on the cantilever and stability of the nanotube's attachment. A procedure has also been developed at Ames that effectively sharpens the multiwalled nanotube, which improves the resolution of the multiwalled nanotube probes and, combined with the greater stability of multiwalled nanotube probes, increases the effective resolution of these probes, making them comparable in resolution to single-walled carbon nanotube probes. The robust attachment derived from this improved fabrication method and the natural strength and resiliency of the nanotube itself produces an AFM probe with an extremely long imaging lifetime. In a longevity test, a nanotube tip imaged a silicon nitride surface for 15 hours without measurable loss of resolution. In contrast, the resolution of conventional silicon probes noticeably begins to degrade within minutes. These carbon nanotube probes have many possible applications in the semiconductor industry, particularly as devices are approaching the nanometer scale and new atomic layer deposition techniques necessitate a higher resolution characterization technique. Previously at Ames, the use of nanotube probes has been demonstrated for imaging photoresist patterns with high aspect ratio. In addition, these tips have been used to analyze Mars simulant dust grains, extremophile protein crystals, and DNA structure.

  19. Influence of Freeze Concentration Technique on Aromatic and Phenolic Compounds, Color Attributes, and Sensory Properties of Cabernet Sauvignon Wine.

    PubMed

    Wu, Yan-Yan; Xing, Kai; Zhang, Xiao-Xu; Wang, Hui; Wang, Yong; Wang, Fang; Li, Jing-Ming

    2017-06-02

    Red wines produced in the Xinjiang region of China possess poor color density, and lack fruity notes and elegance. The freeze concentration technique, as a well-established concentration method for liquid food systems, was applied to the Cabernet Sauvignon ( Vitis vinifera L.) wine-making process, aiming to investigate its effect on wine quality improvement. Results showed that the freeze concentration treatment did not significantly alter the physicochemical properties of the wine, except for an increase of glycerol and alcoholic content. This technique increased ester contents, as well as decreasing the amount of volatile acids. Higher alcohol contents were also increased, but within an acceptable content range. All taken into consideration, the freeze concentration treated wine showed better fragrance characters according to sensory evaluation. The non-anthocyanin composition was altered by this application, however, the difference disappeared after the aging process. Fortunately, sensory evaluation showed that the treated wine possessed better mouthfeel properties. Anthocyanin contents were enhanced, and effectively stabilized the fresh wine color attributes, resulting in an improvement in appearance of the treated wine. All results considered, it can be concluded that freeze concentration treatment could be a good choice to improve wine quality.

  20. Computational Modeling and Neuroimaging Techniques for Targeting during Deep Brain Stimulation

    PubMed Central

    Sweet, Jennifer A.; Pace, Jonathan; Girgis, Fady; Miller, Jonathan P.

    2016-01-01

    Accurate surgical localization of the varied targets for deep brain stimulation (DBS) is a process undergoing constant evolution, with increasingly sophisticated techniques to allow for highly precise targeting. However, despite the fastidious placement of electrodes into specific structures within the brain, there is increasing evidence to suggest that the clinical effects of DBS are likely due to the activation of widespread neuronal networks directly and indirectly influenced by the stimulation of a given target. Selective activation of these complex and inter-connected pathways may further improve the outcomes of currently treated diseases by targeting specific fiber tracts responsible for a particular symptom in a patient-specific manner. Moreover, the delivery of such focused stimulation may aid in the discovery of new targets for electrical stimulation to treat additional neurological, psychiatric, and even cognitive disorders. As such, advancements in surgical targeting, computational modeling, engineering designs, and neuroimaging techniques play a critical role in this process. This article reviews the progress of these applications, discussing the importance of target localization for DBS, and the role of computational modeling and novel neuroimaging in improving our understanding of the pathophysiology of diseases, and thus paving the way for improved selective target localization using DBS. PMID:27445709

  1. Localisation of epileptic foci using novel imaging modalities

    PubMed Central

    De Ciantis, Alessio; Lemieux, Louis

    2013-01-01

    Purpose of review This review examines recent reports on the use of advanced techniques to map the regions and networks involved during focal epileptic seizure generation in humans. Recent findings A number of imaging techniques are capable of providing new localizing information on the ictal processes and epileptogenic zone. Evaluating the clinical utility of these findings has been mainly performed through post-hoc comparison with the findings of invasive EEG and ictal single-photon emission computed tomography, using postsurgical seizure reduction as the main outcome measure. Added value has been demonstrated in MRI-negative cases. Improved understanding of the human ictiogenic processes and the focus vs. network hypothesis is likely to result from the application of multimodal techniques that combine electrophysiological, semiological, and whole-brain coverage of brain activity changes. Summary On the basis of recent research in the field of neuroimaging, several novel imaging modalities have been improved and developed to provide information about the localization of epileptic foci. PMID:23823464

  2. Superplastic Forming 40 Years and Still Growing

    NASA Astrophysics Data System (ADS)

    Barnes, A. J.

    2007-08-01

    In late 1964 Backofen, Turner & Avery, at MIT, published a paper in which they described the “extraordinary formability” exhibited when fine-grain zinc-aluminum eutectoid (Zn 22 Al) was subjected to bulge testing under appropriate conditions. They concluded their research findings with the following insightful comment “ even more appealing is the thought of applying to superplastic metals forming techniques borrowed from polymer and glass processing.” Since then their insightful thought has become a substantial reality with thousands of tons of metallic sheet materials now being superplastically formed each year. This paper reviews the significant advances that have taken place over the past 40 years including alloy developments, improved forming techniques and equipment, and an ever increasing number of commercial applications. Current and likely future trends are discussed including; applications in the aerospace and automotive markets, faster-forming techniques to improve productivity, the increasing importance of computer modeling and simulation in tool design and process optimization and new alloy developments including superplastic magnesium alloys.

  3. Dry-plasma-free chemical etch technique for variability reduction in multi-patterning (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kal, Subhadeep; Mohanty, Nihar; Farrell, Richard A.; Franke, Elliott; Raley, Angelique; Thibaut, Sophie; Pereira, Cheryl; Pillai, Karthik; Ko, Akiteru; Mosden, Aelan; Biolsi, Peter

    2017-04-01

    Scaling beyond the 7nm technology node demands significant control over the variability down to a few angstroms, in order to achieve reasonable yield. For example, to meet the current scaling targets it is highly desirable to achieve sub 30nm pitch line/space features at back-end of the line (BEOL) or front end of line (FEOL); uniform and precise contact/hole patterning at middle of line (MOL). One of the quintessential requirements for such precise and possibly self-aligned patterning strategies is superior etch selectivity between the target films while other masks/films are exposed. The need to achieve high etch selectivity becomes more evident for unit process development at MOL and BEOL, as a result of low density films choices (compared to FEOL film choices) due to lower temperature budget. Low etch selectivity with conventional plasma and wet chemical etch techniques, causes significant gouging (un-intended etching of etch stop layer, as shown in Fig 1), high line edge roughness (LER)/line width roughness (LWR), non-uniformity, etc. In certain circumstances this may lead to added downstream process stochastics. Furthermore, conventional plasma etches may also have the added disadvantage of plasma VUV damage and corner rounding (Fig. 1). Finally, the above mentioned factors can potentially compromise edge placement error (EPE) and/or yield. Therefore a process flow enabled with extremely high selective etches inherent to film properties and/or etch chemistries is a significant advantage. To improve this etch selectivity for certain etch steps during a process flow, we have to implement alternate highly selective, plasma free techniques in conjunction with conventional plasma etches (Fig 2.). In this article, we will present our plasma free, chemical gas phase etch technique using chemistries that have high selectivity towards a spectrum of films owing to the reaction mechanism ( as shown Fig 1). Gas phase etches also help eliminate plasma damage to the features during the etch process. Herein we will also demonstrate a test case on how a combination or plasma assisted and plasma free etch techniques has the potential to improve process performance of a 193nm immersion based self aligned quandruple patterning (SAQP) for BEOL compliant films (an example shown in Fig 2). In addition, we will also present on the application of gas etches for (1) profile improvement, (2) selective mandrel pull (3) critical dimension trim of mandrels, with an analysis of advantages over conventional techniques in terms of LER and EPE.

  4. Business process study simulation for resource management in an emergency department.

    PubMed

    Poomkothammal, Velusamy

    2006-01-01

    Alexandra Hospital conducted a business process reengineering exercise for all its main processes in order to further improve on their efficiencies with the ultimate aim to provide a higher level of services to patients. The goal of the DEM is to manage an anticipated increase in the volume of patients without much increase in resources. As a start, the Department of Emergency (DEM) medicine studied its AS-IS process and has designed and implemented the new TO-BE process. As part of this continuous improvement effort, staff from Nanyang Polytechnic (NYP) has been assigned the task of applying engineering and analytical techniques to simulate the new process. The simulations were conducted to show on process management and resource planning.

  5. Research and application of spectral inversion technique in frequency domain to improve resolution of converted PS-wave

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; He, Zhen-Hua; Li, Ya-Lin; Li, Rui; He, Guamg-Ming; Li, Zhong

    2017-06-01

    Multi-wave exploration is an effective means for improving precision in the exploration and development of complex oil and gas reservoirs that are dense and have low permeability. However, converted wave data is characterized by a low signal-to-noise ratio and low resolution, because the conventional deconvolution technology is easily affected by the frequency range limits, and there is limited scope for improving its resolution. The spectral inversion techniques is used to identify λ/8 thin layers and its breakthrough regarding band range limits has greatly improved the seismic resolution. The difficulty associated with this technology is how to use the stable inversion algorithm to obtain a high-precision reflection coefficient, and then to use this reflection coefficient to reconstruct broadband data for processing. In this paper, we focus on how to improve the vertical resolution of the converted PS-wave for multi-wave data processing. Based on previous research, we propose a least squares inversion algorithm with a total variation constraint, in which we uses the total variance as a priori information to solve under-determined problems, thereby improving the accuracy and stability of the inversion. Here, we simulate the Gaussian fitting amplitude spectrum to obtain broadband wavelet data, which we then process to obtain a higher resolution converted wave. We successfully apply the proposed inversion technology in the processing of high-resolution data from the Penglai region to obtain higher resolution converted wave data, which we then verify in a theoretical test. Improving the resolution of converted PS-wave data will provide more accurate data for subsequent velocity inversion and the extraction of reservoir reflection information.

  6. Processing MALDI mass spectra to improve mass spectral direct tissue analysis

    NASA Astrophysics Data System (ADS)

    Norris, Jeremy L.; Cornett, Dale S.; Mobley, James A.; Andersson, Malin; Seeley, Erin H.; Chaurand, Pierre; Caprioli, Richard M.

    2007-02-01

    Profiling and imaging biological specimens using MALDI mass spectrometry has significant potential to contribute to our understanding and diagnosis of disease. The technique is efficient and high-throughput providing a wealth of data about the biological state of the sample from a very simple and direct experiment. However, in order for these techniques to be put to use for clinical purposes, the approaches used to process and analyze the data must improve. This study examines some of the existing tools to baseline subtract, normalize, align, and remove spectral noise for MALDI data, comparing the advantages of each. A preferred workflow is presented that can be easily implemented for data in ASCII format. The advantages of using such an approach are discussed for both molecular profiling and imaging mass spectrometry.

  7. Radar image processing for rock-type discrimination

    NASA Technical Reports Server (NTRS)

    Blom, R. G.; Daily, M.

    1982-01-01

    Image processing and enhancement techniques for improving the geologic utility of digital satellite radar images are reviewed. Preprocessing techniques such as mean and variance correction on a range or azimuth line by line basis to provide uniformly illuminated swaths, median value filtering for four-look imagery to eliminate speckle, and geometric rectification using a priori elevation data. Examples are presented of application of preprocessing methods to Seasat and Landsat data, and Seasat SAR imagery was coregistered with Landsat imagery to form composite scenes. A polynomial was developed to distort the radar picture to fit the Landsat image of a 90 x 90 km sq grid, using Landsat color ratios with Seasat intensities. Subsequent linear discrimination analysis was employed to discriminate rock types from known areas. Seasat additions to the Landsat data improved rock identification by 7%.

  8. TU-F-CAMPUS-J-04: Evaluation of Metal Artifact Reduction Technique for the Radiation Therapy Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeong, K; Kuo, H; Ritter, J

    Purpose: To evaluate the feasibility of using a metal artifact reduction technique in depleting metal artifact and its application in improving dose calculation in External Radiation Therapy Planning. Methods: CIRS electron density phantom was scanned with and without steel drill bits placed in some plug holes. Meta artifact reduction software with Metal Deletion Technique (MDT) was used to remove metal artifacts for scanned image with metal. Hounsfield units of electron density plugs from artifact free reference image and MDT processed images were compared. To test the dose calculation improvement after the MDT processed images, clinically approved head and neck planmore » with manual dental artifact correction was tested. Patient images were exported and processed with MDT and plan was recalculated with new MDT image without manual correction. Dose profiles near the metal artifacts were compared. Results: The MDT used in this study effectively reduced the metal artifact caused by beam hardening and scatter. The windmill around the metal drill was greatly improved with smooth rounded view. Difference of the mean HU in each density plug between reference and MDT images were less than 10 HU in most of the plugs. Dose difference between original plan and MDT images were minimal. Conclusion: Most metal artifact reduction methods were developed for diagnostic improvement purpose. Hence Hounsfield unit accuracy was not rigorously tested before. In our test, MDT effectively eliminated metal artifacts with good HU reproduciblity. However, it can introduce new mild artifacts so the MDT images should be checked with original images.« less

  9. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  10. Simultaneous F 0-F 1 modifications of Arabic for the improvement of natural-sounding

    NASA Astrophysics Data System (ADS)

    Ykhlef, F.; Bensebti, M.

    2013-03-01

    Pitch (F 0) modification is one of the most important problems in the area of speech synthesis. Several techniques have been developed in the literature to achieve this goal. The main restrictions of these techniques are in the modification range and the synthesised speech quality, intelligibility and naturalness. The control of formants in a spoken language can significantly improve the naturalness of the synthesised speech. This improvement is mainly dependent on the control of the first formant (F 1). Inspired by this observation, this article proposes a new approach that modifies both F 0 and F 1 of Arabic voiced sounds in order to improve the naturalness of the pitch shifted speech. The developed strategy takes a parallel processing approach, in which the analysis segments are decomposed into sub-bands in the wavelet domain, modified in the desired sub-band by using a resampling technique and reconstructed without affecting the remained sub-bands. Pitch marking and voicing detection are performed in the frequency decomposition step based on the comparison of the multi-level approximation and detail signals. The performance of the proposed technique is evaluated by listening tests and compared to the pitch synchronous overlap and add (PSOLA) technique in the third approximation level. Experimental results have shown that the manipulation in the wavelet domain of F 0 in conjunction with F 1 guarantees natural-sounding of the synthesised speech compared to the classical pitch modification technique. This improvement was appropriate for high pitch modifications.

  11. Interactive segmentation of tongue contours in ultrasound video sequences using quality maps

    NASA Astrophysics Data System (ADS)

    Ghrenassia, Sarah; Ménard, Lucie; Laporte, Catherine

    2014-03-01

    Ultrasound (US) imaging is an effective and non invasive way of studying the tongue motions involved in normal and pathological speech, and the results of US studies are of interest for the development of new strategies in speech therapy. State-of-the-art tongue shape analysis techniques based on US images depend on semi-automated tongue segmentation and tracking techniques. Recent work has mostly focused on improving the accuracy of the tracking techniques themselves. However, occasional errors remain inevitable, regardless of the technique used, and the tongue tracking process must thus be supervised by a speech scientist who will correct these errors manually or semi-automatically. This paper proposes an interactive framework to facilitate this process. In this framework, the user is guided towards potentially problematic portions of the US image sequence by a segmentation quality map that is based on the normalized energy of an active contour model and automatically produced during tracking. When a problematic segmentation is identified, corrections to the segmented contour can be made on one image and propagated both forward and backward in the problematic subsequence, thereby improving the user experience. The interactive tools were tested in combination with two different tracking algorithms. Preliminary results illustrate the potential of the proposed framework, suggesting that the proposed framework generally improves user interaction time, with little change in segmentation repeatability.

  12. Total quality management - It works for aerospace information services

    NASA Technical Reports Server (NTRS)

    Erwin, James; Eberline, Carl; Colquitt, Wanda

    1993-01-01

    Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle.

  13. Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul

    2003-01-01

    Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.

  14. Optimization of Visual Information Presentation for Visual Prosthesis.

    PubMed

    Guo, Fei; Yang, Yuan; Gao, Yong

    2018-01-01

    Visual prosthesis applying electrical stimulation to restore visual function for the blind has promising prospects. However, due to the low resolution, limited visual field, and the low dynamic range of the visual perception, huge loss of information occurred when presenting daily scenes. The ability of object recognition in real-life scenarios is severely restricted for prosthetic users. To overcome the limitations, optimizing the visual information in the simulated prosthetic vision has been the focus of research. This paper proposes two image processing strategies based on a salient object detection technique. The two processing strategies enable the prosthetic implants to focus on the object of interest and suppress the background clutter. Psychophysical experiments show that techniques such as foreground zooming with background clutter removal and foreground edge detection with background reduction have positive impacts on the task of object recognition in simulated prosthetic vision. By using edge detection and zooming technique, the two processing strategies significantly improve the recognition accuracy of objects. We can conclude that the visual prosthesis using our proposed strategy can assist the blind to improve their ability to recognize objects. The results will provide effective solutions for the further development of visual prosthesis.

  15. Optimization of Visual Information Presentation for Visual Prosthesis

    PubMed Central

    Gao, Yong

    2018-01-01

    Visual prosthesis applying electrical stimulation to restore visual function for the blind has promising prospects. However, due to the low resolution, limited visual field, and the low dynamic range of the visual perception, huge loss of information occurred when presenting daily scenes. The ability of object recognition in real-life scenarios is severely restricted for prosthetic users. To overcome the limitations, optimizing the visual information in the simulated prosthetic vision has been the focus of research. This paper proposes two image processing strategies based on a salient object detection technique. The two processing strategies enable the prosthetic implants to focus on the object of interest and suppress the background clutter. Psychophysical experiments show that techniques such as foreground zooming with background clutter removal and foreground edge detection with background reduction have positive impacts on the task of object recognition in simulated prosthetic vision. By using edge detection and zooming technique, the two processing strategies significantly improve the recognition accuracy of objects. We can conclude that the visual prosthesis using our proposed strategy can assist the blind to improve their ability to recognize objects. The results will provide effective solutions for the further development of visual prosthesis. PMID:29731769

  16. Creating a standardized process to offer the standard of care: continuous process improvement methodology is associated with increased rates of sperm cryopreservation among adolescent and young adult males with cancer.

    PubMed

    Shnorhavorian, Margarett; Kroon, Leah; Jeffries, Howard; Johnson, Rebecca

    2012-11-01

    There is limited literature on strategies to overcome the barriers to sperm banking among adolescent and young adult (AYA) males with cancer. By standardizing our process for offering sperm banking to AYA males before cancer treatment, we aimed to improve rates of sperm banking at our institution. Continuous process improvement is a technique that has recently been applied to improve health care delivery. We used continuous process improvement methodologies to create a standard process for fertility preservation for AYA males with cancer at our institution. We compared rates of sperm banking before and after standardization. In the 12-month period after implementation of a standardized process, 90% of patients were offered sperm banking. We demonstrated an 8-fold increase in the proportion of AYA males' sperm banking, and a 5-fold increase in the rate of sperm banking at our institution. Implementation of a standardized process for sperm banking for AYA males with cancer was associated with increased rates of sperm banking at our institution. This study supports the role of standardized health care in decreasing barriers to sperm banking.

  17. Critical fiber length technique for composite manufacturing processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivley, G.N.; Vandiver, T.L.; Dougherty, N.S.

    1996-12-31

    An improved injection technique for composite structures has been cooperatively developed by the U.S. Army Missile Command (MICOM) and Rockwell International (RI). This process simultaneously injects chopped fiberglass fibers and an epoxy resin matrix into a mold. Four injection techniques: (1){open_quotes}Little Willie{close_quotes} RTM system, (2) Pressure Vat system, (3) Pressure Vat system with vacuum assistance, and (4) Injection gun system, were investigated for use with a 304.8 mm x 304.8 mm x 5.08 mm (12 in x 12 in x 0.2 in) flat plaque mold. The driving factors in the process optimization included: fiber length, fiber weight, matrix viscosity, injectionmore » pressure, flow rate, and tool design. At fiber weights higher than 30 percent, the injection gun appears to have advantages over the other systems investigated. Results of an experimental investigation are reviewed in this paper. The investigation of injection techniques is the initial part of the research involved in a developing process, {open_quotes}Critical Fiber Length Technique{close_quotes}. This process will use the data collected in injection experiment along with mechanical properties derived from coupon test data to be incorporated into a composite material design code. The {open_quotes}Critical Fiber Length Technique{close_quotes} is part of a Cooperative Research and Development Agreement (CRADA) established in 1994 between MICOM and RI.« less

  18. Near perfect optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goeke, R.; Farnsworth, A.V.; Neumann, C.C.

    1996-06-01

    This report discusses a novel fabrication process to produce nearly perfect optics. The process utilizes vacuum deposition techniques to optimally modify polished optical substrate surfaces. The surface figure, i.e. contour of a polished optical element, is improved by differentially filling in the low spots on the surface using flux from a physical vapor deposition source through an appropriate mask. The process is expected to enable the manufacture of diffraction-limited optical systems for the UV, extreme UV, and soft X-ray spectral regions, which would have great impact on photolithography and astronomy. This same technique may also reduce the fabrication cost ofmore » visible region optics with aspheric surfaces.« less

  19. High efficiency solar cell processing

    NASA Technical Reports Server (NTRS)

    Ho, F.; Iles, P. A.

    1985-01-01

    At the time of writing, cells made by several groups are approaching 19% efficiency. General aspects of the processing required for such cells are discussed. Most processing used for high efficiency cells is derived from space-cell or concentrator cell technology, and recent advances have been obtained from improved techniques rather than from better understanding of the limiting mechanisms. Theory and modeling are fairly well developed, and adequate to guide further asymptotic increases in performance of near conventional cells. There are several competitive cell designs with promise of higher performance ( 20%) but for these designs further improvements are required. The available cell processing technology to fabricate high efficiency cells is examined.

  20. Friction stir weld tools having fine grain structure

    DOEpatents

    Grant, Glenn J.; Frye, John G.; Kim, Jin Yong; Lavender, Curt A.; Weil, Kenneth Scott

    2016-03-15

    Tools for friction stir welding can be made with fewer process steps, lower cost techniques, and/or lower cost ingredients than other state-of-the-art processes by utilizing improved compositions and processes of fabrication. Furthermore, the tools resulting from the improved compositions and processes of fabrication can exhibit better distribution and homogeneity of chemical constituents, greater strength, and/or increased durability. In one example, a friction stir weld tool includes tungsten and rhenium and is characterized by carbide and oxide dispersoids, by carbide particulates, and by grains that comprise a solid solution of the tungsten and rhenium. The grains do not exceed 10 micrometers in diameter.

  1. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  2. Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2017-03-01

    Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.

  3. Do High Dynamic Range threatments improve the results of Structure from Motion approaches in Geomorphology?

    NASA Astrophysics Data System (ADS)

    Gómez-Gutiérrez, Álvaro; Juan de Sanjosé-Blasco, José; Schnabel, Susanne; de Matías-Bejarano, Javier; Pulido-Fernández, Manuel; Berenguer-Sempere, Fernando

    2015-04-01

    In this work, the hypothesis of improving 3D models obtained with Structure from Motion (SfM) approaches using images pre-processed by High Dynamic Range (HDR) techniques is tested. Photographs of the Veleta Rock Glacier in Spain were captured with different exposure values (EV0, EV+1 and EV-1), two focal lengths (35 and 100 mm) and under different weather conditions for the years 2008, 2009, 2011, 2012 and 2014. HDR images were produced using the different EV steps within Fusion F.1 software. Point clouds were generated using commercial and free available SfM software: Agisoft Photoscan and 123D Catch. Models Obtained using pre-processed images and non-preprocessed images were compared in a 3D environment with a benchmark 3D model obtained by means of a Terrestrial Laser Scanner (TLS). A total of 40 point clouds were produced, georeferenced and compared. Results indicated that for Agisoft Photoscan software differences in the accuracy between models obtained with pre-processed and non-preprocessed images were not significant from a statistical viewpoint. However, in the case of the free available software 123D Catch, models obtained using images pre-processed by HDR techniques presented a higher point density and were more accurate. This tendency was observed along the 5 studied years and under different capture conditions. More work should be done in the near future to corroborate whether the results of similar software packages can be improved by HDR techniques (e.g. ARC3D, Bundler and PMVS2, CMP SfM, Photosynth and VisualSFM).

  4. Space processing of chalcogenide glass

    NASA Technical Reports Server (NTRS)

    Larsen, D. C.; Ali, M. I.

    1977-01-01

    The manner in which the weightless, containerless nature of in-space processing can be successfully utilized to improve the quality of infrared transmitting chalcogenide glasses is determined. The technique of space processing chalcogenide glass was developed, and the process and equipment necessary to do so was defined. Earthbound processing experiments with As2S3 and G28Sb12Se60 glasses were experimented with. Incorporated into these experiments is the use of an acoustic levitation device.

  5. How to improve patient satisfaction when patients are already satisfied: a continuous process-improvement approach.

    PubMed

    Friesner, Dan; Neufelder, Donna; Raisor, Janet; Bozman, Carl S

    2009-01-01

    The authors present a methodology that measures improvement in customer satisfaction scores when those scores are already high and the production process is slow and thus does not generate a large amount of useful data in any given time period. The authors used these techniques with data from a midsized rehabilitation institute affiliated with a regional, nonprofit medical center. Thus, this article functions as a case study, the findings of which may be applicable to a large number of other healthcare providers that share both the mission and challenges faced by this facility. The methodology focused on 2 factors: use of the unique characteristics of panel data to overcome the paucity of observations and a dynamic benchmarking approach to track process variability over time. By focusing on these factors, the authors identify some additional areas for process improvement despite the institute's past operational success.

  6. Additive Manufacturing Techniques for the Reconstruction of 3D Fetal Faces.

    PubMed

    Speranza, Domenico; Citro, Daniela; Padula, Francesco; Motyl, Barbara; Marcolin, Federica; Calì, Michele; Martorelli, Massimo

    2017-01-01

    This paper deals with additive manufacturing techniques for the creation of 3D fetal face models starting from routine 3D ultrasound data. In particular, two distinct themes are addressed. First, a method for processing and building 3D models based on the use of medical image processing techniques is proposed. Second, the preliminary results of a questionnaire distributed to future parents consider the use of these reconstructions both from an emotional and an affective point of view. In particular, the study focuses on the enhancement of the perception of maternity or paternity and the improvement in the relationship between parents and physicians in case of fetal malformations, in particular facial or cleft lip diseases.

  7. Optimal Sensor Management and Signal Processing for New EMI Systems

    DTIC Science & Technology

    2010-09-01

    adaptive techniques that would improve the speed of data collection and increase the mobility of a TEMTADS system. Although an active learning technique...data, SIG has simulated the active selection based on the data already collected at Camp SLO. In this setup, the active learning approach was constrained...to work only on a 5x5 grid (corresponding to twenty five transmitters and co-located receivers). The first technique assumes that active learning will

  8. Determination of the smoke-plume heights and their dynamics with ground-based scanning LIDAR

    Treesearch

    V. Kovalev; A. Petkov; C. Wold; S. Urbanski; W. M. Hao

    2015-01-01

    Lidar-data processing techniques are analyzed, which allow determining smoke-plume heights and their dynamics and can be helpful for the improvement of smoke dispersion and air quality models. The data processing algorithms considered in the paper are based on the analysis of two alternative characteristics related to the smoke dispersion process: the regularized...

  9. Percutaneous Repair Technique for Acute Achilles Tendon Rupture with Assistance of Kirschner Wire.

    PubMed

    He, Ze-yang; Chai, Ming-xiang; Liu, Yue-ju; Zhang, Xiao-ran; Zhang, Tao; Song, Lian-xin; Ren, Zhi-xin; Wu, Xi-rui

    2015-11-01

    The aim of this study is to introduce a self-designed, minimally invasive technique for repairing an acute Achilles tendon rupture percutaneously. Comparing with the traditional open repair, the new technique provides obvious advantages of minimized operation-related lesions, fewer wound complications as well as a higher healing rate. However, a percutaneous technique without direct vision may be criticized by its insufficient anastomosis of Achilles tendon and may also lead to the lengthening of the Achilles tendon and a reduction in the strength of the gastrocnemius. To address the potential problems, we have improved our technique using a percutaneous Kirschner wire leverage process before suturing, which can effectively recover the length of the Achilles tendon and ensure the broken ends are in tight contact. With this improvement in technique, we have great confidence that it will become the treatment of choice for acute Achilles tendon ruptures. © 2015 Chinese Orthopaedic Association and Wiley Publishing Asia Pty Ltd.

  10. Design and optimization of the micro-engine turbine rotor manufacturing using the rapid prototyping technology

    NASA Astrophysics Data System (ADS)

    Vdovin, R. A.; Smelov, V. G.

    2017-02-01

    This work describes the experience in manufacturing the turbine rotor for the micro-engine. It demonstrates the design principles for the complex investment casting process combining the use of the ProCast software and the rapid prototyping techniques. At the virtual modelling stage, in addition to optimized process parameters, the casting structure was improved to obtain the defect-free section. The real production stage allowed demonstrating the performance and fitness of rapid prototyping techniques for the manufacture of geometrically-complex engine-building parts.

  11. Smith predictor with sliding mode control for processes with large dead times

    NASA Astrophysics Data System (ADS)

    Mehta, Utkal; Kaya, İbrahim

    2017-11-01

    The paper discusses the Smith Predictor scheme with Sliding Mode Controller (SP-SMC) for processes with large dead times. This technique gives improved load-disturbance rejection with optimum input control signal variations. A power rate reaching law is incorporated in the sporadic part of sliding mode control such that the overall performance recovers meaningfully. The proposed scheme obtains parameter values by satisfying a new performance index which is based on biobjective constraint. In simulation study, the efficiency of the method is evaluated for robustness and transient performance over reported techniques.

  12. Manufacturing engineering: Principles for optimization

    NASA Astrophysics Data System (ADS)

    Koenig, Daniel T.

    Various subjects in the area of manufacturing engineering are addressed. The topics considered include: manufacturing engineering organization concepts and management techniques, factory capacity and loading techniques, capital equipment programs, machine tool and equipment selection and implementation, producibility engineering, methods, planning and work management, and process control engineering in job shops. Also discussed are: maintenance engineering, numerical control of machine tools, fundamentals of computer-aided design/computer-aided manufacture, computer-aided process planning and data collection, group technology basis for plant layout, environmental control and safety, and the Integrated Productivity Improvement Program.

  13. Basic forest cover mapping using digitized remote sensor data and automated data processing techniques

    NASA Technical Reports Server (NTRS)

    Coggeshall, M. E.; Hoffer, R. M.

    1973-01-01

    Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.

  14. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yonggang

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less

  15. Approximate techniques of structural reanalysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lowder, H. E.

    1974-01-01

    A study is made of two approximate techniques for structural reanalysis. These include Taylor series expansions for response variables in terms of design variables and the reduced-basis method. In addition, modifications to these techniques are proposed to overcome some of their major drawbacks. The modifications include a rational approach to the selection of the reduced-basis vectors and the use of Taylor series approximation in an iterative process. For the reduced basis a normalized set of vectors is chosen which consists of the original analyzed design and the first-order sensitivity analysis vectors. The use of the Taylor series approximation as a first (initial) estimate in an iterative process, can lead to significant improvements in accuracy, even with one iteration cycle. Therefore, the range of applicability of the reanalysis technique can be extended. Numerical examples are presented which demonstrate the gain in accuracy obtained by using the proposed modification techniques, for a wide range of variations in the design variables.

  16. A Survey of Stemming Algorithms in Information Retrieval

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angélica; Imbert, Ricardo; Ramírez, Jaime

    2014-01-01

    Background: During the last fifty years, improved information retrieval techniques have become necessary because of the huge amount of information people have available, which continues to increase rapidly due to the use of new technologies and the Internet. Stemming is one of the processes that can improve information retrieval in terms of…

  17. Supervision that Improves Teaching: Strategies and Techniques. Second Edition

    ERIC Educational Resources Information Center

    Sullivan, Susan; Glanz, Jeffrey

    2004-01-01

    In this exciting, new edition of "Supervision That Improves Teaching," the authors have taken their reflective clinical supervision process to a new level, with the planning conference now the heart of the supervision cycle. Sullivan and Glanz have addressed the dilemmas of preserving meaningful supervision in an era of high-stakes…

  18. High dynamic range hyperspectral imaging for camouflage performance test and evaluation

    NASA Astrophysics Data System (ADS)

    Pearce, D.; Feenan, J.

    2016-10-01

    This paper demonstrates the use of high dynamic range processing applied to the specific technique of hyper-spectral imaging with linescan spectrometers. The technique provides an improvement in signal to noise for reflectance estimation. This is demonstrated for field measurements of rural imagery collected from a ground-based linescan spectrometer of rural scenes. Once fully developed, the specific application is expected to improve the colour estimation approaches and consequently the test and evaluation accuracy of camouflage performance tests. Data are presented on both field and laboratory experiments that have been used to evaluate the improvements granted by the adoption of high dynamic range data acquisition in the field of hyperspectral imaging. High dynamic ranging imaging is well suited to the hyperspectral domain due to the large variation in solar irradiance across the visible and short wave infra-red (SWIR) spectrum coupled with the wavelength dependence of the nominal silicon detector response. Under field measurement conditions it is generally impractical to provide artificial illumination; consequently, an adaptation of the hyperspectral imaging and re ectance estimation process has been developed to accommodate the solar spectrum. This is shown to improve the signal to noise ratio for the re ectance estimation process of scene materials in the 400-500 nm and 700-900 nm regions.

  19. Process Engineering with the Evolutionary Spiral Process Model. Version 01.00.06

    DTIC Science & Technology

    1994-01-01

    program . Process Definition and SPC-92041-CMC Provides methods for defining and Modeling Guidebook documenting processes so they can be analyzed, modified...and Program Evaluation and Review Technique (PERT) support the activity of developing a project schedule. A variety of automated tools, such as...keep the organiza- tion from becoming disoriented during the improvement program (Curtis, Kellner, and Over 1992). Analyzing and documenting how

  20. Analysis of the United States Marine Corps Continuous Process Improvement Program Applied to the Contracting Process at Marine Corps Regional Contracting Office - Southwest

    DTIC Science & Technology

    2007-12-01

    37 3. Poka - yoke ............................................................................................37 4. Systems for...Standard operating procedures • Visual displays for workflow and communication • Total productive maintenance • Poka - yoke techniques to prevent...process step or eliminating non-value-added steps, and reducing the seven common wastes, will decrease the total time of a process. 3. Poka - yoke

  1. Coupling Computer-Aided Process Simulation and ...

    EPA Pesticide Factsheets

    A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable

  2. A novel approach to enhance antibody sensitivity and specificity by peptide cross-linking

    PubMed Central

    Namiki, Takeshi; Valencia, Julio C.; Hall, Matthew D.; Hearing, Vincent J.

    2008-01-01

    Most current techniques employed to improve antigen-antibody signals in western blotting and in immunohistochemistry rely on sample processing prior to staining (e.g. microwaving) or using a more robust reporter (e.g. a secondary antibody with biotin-streptavidin). We have developed and optimized a new approach intended to stabilize the complexes formed between antigens and their respective primary antibodies by cupric ions at high pH. This technique improves the affinity and lowers cross-reactivity with non-specific bands of ∼20% of antibodies tested (5/25). Here we report that this method can enhance antigen-antibody specificity and can improve the utility of some poorly reactive primary antibodies. PMID:18801330

  3. Investigation on improved Gabor order tracking technique

    NASA Astrophysics Data System (ADS)

    Pan, Min-Chun; Chiu, Chun-Ching

    2004-07-01

    The study proposes an improved Gabor order tracking (GOT) technique to cope with crossing orders that cannot be effectively separated using the original GOT scheme. The improvement aids both the reconstruction and interpretation of two crossing orders such as a transmission-element-regarding order component and a structural resonant component. In the paper, the influence of the dual function to Gabor expansion coefficients is investigated, which can affect the precision of the tracked order component. Additionally, using the GOT scheme in noise conditions is demonstrated as well. For applying the improved GOT in real tasks, separation and extraction of close-order components of vibration signals measured from a transmission-element test bench is illustrated using both the GOT and Vold-Kalman filtering (VKF) OT schemes. Finally, comprehensive comparisons between the improved GOT and VKF_OT schemes are made from processing results.

  4. The Interview and Personnel Selection: Is the Process Valid and Reliable?

    ERIC Educational Resources Information Center

    Niece, Richard

    1983-01-01

    Reviews recent literature concerning the job interview. Concludes that such interviews are generally ineffective and proposes that school administrators devise techniques for improving their interviewing systems. (FL)

  5. Ergonomics and design: its principles applied in the industry.

    PubMed

    Tavares, Ademario Santos; Silva, Francisco Nilson da

    2012-01-01

    Industrial Design encompasses both product development and optimization of production process. In this sense, Ergonomics plays a fundamental role, because its principles, methods and techniques can help operators to carry out their tasks most successfully. A case study carried out in an industry shows that the interaction among Design, Production Engineering and Materials Engineering departments may improve some aspects concerned security, comfort, efficiency and performance. In this process, Ergonomics had shown to be of essential importance to strategic decision making to the improvement of production section.

  6. Plagiarism Detection for Indonesian Language using Winnowing with Parallel Processing

    NASA Astrophysics Data System (ADS)

    Arifin, Y.; Isa, S. M.; Wulandhari, L. A.; Abdurachman, E.

    2018-03-01

    The plagiarism has many forms, not only copy paste but include changing passive become active voice, or paraphrasing without appropriate acknowledgment. It happens on all language include Indonesian Language. There are many previous research that related with plagiarism detection in Indonesian Language with different method. But there are still some part that still has opportunity to improve. This research proposed the solution that can improve the plagiarism detection technique that can detect not only copy paste form but more advance than that. The proposed solution is using Winnowing with some addition process in pre-processing stage. With stemming processing in Indonesian Language and generate fingerprint in parallel processing that can saving time processing and produce the plagiarism result on the suspected document.

  7. LEAN SIX SIGMA TECHNIQUES TO IMPROVE OPHTHALMOLOGY CLINIC EFFICIENCY.

    PubMed

    Ciulla, Thomas A; Tatikonda, Mohan V; ElMaraghi, Yehya A; Hussain, Rehan M; Hill, Amanda L; Clary, Julie M; Hattab, Eyas

    2017-07-18

    Ophthalmologists serve an increasing volume of a growing elderly population undergoing increasingly complex outpatient medical care, including extensive diagnostic testing and treatment. The resulting prolonged patient visit times ("patient flow times") limit quality, patient and employee satisfaction, and represent waste. Lean Six Sigma process improvement was used in a vitreoretinal practice to decrease patient flow time, demonstrating that this approach can yield significant improvement in health care. Process flow maps were created to determine the most common care pathways within clinic. Three months' visits from the electronic medical record system, which tracks patient task times at each process step in the office were collected. Care tasks and care pathways consuming the greatest time and variation were identified and modified. Follow-up analysis from 6 weeks' visits was conducted to assess improvement. Nearly all patients took one of five paths through the office. Patient flow was redesigned to reduce waiting room time by having staff members immediately start patients into one of those five paths; staffing was adjusted to address high demand tasks, and scheduling was optimized around derived predictors of patient flow times. Follow-up analysis revealed a statistically significant decline in mean patient flow time by 18% and inpatient flow time SD by 4.6%. Patient and employee satisfaction scores improved. Manufacturing industry techniques, such as Lean and Six Sigma, can be used to improve patient care, minimize waste, and enhance patient and staff satisfaction in outpatient clinics.

  8. Enhancing Teaching Effectiveness Using Experiential Techniques: Model Development and Empirical Evaluation.

    ERIC Educational Resources Information Center

    Wagner, Richard J.; And Others

    In U.S. colleges and universities, much attention has been focused on the need to improve teaching quality and to involve students in the learning process. At the same time, many faculty members are faced with growing class sizes and with time pressures due to research demands. One useful technique is to divide the class into small groups and…

  9. Improvement of transmission properties of visible pilot beam for polymer-coated silver hollow fibers with acrylic silicone resin as buffer layer for sturdy structure

    NASA Astrophysics Data System (ADS)

    Iwai, Katsumasa; Takaku, Hiroyuki; Miyagi, Mitsunobu; Shi, Yi-Wei; Zhu, Xiao-Song; Matsuura, Yuji

    2017-02-01

    Flexible hollow fibers with 530-μm-bore size were developed for infrared laser delivery. Sturdy hollow fibers were fabricated by liquid-phase coating techniques. A silica glass capillary is used as the substrate. Acrylic silicone resin is used as a buffer layer and the buffer layer is firstly coated on the inner surface of the capillary to protect the glass tube from chemical damages due to the following silver plating process. A silver layer was inner-plated by using the conventional silver mirror-plating technique. To improve adhesion of catalyst to the buffer layer, a surface conditioner has been introduced in the method of silver mirror-plating technique. We discuss improvement of transmission properties of sturdy polymer-coated silver hollow fibers for the Er:YAG laser and red pilot beam delivery.

  10. Building a new predictor for multiple linear regression technique-based corrective maintenance turnaround time.

    PubMed

    Cruz, Antonio M; Barr, Cameron; Puñales-Pozo, Elsa

    2008-01-01

    This research's main goals were to build a predictor for a turnaround time (TAT) indicator for estimating its values and use a numerical clustering technique for finding possible causes of undesirable TAT values. The following stages were used: domain understanding, data characterisation and sample reduction and insight characterisation. Building the TAT indicator multiple linear regression predictor and clustering techniques were used for improving corrective maintenance task efficiency in a clinical engineering department (CED). The indicator being studied was turnaround time (TAT). Multiple linear regression was used for building a predictive TAT value model. The variables contributing to such model were clinical engineering department response time (CE(rt), 0.415 positive coefficient), stock service response time (Stock(rt), 0.734 positive coefficient), priority level (0.21 positive coefficient) and service time (0.06 positive coefficient). The regression process showed heavy reliance on Stock(rt), CE(rt) and priority, in that order. Clustering techniques revealed the main causes of high TAT values. This examination has provided a means for analysing current technical service quality and effectiveness. In doing so, it has demonstrated a process for identifying areas and methods of improvement and a model against which to analyse these methods' effectiveness.

  11. Toward dynamic magnetic resonance imaging of the vocal tract during speech production.

    PubMed

    Ventura, Sandra M Rua; Freitas, Diamantino Rui S; Tavares, João Manuel R S

    2011-07-01

    The most recent and significant magnetic resonance imaging (MRI) improvements allow for the visualization of the vocal tract during speech production, which has been revealed to be a powerful tool in dynamic speech research. However, a synchronization technique with enhanced temporal resolution is still required. The study design was transversal in nature. Throughout this work, a technique for the dynamic study of the vocal tract with MRI by using the heart's signal to synchronize and trigger the imaging-acquisition process is presented and described. The technique in question is then used in the measurement of four speech articulatory parameters to assess three different syllables (articulatory gestures) of European Portuguese Language. The acquired MR images are automatically reconstructed so as to result in a variable sequence of images (slices) of different vocal tract shapes in articulatory positions associated with Portuguese speech sounds. The knowledge obtained as a result of the proposed technique represents a direct contribution to the improvement of speech synthesis algorithms, thereby allowing for novel perceptions in coarticulation studies, in addition to providing further efficient clinical guidelines in the pursuit of more proficient speech rehabilitation processes. Copyright © 2011 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  12. Improved representation of situational awareness within a dismounted small combat unit constructive simulation

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Colony, Mike

    2011-06-01

    Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.

  13. Improved preconditioned conjugate gradient algorithm and application in 3D inversion of gravity-gradiometry data

    NASA Astrophysics Data System (ADS)

    Wang, Tai-Han; Huang, Da-Nian; Ma, Guo-Qing; Meng, Zhao-Hai; Li, Ye

    2017-06-01

    With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noisecontaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airborne gravity-gradiometry data from Vinton salt dome (southwest Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.

  14. Intelligent form removal with character stroke preservation

    NASA Astrophysics Data System (ADS)

    Garris, Michael D.

    1996-03-01

    A new technique for intelligent form removal has been developed along with a new method for evaluating its impact on optical character recognition (OCR). All the dominant lines in the image are automatically detected using the Hough line transform and intelligently erased while simultaneously preserving overlapping character strokes by computing line width statistics and keying off of certain visual cues. This new method of form removal operates on loosely defined zones with no image deskewing. Any field in which the writer is provided a horizontal line to enter a response can be processed by this method. Several examples of processed fields are provided, including a comparison of results between the new method and a commercially available forms removal package. Even if this new form removal method did not improve character recognition accuracy, it is still a significant improvement to the technology because the requirement of a priori knowledge of the form's geometric details has been greatly reduced. This relaxes the recognition system's dependence on rigid form design, printing, and reproduction by automatically detecting and removing some of the physical structures (lines) on the form. Using the National Institute of Standards and Technology (NIST) public domain form-based handprint recognition system, the technique was tested on a large number of fields containing randomly ordered handprinted lowercase alphabets, as these letters (especially those with descenders) frequently touch and extend through the line along which they are written. Preserving character strokes improves overall lowercase recognition performance by 3%, which is a net improvement, but a single performance number like this doesn't communicate how the recognition process was really influenced. There is expected to be trade- offs with the introduction of any new technique into a complex recognition system. To understand both the improvements and the trade-offs, a new analysis was designed to compare the statistical distributions of individual confusion pairs between two systems. As OCR technology continues to improve, sophisticated analyses like this are necessary to reduce the errors remaining in complex recognition problems.

  15. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  16. Improved inference in Bayesian segmentation using Monte Carlo sampling: application to hippocampal subfield volumetry.

    PubMed

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2013-10-01

    Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the technique also allows one to compute informative "error bars" on the volume estimates of individual structures. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Improved Inference in Bayesian Segmentation Using Monte Carlo Sampling: Application to Hippocampal Subfield Volumetry

    PubMed Central

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Leemput, Koen Van

    2013-01-01

    Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the technique also allows one to compute informative “error bars” on the volume estimates of individual structures. PMID:23773521

  18. Improved Rhenium Thrust Chambers

    NASA Technical Reports Server (NTRS)

    O'Dell, John Scott

    2015-01-01

    Radiation-cooled bipropellant thrust chambers are being considered for ascent/ descent engines and reaction control systems on various NASA missions and spacecraft, such as the Mars Sample Return and Orion Multi-Purpose Crew Vehicle (MPCV). Currently, iridium (Ir)-lined rhenium (Re) combustion chambers are the state of the art for in-space engines. NASA's Advanced Materials Bipropellant Rocket (AMBR) engine, a 150-lbf Ir-Re chamber produced by Plasma Processes and Aerojet Rocketdyne, recently set a hydrazine specific impulse record of 333.5 seconds. To withstand the high loads during terrestrial launch, Re chambers with improved mechanical properties are needed. Recent electrochemical forming (EL-Form"TM") results have shown considerable promise for improving Re's mechanical properties by producing a multilayered deposit composed of a tailored microstructure (i.e., Engineered Re). The Engineered Re processing techniques were optimized, and detailed characterization and mechanical properties tests were performed. The most promising techniques were selected and used to produce an Engineered Re AMBR-sized combustion chamber for testing at Aerojet Rocketdyne.

  19. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    PubMed

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  20. Improving quality of care in substance abuse treatment using five key process improvement principles

    PubMed Central

    Hoffman, Kim A.; Green, Carla A.; Ford, James H.; Wisdom, Jennifer P.; Gustafson, David H.; McCarty, Dennis

    2012-01-01

    Process and quality improvement techniques have been successfully applied in health care arenas, but efforts to institute these strategies in alcohol and drug treatment are underdeveloped. The Network for the Improvement of Addiction Treatment (NIATx) teaches participating substance abuse treatment agencies to use process improvement strategies to increase client access to, and retention in, treatment. NIATx recommends five principles to promote organizational change: 1) Understand and involve the customer; 2) Fix key problems; 3) Pick a powerful change leader; 4) Get ideas from outside the organization; and 5) Use rapid-cycle testing. Using case studies, supplemented with cross-agency analyses of interview data, this paper profiles participating NIATx treatment agencies that illustrate application of each principle. Results suggest that the most successful organizations integrate and apply most, if not all, of the five principles as they develop and test change strategies. PMID:22282129

  1. Vehicle registration compliance in Wisconsin : [summary].

    DOT National Transportation Integrated Search

    2015-03-01

    The Wisconsin Department of Transportation (WisDOT) conducted an investigation : to improve its passenger vehicle registration processes, with the goals to modernize : techniques, reduce costs, enhance security and maximize compliance. WisDOTs : D...

  2. Using Rapid Improvement Events for Disaster After-Action Reviews: Experience in a Hospital Information Technology Outage and Response.

    PubMed

    Little, Charles M; McStay, Christopher; Oeth, Justin; Koehler, April; Bookman, Kelly

    2018-02-01

    The use of after-action reviews (AARs) following major emergency events, such as a disaster, is common and mandated for hospitals and similar organizations. There is a recurrent challenge of identified problems not being resolved and repeated in subsequent events. A process improvement technique called a rapid improvement event (RIE) was used to conduct an AAR following a complete information technology (IT) outage at a large urban hospital. Using RIE methodology to conduct the AAR allowed for the rapid development and implementation of major process improvements to prepare for future IT downtime events. Thus, process improvement methodology, particularly the RIE, is suited for conducting AARs following disasters and holds promise for improving outcomes in emergency management. Little CM , McStay C , Oeth J , Koehler A , Bookman K . Using rapid improvement events for disaster after-action reviews: experience in a hospital information technology outage and response. Prehosp Disaster Med. 2018;33(1):98-100.

  3. New technique for real-time distortion-invariant multiobject recognition and classification

    NASA Astrophysics Data System (ADS)

    Hong, Rutong; Li, Xiaoshun; Hong, En; Wang, Zuyi; Wei, Hongan

    2001-04-01

    A real-time hybrid distortion-invariant OPR system was established to make 3D multiobject distortion-invariant automatic pattern recognition. Wavelet transform technique was used to make digital preprocessing of the input scene, to depress the noisy background and enhance the recognized object. A three-layer backpropagation artificial neural network was used in correlation signal post-processing to perform multiobject distortion-invariant recognition and classification. The C-80 and NOA real-time processing ability and the multithread programming technology were used to perform high speed parallel multitask processing and speed up the post processing rate to ROIs. The reference filter library was constructed for the distortion version of 3D object model images based on the distortion parameter tolerance measuring as rotation, azimuth and scale. The real-time optical correlation recognition testing of this OPR system demonstrates that using the preprocessing, post- processing, the nonlinear algorithm os optimum filtering, RFL construction technique and the multithread programming technology, a high possibility of recognition and recognition rate ere obtained for the real-time multiobject distortion-invariant OPR system. The recognition reliability and rate was improved greatly. These techniques are very useful to automatic target recognition.

  4. Alzheimer's Disease Assessment: A Review and Illustrations Focusing on Item Response Theory Techniques.

    PubMed

    Balsis, Steve; Choudhury, Tabina K; Geraci, Lisa; Benge, Jared F; Patrick, Christopher J

    2018-04-01

    Alzheimer's disease (AD) affects neurological, cognitive, and behavioral processes. Thus, to accurately assess this disease, researchers and clinicians need to combine and incorporate data across these domains. This presents not only distinct methodological and statistical challenges but also unique opportunities for the development and advancement of psychometric techniques. In this article, we describe relatively recent research using item response theory (IRT) that has been used to make progress in assessing the disease across its various symptomatic and pathological manifestations. We focus on applications of IRT to improve scoring, test development (including cross-validation and adaptation), and linking and calibration. We conclude by describing potential future multidimensional applications of IRT techniques that may improve the precision with which AD is measured.

  5. Energy resolution improvement of CdTe detectors by using the principal component analysis technique

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2018-02-01

    In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.

  6. Remote photoacoustic detection of liquid contamination of a surface.

    PubMed

    Perrett, Brian; Harris, Michael; Pearson, Guy N; Willetts, David V; Pitter, Mark C

    2003-08-20

    A method for the remote detection and identification of liquid chemicals at ranges of tens of meters is presented. The technique uses pulsed indirect photoacoustic spectroscopy in the 10-microm wavelength region. Enhanced sensitivity is brought about by three main system developments: (1) increased laser-pulse energy (150 microJ/pulse), leading to increased strength of the generated photoacoustic signal; (2) increased microphone sensitivity and improved directionality by the use of a 60-cm-diameter parabolic dish; and (3) signal processing that allows improved discrimination of the signal from noise levels through prior knowledge of the pulse shape and pulse-repetition frequency. The practical aspects of applying the technique in a field environment are briefly examined, and possible applications of this technique are discussed.

  7. The magic of image processing

    NASA Astrophysics Data System (ADS)

    Sulentic, Jack W.; Lorre, Jean J.

    1984-05-01

    Digital technology has been used to improve enhancement techniques in astronomical image processing. Continuous tone variations in photographs are assigned density number (DN) values which are arranged in an array. DN locations are processed by computer and turned into pixels which form a reconstruction of the original scene on a television monitor. Digitized data can be manipulated to enhance contrast and filter out gross patterns of light and dark which obscure small scale features. Separate black and white frames exposed at different wavelengths can be digitized and processed individually, then recombined to produce a final image in color. Several examples of the use of the technique are provided, including photographs of spiral galaxy M33; four galaxies in Coma Berenices (NGC 4169, 4173, 4174, and 4175); and Stephens Quintet.

  8. Arc-Welding Spectroscopic Monitoring based on Feature Selection and Neural Networks.

    PubMed

    Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M

    2008-10-21

    A new spectral processing technique designed for application in the on-line detection and classification of arc-welding defects is presented in this paper. A noninvasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed in two consecutive stages. A compression algorithm is first applied to the data, allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in previous works, giving rise to an improvement in the performance of the monitoring system.

  9. High Tech Aids Low Vision: A Review of Image Processing for the Visually Impaired.

    PubMed

    Moshtael, Howard; Aslam, Tariq; Underwood, Ian; Dhillon, Baljean

    2015-08-01

    Recent advances in digital image processing provide promising methods for maximizing the residual vision of the visually impaired. This paper seeks to introduce this field to the readership and describe its current state as found in the literature. A systematic search revealed 37 studies that measure the value of image processing techniques for subjects with low vision. The techniques used are categorized according to their effect and the principal findings are summarized. The majority of participants preferred enhanced images over the original for a wide range of enhancement types. Adapting the contrast and spatial frequency content often improved performance at object recognition and reading speed, as did techniques that attenuate the image background and a technique that induced jitter. A lack of consistency in preference and performance measures was found, as well as a lack of independent studies. Nevertheless, the promising results should encourage further research in order to allow their widespread use in low-vision aids.

  10. Understanding the distributed cognitive processes of intensive care patient discharge.

    PubMed

    Lin, Frances; Chaboyer, Wendy; Wallis, Marianne

    2014-03-01

    To better understand and identify vulnerabilities and risks in the ICU patient discharge process, which provides evidence for service improvement. Previous studies have identified that 'after hours' discharge and 'premature' discharge from ICU are associated with increased mortality. However, some of these studies have largely been retrospective reviews of various administrative databases, while others have focused on specific aspects of the process, which may miss crucial components of the discharge process. This is an ethnographic exploratory study. Distributed cognition and activity theory were used as theoretical frameworks. Ethnographic data collection techniques including informal interviews, direct observations and collecting existing documents were used. A total of 56 one-to-one interviews were conducted with 46 participants; 28 discharges were observed; and numerous documents were collected during a five-month period. A triangulated technique was used in both data collection and data analysis to ensure the research rigour. Under the guidance of activity theory and distributed cognition theoretical frameworks, five themes emerged: hierarchical power and authority, competing priorities, ineffective communication, failing to enact the organisational processes and working collaboratively to optimise the discharge process. Issues with teamwork, cognitive processes and team members' interaction with cognitive artefacts influenced the discharge process. Strategies to improve shared situational awareness are needed to improve teamwork, patient flow and resource efficiency. Tools need to be evaluated regularly to ensure their continuous usefulness. Health care professionals need to be aware of the impact of their competing priorities and ensure discharges occur in a timely manner. Activity theory and distributed cognition are useful theoretical frameworks to support healthcare organisational research. © 2013 John Wiley & Sons Ltd.

  11. Proposal of Heuristic Algorithm for Scheduling of Print Process in Auto Parts Supplier

    NASA Astrophysics Data System (ADS)

    Matsumoto, Shimpei; Okuhara, Koji; Ueno, Nobuyuki; Ishii, Hiroaki

    We are interested in the print process on the manufacturing processes of auto parts supplier as an actual problem. The purpose of this research is to apply our scheduling technique developed in university to the actual print process in mass customization environment. Rationalization of the print process is depending on the lot sizing. The manufacturing lead time of the print process is long, and in the present method, production is done depending on worker’s experience and intuition. The construction of an efficient production system is urgent problem. Therefore, in this paper, in order to shorten the entire manufacturing lead time and to reduce the stock, we reexamine the usual method of the lot sizing rule based on heuristic technique, and we propose the improvement method which can plan a more efficient schedule.

  12. A New Hybrid BFOA-PSO Optimization Technique for Decoupling and Robust Control of Two-Coupled Distillation Column Process.

    PubMed

    Abdelkarim, Noha; Mohamed, Amr E; El-Garhy, Ahmed M; Dorrah, Hassen T

    2016-01-01

    The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.

  13. A New Hybrid BFOA-PSO Optimization Technique for Decoupling and Robust Control of Two-Coupled Distillation Column Process

    PubMed Central

    Mohamed, Amr E.; Dorrah, Hassen T.

    2016-01-01

    The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller. PMID:27807444

  14. Optimizing Vacuum Assisted Resin Transfer Moulding (VARTM) Processing Parameters to Improve Part Quality

    NASA Astrophysics Data System (ADS)

    Polowick, Christopher

    The Low Cost Composites (LCC) group at Carleton University is studying out-of-autoclave composite manufacturing processes such as Vacuum Assisted Resin Transfer Moulding (VARTM) and Closed Cavity Bag Moulding (CCBM). These processes are used to produce inexpensive and high performance components for the GeoSurv II, an Unmanned Aerial Vehicle (UAV) being developed at Carleton University. This research has focused on optimizing VARTM processing parameters to reduce the weight and improve the strength and surface finish of GeoSurv II composite components. A simulation was developed to model resin flow through in VARTM infusions and was used to simulate mould filling and resin emptying of the GeoSurv II inverted V-empennage and mission avionics hatch. The resin infusion schemes of these parts were designed to ensure full preform resin saturation, and minimize thickness variations. An experimental study of the effects of the presence of a corner on composite thickness, void content, and strength was conducted. It was found that inside corners result in local increases in thickness and void content due to poor preform compaction. A novel bagging technique was developed to improve corner compaction, and this technique was shown to reduce thickness variability and void content. The strength, void content, and thickness variation were found to be heavily dependent on corner radius, with corner radii greater than 6.4 mm displaying the greatest improvement in performance for the layups considered. The design of the empennage and hatch mould incorporated the results of this study to improve the quality of these components.

  15. Improved 3D seismic images of dynamic deformation in the Nankai Trough off Kumano

    NASA Astrophysics Data System (ADS)

    Shiraishi, K.; Moore, G. F.; Yamada, Y.; Kinoshita, M.; Sanada, Y.; Kimura, G.

    2016-12-01

    In order to improve the seismic reflection image of dynamic deformation and seismogenic faults in the Nankai trough, the 2006 Kumano 3D seismic dataset was reprocessed from the original field records by applying advanced technologies a decade after the data acquisition and initial processing. The 3D seismic survey revealed the geometry of megasplay fault system. However, there were still unclear regions in the accretionary prism beneath from Kumano basin to the outer ridge, because of sea floor multiple reflections and noise caused by the Kuroshio current. For the next stage of deep scientific drilling into the Nankai trough seismogenic zone, it is essential to know exactly the shape and depth of the megasplay, and fine structures around the drilling site. Three important improvements were achieved in data processing before imaging. First, full deghosting and optimized zero phasing techniques could recover broadband signals, especially in low frequency, by compensating for ghost effects at both source and receiver, and removing source bubbles. Second, the multiple reflections better attenuated by applying advanced techniques in combination, and the strong noise caused by the Kuroshio were attenuated carefully. Third, data regularization by means of the optimized 4D trace interpolation was effective both to mitigate non-uniform fold distribution and to improve data quality. Further imaging processes led to obvious improvement from previous results by applying PSTM with higher order correction of VTI anisotropy, and PSDM based on the velocity model built by reflection tomography with TTI anisotropy. Final reflection images show new geological aspects, such as clear steep dip faults around the "notch", and fine scale faults related to main thrusts in frontal thrust zone. The improved images will highly contribute to understanding the deformation process in the old accretionary prism and seismogenic features related to the megasplay faults.

  16. Identifying critical issues in recreation planning and management: improving the management-research partnership

    Treesearch

    John H. Schomaker; David W. Lime

    1988-01-01

    The "nominal group" process is a proven technique to systematically arrive at a consensus about critical information needs in recreation planning and management. Using this process, 41 managers who attended a 1983 conference on river management identified 114 specific information needs grouped under 11 general questions. Clearly, some concerns of...

  17. A Multifactor Ecosystem Assessment of Wetlands Created Using a Novel Dredged Material Placement Technique in the Atchafalaya River, Louisiana: An Engineering With Nature Demonstration Project

    DTIC Science & Technology

    functions. The strategic placement of dredged materials in locations that mimic natural process promoted additional ecological benefits, especially...regarding wading bird and infaunal habitat, thus adhering to Engineering With Nature (EWN) processes. The multifactor approach improved the wetland

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hettiarachchi, Ganga M.; Donner, Erica; Doelsch, Emmanuel

    To understand the biogeochemistry of nutrients and contaminants in environmental media, their speciation and behavior under different conditions and at multiple scales must be determined. Synchrotron radiation-based X-ray techniques allow scientists to elucidate the underlying mechanisms responsible for nutrient and contaminant mobility, bioavailability, and behavior. The continuous improvement of synchrotron light sources and X-ray beamlines around the world has led to a profound transformation in the field of environmental biogeochemistry and, subsequently, to significant scientific breakthroughs. Following this introductory paper, this special collection includes 10 papers that either present targeted reviews of recent advancements in spectroscopic methods that are applicablemore » to environmental biogeochemistry or describe original research studies conducted on complex environmental samples that have been significantly enhanced by incorporating synchrotron radiation-based X-ray technique(s). We believe that the current focus on improving the speciation of ultra-dilute elements in environmental media through the ongoing optimization of synchrotron technologies (e.g., brighter light sources, improved monochromators, more efficient detectors) will help to significantly push back the frontiers of environmental biogeochemistry research. As many of the relevant techniques produce extremely large datasets, we also identify ongoing improvements in data processing and analysis (e.g., software improvements and harmonization of analytical methods) as a significant requirement for environmental biogeochemists to maximize the information that can be gained using these powerful tools.« less

  19. Application of Fuzzy TOPSIS for evaluating machining techniques using sustainability metrics

    NASA Astrophysics Data System (ADS)

    Digalwar, Abhijeet K.

    2018-04-01

    Sustainable processes and techniques are getting increased attention over the last few decades due to rising concerns over the environment, improved focus on productivity and stringency in environmental as well as occupational health and safety norms. The present work analyzes the research on sustainable machining techniques and identifies techniques and parameters on which sustainability of a process is evaluated. Based on the analysis these parameters are then adopted as criteria’s to evaluate different sustainable machining techniques such as Cryogenic Machining, Dry Machining, Minimum Quantity Lubrication (MQL) and High Pressure Jet Assisted Machining (HPJAM) using a fuzzy TOPSIS framework. In order to facilitate easy arithmetic, the linguistic variables represented by fuzzy numbers are transformed into crisp numbers based on graded mean representation. Cryogenic machining was found to be the best alternative sustainable technique as per the fuzzy TOPSIS framework adopted. The paper provides a method to deal with multi criteria decision making problems in a complex and linguistic environment.

  20. Numerical Simulation of Non-Thermal Food Preservation

    NASA Astrophysics Data System (ADS)

    Rauh, C.; Krauss, J.; Ertunc, Ö.; Delgado, a.

    2010-09-01

    Food preservation is an important process step in food technology regarding product safety and product quality. Novel preservation techniques are currently developed, that aim at improved sensory and nutritional value but comparable safety than in conventional thermal preservation techniques. These novel non-thermal food preservation techniques are based for example on high pressures up to one GPa or pulsed electric fields. in literature studies the high potential of high pressures (HP) and of pulsed electric fields (PEF) is shown due to their high retention of valuable food components as vitamins and flavour and selective inactivation of spoiling enzymes and microorganisms. for the design of preservation processes based on the non-thermal techniques it is crucial to predict the effect of high pressure and pulsed electric fields on the food components and on the spoiling enzymes and microorganisms locally and time-dependent in the treated product. Homogenous process conditions (especially of temperature fields in HP and PEF processing and of electric fields in PEF) are aimed at to avoid the need of over-processing and the connected quality loss and to minimize safety risks due to under-processing. the present contribution presents numerical simulations of thermofluiddynamical phenomena inside of high pressure autoclaves and pulsed electric field treatment chambers. in PEF processing additionally the electric fields are considered. Implementing kinetics of occurring (bio-) chemical reactions in the numerical simulations of the temperature, flow and electric fields enables the evaluation of the process homogeneity and efficiency connected to different process parameters of the preservation techniques. Suggestions to achieve safe and high quality products are concluded out of the numerical results.

  1. Combining active learning and semi-supervised learning techniques to extract protein interaction sentences.

    PubMed

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2011-11-24

    Protein-protein interaction (PPI) extraction has been a focal point of many biomedical research and database curation tools. Both Active Learning and Semi-supervised SVMs have recently been applied to extract PPI automatically. In this paper, we explore combining the AL with the SSL to improve the performance of the PPI task. We propose a novel PPI extraction technique called PPISpotter by combining Deterministic Annealing-based SSL and an AL technique to extract protein-protein interaction. In addition, we extract a comprehensive set of features from MEDLINE records by Natural Language Processing (NLP) techniques, which further improve the SVM classifiers. In our feature selection technique, syntactic, semantic, and lexical properties of text are incorporated into feature selection that boosts the system performance significantly. By conducting experiments with three different PPI corpuses, we show that PPISpotter is superior to the other techniques incorporated into semi-supervised SVMs such as Random Sampling, Clustering, and Transductive SVMs by precision, recall, and F-measure. Our system is a novel, state-of-the-art technique for efficiently extracting protein-protein interaction pairs.

  2. Image processing and recognition for biological images

    PubMed Central

    Uchida, Seiichi

    2013-01-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. PMID:23560739

  3. Process characteristics of the combination of laser beam- and gas metal arc welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalla, G.; Neuenhahn, J.; Koerber, C.

    1994-12-31

    In this presentation, experiences regarding the combination of laser beam-and gas metal arc welding are discussed. The combination of both techniques offers the possibility of using the specific advantages include the deep penetration effect and the concentrated heat input. Additionally, the gas metal arc welding (GMAW) process is characterized by several advantages, such as high thermal efficiency and good gap-bridging ability. Beyond these characteristics, the combination leads to additional advantages concerning process, technique, and quality. Improvement of seam quality and properties are of special note. Adaptation of the GMAW parameters reduces the hardness of the seam weld at increasing weldingmore » speed. This is possible by adapting the efficiency of metal deposition and by the suitable choice of wire material composition. Another advantage is an improvement of surface topology. The surface of the weld seam and the connection to the base material are very smooth. This leads to advantages with regard to the fatigue strength of the seam.« less

  4. School Improvement Goal Setting: A Collaborative Model.

    ERIC Educational Resources Information Center

    Snyder, Karolyn J.; And Others

    1983-01-01

    Describes the successful use of the Delphi Dialog Technique (a goal-setting process) at East High School, Anchorage, Alaska, where it was used to obtain consensus among staff members about school-growth targets. (JW)

  5. The Telecommunications and Data Aquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1983-01-01

    Tracking and ground-based navigation techniques are discussed in relation to DSN advanced systems. Network data processing and productivity are studied to improve management planning methods. Project activities for upgrading DSN facilities are presented.

  6. Cognitive task analysis of network analysts and managers for network situational awareness

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn

    2010-01-01

    The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.

  7. Development of high-efficiency solar cells on silicon web

    NASA Technical Reports Server (NTRS)

    Meier, D. L.

    1986-01-01

    Achievement of higher efficiency cells by directing efforts toward identifying carrier loss mechanisms; design of cell structures; and development of processing techniques are described. Use of techniques such as deep-level transient spectroscopy (DLTS), laser-beam-induced current (LBIC), and transmission electron microscopy (TEM) indicated that dislocations in web material rather than twin planes were primarily responsible for limiting diffusion lengths in the web. Lifetimes and cell efficiencies can be improved from 19 to 120 microns, and 8 to 10.3% (no AR), respectively, by implanting hydrogen at 1500 eV and a beam current density of 2.0 mA/sq cm. Some of the processing improvements included use of a double-layer AR coating (ZnS and MgF2) and an addition of an aluminum back surface reflectors. Cells of more than 16% efficiency were achieved.

  8. Drug loading into beta-cyclodextrin granules using a supercritical fluid process for improved drug dissolution.

    PubMed

    Hussein, Khaled; Türk, Michael; Wahl, Martin A

    2008-03-03

    To improve dissolution properties of drugs, a supercritical fluid (SCF) technique was used to load these drugs into a solid carrier. In this study, granules based on beta-cyclodextrin (betaCD) were applied as a carrier for poor water-soluble drug and loaded with a model drug (ibuprofen) using two different procedures: controlled particle deposition (CPD), SCF process and solution immersion (SI) as a conventional method for comparison. Using the CPD technique, 17.42+/-2.06wt.% (n=3) ibuprofen was loaded into betaCD-granules, in contrast to only 3.8+/-0.15wt.% (n=3) in the SI-product. The drug loading was confirmed as well by reduction of the BET surface area for the CPD-product (1.134+/-0.07m(2)/g) compared to the unloaded-granules (1.533+/-0.031m(2)/g). Such a reduction was not seen in the SI-product (1.407+/-0.048m(2)/g). The appearance of an endothermic melting peak at 77 degrees C and X-ray patterns representing ibuprofen in drug-loaded granules can be attributed to the amount of ibuprofen loaded in its crystalline form. A significant increase in drug dissolution was achieved by either drug-loading procedures compared to the unprocessed ibuprofen. In this study, the CPD technique, a supercritical fluid process avoiding the use of toxic or organic solvents was successfully applied to load drug into solid carriers, thereby improving the water-solubility of the drug.

  9. Welding and joining techniques.

    PubMed

    Chipperfield, F A; Dunkerton, S B

    2001-05-01

    There is a welding solution for most applications. As products must meet more stringent requirements or require more flexible processes to aid design or reduce cost, further improvements or totally new processes are likely to be developed. Quality control aspects are also becoming more important to meet regulation, and monitoring and control of welding processes and the standardised testing of joints will meet some if not all of these requirements.

  10. Lithography process for patterning HgI2 photonic devices

    DOEpatents

    Mescher, Mark J.; James, Ralph B.; Hermon, Haim

    2004-11-23

    A photolithographic process forms patterns on HgI.sub.2 surfaces and defines metal sublimation masks and electrodes to substantially improve device performance by increasing the realizable design space. Techniques for smoothing HgI.sub.2 surfaces and for producing trenches in HgI.sub.2 are provided. A sublimation process is described which produces etched-trench devices with enhanced electron-transport-only behavior.

  11. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  12. Porous silicon carbide (SIC) semiconductor device

    NASA Technical Reports Server (NTRS)

    Shor, Joseph S. (Inventor); Kurtz, Anthony D. (Inventor)

    1996-01-01

    Porous silicon carbide is fabricated according to techniques which result in a significant portion of nanocrystallites within the material in a sub 10 nanometer regime. There is described techniques for passivating porous silicon carbide which result in the fabrication of optoelectronic devices which exhibit brighter blue luminescence and exhibit improved qualities. Based on certain of the techniques described porous silicon carbide is used as a sacrificial layer for the patterning of silicon carbide. Porous silicon carbide is then removed from the bulk substrate by oxidation and other methods. The techniques described employ a two-step process which is used to pattern bulk silicon carbide where selected areas of the wafer are then made porous and then the porous layer is subsequently removed. The process to form porous silicon carbide exhibits dopant selectivity and a two-step etching procedure is implemented for silicon carbide multilayers.

  13. Developing Process Maps as a Tool for a Surgical Infection Prevention Quality Improvement Initiative in Resource-Constrained Settings.

    PubMed

    Forrester, Jared A; Koritsanszky, Luca A; Amenu, Demisew; Haynes, Alex B; Berry, William R; Alemu, Seifu; Jiru, Fekadu; Weiser, Thomas G

    2018-06-01

    Surgical infections cause substantial morbidity and mortality in low-and middle-income countries (LMICs). To improve adherence to critical perioperative infection prevention standards, we developed Clean Cut, a checklist-based quality improvement program to improve compliance with best practices. We hypothesized that process mapping infection prevention activities can help clinicians identify strategies for improving surgical safety. We introduced Clean Cut at a tertiary hospital in Ethiopia. Infection prevention standards included skin antisepsis, ensuring a sterile field, instrument decontamination/sterilization, prophylactic antibiotic administration, routine swab/gauze counting, and use of a surgical safety checklist. Processes were mapped by a visiting surgical fellow and local operating theater staff to facilitate the development of contextually relevant solutions; processes were reassessed for improvements. Process mapping helped identify barriers to using alcohol-based hand solution due to skin irritation, inconsistent administration of prophylactic antibiotics due to variable delivery outside of the operating theater, inefficiencies in assuring sterility of surgical instruments through lack of confirmatory measures, and occurrences of retained surgical items through inappropriate guidelines, staffing, and training in proper routine gauze counting. Compliance with most processes improved significantly following organizational changes to align tasks with specific process goals. Enumerating the steps involved in surgical infection prevention using a process mapping technique helped identify opportunities for improving adherence and plotting contextually relevant solutions, resulting in superior compliance with antiseptic standards. Simplifying these process maps into an adaptable tool could be a powerful strategy for improving safe surgery delivery in LMICs. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    PubMed

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  15. Improving lateral resolution and image quality of optical coherence tomography by the multi-frame superresolution technique for 3D tissue imaging

    PubMed Central

    Shen, Kai; Lu, Hui; Baig, Sarfaraz; Wang, Michael R.

    2017-01-01

    The multi-frame superresolution technique is introduced to significantly improve the lateral resolution and image quality of spectral domain optical coherence tomography (SD-OCT). Using several sets of low resolution C-scan 3D images with lateral sub-spot-spacing shifts on different sets, the multi-frame superresolution processing of these sets at each depth layer reconstructs a higher resolution and quality lateral image. Layer by layer processing yields an overall high lateral resolution and quality 3D image. In theory, the superresolution processing including deconvolution can solve the diffraction limit, lateral scan density and background noise problems together. In experiment, the improved lateral resolution by ~3 times reaching 7.81 µm and 2.19 µm using sample arm optics of 0.015 and 0.05 numerical aperture respectively as well as doubling the image quality has been confirmed by imaging a known resolution test target. Improved lateral resolution on in vitro skin C-scan images has been demonstrated. For in vivo 3D SD-OCT imaging of human skin, fingerprint and retina layer, we used the multi-modal volume registration method to effectively estimate the lateral image shifts among different C-scans due to random minor unintended live body motion. Further processing of these images generated high lateral resolution 3D images as well as high quality B-scan images of these in vivo tissues. PMID:29188089

  16. Hot Melt Extruded and Injection Moulded Dosage Forms: Recent Research and Patents.

    PubMed

    Major, Ian; McConville, Christopher

    2015-01-01

    Hot Melt Extrusion (HME) and Injection Moulding (IM) are becoming more prevalent in the drug delivery field due to their continuous nature and advantages over current pharmaceutical manufacturing techniques. Hot melt extrusion (HME) is a process that involves the use of at least one reciprocating screw to force a thermoplastic resin along a heated barrel and through a die, while injection moulding is a forming process were molten polymer is forced at high pressure to enter a mould. HME offers a number of advantages over conventional pharmaceutical manufacturing techniques such as increased solubility and bioavailability of poorly water soluble drugs, a solvent free and continuous process, improved content uniformity and flexibility in manufacture. Injection moulding (IM) has been recognised as a rapid and versatile manufacturing technique, which has the advantages of being a continuous process, which is easily scaled up by the use of larger equipment and moulds. However, despite their advantages and the significant number of publications and patents on HME and IM drug delivery devices there are very few marketed formulations. These marketed products range from oral dosage forms which improve bioavailability and reduce pill burden to vaginal rings which provide long-term controlled release thus improving patient compliance. The patenting strategy for IM and HME seems to be focused towards patenting the finished product, more so than patenting the manufacturing process. This is probably due to the fact that the IM and HME processes have already been patented. HME is a process where raw materials (i.e. polymer, plasticizer, drug etc.) are mixed and pumped along by a rotating screw(s) at elevated temperatures through a die to produce a product of uniform shape. IM is similar to HME except that the raw materials are pushed into a mould which is set at lower temperatures. Interest in the use of HME and IM within the pharmaceutical industry is growing with as steady increase in the number of HME patents being issued and with more than 10 products, ranging from oral dosage forms to implantable devices, currently on the market. Therefore, this review of HME and IM is important to the scientific community to further understand and advance these novel and exciting manufacturing techniques.

  17. Longitudinal Analysis Technique to Assist School Leaders in Making Critical Curriculum and Instruction Decisions for School Improvement

    ERIC Educational Resources Information Center

    Bigham, Gary D.; Riney, Mark R.

    2017-01-01

    To meet the constantly changing needs of schools and diverse learners, educators must frequently monitor student learning, revise curricula, and improve instruction. Consequently, it is critical that careful analyses of student performance data are ongoing components of curriculum decision-making processes. The primary purpose of this study is to…

  18. The Effect of Training in Visual Composition on Organization in Written Composition in Grade III.

    ERIC Educational Resources Information Center

    Tuttle, Frederick B., Jr.

    The purpose of this investigation was to explore the possibility that one technique for improvement of organization in written composition might be instruction in the organizational process of another medium, such as sequencing photographs meaningfully. Two methods of improving organization in written composition were compared. The first was a…

  19. Fully Solution-Processed Flexible Organic Thin Film Transistor Arrays with High Mobility and Exceptional Uniformity

    PubMed Central

    Fukuda, Kenjiro; Takeda, Yasunori; Mizukami, Makoto; Kumaki, Daisuke; Tokito, Shizuo

    2014-01-01

    Printing fully solution-processed organic electronic devices may potentially revolutionize production of flexible electronics for various applications. However, difficulties in forming thin, flat, uniform films through printing techniques have been responsible for poor device performance and low yields. Here, we report on fully solution-processed organic thin-film transistor (TFT) arrays with greatly improved performance and yields, achieved by layering solution-processable materials such as silver nanoparticle inks, organic semiconductors, and insulating polymers on thin plastic films. A treatment layer improves carrier injection between the source/drain electrodes and the semiconducting layer and dramatically reduces contact resistance. Furthermore, an organic semiconductor with large-crystal grains results in TFT devices with shorter channel lengths and higher field-effect mobilities. We obtained mobilities of over 1.2 cm2 V−1 s−1 in TFT devices with channel lengths shorter than 20 μm. By combining these fabrication techniques, we built highly uniform organic TFT arrays with average mobility levels as high as 0.80 cm2 V−1 s−1 and ideal threshold voltages of 0 V. These results represent major progress in the fabrication of fully solution-processed organic TFT device arrays. PMID:24492785

  20. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    NASA Astrophysics Data System (ADS)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  1. Cockpit System Situational Awareness Modeling Tool

    NASA Technical Reports Server (NTRS)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  2. Readout circuit with novel background suppression for long wavelength infrared focal plane arrays

    NASA Astrophysics Data System (ADS)

    Xie, L.; Xia, X. J.; Zhou, Y. F.; Wen, Y.; Sun, W. F.; Shi, L. X.

    2011-02-01

    In this article, a novel pixel readout circuit using a switched-capacitor integrator mode background suppression technique is presented for long wavelength infrared focal plane arrays. This circuit can improve dynamic range and signal-to-noise ratio by suppressing the large background current during integration. Compared with other background suppression techniques, the new background suppression technique is less sensitive to the process mismatch and has no additional shot noise. The proposed circuit is theoretically analysed and simulated while taking into account the non-ideal characteristics. The result shows that the background suppression non-uniformity is ultra-low even for a large process mismatch. The background suppression non-uniformity of the proposed circuit can also remain very small with technology scaling.

  3. Additive Manufacturing Techniques for the Reconstruction of 3D Fetal Faces

    PubMed Central

    Citro, Daniela; Padula, Francesco; Motyl, Barbara; Marcolin, Federica; Calì, Michele

    2017-01-01

    This paper deals with additive manufacturing techniques for the creation of 3D fetal face models starting from routine 3D ultrasound data. In particular, two distinct themes are addressed. First, a method for processing and building 3D models based on the use of medical image processing techniques is proposed. Second, the preliminary results of a questionnaire distributed to future parents consider the use of these reconstructions both from an emotional and an affective point of view. In particular, the study focuses on the enhancement of the perception of maternity or paternity and the improvement in the relationship between parents and physicians in case of fetal malformations, in particular facial or cleft lip diseases. PMID:29410600

  4. a Spatiotemporal Aggregation Query Method Using Multi-Thread Parallel Technique Based on Regional Division

    NASA Astrophysics Data System (ADS)

    Liao, S.; Chen, L.; Li, J.; Xiong, W.; Wu, Q.

    2015-07-01

    Existing spatiotemporal database supports spatiotemporal aggregation query over massive moving objects datasets. Due to the large amounts of data and single-thread processing method, the query speed cannot meet the application requirements. On the other hand, the query efficiency is more sensitive to spatial variation then temporal variation. In this paper, we proposed a spatiotemporal aggregation query method using multi-thread parallel technique based on regional divison and implemented it on the server. Concretely, we divided the spatiotemporal domain into several spatiotemporal cubes, computed spatiotemporal aggregation on all cubes using the technique of multi-thread parallel processing, and then integrated the query results. By testing and analyzing on the real datasets, this method has improved the query speed significantly.

  5. A proposed framework on hybrid feature selection techniques for handling high dimensional educational data

    NASA Astrophysics Data System (ADS)

    Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd

    2017-10-01

    Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.

  6. System design and improvement of an emergency department using Simulation-Based Multi-Objective Optimization

    NASA Astrophysics Data System (ADS)

    Goienetxea Uriarte, A.; Ruiz Zúñiga, E.; Urenda Moris, M.; Ng, A. H. C.

    2015-05-01

    Discrete Event Simulation (DES) is nowadays widely used to support decision makers in system analysis and improvement. However, the use of simulation for improving stochastic logistic processes is not common among healthcare providers. The process of improving healthcare systems involves the necessity to deal with trade-off optimal solutions that take into consideration a multiple number of variables and objectives. Complementing DES with Multi-Objective Optimization (SMO) creates a superior base for finding these solutions and in consequence, facilitates the decision-making process. This paper presents how SMO has been applied for system improvement analysis in a Swedish Emergency Department (ED). A significant number of input variables, constraints and objectives were considered when defining the optimization problem. As a result of the project, the decision makers were provided with a range of optimal solutions which reduces considerably the length of stay and waiting times for the ED patients. SMO has proved to be an appropriate technique to support healthcare system design and improvement processes. A key factor for the success of this project has been the involvement and engagement of the stakeholders during the whole process.

  7. An electromagnetic induction method for underground target detection and characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartel, L.C.; Cress, D.H.

    1997-01-01

    An improved capability for subsurface structure detection is needed to support military and nonproliferation requirements for inspection and for surveillance of activities of threatening nations. As part of the DOE/NN-20 program to apply geophysical methods to detect and characterize underground facilities, Sandia National Laboratories (SNL) initiated an electromagnetic induction (EMI) project to evaluate low frequency electromagnetic (EM) techniques for subsurface structure detection. Low frequency, in this case, extended from kilohertz to hundreds of kilohertz. An EMI survey procedure had already been developed for borehole imaging of coal seams and had successfully been applied in a surface mode to detect amore » drug smuggling tunnel. The SNL project has focused on building upon the success of that procedure and applying it to surface and low altitude airborne platforms. Part of SNL`s work has focused on improving that technology through improved hardware and data processing. The improved hardware development has been performed utilizing Laboratory Directed Research and Development (LDRD) funding. In addition, SNL`s effort focused on: (1) improvements in modeling of the basic geophysics of the illuminating electromagnetic field and its coupling to the underground target (partially funded using LDRD funds) and (2) development of techniques for phase-based and multi-frequency processing and spatial processing to support subsurface target detection and characterization. The products of this project are: (1) an evaluation of an improved EM gradiometer, (2) an improved gradiometer concept for possible future development, (3) an improved modeling capability, (4) demonstration of an EM wave migration method for target recognition, and a demonstration that the technology is capable of detecting targets to depths exceeding 25 meters.« less

  8. Improved Concrete Cutting and Excavation Capabilities for Crater Repair Phase 2

    DTIC Science & Technology

    2015-05-01

    production rate and ease of execution. The current ADR techniques, tactics, and procedures (TTPs) indicate cutting of pavement around a small crater...demonstrations and evaluations were used to create the techniques, tactics, and procedures (TTPs) manual describing the processes and requirements of...was more difficult when dowels were present. In general, the OUA demonstration validated that the new materials, equipment, and procedures were

  9. TOPICAL REVIEW: Human soft tissue analysis using x-ray or gamma-ray techniques

    NASA Astrophysics Data System (ADS)

    Theodorakou, C.; Farquharson, M. J.

    2008-06-01

    This topical review is intended to describe the x-ray techniques used for human soft tissue analysis. X-ray techniques have been applied to human soft tissue characterization and interesting results have been presented over the last few decades. The motivation behind such studies is to provide improved patient outcome by using the data obtained to better understand a disease process and improve diagnosis. An overview of theoretical background as well as a complete set of references is presented. For each study, a brief summary of the methodology and results is given. The x-ray techniques include x-ray diffraction, x-ray fluorescence, Compton scattering, Compton to coherent scattering ratio and attenuation measurements. The soft tissues that have been classified using x-rays or gamma rays include brain, breast, colon, fat, kidney, liver, lung, muscle, prostate, skin, thyroid and uterus.

  10. Resonant fiber optic gyro based on a sinusoidal wave modulation and square wave demodulation technique.

    PubMed

    Wang, Linglan; Yan, Yuchao; Ma, Huilian; Jin, Zhonghe

    2016-04-20

    New developments are made in the resonant fiber optic gyro (RFOG), which is an optical sensor for the measurement of rotation rate. The digital signal processing system based on the phase modulation technique is capable of detecting the weak frequency difference induced by the Sagnac effect and suppressing the reciprocal noise in the circuit, which determines the detection sensitivity of the RFOG. A new technique based on the sinusoidal wave modulation and square wave demodulation is implemented, and the demodulation curve of the system is simulated and measured. Compared with the past technique using sinusoidal modulation and demodulation, it increases the slope of the demodulation curve by a factor of 1.56, improves the spectrum efficiency of the modulated signal, and reduces the occupancy of the field-programmable gate array resource. On the basis of this new phase modulation technique, the loop is successfully locked and achieves a short-term bias stability of 1.08°/h, which is improved by a factor of 1.47.

  11. Supervised detection of exoplanets in high-contrast imaging sequences

    NASA Astrophysics Data System (ADS)

    Gomez Gonzalez, C. A.; Absil, O.; Van Droogenbroeck, M.

    2018-06-01

    Context. Post-processing algorithms play a key role in pushing the detection limits of high-contrast imaging (HCI) instruments. State-of-the-art image processing approaches for HCI enable the production of science-ready images relying on unsupervised learning techniques, such as low-rank approximations, for generating a model point spread function (PSF) and subtracting the residual starlight and speckle noise. Aims: In order to maximize the detection rate of HCI instruments and survey campaigns, advanced algorithms with higher sensitivities to faint companions are needed, especially for the speckle-dominated innermost region of the images. Methods: We propose a reformulation of the exoplanet detection task (for ADI sequences) that builds on well-established machine learning techniques to take HCI post-processing from an unsupervised to a supervised learning context. In this new framework, we present algorithmic solutions using two different discriminative models: SODIRF (random forests) and SODINN (neural networks). We test these algorithms on real ADI datasets from VLT/NACO and VLT/SPHERE HCI instruments. We then assess their performances by injecting fake companions and using receiver operating characteristic analysis. This is done in comparison with state-of-the-art ADI algorithms, such as ADI principal component analysis (ADI-PCA). Results: This study shows the improved sensitivity versus specificity trade-off of the proposed supervised detection approach. At the diffraction limit, SODINN improves the true positive rate by a factor ranging from 2 to 10 (depending on the dataset and angular separation) with respect to ADI-PCA when working at the same false-positive level. Conclusions: The proposed supervised detection framework outperforms state-of-the-art techniques in the task of discriminating planet signal from speckles. In addition, it offers the possibility of re-processing existing HCI databases to maximize their scientific return and potentially improve the demographics of directly imaged exoplanets.

  12. Nuevos aspectos en el estudio de la particula D en el experimento FOCUS de Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinones Gonzalez, Jose A.; /Puerto Rico U., Mayaguez

    The purpose of this work is to improve the reconstruction techniques of the decays of the particles that contain charm in the quark composition using the information of the Target Silicon Detector of the experiment E831 (FOCUS). That experiment runs during 1997 to 1998 in Fermilab National Laboratory. The objective of the experiment was improving the understanding of the particles that contain charm. Adding the Target Silicon Detector information in the reconstruction process of the primary vertex the position error. This reduction produces an improvement in the mass signal and the knowledge of the charm particles properties. This ad tomore » the possibility's that in other analysis will use the techniques developed in this work.« less

  13. A quality improvement management model for renal care.

    PubMed

    Vlchek, D L; Day, L M

    1991-04-01

    The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.

  14. Improving the medical records department processes by lean management.

    PubMed

    Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine

    2015-01-01

    Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. The study represents one of the few attempts trying to eliminate wastes in the MRD.

  15. Initial Skill Acquisition of Handrim Wheelchair Propulsion: A New Perspective.

    PubMed

    Vegter, Riemer J K; de Groot, Sonja; Lamoth, Claudine J; Veeger, Dirkjan Hej; van der Woude, Lucas H V

    2014-01-01

    To gain insight into cyclic motor learning processes, hand rim wheelchair propulsion is a suitable cyclic task, to be learned during early rehabilitation and novel to almost every individual. To propel in an energy efficient manner, wheelchair users must learn to control bimanually applied forces onto the rims, preserving both speed and direction of locomotion. The purpose of this study was to evaluate mechanical efficiency and propulsion technique during the initial stage of motor learning. Therefore, 70 naive able-bodied men received 12-min uninstructed wheelchair practice, consisting of three 4-min blocks separated by 2 min rest. Practice was performed on a motor-driven treadmill at a fixed belt speed and constant power output relative to body mass. Energy consumption and the kinetics of propulsion technique were continuously measured. Participants significantly increased their mechanical efficiency and changed their propulsion technique from a high frequency mode with a lot of negative work to a longer-slower movement pattern with less power losses. Furthermore a multi-level model showed propulsion technique to relate to mechanical efficiency. Finally improvers and non-improvers were identified. The non-improving group was already more efficient and had a better propulsion technique in the first block of practice (i.e., the fourth minute). These findings link propulsion technique to mechanical efficiency, support the importance of a correct propulsion technique for wheelchair users and show motor learning differences.

  16. A combined Bodian-Nissl stain for improved network analysis in neuronal cell culture.

    PubMed

    Hightower, M; Gross, G W

    1985-11-01

    Bodian and Nissl procedures were combined to stain dissociated mouse spinal cord cells cultured on coverslips. The Bodian technique stains fine neuronal processes in great detail as well as an intracellular fibrillar network concentrated around the nucleus and in proximal neurites. The Nissl stain clearly delimits neuronal cytoplasm in somata and in large dendrites. A combination of these techniques allows the simultaneous depiction of neuronal perikarya and all afferent and efferent processes. Costaining with little background staining by either procedure suggests high specificity for neurons. This procedure could be exploited for routine network analysis of cultured neurons.

  17. Burst design and signal processing for the speed of sound measurement of fluids with the pulse-echo technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubberke, Frithjof H.; Baumhögger, Elmar; Vrabec, Jadran, E-mail: jadran.vrabec@upb.de

    2015-05-15

    The pulse-echo technique determines the propagation time of acoustic wave bursts in a fluid over a known propagation distance. It is limited by the signal quality of the received echoes of the acoustic wave bursts, which degrades with decreasing density of the fluid due to acoustic impedance and attenuation effects. Signal sampling is significantly improved in this work by burst design and signal processing such that a wider range of thermodynamic states can be investigated. Applying a Fourier transformation based digital filter on acoustic wave signals increases their signal-to-noise ratio and enhances their time and amplitude resolutions, improving the overallmore » measurement accuracy. In addition, burst design leads to technical advantages for determining the propagation time due to the associated conditioning of the echo. It is shown that the according operation procedure enlarges the measuring range of the pulse-echo technique for supercritical argon and nitrogen at 300 K down to 5 MPa, where it was limited to around 20 MPa before.« less

  18. Enhancement of signal-to-noise ratio in Brillouin optical time domain analyzers by dual-probe detection

    NASA Astrophysics Data System (ADS)

    Iribas, Haritz; Loayssa, Alayn; Sauser, Florian; Llera, Miguel; Le Floch, Sébastien

    2017-04-01

    We demonstrate a simple technique to enhance the signal-to-noise ratio (SNR) in Brillouin optical time-domain analysis sensors by the addition of gain and loss processes. The technique is based on the shift of the pump pulse optical frequency in a double-sideband probe system, so that the gain and loss processes take place at different frequencies. In this manner, the loss and the gain do not cancel each other out, and it makes possible to take advantage of both informations at the same time, obtaining an improvement of 3 dB on the SNR. Furthermore, the technique does not need an optical filtering, so that larger improvement on SNR and a simplification of the setup are obtained. The method is experimentally demonstrated in a 101 km fiber spool, obtaining a measurement uncertainty of 2.6 MHz (2σ) at the worst-contrast position for 2 m spatial resolution. This leads, to the best of our knowledge, to the highest figure-of-merit in a BOTDA without using coding or raman amplification.

  19. Improving a Dental School's Clinic Operations Using Lean Process Improvement.

    PubMed

    Robinson, Fonda G; Cunningham, Larry L; Turner, Sharon P; Lindroth, John; Ray, Deborah; Khan, Talib; Yates, Audrey

    2016-10-01

    The term "lean production," also known as "Lean," describes a process of operations management pioneered at the Toyota Motor Company that contributed significantly to the success of the company. Although developed by Toyota, the Lean process has been implemented at many other organizations, including those in health care, and should be considered by dental schools in evaluating their clinical operations. Lean combines engineering principles with operations management and improvement tools to optimize business and operating processes. One of the core concepts is relentless elimination of waste (non-value-added components of a process). Another key concept is utilization of individuals closest to the actual work to analyze and improve the process. When the medical center of the University of Kentucky adopted the Lean process for improving clinical operations, members of the College of Dentistry trained in the process applied the techniques to improve inefficient operations at the Walk-In Dental Clinic. The purpose of this project was to reduce patients' average in-the-door-to-out-the-door time from over four hours to three hours within 90 days. Achievement of this goal was realized by streamlining patient flow and strategically relocating key phases of the process. This initiative resulted in patient benefits such as shortening average in-the-door-to-out-the-door time by over an hour, improving satisfaction by 21%, and reducing negative comments by 24%, as well as providing opportunity to implement the electronic health record, improving teamwork, and enhancing educational experiences for students. These benefits were achieved while maintaining high-quality patient care with zero adverse outcomes during and two years following the process improvement project.

  20. Graphics Processing Unit-Accelerated Nonrigid Registration of MR Images to CT Images During CT-Guided Percutaneous Liver Tumor Ablations.

    PubMed

    Tokuda, Junichi; Plishker, William; Torabi, Meysam; Olubiyi, Olutayo I; Zaki, George; Tatli, Servet; Silverman, Stuart G; Shekher, Raj; Hata, Nobuhiko

    2015-06-01

    Accuracy and speed are essential for the intraprocedural nonrigid magnetic resonance (MR) to computed tomography (CT) image registration in the assessment of tumor margins during CT-guided liver tumor ablations. Although both accuracy and speed can be improved by limiting the registration to a region of interest (ROI), manual contouring of the ROI prolongs the registration process substantially. To achieve accurate and fast registration without the use of an ROI, we combined a nonrigid registration technique on the basis of volume subdivision with hardware acceleration using a graphics processing unit (GPU). We compared the registration accuracy and processing time of GPU-accelerated volume subdivision-based nonrigid registration technique to the conventional nonrigid B-spline registration technique. Fourteen image data sets of preprocedural MR and intraprocedural CT images for percutaneous CT-guided liver tumor ablations were obtained. Each set of images was registered using the GPU-accelerated volume subdivision technique and the B-spline technique. Manual contouring of ROI was used only for the B-spline technique. Registration accuracies (Dice similarity coefficient [DSC] and 95% Hausdorff distance [HD]) and total processing time including contouring of ROIs and computation were compared using a paired Student t test. Accuracies of the GPU-accelerated registrations and B-spline registrations, respectively, were 88.3 ± 3.7% versus 89.3 ± 4.9% (P = .41) for DSC and 13.1 ± 5.2 versus 11.4 ± 6.3 mm (P = .15) for HD. Total processing time of the GPU-accelerated registration and B-spline registration techniques was 88 ± 14 versus 557 ± 116 seconds (P < .000000002), respectively; there was no significant difference in computation time despite the difference in the complexity of the algorithms (P = .71). The GPU-accelerated volume subdivision technique was as accurate as the B-spline technique and required significantly less processing time. The GPU-accelerated volume subdivision technique may enable the implementation of nonrigid registration into routine clinical practice. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  1. Process-driven selection of information systems for healthcare

    NASA Astrophysics Data System (ADS)

    Mills, Stephen F.; Yeh, Raymond T.; Giroir, Brett P.; Tanik, Murat M.

    1995-05-01

    Integration of networking and data management technologies such as PACS, RIS and HIS into a healthcare enterprise in a clinically acceptable manner is a difficult problem. Data within such a facility are generally managed via a combination of manual hardcopy systems and proprietary, special-purpose data processing systems. Process modeling techniques have been successfully applied to engineering and manufacturing enterprises, but have not generally been applied to service-based enterprises such as healthcare facilities. The use of process modeling techniques can provide guidance for the placement, configuration and usage of PACS and other informatics technologies within the healthcare enterprise, and thus improve the quality of healthcare. Initial process modeling activities conducted within the Pediatric ICU at Children's Medical Center in Dallas, Texas are described. The ongoing development of a full enterprise- level model for the Pediatric ICU is also described.

  2. Arc-welding quality assurance by means of embedded fiber sensor and spectral processing combining feature selection and neural networks

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; García-Allende, P. B.; Cobo, A.; Conde, O.; López-Higuera, J. M.

    2007-07-01

    A new spectral processing technique designed for its application in the on-line detection and classification of arc-welding defects is presented in this paper. A non-invasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed by means of two consecutive stages. A compression algorithm is first applied to the data allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in a previous paper, giving rise to an improvement in the performance of the monitoring system.

  3. Optimisation of shape kernel and threshold in image-processing motion analysers.

    PubMed

    Pedrocchi, A; Baroni, G; Sada, S; Marcon, E; Pedotti, A; Ferrigno, G

    2001-09-01

    The aim of the work is to optimise the image processing of a motion analyser. This is to improve accuracy, which is crucial for neurophysiological and rehabilitation applications. A new motion analyser, ELITE-S2, for installation on the International Space Station is described, with the focus on image processing. Important improvements are expected in the hardware of ELITE-S2 compared with ELITE and previous versions (ELITE-S and Kinelite). The core algorithm for marker recognition was based on the current ELITE version, using the cross-correlation technique. This technique was based on the matching of the expected marker shape, the so-called kernel, with image features. Optimisation of the kernel parameters was achieved using a genetic algorithm, taking into account noise rejection and accuracy. Optimisation was achieved by performing tests on six highly precise grids (with marker diameters ranging from 1.5 to 4 mm), representing all allowed marker image sizes, and on a noise image. The results of comparing the optimised kernels and the current ELITE version showed a great improvement in marker recognition accuracy, while noise rejection characteristics were preserved. An average increase in marker co-ordinate accuracy of +22% was achieved, corresponding to a mean accuracy of 0.11 pixel in comparison with 0.14 pixel, measured over all grids. An improvement of +37%, corresponding to an improvement from 0.22 pixel to 0.14 pixel, was observed over the grid with the biggest markers.

  4. Bone surface enhancement in ultrasound images using a new Doppler-based acquisition/processing method.

    PubMed

    Yang, Xu; Tang, Songyuan; Tasciotti, Ennio; Righetti, Raffaella

    2018-01-17

    Ultrasound (US) imaging has long been considered as a potential aid in orthopedic surgeries. US technologies are safe, portable and do not use radiations. This would make them a desirable tool for real-time assessment of fractures and to monitor fracture healing. However, image quality of US imaging methods in bone applications is limited by speckle, attenuation, shadow, multiple reflections and other imaging artifacts. While bone surfaces typically appear in US images as somewhat 'brighter' than soft tissue, they are often not easily distinguishable from the surrounding tissue. Therefore, US imaging methods aimed at segmenting bone surfaces need enhancement in image contrast prior to segmentation to improve the quality of the detected bone surface. In this paper, we present a novel acquisition/processing technique for bone surface enhancement in US images. Inspired by elastography and Doppler imaging methods, this technique takes advantage of the difference between the mechanical and acoustic properties of bones and those of soft tissues to make the bone surface more easily distinguishable in US images. The objective of this technique is to facilitate US-based bone segmentation methods and improve the accuracy of their outcomes. The newly proposed technique is tested both in in vitro and in vivo experiments. The results of these preliminary experiments suggest that the use of the proposed technique has the potential to significantly enhance the detectability of bone surfaces in noisy ultrasound images.

  5. Multiscale Analysis of Solar Image Data

    NASA Astrophysics Data System (ADS)

    Young, C. A.; Myers, D. C.

    2001-12-01

    It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.

  6. Bone surface enhancement in ultrasound images using a new Doppler-based acquisition/processing method

    NASA Astrophysics Data System (ADS)

    Yang, Xu; Tang, Songyuan; Tasciotti, Ennio; Righetti, Raffaella

    2018-01-01

    Ultrasound (US) imaging has long been considered as a potential aid in orthopedic surgeries. US technologies are safe, portable and do not use radiations. This would make them a desirable tool for real-time assessment of fractures and to monitor fracture healing. However, image quality of US imaging methods in bone applications is limited by speckle, attenuation, shadow, multiple reflections and other imaging artifacts. While bone surfaces typically appear in US images as somewhat ‘brighter’ than soft tissue, they are often not easily distinguishable from the surrounding tissue. Therefore, US imaging methods aimed at segmenting bone surfaces need enhancement in image contrast prior to segmentation to improve the quality of the detected bone surface. In this paper, we present a novel acquisition/processing technique for bone surface enhancement in US images. Inspired by elastography and Doppler imaging methods, this technique takes advantage of the difference between the mechanical and acoustic properties of bones and those of soft tissues to make the bone surface more easily distinguishable in US images. The objective of this technique is to facilitate US-based bone segmentation methods and improve the accuracy of their outcomes. The newly proposed technique is tested both in in vitro and in vivo experiments. The results of these preliminary experiments suggest that the use of the proposed technique has the potential to significantly enhance the detectability of bone surfaces in noisy ultrasound images.

  7. Dysphagia Screening: Contributions of Cervical Auscultation Signals and Modern Signal-Processing Techniques

    PubMed Central

    Dudik, Joshua M.; Coyle, James L.

    2015-01-01

    Cervical auscultation is the recording of sounds and vibrations caused by the human body from the throat during swallowing. While traditionally done by a trained clinician with a stethoscope, much work has been put towards developing more sensitive and clinically useful methods to characterize the data obtained with this technique. The eventual goal of the field is to improve the effectiveness of screening algorithms designed to predict the risk that swallowing disorders pose to individual patients’ health and safety. This paper provides an overview of these signal processing techniques and summarizes recent advances made with digital transducers in hopes of organizing the highly varied research on cervical auscultation. It investigates where on the body these transducers are placed in order to record a signal as well as the collection of analog and digital filtering techniques used to further improve the signal quality. It also presents the wide array of methods and features used to characterize these signals, ranging from simply counting the number of swallows that occur over a period of time to calculating various descriptive features in the time, frequency, and phase space domains. Finally, this paper presents the algorithms that have been used to classify this data into ‘normal’ and ‘abnormal’ categories. Both linear as well as non-linear techniques are presented in this regard. PMID:26213659

  8. Point-based warping with optimized weighting factors of displacement vectors

    NASA Astrophysics Data System (ADS)

    Pielot, Ranier; Scholz, Michael; Obermayer, Klaus; Gundelfinger, Eckart D.; Hess, Andreas

    2000-06-01

    The accurate comparison of inter-individual 3D image brain datasets requires non-affine transformation techniques (warping) to reduce geometric variations. Constrained by the biological prerequisites we use in this study a landmark-based warping method with weighted sums of displacement vectors, which is enhanced by an optimization process. Furthermore, we investigate fast automatic procedures for determining landmarks to improve the practicability of 3D warping. This combined approach was tested on 3D autoradiographs of Gerbil brains. The autoradiographs were obtained after injecting a non-metabolized radioactive glucose derivative into the Gerbil thereby visualizing neuronal activity in the brain. Afterwards the brain was processed with standard autoradiographical methods. The landmark-generator computes corresponding reference points simultaneously within a given number of datasets by Monte-Carlo-techniques. The warping function is a distance weighted exponential function with a landmark- specific weighting factor. These weighting factors are optimized by a computational evolution strategy. The warping quality is quantified by several coefficients (correlation coefficient, overlap-index, and registration error). The described approach combines a highly suitable procedure to automatically detect landmarks in autoradiographical brain images and an enhanced point-based warping technique, optimizing the local weighting factors. This optimization process significantly improves the similarity between the warped and the target dataset.

  9. Developing a Forensic Approach to Process Improvement: The Relationship between Curriculum and Impact in Frontline Operator Education

    ERIC Educational Resources Information Center

    Croom, Simon; Betts, Alan

    2011-01-01

    The authors present a comparative study of 2 in-company educational programs aimed at developing frontline operator capabilities in forensic methods. They discuss the relationship between the application of various forensic tools and conceptual techniques, the process (i.e., curriculum) for developing employee knowledge and capability, and the…

  10. Overcoming the Glassy-Eyed Nod: An Application of Process-Oriented Guided Inquiry Learning Techniques in Information Technology

    ERIC Educational Resources Information Center

    Myers, Trina; Monypenny, Richard; Trevathan, Jarrod

    2012-01-01

    Two significant problems faced by universities are to ensure sustainability and to produce quality graduates. Four aspects of these problems are to improve engagement, to foster interaction, develop required skills and to effectively gauge the level of attention and comprehension within lectures and large tutorials. Process-Oriented Guided Inquiry…

  11. Post-processing for improving hyperspectral anomaly detection accuracy

    NASA Astrophysics Data System (ADS)

    Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang

    2015-10-01

    Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.

  12. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  13. Improving LiDAR Data Post-Processing Techniques for Archaeological Site Management and Analysis: A Case Study from Canaveral National Seashore Park

    NASA Astrophysics Data System (ADS)

    Griesbach, Christopher

    Methods used to process raw Light Detection and Ranging (LiDAR) data can sometimes obscure the digital signatures indicative of an archaeological site. This thesis explains the negative effects that certain LiDAR data processing procedures can have on the preservation of an archaeological site. This thesis also presents methods for effectively integrating LiDAR with other forms of mapping data in a Geographic Information Systems (GIS) environment in order to improve LiDAR archaeological signatures by examining several pre-Columbian Native American shell middens located in Canaveral National Seashore Park (CANA).

  14. Somatic Embryogenesis: Still a Relevant Technique in Citrus Improvement.

    PubMed

    Omar, Ahmad A; Dutt, Manjul; Gmitter, Frederick G; Grosser, Jude W

    2016-01-01

    The genus Citrus contains numerous fresh and processed fruit cultivars that are economically important worldwide. New cultivars are needed to battle industry threatening diseases and to create new marketing opportunities. Citrus improvement by conventional methods alone has many limitations that can be overcome by applications of emerging biotechnologies, generally requiring cell to plant regeneration. Many citrus genotypes are amenable to somatic embryogenesis, which became a key regeneration pathway in many experimental approaches to cultivar improvement. This chapter provides a brief history of plant somatic embryogenesis with focus on citrus, followed by a discussion of proven applications in biotechnology-facilitated citrus improvement techniques, such as somatic hybridization, somatic cybridization, genetic transformation, and the exploitation of somaclonal variation. Finally, two important new protocols that feature plant regeneration via somatic embryogenesis are provided: protoplast transformation and Agrobacterium-mediated transformation of embryogenic cell suspension cultures.

  15. Research on the Improved Image Dodging Algorithm Based on Mask Technique

    NASA Astrophysics Data System (ADS)

    Yao, F.; Hu, H.; Wan, Y.

    2012-08-01

    The remote sensing image dodging algorithm based on Mask technique is a good method for removing the uneven lightness within a single image. However, there are some problems with this algorithm, such as how to set an appropriate filter size, for which there is no good solution. In order to solve these problems, an improved algorithm is proposed. In this improved algorithm, the original image is divided into blocks, and then the image blocks with different definitions are smoothed using the low-pass filters with different cut-off frequencies to get the background image; for the image after subtraction, the regions with different lightness are processed using different linear transformation models. The improved algorithm can get a better dodging result than the original one, and can make the contrast of the whole image more consistent.

  16. LANDSAT information for state planning

    NASA Technical Reports Server (NTRS)

    Faust, N. L.; Spann, G. W.

    1977-01-01

    The transfer of remote sensing technology for the digital processing of LANDSAT data to state and local agencies in Georgia and other southeastern states is discussed. The project consists of a series of workshops, seminars, and demonstration efforts, and transfer of NASA-developed hardware concepts and computer software to state agencies. Throughout the multi-year effort, digital processing techniques have been emphasized classification algorithms. Software for LANDSAT data rectification and processing have been developed and/or transferred. A hardware system is available at EES (engineering experiment station) to allow user interactive processing of LANDSAT data. Seminars and workshops emphasize the digital approach to LANDSAT data utilization and the system improvements scheduled for LANDSATs C and D. Results of the project indicate a substantially increased awareness of the utility of digital LANDSAT processing techniques among the agencies contracted throughout the southeast. In Georgia, several agencies have jointly funded a program to map the entire state using digitally processed LANDSAT data.

  17. Evaluation of segmentation algorithms for optical coherence tomography images of ovarian tissue

    NASA Astrophysics Data System (ADS)

    Sawyer, Travis W.; Rice, Photini F. S.; Sawyer, David M.; Koevary, Jennifer W.; Barton, Jennifer K.

    2018-02-01

    Ovarian cancer has the lowest survival rate among all gynecologic cancers due to predominantly late diagnosis. Early detection of ovarian cancer can increase 5-year survival rates from 40% up to 92%, yet no reliable early detection techniques exist. Optical coherence tomography (OCT) is an emerging technique that provides depthresolved, high-resolution images of biological tissue in real time and demonstrates great potential for imaging of ovarian tissue. Mouse models are crucial to quantitatively assess the diagnostic potential of OCT for ovarian cancer imaging; however, due to small organ size, the ovaries must rst be separated from the image background using the process of segmentation. Manual segmentation is time-intensive, as OCT yields three-dimensional data. Furthermore, speckle noise complicates OCT images, frustrating many processing techniques. While much work has investigated noise-reduction and automated segmentation for retinal OCT imaging, little has considered the application to the ovaries, which exhibit higher variance and inhomogeneity than the retina. To address these challenges, we evaluated a set of algorithms to segment OCT images of mouse ovaries. We examined ve preprocessing techniques and six segmentation algorithms. While all pre-processing methods improve segmentation, Gaussian filtering is most effective, showing an improvement of 32% +/- 1.2%. Of the segmentation algorithms, active contours performs best, segmenting with an accuracy of 0.948 +/- 0.012 compared with manual segmentation (1.0 being identical). Nonetheless, further optimization could lead to maximizing the performance for segmenting OCT images of the ovaries.

  18. Evaluation of a biological wastewater treatment system combining an OSA process with ultrasound for sludge reduction.

    PubMed

    Romero-Pareja, P M; Aragon, C A; Quiroga, J M; Coello, M D

    2017-05-01

    Sludge production is an undesirable by-product of biological wastewater treatment. The oxic-settling-anaerobic (OSA) process constitutes one of the most promising techniques for reducing the sludge produced at the treatment plant without negative consequences for its overall performance. In the present study, the OSA process is applied in combination with ultrasound treatment, a lysis technique, in a lab-scale wastewater treatment plant to assess whether sludge reduction is enhanced as a result of mechanical treatment. Reported sludge reductions of 45.72% and 78.56% were obtained for the two regimes of combined treatment tested in this study during two respective stages: UO1 and UO2. During the UO1 stage, the general performance and nutrient removal improved, obtaining 47.28% TN removal versus 21.95% in the conventional stage. However, the performance of the system was seriously damaged during the UO2 stage. Increases in dehydrogenase and protease activities were observed during both stages. The advantages of the combined process are not necessarily economic, but operational, as US treatment acts as contributing factor in the OSA process, inducing mechanisms that lead to sludge reduction in the OSA process and improving performance parameters. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Damage Evaluation Based on a Wave Energy Flow Map Using Multiple PZT Sensors

    PubMed Central

    Liu, Yaolu; Hu, Ning; Xu, Hong; Yuan, Weifeng; Yan, Cheng; Li, Yuan; Goda, Riu; Alamusi; Qiu, Jinhao; Ning, Huiming; Wu, Liangke

    2014-01-01

    A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti's reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map. PMID:24463430

  20. Histological Stains: A Literature Review and Case Study

    PubMed Central

    Alturkistani, Hani A; Tashkandi, Faris M; Mohammedsaleh, Zuhair M

    2016-01-01

    The history of histology indicates that there have been significant changes in the techniques used for histological staining through chemical, molecular biology assays and immunological techniques, collectively referred to as histochemistry. Early histologists used the readily available chemicals to prepare tissues for microscopic studies; these laboratory chemicals were potassium dichromate, alcohol and the mercuric chloride to harden cellular tissues. Staining techniques used were carmine, silver nitrate, Giemsa, Trichrome Stains, Gram Stain and Hematoxylin among others. The purpose of this research was to assess past and current literature reviews, as well as case studies, with the aim of informing ways in which histological stains have been improved in the modern age. Results from the literature review has indicated that there has been an improvement in histopathology and histotechnology in stains used. There has been a rising need for efficient, accurate and less complex staining procedures. Many stain procedures are still in use today, and many others have been replaced with new immunostaining, molecular, non-culture and other advanced staining techniques. Some staining methods have been abandoned because the chemicals required have been medically proven to be toxic. The case studies indicated that in modern histology a combination of different stain techniques are used to enhance the effectiveness of the staining process. Currently, improved histological stains, have been modified and combined with other stains to improve their effectiveness. PMID:26493433

  1. Histological Stains: A Literature Review and Case Study.

    PubMed

    Alturkistani, Hani A; Tashkandi, Faris M; Mohammedsaleh, Zuhair M

    2015-06-25

    The history of histology indicates that there have been significant changes in the techniques used for histological staining through chemical, molecular biology assays and immunological techniques, collectively referred to as histochemistry. Early histologists used the readily available chemicals to prepare tissues for microscopic studies; these laboratory chemicals were potassium dichromate, alcohol and the mercuric chloride to harden cellular tissues. Staining techniques used were carmine, silver nitrate, Giemsa, Trichrome Stains, Gram Stain and Hematoxylin among others. The purpose of this research was to assess past and current literature reviews, as well as case studies, with the aim of informing ways in which histological stains have been improved in the modern age. Results from the literature review has indicated that there has been an improvement in histopathology and histotechnology in stains used. There has been a rising need for efficient, accurate and less complex staining procedures. Many stain procedures are still in use today, and many others have been replaced with new immunostaining, molecular, non-culture and other advanced staining techniques. Some staining methods have been abandoned because the chemicals required have been medically proven to be toxic. The case studies indicated that in modern histology a combination of different stain techniques are used to enhance the effectiveness of the staining process. Currently, improved histological stains, have been modified and combined with other stains to improve their effectiveness.

  2. Holographic femtosecond laser processing and its application to biological materials (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Hayasaki, Yoshio

    2017-02-01

    Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.

  3. Fallon, Nevada FORGE Seismic Reflection Profiles

    DOE Data Explorer

    Blankenship, Doug; Faulds, James; Queen, John; Fortuna, Mark

    2018-02-01

    Newly reprocessed Naval Air Station Fallon (1994) seismic lines: pre-stack depth migrations, with interpretations to support the Fallon FORGE (Phase 2B) 3D Geologic model. Data along seven profiles (>100 km of total profile length) through and adjacent to the Fallon site were re-processed. The most up-to-date, industry-tested seismic processing techniques were utilized to improve the signal strength and coherency in the sedimentary, volcanic, and Mesozoic crystalline basement sections, in conjunction with fault diffractions in order to improve the identification and definition of faults within the study area.

  4. Optimization of Surfactant Mixtures and Their Interfacial Behavior for Advanced Oil Recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somasundaran, Prof. P.

    2002-03-04

    The objective of this project was to develop a knowledge base that is helpful for the design of improved processes for mobilizing and producing oil left untapped using conventional techniques. The main goal was to develop and evaluate mixtures of new or modified surfactants for improved oil recovery. In this regard, interfacial properties of novel biodegradable n-alkyl pyrrolidones and sugar-based surfactants have been studied systematically. Emphasis was on designing cost-effective processes compatible with existing conditions and operations in addition to ensuring minimal reagent loss.

  5. Process-aware EHR BPM systems: two prototypes and a conceptual framework.

    PubMed

    Webster, Charles; Copenhaver, Mark

    2010-01-01

    Systematic methods to improve the effectiveness and efficiency of electronic health record-mediated processes will be key to EHRs playing an important role in the positive transformation of healthcare. Business process management (BPM) systematically optimizes process effectiveness, efficiency, and flexibility. Therefore BPM offers relevant ideas and technologies. We provide a conceptual model based on EHR productivity and negative feedback control that links EHR and BPM domains, describe two EHR BPM prototype modules, and close with the argument that typical EHRs must become more process-aware if they are to take full advantage of BPM ideas and technology. A prediction: Future extensible clinical groupware will coordinate delivery of EHR functionality to teams of users by combining modular components with executable process models whose usability (effectiveness, efficiency, and user satisfaction) will be systematically improved using business process management techniques.

  6. Some Improvements in H-PDLCs

    NASA Technical Reports Server (NTRS)

    Crawford, Gregory P.; Li, Liuliu

    2005-01-01

    Some improvements have been made in the formulation of holographically formed polymer-dispersed liquid crystals (H-PDLCs) and in the fabrication of devices made from these materials, with resulting improvements in performance. H-PDLCs are essentially volume Bragg gratings. Devices made from H-PDLCs function as electrically switchable reflective filters. Heretofore, it has been necessary to apply undesirably high drive voltages in order to switch H-PDLC devices. Many scientific papers on H-PDLCs and on the potential utility of H-PDLC devices for display and telecommunication applications have been published. However, until now, little has been published about improving quality control in synthesis of H-PDLCs and fabrication of H-PDLC devices to minimize (1) spatial nonuniformities within individual devices, (2) nonuniformities among nominally identical devices, and (3) variations in performance among nominally identical devices. The improvements reported here are results of a research effort directed partly toward solving these quality-control problems and partly toward reducing switching voltages. The quality-control improvements include incorporation of a number of process controls to create a relatively robust process, such that the H-PDLC devices fabricated in this process are more nearly uniform than were those fabricated in a prior laboratory-type process. The improved process includes ultrasonic mixing, ultrasonic cleaning, the use of a micro dispensing technique, and the use of a bubble press.

  7. Symmetric Phase Only Filtering for Improved DPIV Data Processing

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    2006-01-01

    The standard approach in Digital Particle Image Velocimetry (DPIV) data processing is to use Fast Fourier Transforms to obtain the cross-correlation of two single exposure subregions, where the location of the cross-correlation peak is representative of the most probable particle displacement across the subregion. This standard DPIV processing technique is analogous to Matched Spatial Filtering, a technique commonly used in optical correlators to perform the crosscorrelation operation. Phase only filtering is a well known variation of Matched Spatial Filtering, which when used to process DPIV image data yields correlation peaks which are narrower and up to an order of magnitude larger than those obtained using traditional DPIV processing. In addition to possessing desirable correlation plane features, phase only filters also provide superior performance in the presence of DC noise in the correlation subregion. When DPIV image subregions contaminated with surface flare light or high background noise levels are processed using phase only filters, the correlation peak pertaining only to the particle displacement is readily detected above any signal stemming from the DC objects. Tedious image masking or background image subtraction are not required. Both theoretical and experimental analyses of the signal-to-noise ratio performance of the filter functions are presented. In addition, a new Symmetric Phase Only Filtering (SPOF) technique, which is a variation on the traditional phase only filtering technique, is described and demonstrated. The SPOF technique exceeds the performance of the traditionally accepted phase only filtering techniques and is easily implemented in standard DPIV FFT based correlation processing with no significant computational performance penalty. An "Automatic" SPOF algorithm is presented which determines when the SPOF is able to provide better signal to noise results than traditional PIV processing. The SPOF based optical correlation processing approach is presented as a new paradigm for more robust cross-correlation processing of low signal-to-noise ratio DPIV image data."

  8. Wear study of Al-SiC metal matrix composites processed through microwave energy

    NASA Astrophysics Data System (ADS)

    Honnaiah, C.; Srinath, M. S.; Prasad, S. L. Ajit

    2018-04-01

    Particulate reinforced metal matrix composites are finding wider acceptance in many industrial applications due to their isotropic properties and ease of manufacture. Uniform distribution of reinforcement particulates and good bonding between matrix and reinforcement phases are essential features in order to obtain metal matrix composites with improved properties. Conventional powder metallurgy technique can successfully overcome the limitation of stir casting techniques, but it is time consuming and not cost effective. Use of microwave technology for processing particulate reinforced metal matrix composites through powder metallurgy technique is being increasingly explored in recent times because of its cost effectiveness and speed of processing. The present work is an attempt to process Al-SiC metal matrix composites using microwaves irradiated at 2.45 GHz frequency and 900 W power for 10 minutes. Further, dry sliding wear studies were conducted at different loads at constant velocity of 2 m/s for various sliding distances using pin-on-disc equipment. Analysis of the obtained results show that the microwave processed Al-SiC composite material shows around 34 % of resistance to wear than the aluminium alloy.

  9. Myocardial tagging by Cardiovascular Magnetic Resonance: evolution of techniques--pulse sequences, analysis algorithms, and applications

    PubMed Central

    2011-01-01

    Cardiovascular magnetic resonance (CMR) tagging has been established as an essential technique for measuring regional myocardial function. It allows quantification of local intramyocardial motion measures, e.g. strain and strain rate. The invention of CMR tagging came in the late eighties, where the technique allowed for the first time for visualizing transmural myocardial movement without having to implant physical markers. This new idea opened the door for a series of developments and improvements that continue up to the present time. Different tagging techniques are currently available that are more extensive, improved, and sophisticated than they were twenty years ago. Each of these techniques has different versions for improved resolution, signal-to-noise ratio (SNR), scan time, anatomical coverage, three-dimensional capability, and image quality. The tagging techniques covered in this article can be broadly divided into two main categories: 1) Basic techniques, which include magnetization saturation, spatial modulation of magnetization (SPAMM), delay alternating with nutations for tailored excitation (DANTE), and complementary SPAMM (CSPAMM); and 2) Advanced techniques, which include harmonic phase (HARP), displacement encoding with stimulated echoes (DENSE), and strain encoding (SENC). Although most of these techniques were developed by separate groups and evolved from different backgrounds, they are in fact closely related to each other, and they can be interpreted from more than one perspective. Some of these techniques even followed parallel paths of developments, as illustrated in the article. As each technique has its own advantages, some efforts have been made to combine different techniques together for improved image quality or composite information acquisition. In this review, different developments in pulse sequences and related image processing techniques are described along with the necessities that led to their invention, which makes this article easy to read and the covered techniques easy to follow. Major studies that applied CMR tagging for studying myocardial mechanics are also summarized. Finally, the current article includes a plethora of ideas and techniques with over 300 references that motivate the reader to think about the future of CMR tagging. PMID:21798021

  10. NDE Process Development Specification for SRB Composite Nose Cap

    NASA Technical Reports Server (NTRS)

    Suits, M.

    1999-01-01

    The Shuttle Upgrade program is a continuing improvement process to enable the Space Shuttle to be an effective space transportation vehicle for the next few decades. The Solid Rocket Booster (SRB), as a component of that system, is currently undergoing such an improvement. Advanced materials, such as composites, have given us a chance to improve performance and to reduce weight. The SRB Composite Nose Cap (CNC) program aims to replace the current aluminum nose cap, which is coated with a Thermal Protection System and poses a possible debris hazard, with a lighter, stronger, CNC. For the next 2 years, this program will evaluate the design, material selection, properties, and verification of the CNC. This particular process specification cites the methods and techniques for verifying the integrity of such a nose cap with nondestructive evaluation.

  11. Managing distribution changes in time series prediction

    NASA Astrophysics Data System (ADS)

    Matias, J. M.; Gonzalez-Manteiga, W.; Taboada, J.; Ordonez, C.

    2006-07-01

    When a problem is modeled statistically, a single distribution model is usually postulated that is assumed to be valid for the entire space. Nonetheless, this practice may be somewhat unrealistic in certain application areas, in which the conditions of the process that generates the data may change; as far as we are aware, however, no techniques have been developed to tackle this problem.This article proposes a technique for modeling and predicting this change in time series with a view to improving estimates and predictions. The technique is applied, among other models, to the hypernormal distribution recently proposed. When tested on real data from a range of stock market indices the technique produces better results that when a single distribution model is assumed to be valid for the entire period of time studied.Moreover, when a global model is postulated, it is highly recommended to select the hypernormal distribution parameter in the same likelihood maximization process.

  12. Method of fabricating porous silicon carbide (SiC)

    NASA Technical Reports Server (NTRS)

    Shor, Joseph S. (Inventor); Kurtz, Anthony D. (Inventor)

    1995-01-01

    Porous silicon carbide is fabricated according to techniques which result in a significant portion of nanocrystallites within the material in a sub 10 nanometer regime. There is described techniques for passivating porous silicon carbide which result in the fabrication of optoelectronic devices which exhibit brighter blue luminescence and exhibit improved qualities. Based on certain of the techniques described porous silicon carbide is used as a sacrificial layer for the patterning of silicon carbide. Porous silicon carbide is then removed from the bulk substrate by oxidation and other methods. The techniques described employ a two-step process which is used to pattern bulk silicon carbide where selected areas of the wafer are then made porous and then the porous layer is subsequently removed. The process to form porous silicon carbide exhibits dopant selectivity and a two-step etching procedure is implemented for silicon carbide multilayers.

  13. Research and Development in Very Long Baseline Interferometry (VLBI)

    NASA Technical Reports Server (NTRS)

    Himwich, William E.

    2004-01-01

    Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.

  14. An improved technique for the use of zinc-rich coatings

    NASA Technical Reports Server (NTRS)

    Paton, W. J.

    1973-01-01

    Blistering and peeling of topcoats used over ethyl silicate, inorganic, zinc-rich protective coatings are virtually eliminated when primer is allowed to cure outdoors for extended period of time and is moistened during process.

  15. Seasonal load restriction tool : Clarus regional demonstrations.

    DOT National Transportation Integrated Search

    2011-01-01

    State transportation agencies have wanted to improve : the techniques that lead to the decisions to impose and : subsequently lift restrictions on selected roads that are prone : to road damage due to subsurface freezing/thawing processes. : The Seas...

  16. A digital strategy for manometer dynamic enhancement. [for wind tunnel monitoring

    NASA Technical Reports Server (NTRS)

    Stoughton, J. W.

    1978-01-01

    Application of digital signal processing techniques to improve the non-linear dynamic characteristics of a sonar-type mercury manometer is described. The dynamic enhancement strategy quasi-linearizes the manometer characteristics and improves the effective bandwidth in the context of a wind-tunnel pressure regulation system. Model identification data and real-time hybrid simulation data demonstrate feasibility of approach.

  17. A Quality System for Education: Using Quality and Productivity Techniques To Save Our Schools.

    ERIC Educational Resources Information Center

    Spanbauer, Stanley J.; Hillman, Jo

    This book provides a case study of the implementation of a quality improvement model to improve educational services at Fox Valley Technical College (FVTC), in Appleton, Wisconsin. Chapter 1 describes the early stages of the implementation of the quality processes at FVTC. Chapter 2 discusses the role of the chief administrator as mentor and…

  18. Advancements in oxygen generation and humidity control by water vapor electrolysis

    NASA Technical Reports Server (NTRS)

    Heppner, D. B.; Sudar, M.; Lee, M. C.

    1988-01-01

    Regenerative processes for the revitalization of manned spacecraft atmospheres or other manned habitats are essential for realization of long-term space missions. These processes include oxygen generation through water electrolysis. One promising technique of water electrolysis is the direct conversion of the water vapor contained in the cabin air to oxygen. This technique is the subject of the present program on water vapor electrolysis development. The objectives were to incorporate technology improvements developed under other similar electrochemical programs and add new ones; design and fabricate a mutli-cell electrochemical module and a testing facility; and demonstrate through testing the improvements. Each aspect of the water vapor electrolysis cell was reviewed. The materials of construction and sizing of each element were investigated analytically and sometime experimentally. In addition, operational considerations such as temperature control in response to inlet conditions were investigated. Three specific quantitative goals were established.

  19. Speckle reduction in echocardiography by temporal compounding and anisotropic diffusion filtering

    NASA Astrophysics Data System (ADS)

    Giraldo-Guzmán, Jader; Porto-Solano, Oscar; Cadena-Bonfanti, Alberto; Contreras-Ortiz, Sonia H.

    2015-01-01

    Echocardiography is a medical imaging technique based on ultrasound signals that is used to evaluate heart anatomy and physiology. Echocardiographic images are affected by speckle, a type of multiplicative noise that obscures details of the structures, and reduces the overall image quality. This paper shows an approach to enhance echocardiography using two processing techniques: temporal compounding and anisotropic diffusion filtering. We used twenty echocardiographic videos that include one or three cardiac cycles to test the algorithms. Two images from each cycle were aligned in space and averaged to obtain the compound images. These images were then processed using anisotropic diffusion filters to further improve their quality. Resultant images were evaluated using quality metrics and visual assessment by two medical doctors. The average total improvement on signal-to-noise ratio was up to 100.29% for videos with three cycles, and up to 32.57% for videos with one cycle.

  20. Solid-state reactions during mechanical alloying of ternary Fe-Al-X (X=Ni, Mn, Cu, Ti, Cr, B, Si) systems: A review

    NASA Astrophysics Data System (ADS)

    Hadef, Fatma

    2016-12-01

    The last decade has witnessed an intensive research in the field of nanocrystalline materials due to their enhanced properties. A lot of processing techniques were developed in order to synthesis these novel materials, among them mechanical alloying or high-energy ball milling. In fact, mechanical alloying is one of the most common operations in the processing of solids. It can be used to quickly and easily synthesize a variety of technologically useful materials which are very difficult to manufacture by other techniques. One advantage of MA over many other techniques is that is a solid state technique and consequently problems associated with melting and solidification are bypassed. Special attention is being paid to the synthesis of alloys through reactions mainly occurring in solid state in many metallic ternary Fe-Al-X systems, in order to improve mainly Fe-Al structural and mechanical properties. The results show that nanocrystallization is the common result occurring in all systems during MA process. The aim of this work is to illustrate the uniqueness of MA process to induce phase transformation in metallic Fe-Al-X (X=Ni, Mn, Cu, Ti, Cr, B, Si) systems.

  1. Effect of Controlled Ice Nucleation on Stability of Lactate Dehydrogenase During Freeze-Drying.

    PubMed

    Fang, Rui; Tanaka, Kazunari; Mudhivarthi, Vamsi; Bogner, Robin H; Pikal, Michael J

    2018-03-01

    Several controlled ice nucleation techniques have been developed to increase the efficiency of the freeze-drying process as well as to improve the quality of pharmaceutical products. Owing to the reduction in ice surface area, these techniques have the potential to reduce the degradation of proteins labile during freezing. The objective of this study was to evaluate the effect of ice nucleation temperature on the in-process stability of lactate dehydrogenase (LDH). LDH in potassium phosphate buffer was nucleated at -4°C, -8°C, and -12°C using ControLyo™ or allowed to nucleate spontaneously. Both the enzymatic activity and tetramer recovery after freeze-thawing linearly correlated with product ice nucleation temperature (n = 24). Controlled nucleation also significantly improved batch homogeneity as reflected by reduced inter-vial variation in activity and tetramer recovery. With the correlation established in the laboratory, the degradation of protein in manufacturing arising from ice nucleation temperature differences can be quantitatively predicted. The results show that controlled nucleation reduced the degradation of LDH during the freezing process, but this does not necessarily translate to vastly superior stability during the entire freeze-drying process. The capability of improving batch homogeneity provides potential advantages in scaling-up from lab to manufacturing scale. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  2. A novel process for introducing a new intraoperative program: a multidisciplinary paradigm for mitigating hazards and improving patient safety.

    PubMed

    Rodriguez-Paz, Jose M; Mark, Lynette J; Herzer, Kurt R; Michelson, James D; Grogan, Kelly L; Herman, Joseph; Hunt, David; Wardlow, Linda; Armour, Elwood P; Pronovost, Peter J

    2009-01-01

    Since the Institute of Medicine's report, To Err is Human, was published, numerous interventions have been designed and implemented to correct the defects that lead to medical errors and adverse events; however, most efforts were largely reactive. Safety, communication, team performance, and efficiency are areas of care that attract a great deal of attention, especially regarding the introduction of new technologies, techniques, and procedures. We describe a multidisciplinary process that was implemented at our hospital to identify and mitigate hazards before the introduction of a new technique: high-dose-rate intraoperative radiation therapy, (HDR-IORT). A multidisciplinary team of surgeons, anesthesiologists, radiation oncologists, physicists, nurses, hospital risk managers, and equipment specialists used a structured process that included in situ clinical simulation to uncover concerns among care providers and to prospectively identify and mitigate defects for patients who would undergo surgery using the HDR-IORT technique. We identified and corrected 20 defects in the simulated patient care process before application to actual patients. Subsequently, eight patients underwent surgery using the HDR-IORT technique with no recurrence of simulation-identified or unanticipated defects. Multiple benefits were derived from the use of this systematic process to introduce the HDR-IORT technique; namely, the safety and efficiency of care for this select patient population was optimized, and this process mitigated harmful or adverse events before the inclusion of actual patients. Further work is needed, but the process outlined in this paper can be universally applied to the introduction of any new technologies, treatments, or procedures.

  3. Improved functionality of graphene and carbon nanotube hybrid foam architecture by UV-ozone treatment

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Ruiz, Isaac; Lee, Ilkeun; Zaera, Francisco; Ozkan, Mihrimah; Ozkan, Cengiz S.

    2015-04-01

    Optimization of the electrode/electrolyte double-layer interface is a key factor for improving electrode performance of aqueous electrolyte based supercapacitors (SCs). Here, we report the improved functionality of carbon materials via a non-invasive, high-throughput, and inexpensive UV generated ozone (UV-ozone) treatment. This process allows precise tuning of the graphene and carbon nanotube hybrid foam (GM) transitionally from ultrahydrophobic to hydrophilic within 60 s. The continuous tuning of surface energy can be controlled by simply varying the UV-ozone exposure time, while the ozone-oxidized carbon nanostructure maintains its integrity. Symmetric SCs based on the UV-ozone treated GM foam demonstrated enhanced rate performance. This technique can be readily applied to other CVD-grown carbonaceous materials by taking advantage of its ease of processing, low cost, scalability, and controllability.Optimization of the electrode/electrolyte double-layer interface is a key factor for improving electrode performance of aqueous electrolyte based supercapacitors (SCs). Here, we report the improved functionality of carbon materials via a non-invasive, high-throughput, and inexpensive UV generated ozone (UV-ozone) treatment. This process allows precise tuning of the graphene and carbon nanotube hybrid foam (GM) transitionally from ultrahydrophobic to hydrophilic within 60 s. The continuous tuning of surface energy can be controlled by simply varying the UV-ozone exposure time, while the ozone-oxidized carbon nanostructure maintains its integrity. Symmetric SCs based on the UV-ozone treated GM foam demonstrated enhanced rate performance. This technique can be readily applied to other CVD-grown carbonaceous materials by taking advantage of its ease of processing, low cost, scalability, and controllability. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr06795a

  4. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  5. Evidential Reasoning in Expert Systems for Image Analysis.

    DTIC Science & Technology

    1985-02-01

    techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths

  6. LSSA (Low-cost Silicon Solar Array) project

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Methods are explored for economically generating electrical power to meet future requirements. The Low-Cost Silicon Solar Array Project (LSSA) was established to reduce the price of solar arrays by improving manufacturing technology, adapting mass production techniques, and promoting user acceptance. The new manufacturing technology includes the consideration of new silicon refinement processes, silicon sheet growth techniques, encapsulants, and automated assembly production being developed under contract by industries and universities.

  7. Microgravity

    NASA Image and Video Library

    1994-02-03

    The objective of this facility is to investigate the potential of space grown semiconductor materials by the vapor transport technique and develop powdered metal and ceramic sintering techniques in microgravity. The materials processed or developed in the SEF have potential application for improving infrared detectors, nuclear particle detectors, photovoltaic cells, bearing cutting tools, electrical brushes and catalysts for chemical production. Flown on STS-60 Commercial Center: Consortium for Materials Development in Space - University of Alabama Huntsville (UAH)

  8. Developing and Evaluating RGB Composite MODIS Imagery for Applications in National Weather Service Forecast Offices

    NASA Technical Reports Server (NTRS)

    Oswald, Hayden; Molthan, Andrew L.

    2011-01-01

    Satellite remote sensing has gained widespread use in the field of operational meteorology. Although raw satellite imagery is useful, several techniques exist which can convey multiple types of data in a more efficient way. One of these techniques is multispectral compositing. The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed two multispectral satellite imagery products which utilize data from the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA's Terra and Aqua satellites, based upon products currently generated and used by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT). The nighttime microphysics product allows users to identify clouds occurring at different altitudes, but emphasizes fog and low cloud detection. This product improves upon current spectral difference and single channel infrared techniques. Each of the current products has its own set of advantages for nocturnal fog detection, but each also has limiting drawbacks which can hamper the analysis process. The multispectral product combines each current product with a third channel difference. Since the final image is enhanced with color, it simplifies the fog identification process. Analysis has shown that the nighttime microphysics imagery product represents a substantial improvement to conventional fog detection techniques, as well as provides a preview of future satellite capabilities to forecasters.

  9. High accuracy switched-current circuits using an improved dynamic mirror

    NASA Technical Reports Server (NTRS)

    Zweigle, G.; Fiez, T.

    1991-01-01

    The switched-current technique, a recently developed circuit approach to analog signal processing, has emerged as an alternative/compliment to the well established switched-capacitor circuit technique. High speed switched-current circuits offer potential cost and power savings over slower switched-capacitor circuits. Accuracy improvements are a primary concern at this stage in the development of the switched-current technique. Use of the dynamic current mirror has produced circuits that are insensitive to transistor matching errors. The dynamic current mirror has been limited by other sources of error including clock-feedthrough and voltage transient errors. In this paper we present an improved switched-current building block using the dynamic current mirror. Utilizing current feedback the errors due to current imbalance in the dynamic current mirror are reduced. Simulations indicate that this feedback can reduce total harmonic distortion by as much as 9 dB. Additionally, we have developed a clock-feedthrough reduction scheme for which simulations reveal a potential 10 dB total harmonic distortion improvement. The clock-feedthrough reduction scheme also significantly reduces offset errors and allows for cancellation with a constant current source. Experimental results confirm the simulated improvements.

  10. TQM in a test environment

    NASA Technical Reports Server (NTRS)

    Chambers, Gary D.; King, Elizabeth A.; Oleson, Keith

    1992-01-01

    In response to the changing aerospace economic climate, Martin Marietta Astronautics Group (MMAG) has adopted a Total Quality Management (TQM) philosophy to maintain a competitive edge. TQM emphasizes continuous improvement of processes, motivation to improve from within, cross-functional involvement, people empowerment, customer satisfaction, and modern process control techniques. The four major initiatives of TQM are Product Excellence, Manufacturing Resource Planning (MRP II), People Empowerment, and Subcontract Management. The Defense Space and Communications (DS&C) Test Lab's definition and implementation of the MRP II and people empowerment initiatives within TQM are discussed. The application of MRP II to environmental test planning and operations processes required a new and innovative approach. In an 18 month span, the test labs implemented MRP II and people empowerment and achieved a Class 'A' operational status. This resulted in numerous benefits, both tangible and intangible, including significant cost savings and improved quality of life. A detailed description of the implementation process and results are addressed.

  11. TQM in a test environment

    NASA Astrophysics Data System (ADS)

    Chambers, Gary D.; King, Elizabeth A.; Oleson, Keith

    1992-11-01

    In response to the changing aerospace economic climate, Martin Marietta Astronautics Group (MMAG) has adopted a Total Quality Management (TQM) philosophy to maintain a competitive edge. TQM emphasizes continuous improvement of processes, motivation to improve from within, cross-functional involvement, people empowerment, customer satisfaction, and modern process control techniques. The four major initiatives of TQM are Product Excellence, Manufacturing Resource Planning (MRP II), People Empowerment, and Subcontract Management. The Defense Space and Communications (DS&C) Test Lab's definition and implementation of the MRP II and people empowerment initiatives within TQM are discussed. The application of MRP II to environmental test planning and operations processes required a new and innovative approach. In an 18 month span, the test labs implemented MRP II and people empowerment and achieved a Class 'A' operational status. This resulted in numerous benefits, both tangible and intangible, including significant cost savings and improved quality of life. A detailed description of the implementation process and results are addressed.

  12. Faster, better, cheaper: lean labs are the key to future survival.

    PubMed

    Bryant, Patsy M; Gulling, Richard D

    2006-03-28

    Process improvement techniques have been used in manufacturing for many years to rein in costs and improve quality. Health care is now grappling with similar challenges. The Department of Laboratory Services at Good Samaritan Hospital, a 560-bed facility in Dayton, OH, used the Lean process improvement method in a 12-week project to streamline its core laboratory processes. By analyzing the flow of samples through the system and identifying value-added and non-value-added steps, both in the laboratory and during the collection process, Good Samaritan's project team redesigned systems and reconfigured the core laboratory layout to trim collection-to-results time from 65 minutes to 40 minutes. As a result, virtually all morning results are available to physicians by 7 a.m., critical values are called to nursing units within 30 minutes, and core laboratory services are optimally staffed for maximum cost-effectiveness.

  13. Controlling for confounding variables in MS-omics protocol: why modularity matters.

    PubMed

    Smith, Rob; Ventura, Dan; Prince, John T

    2014-09-01

    As the field of bioinformatics research continues to grow, more and more novel techniques are proposed to meet new challenges and improvements upon solutions to long-standing problems. These include data processing techniques and wet lab protocol techniques. Although the literature is consistently thorough in experimental detail and variable-controlling rigor for wet lab protocol techniques, bioinformatics techniques tend to be less described and less controlled. As the validation or rejection of hypotheses rests on the experiment's ability to isolate and measure a variable of interest, we urge the importance of reducing confounding variables in bioinformatics techniques during mass spectrometry experimentation. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  14. Demosaiced pixel super-resolution in digital holography for multiplexed computational color imaging on-a-chip (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wu, Yichen; Zhang, Yibo; Luo, Wei; Ozcan, Aydogan

    2017-03-01

    Digital holographic on-chip microscopy achieves large space-bandwidth-products (e.g., >1 billion) by making use of pixel super-resolution techniques. To synthesize a digital holographic color image, one can take three sets of holograms representing the red (R), green (G) and blue (B) parts of the spectrum and digitally combine them to synthesize a color image. The data acquisition efficiency of this sequential illumination process can be improved by 3-fold using wavelength-multiplexed R, G and B illumination that simultaneously illuminates the sample, and using a Bayer color image sensor with known or calibrated transmission spectra to digitally demultiplex these three wavelength channels. This demultiplexing step is conventionally used with interpolation-based Bayer demosaicing methods. However, because the pixels of different color channels on a Bayer image sensor chip are not at the same physical location, conventional interpolation-based demosaicing process generates strong color artifacts, especially at rapidly oscillating hologram fringes, which become even more pronounced through digital wave propagation and phase retrieval processes. Here, we demonstrate that by merging the pixel super-resolution framework into the demultiplexing process, such color artifacts can be greatly suppressed. This novel technique, termed demosaiced pixel super-resolution (D-PSR) for digital holographic imaging, achieves very similar color imaging performance compared to conventional sequential R,G,B illumination, with 3-fold improvement in image acquisition time and data-efficiency. We successfully demonstrated the color imaging performance of this approach by imaging stained Pap smears. The D-PSR technique is broadly applicable to high-throughput, high-resolution digital holographic color microscopy techniques that can be used in resource-limited-settings and point-of-care offices.

  15. Removal of Lattice Imperfections that Impact the Optical Quality of Ti:Sapphire using Advanced Magnetorheological Finishing Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menapace, J A; Schaffers, K I; Bayramian, A J

    2008-02-26

    Advanced magnetorheological finishing (MRF) techniques have been applied to Ti:sapphire crystals to compensate for sub-millimeter lattice distortions that occur during the crystal growing process. Precise optical corrections are made by imprinting topographical structure onto the crystal surfaces to cancel out the effects of the lattice distortion in the transmitted wavefront. This novel technique significantly improves the optical quality for crystals of this type and sets the stage for increasing the availability of high-quality large-aperture sapphire and Ti:sapphire optics in critical applications.

  16. Improved structural integrity through advances in reliable residual stress measurement: the impact of ENGIN-X

    NASA Astrophysics Data System (ADS)

    Edwards, L.; Santisteban, J. R.

    The determination of accurate reliable residual stresses is critical to many fields of structural integrity. Neutron stress measurement is a non-destructive technique that uniquely provides insights into stress fields deep within engineering components and structures. As such, it has become an increasingly important tool within engineering, leading to improved manufacturing processes to reduce stress and distortion as well as to the definition of more precise lifing procedures. This paper describes the likely impact of the next generation of dedicated engineering stress diffractometers currently being constructed and the utility of the technique using examples of residual stresses both beneficial and detrimental to structural integrity.

  17. OPC care-area feedforwarding to MPC

    NASA Astrophysics Data System (ADS)

    Dillon, Brian; Peng, Yi-Hsing; Hamaji, Masakazu; Tsunoda, Dai; Muramatsu, Tomoyuki; Ohara, Shuichiro; Zou, Yi; Arnoux, Vincent; Baron, Stanislas; Zhang, Xiaolong

    2016-10-01

    Demand for mask process correction (MPC) is growing for leading-edge process nodes. MPC was originally intended to correct CD linearity for narrow assist features difficult to resolve on a photomask without any correction, but it has been extended to main features as process nodes have been shrinking. As past papers have observed, MPC shows improvements in photomask fidelity. Using advanced shape and dose corrections could give more improvements, especially at line-ends and corners. However, there is a dilemma on using such advanced corrections on full mask level because it increases data volume and run time. In addition, write time on variable shaped beam (VSB) writers also increases as the number of shots increases. Optical proximity correction (OPC) care-area defines circuit design locations that require high mask fidelity under mask writing process variations such as energy fluctuation. It is useful for MPC to switch its correction strategy and permit the use of advanced mask correction techniques in those local care-areas where they provide maximum wafer benefits. The use of mask correction techniques tailored to localized post-OPC design can result in similar desired level of data volume, run time, and write time. ASML Brion and NCS have jointly developed a method to feedforward the care-area information from Tachyon LMC to NDE-MPC to provide real benefit for improving both mask writing and wafer printing quality. This paper explains the detail of OPC care-area feedforwarding to MPC between ASML Brion and NCS, and shows the results. In addition, improvements on mask and wafer simulations are also shown. The results indicate that the worst process variation (PV) bands are reduced up to 37% for a 10nm tech node metal case.

  18. Overview of CMOS process and design options for image sensor dedicated to space applications

    NASA Astrophysics Data System (ADS)

    Martin-Gonthier, P.; Magnan, P.; Corbiere, F.

    2005-10-01

    With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.

  19. Enzymes in Fish and Seafood Processing

    PubMed Central

    Fernandes, Pedro

    2016-01-01

    Enzymes have been used for the production and processing of fish and seafood for several centuries in an empirical manner. In recent decades, a growing trend toward a rational and controlled application of enzymes for such goals has emerged. Underlying such pattern are, among others, the increasingly wider array of enzyme activities and enzyme sources, improved enzyme formulations, and enhanced requirements for cost-effective and environmentally friendly processes. The better use of enzyme action in fish- and seafood-related application has had a significant impact on fish-related industry. Thus, new products have surfaced, product quality has improved, more sustainable processes have been developed, and innovative and reliable analytical techniques have been implemented. Recent development in these fields are presented and discussed, and prospective developments are suggested. PMID:27458583

  20. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  1. Iterative co-creation for improved hand hygiene and aseptic techniques in the operating room: experiences from the safe hands study.

    PubMed

    Erichsen Andersson, Annette; Frödin, Maria; Dellenborg, Lisen; Wallin, Lars; Hök, Jesper; Gillespie, Brigid M; Wikström, Ewa

    2018-01-04

    Hand hygiene and aseptic techniques are essential preventives in combating hospital-acquired infections. However, implementation of these strategies in the operating room remains suboptimal. There is a paucity of intervention studies providing detailed information on effective methods for change. This study aimed to evaluate the process of implementing a theory-driven knowledge translation program for improved use of hand hygiene and aseptic techniques in the operating room. The study was set in an operating department of a university hospital. The intervention was underpinned by theories on organizational learning, culture and person centeredness. Qualitative process data were collected via participant observations and analyzed using a thematic approach. Doubts that hand-hygiene practices are effective in preventing hospital acquired infections, strong boundaries and distrust between professional groups and a lack of psychological safety were identified as barriers towards change. Facilitated interprofessional dialogue and learning in "safe spaces" worked as mechanisms for motivation and engagement. Allowing for the free expression of different opinions, doubts and viewing resistance as a natural part of any change was effective in engaging all professional categories in co-creation of clinical relevant solutions to improve hand hygiene. Enabling nurses and physicians to think and talk differently about hospital acquired infections and hand hygiene requires a shift from the concept of one-way directed compliance towards change and learning as the result of a participatory and meaning-making process. The present study is a part of the Safe Hands project, and is registered with ClinicalTrials.gov (ID: NCT02983136 ). Date of registration 2016/11/28, retrospectively registered.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rangaraj, D; Chan, K; Boddu, S

    Lean thinking has revolutionized the manufacturing industry. Toyota has pioneered and leveraged this aspect of Lean thinking. Application of Lean thinking and Lean Six Sigma techniques into Healthcare and in particular in Radiation Oncology has its merits and challenges. To improve quality, safety and patient satisfaction with available resources or reducing cost in terms of time, staff and resources is demands of today's healthcare. Radiation oncology treatment involves many processes and steps, identifying and removing the non-value added steps in a process can significantly improve the efficiency. Real projects undertaken in radiation oncology department in cutting down the procedure timemore » for MRI guided brachytherapy to 40% less using lean thinking will be narrated. Simple Lean tools and techniques such as Gemba walk, visual control, daily huddles, standard work, value stream mapping, error-proofing, etc. can be applied with existing resources and how that improved the operation in a Radiation Oncology department's two year experience will be discussed. Lean thinking focuses on identifying and solving the root-cause of a problem by asking “Why” and not “Who” and this requires a culture change of no blame. Role of leadership in building lean culture, employee empowerment and trains and develops lean thinkers will be presented. Why Lean initiatives fail and how to implement lean successfully in your clinic will be discussed. Learning Objectives: Concepts of lean management or lean thinking. Lean tools and techniques applied in Radiation Oncology. Implement no blame culture and focus on system and processes. Leadership role in implementing lean culture. Challenges for Lean thinking in healthcare.« less

  3. Post-acquisition data mining techniques for LC-MS/MS-acquired data in drug metabolite identification.

    PubMed

    Dhurjad, Pooja Sukhdev; Marothu, Vamsi Krishna; Rathod, Rajeshwari

    2017-08-01

    Metabolite identification is a crucial part of the drug discovery process. LC-MS/MS-based metabolite identification has gained widespread use, but the data acquired by the LC-MS/MS instrument is complex, and thus the interpretation of data becomes troublesome. Fortunately, advancements in data mining techniques have simplified the process of data interpretation with improved mass accuracy and provide a potentially selective, sensitive, accurate and comprehensive way for metabolite identification. In this review, we have discussed the targeted (extracted ion chromatogram, mass defect filter, product ion filter, neutral loss filter and isotope pattern filter) and untargeted (control sample comparison, background subtraction and metabolomic approaches) post-acquisition data mining techniques, which facilitate the drug metabolite identification. We have also discussed the importance of integrated data mining strategy.

  4. Ultrafast chirped optical waveform recording using referenced heterodyning and a time microscope

    DOEpatents

    Bennett, Corey Vincent

    2010-06-15

    A new technique for capturing both the amplitude and phase of an optical waveform is presented. This technique can capture signals with many THz of bandwidths in a single shot (e.g., temporal resolution of about 44 fs), or be operated repetitively at a high rate. That is, each temporal window (or frame) is captured single shot, in real time, but the process may be run repeatedly or single-shot. This invention expands upon previous work in temporal imaging by adding heterodyning, which can be self-referenced for improved precision and stability, to convert frequency chirp (the second derivative of phase with respect to time) into a time varying intensity modulation. By also including a variety of possible demultiplexing techniques, this process is scalable to recoding continuous signals.

  5. Adaptive design of an X-ray magnetic circular dichroism spectroscopy experiment with Gaussian process modelling

    NASA Astrophysics Data System (ADS)

    Ueno, Tetsuro; Hino, Hideitsu; Hashimoto, Ai; Takeichi, Yasuo; Sawada, Masahiro; Ono, Kanta

    2018-01-01

    Spectroscopy is a widely used experimental technique, and enhancing its efficiency can have a strong impact on materials research. We propose an adaptive design for spectroscopy experiments that uses a machine learning technique to improve efficiency. We examined X-ray magnetic circular dichroism (XMCD) spectroscopy for the applicability of a machine learning technique to spectroscopy. An XMCD spectrum was predicted by Gaussian process modelling with learning of an experimental spectrum using a limited number of observed data points. Adaptive sampling of data points with maximum variance of the predicted spectrum successfully reduced the total data points for the evaluation of magnetic moments while providing the required accuracy. The present method reduces the time and cost for XMCD spectroscopy and has potential applicability to various spectroscopies.

  6. Ultrafast chirped optical waveform recorder using referenced heterodyning and a time microscope

    DOEpatents

    Bennett, Corey Vincent [Livermore, CA

    2011-11-22

    A new technique for capturing both the amplitude and phase of an optical waveform is presented. This technique can capture signals with many THz of bandwidths in a single shot (e.g., temporal resolution of about 44 fs), or be operated repetitively at a high rate. That is, each temporal window (or frame) is captured single shot, in real time, but the process may be run repeatedly or single-shot. This invention expands upon previous work in temporal imaging by adding heterodyning, which can be self-referenced for improved precision and stability, to convert frequency chirp (the second derivative of phase with respect to time) into a time varying intensity modulation. By also including a variety of possible demultiplexing techniques, this process is scalable to recoding continuous signals.

  7. An investigation into underwater wet welding using the flux cored arc welding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brydon, A.M.; Nixon, J.H.

    1995-12-31

    For the last two years, Cranfield has been carrying out a program of process investigations into wet underwater welding (Graham and Nixon 1993, Nixon and Webb 1994), and has demonstrated that it is possible to markedly improve the stability and consistency of the process by using control techniques developed for hyperbaric welding. In the project reported below, an initial evaluation of wet flux cored arc welding was undertaken. Although there continues to be considerable resistance to the use of wet welding on structures in the North Sea, continued pressure to reduce repair and maintenance costs is causing the industry tomore » re-examine techniques previously discounted, such as wet welding (Anon 1993).« less

  8. Nano-Electrochemistry and Nano-Electrografting with an Original Combined AFM-SECM

    PubMed Central

    Ghorbal, Achraf; Grisotto, Federico; Charlier, Julienne; Palacin, Serge; Goyer, Cédric; Demaille, Christophe; Ben Brahim, Ammar

    2013-01-01

    This study demonstrates the advantages of the combination between atomic force microscopy and scanning electrochemical microscopy. The combined technique can perform nano-electrochemical measurements onto agarose surface and nano-electrografting of non-conducting polymers onto conducting surfaces. This work was achieved by manufacturing an original Atomic Force Microscopy-Scanning ElectroChemical Microscopy (AFM-SECM) electrode. The capabilities of the AFM-SECM-electrode were tested with the nano-electrografting of vinylic monomers initiated by aryl diazonium salts. Nano-electrochemical and technical processes were thoroughly described, so as to allow experiments reproducing. A plausible explanation of chemical and electrochemical mechanisms, leading to the nano-grafting process, was reported. This combined technique represents the first step towards improved nano-processes for the nano-electrografting. PMID:28348337

  9. Improving Signal Detection using Allan and Theo Variances

    NASA Astrophysics Data System (ADS)

    Hardy, Andrew; Broering, Mark; Korsch, Wolfgang

    2017-09-01

    Precision measurements often deal with small signals buried within electronic noise. Extracting these signals can be enhanced through digital signal processing. Improving these techniques provide signal to noise ratios. Studies presently performed at the University of Kentucky are utilizing the electro-optic Kerr effect to understand cell charging effects within ultra-cold neutron storage cells. This work is relevant for the neutron electric dipole moment (nEDM) experiment at Oak Ridge National Laboratory. These investigations, and future investigations in general, will benefit from the illustrated improved analysis techniques. This project will showcase various methods for determining the optimum duration that data should be gathered for. Typically, extending the measuring time of an experimental run reduces the averaged noise. However, experiments also encounter drift due to fluctuations which mitigate the benefits of extended data gathering. Through comparing FFT averaging techniques, along with Allan and Theo variance measurements, quantifiable differences in signal detection will be presented. This research is supported by DOE Grants: DE-FG02-99ER411001, DE-AC05-00OR22725.

  10. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  11. X-Ray Microanalysis and Electron Energy Loss Spectrometry in the Analytical Electron Microscope: Review and Future Directions

    NASA Technical Reports Server (NTRS)

    Goldstein, J. I.; Williams, D. B.

    1992-01-01

    This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.

  12. Effect of Temperature and Deformation Rate on the Tensile Mechanical Properties of Polyimide Films

    NASA Technical Reports Server (NTRS)

    Moghazy, Samir F.; McNair, Kevin C.

    1996-01-01

    In order to study the structure-property relationships of different processed oriented polyimide films, the mechanical properties will be identified by using tensile tester Instron 4505 and structural information such as the 3-dimensional birefringence molecular symmetry axis and 3-dimensional refractive indices will be determined by using wave guide coupling techniques. The monoaxial drawing techniques utilized in this research are very useful for improving the tensile mechanical properties of aromatic polyimide films. In order to obtain high modulus/high strength polyimide films the following two techniques have been employed, cold drawing in which polyimide films are drawn at room temperature at different cross head speeds and hot drawing in which polyimide films are drawn at different temperatures and cross head speeds. In the hot drawing process the polyimide films are drawn at different temperatures until the glass transition temperature (Tg) is reached by using the environmental chamber. All of the mechanical and optical property parameters will be identified for each sample processed by both cold and hot drawing techniques.

  13. [QUIPS: quality improvement in postoperative pain management].

    PubMed

    Meissner, Winfried

    2011-01-01

    Despite the availability of high-quality guidelines and advanced pain management techniques acute postoperative pain management is still far from being satisfactory. The QUIPS (Quality Improvement in Postoperative Pain Management) project aims to improve treatment quality by means of standardised data acquisition, analysis of quality and process indicators, and feedback and benchmarking. During a pilot phase funded by the German Ministry of Health (BMG), a total of 12,389 data sets were collected from six participating hospitals. Outcome improved in four of the six hospitals. Process indicators, such as routine pain documentation, were only poorly correlated with outcomes. To date, more than 130 German hospitals use QUIPS as a routine quality management tool. An EC-funded parallel project disseminates the concept internationally. QUIPS demonstrates that patient-reported outcomes in postoperative pain management can be benchmarked in routine clinical practice. Quality improvement initiatives should use outcome instead of structural and process parameters. The concept is transferable to other fields of medicine. Copyright © 2011. Published by Elsevier GmbH.

  14. Improving sensor data analysis through diverse data source integration

    NASA Astrophysics Data System (ADS)

    Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry

    2009-05-01

    Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.

  15. Integrating MRP (materiel requirements planning) into modern business.

    PubMed

    Lunn, T

    1994-05-01

    Time is the commodity of the '90s. Therefore, we all must learn how to use our manufacturing systems to shorten lead time and increase customer satisfaction. The objective of this article is to discuss practical ways people integrate the techniques of materiel requirements planning (MRP) systems with just-in-time (JIT) execution systems to increase customer satisfaction. Included are examples of new ways people use MRP systems to exemplify the process of continuous improvement--multiple items on work orders, consolidated routings, flexing capacity, and other new developments. Ways that successful companies use MRP II for planning and JIT for execution are discussed. There are many examples of how to apply theory to real life situations and a discussion of techniques that work to keep companies in the mode of continuous improvement. Also included is a look at hands-on, practical methods people use to achieve lead time reduction and simplify bills of material. Total quality management concepts can be applied to the MRP process itself. This in turn helps people improve schedule adherence, which leads to customer satisfaction.

  16. [Detection of lung nodules. New opportunities in chest radiography].

    PubMed

    Pötter-Lang, S; Schalekamp, S; Schaefer-Prokop, C; Uffmann, M

    2014-05-01

    Chest radiography still represents the most commonly performed X-ray examination because it is readily available, requires low radiation doses and is relatively inexpensive. However, as previously published, many initially undetected lung nodules are retrospectively visible in chest radiographs. The great improvements in detector technology with the increasing dose efficiency and improved contrast resolution provide a better image quality and reduced dose needs. The dual energy acquisition technique and advanced image processing methods (e.g. digital bone subtraction and temporal subtraction) reduce the anatomical background noise by reduction of overlapping structures in chest radiography. Computer-aided detection (CAD) schemes increase the awareness of radiologists for suspicious areas. The advanced image processing methods show clear improvements for the detection of pulmonary lung nodules in chest radiography and strengthen the role of this method in comparison to 3D acquisition techniques, such as computed tomography (CT). Many of these methods will probably be integrated into standard clinical treatment in the near future. Digital software solutions offer advantages as they can be easily incorporated into radiology departments and are often more affordable as compared to hardware solutions.

  17. Image processing and recognition for biological images.

    PubMed

    Uchida, Seiichi

    2013-05-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. © 2013 The Author Development, Growth & Differentiation © 2013 Japanese Society of Developmental Biologists.

  18. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    NASA Astrophysics Data System (ADS)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  19. A New Essential Functions Installed DWH in Hospital Information System: Process Mining Techniques and Natural Language Processing.

    PubMed

    Honda, Masayuki; Matsumoto, Takehiro

    2017-01-01

    Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.

  20. Clinical operations management in radiology.

    PubMed

    Ondategui-Parra, Silvia; Gill, Ileana E; Bhagwat, Jui G; Intrieri, Lisa A; Gogate, Adheet; Zou, Kelly H; Nathanson, Eric; Seltzer, Steven E; Ros, Pablo R

    2004-09-01

    Providing radiology services is a complex and technically demanding enterprise in which the application of operations management (OM) tools can play a substantial role in process management and improvement. This paper considers the benefits of an OM process in a radiology setting. Available techniques and concepts of OM are addressed, along with gains and benefits that can be derived from these processes. A reference framework for the radiology processes is described, distinguishing two phases in the initial assessment of a unit: the diagnostic phase and the redesign phase.

Top