Sample records for processing techniques applied

  1. Applying Parallel Processing Techniques to Tether Dynamics Simulation

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    1996-01-01

    The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.

  2. A systematic mapping study of process mining

    NASA Astrophysics Data System (ADS)

    Maita, Ana Rocío Cárdenas; Martins, Lucas Corrêa; López Paz, Carlos Ramón; Rafferty, Laura; Hung, Patrick C. K.; Peres, Sarajane Marques; Fantinato, Marcelo

    2018-05-01

    This study systematically assesses the process mining scenario from 2005 to 2014. The analysis of 705 papers evidenced 'discovery' (71%) as the main type of process mining addressed and 'categorical prediction' (25%) as the main mining task solved. The most applied traditional technique is the 'graph structure-based' ones (38%). Specifically concerning computational intelligence and machine learning techniques, we concluded that little relevance has been given to them. The most applied are 'evolutionary computation' (9%) and 'decision tree' (6%), respectively. Process mining challenges, such as balancing among robustness, simplicity, accuracy and generalization, could benefit from a larger use of such techniques.

  3. Flash X-ray with image enhancement applied to combustion events

    NASA Astrophysics Data System (ADS)

    White, K. J.; McCoy, D. G.

    1983-10-01

    Flow visualization of interior ballistic processes by use of X-rays has placed more stringent requirements on flash X-ray techniques. The problem of improving radiographic contrast of propellants in X-ray transparent chambers was studied by devising techniques for evaluating, measuring and reducing the effects of scattering from both the test object and structures in the test area. X-ray film and processing is reviewed and techniques for evaluating and calibrating these are outlined. Finally, after X-ray techniques were optimized, the application of image enhancement processing which can improve image quality is described. This technique was applied to X-ray studies of the combustion of very high burning rate (VHBR) propellants and stick propellant charges.

  4. Low cost MATLAB-based pulse oximeter for deployment in research and development applications.

    PubMed

    Shokouhian, M; Morling, R C S; Kale, I

    2013-01-01

    Problems such as motion artifact and effects of ambient lights have forced developers to design different signal processing techniques and algorithms to increase the reliability and accuracy of the conventional pulse oximeter device. To evaluate the robustness of these techniques, they are applied either to recorded data or are implemented on chip to be applied to real-time data. Recorded data is the most common method of evaluating however it is not as reliable as real-time measurements. On the other hand, hardware implementation can be both expensive and time consuming. This paper presents a low cost MATLAB-based pulse oximeter that can be used for rapid evaluation of newly developed signal processing techniques and algorithms. Flexibility to apply different signal processing techniques, providing both processed and unprocessed data along with low implementation cost are the important features of this design which makes it ideal for research and development purposes, as well as commercial, hospital and healthcare application.

  5. Towards Real Time Diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Mcjunkin; Dennis C. Kunerth; Corrie Nichol

    2013-07-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  6. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.

    2014-02-18

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  7. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    NASA Astrophysics Data System (ADS)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.; Todorov, E.; Levesque, S.

    2014-02-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defects or precursors to defects and correct when possible during the weld process.

  8. Using Decision Trees for Estimating Mode Choice of Trips in Buca-Izmir

    NASA Astrophysics Data System (ADS)

    Oral, L. O.; Tecim, V.

    2013-05-01

    Decision makers develop transportation plans and models for providing sustainable transport systems in urban areas. Mode Choice is one of the stages in transportation modelling. Data mining techniques can discover factors affecting the mode choice. These techniques can be applied with knowledge process approach. In this study a data mining process model is applied to determine the factors affecting the mode choice with decision trees techniques by considering individual trip behaviours from household survey data collected within Izmir Transportation Master Plan. From this perspective transport mode choice problem is solved on a case in district of Buca-Izmir, Turkey with CRISP-DM knowledge process model.

  9. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  10. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  11. Conceptual design optimization study

    NASA Technical Reports Server (NTRS)

    Hollowell, S. J.; Beeman, E. R., II; Hiyama, R. M.

    1990-01-01

    The feasibility of applying multilevel functional decomposition and optimization techniques to conceptual design of advanced fighter aircraft was investigated. Applying the functional decomposition techniques to the conceptual design phase appears to be feasible. The initial implementation of the modified design process will optimize wing design variables. A hybrid approach, combining functional decomposition techniques for generation of aerodynamic and mass properties linear sensitivity derivatives with existing techniques for sizing mission performance and optimization, is proposed.

  12. A review of recent developments in parametric based acoustic emission techniques applied to concrete structures

    NASA Astrophysics Data System (ADS)

    Vidya Sagar, R.; Raghu Prasad, B. K.

    2012-03-01

    This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.

  13. Parallel plan execution with self-processing networks

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.

    1989-01-01

    A critical issue for space operations is how to develop and apply advanced automation techniques to reduce the cost and complexity of working in space. In this context, it is important to examine how recent advances in self-processing networks can be applied for planning and scheduling tasks. For this reason, the feasibility of applying self-processing network models to a variety of planning and control problems relevant to spacecraft activities is being explored. Goals are to demonstrate that self-processing methods are applicable to these problems, and that MIRRORS/II, a general purpose software environment for implementing self-processing models, is sufficiently robust to support development of a wide range of application prototypes. Using MIRRORS/II and marker passing modelling techniques, a model of the execution of a Spaceworld plan was implemented. This is a simplified model of the Voyager spacecraft which photographed Jupiter, Saturn, and their satellites. It is shown that plan execution, a task usually solved using traditional artificial intelligence (AI) techniques, can be accomplished using a self-processing network. The fact that self-processing networks were applied to other space-related tasks, in addition to the one discussed here, demonstrates the general applicability of this approach to planning and control problems relevant to spacecraft activities. It is also demonstrated that MIRRORS/II is a powerful environment for the development and evaluation of self-processing systems.

  14. Unfolding and unfoldability of digital pulses in the z-domain

    NASA Astrophysics Data System (ADS)

    Regadío, Alberto; Sánchez-Prieto, Sebastián

    2018-04-01

    The unfolding (or deconvolution) technique is used in the development of digital pulse processing systems applied to particle detection. This technique is applied to digital signals obtained by digitization of analog signals that represent the combined response of the particle detectors and the associated signal conditioning electronics. This work describes a technique to determine if the signal is unfoldable. For unfoldable signals the characteristics of the unfolding system (unfolder) are presented. Finally, examples of the method applied to real experimental setup are discussed.

  15. Photoacoustic technique applied to the study of skin and leather

    NASA Astrophysics Data System (ADS)

    Vargas, M.; Varela, J.; Hernández, L.; González, A.

    1998-08-01

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process.

  16. Photo-reconnaissance applications of computer processing of images.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1971-01-01

    An imaging processing technique is developed for enhancement and calibration of imaging experiments. The technique is shown to be useful not only for the original application but also when applied to images from a wide variety of sources.

  17. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Credille, Jennifer; Owens, Elizabeth

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restrictedmore » to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.« less

  18. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  19. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  20. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  1. Process Mining-Based Method of Designing and Optimizing the Layouts of Emergency Departments in Hospitals.

    PubMed

    Rismanchian, Farhood; Lee, Young Hoon

    2017-07-01

    This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.

  2. State-of-the-art of optics in China reviewed

    NASA Astrophysics Data System (ADS)

    Wang, Daheng; Wo, Xinneng

    1985-06-01

    The state-of-the-art of optics and applied optics in China is reviewed. Developments in lasers, infrared and opto-electronic techniques, optical metrology, high-speed photography, holography and information processing, nonlinear optics, optical fiber communications and optical techniques are described. Further development of optics and applied optics in China are proposed.

  3. A NOVEL TECHNIQUE APPLYING SPECTRAL ESTIMATION TO JOHNSON NOISE THERMOMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ezell, N Dianne Bull; Britton Jr, Charles L; Roberts, Michael

    Johnson noise thermometry (JNT) is one of many important measurements used to monitor the safety levels and stability in a nuclear reactor. However, this measurement is very dependent on the electromagnetic environment. Properly removing unwanted electromagnetic interference (EMI) is critical for accurate drift free temperature measurements. The two techniques developed by Oak Ridge National Laboratory (ORNL) to remove transient and periodic EMI are briefly discussed in this document. Spectral estimation is a key component in the signal processing algorithm utilized for EMI removal and temperature calculation. Applying these techniques requires the simple addition of the electronics and signal processing tomore » existing resistive thermometers.« less

  4. A Discriminative Approach to EEG Seizure Detection

    PubMed Central

    Johnson, Ashley N.; Sow, Daby; Biem, Alain

    2011-01-01

    Seizures are abnormal sudden discharges in the brain with signatures represented in electroencephalograms (EEG). The efficacy of the application of speech processing techniques to discriminate between seizure and non-seizure states in EEGs is reported. The approach accounts for the challenges of unbalanced datasets (seizure and non-seizure), while also showing a system capable of real-time seizure detection. The Minimum Classification Error (MCE) algorithm, which is a discriminative learning algorithm with wide-use in speech processing, is applied and compared with conventional classification techniques that have already been applied to the discrimination between seizure and non-seizure states in the literature. The system is evaluated on 22 pediatric patients multi-channel EEG recordings. Experimental results show that the application of speech processing techniques and MCE compare favorably with conventional classification techniques in terms of classification performance, while requiring less computational overhead. The results strongly suggests the possibility of deploying the designed system at the bedside. PMID:22195192

  5. Applied in situ product recovery in ABE fermentation

    PubMed Central

    Lalander, Carl‐Axel; Lee, Jonathan G. M.; Davies, E. Timothy; Harvey, Adam P.

    2017-01-01

    The production of biobutanol is hindered by the product's toxicity to the bacteria, which limits the productivity of the process. In situ product recovery of butanol can improve the productivity by removing the source of inhibition. This paper reviews in situ product recovery techniques applied to the acetone butanol ethanol fermentation in a stirred tank reactor. Methods of in situ recovery include gas stripping, vacuum fermentation, pervaporation, liquid–liquid extraction, perstraction, and adsorption, all of which have been investigated for the acetone, butanol, and ethanol fermentation. All techniques have shown an improvement in substrate utilization, yield, productivity or both. Different fermentation modes favored different techniques. For batch processing gas stripping and pervaporation were most favorable, but in fed‐batch fermentations gas stripping and adsorption were most promising. During continuous processing perstraction appeared to offer the best improvement. The use of hybrid techniques can increase the final product concentration beyond that of single‐stage techniques. Therefore, the selection of an in situ product recovery technique would require comparable information on the energy demand and economics of the process. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:563–579, 2017 PMID:28188696

  6. An In-Process Surface Roughness Recognition System in End Milling Operations

    ERIC Educational Resources Information Center

    Yang, Lieh-Dai; Chen, Joseph C.

    2004-01-01

    To develop an in-process quality control system, a sensor technique and a decision-making algorithm need to be applied during machining operations. Several sensor techniques have been used in the in-process prediction of quality characteristics in machining operations. For example, an accelerometer sensor can be used to monitor the vibration of…

  7. Strategies for Fermentation Medium Optimization: An In-Depth Review

    PubMed Central

    Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.

    2017-01-01

    Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566

  8. Synthesized af-PFCl and GG-g-P(AN)/TEOS hydrogel composite used in hybridized technique applied for AMD treatment

    NASA Astrophysics Data System (ADS)

    Fosso-Kankeu, Elvis

    2018-06-01

    In the present study af-PFCl, GL-g-P(AN) hydrogel and GL-g-P(AN)/TEOS hydrogel composite were synthesized. The hydrogels were characterized using the fourier transformed infra-red (FTIR) and the scanning electron microscope (SEM) techniques. The coagulant af-PFCl and the hydrogels were applied consecutively in flocculation and adsorption processes respectively for the treatment of acid mine drainage (AMD). It was observed that the grafting process increased the amount of binding groups on the hydrogels. The hybridization of the techniques assisted in the removal of anions; while the cations were mostly removed by the adsorption process. The adsorbents behaviour was fittingly expressed by the pseudo-second order model. The adsorption capacities of GL-g-P(AN)/TEOS hydrogel composite for the removal of Al, As and Zn were 3.89, 0.66 and 0.394 respectively; while the adsorption capacities of GL-g-P(AN) for the removal of Al and Mg were 3.47 and 9.66 mg/g respectively. The techniques applied in this study have shown good potential for the removal of specific pollutants from the AMD; it is however, important that the appropriate hybridization of techniques allows to remove all the pollutants and restore acceptable water quality.

  9. Applying Mixed Methods Techniques in Strategic Planning

    ERIC Educational Resources Information Center

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  10. Fuzzy neural network methodology applied to medical diagnosis

    NASA Technical Reports Server (NTRS)

    Gorzalczany, Marian B.; Deutsch-Mcleish, Mary

    1992-01-01

    This paper presents a technique for building expert systems that combines the fuzzy-set approach with artificial neural network structures. This technique can effectively deal with two types of medical knowledge: a nonfuzzy one and a fuzzy one which usually contributes to the process of medical diagnosis. Nonfuzzy numerical data is obtained from medical tests. Fuzzy linguistic rules describing the diagnosis process are provided by a human expert. The proposed method has been successfully applied in veterinary medicine as a support system in the diagnosis of canine liver diseases.

  11. Coupling Computer-Aided Process Simulation and ...

    EPA Pesticide Factsheets

    A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable

  12. Speckle noise reduction in quantitative optical metrology techniques by application of the discrete wavelet transformation

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Effective suppression of speckle noise content in interferometric data images can help in improving accuracy and resolution of the results obtained with interferometric optical metrology techniques. In this paper, novel speckle noise reduction algorithms based on the discrete wavelet transformation are presented. The algorithms proceed by: (a) estimating the noise level contained in the interferograms of interest, (b) selecting wavelet families, (c) applying the wavelet transformation using the selected families, (d) wavelet thresholding, and (e) applying the inverse wavelet transformation, producing denoised interferograms. The algorithms are applied to the different stages of the processing procedures utilized for generation of quantitative speckle correlation interferometry data of fiber-optic based opto-electronic holography (FOBOEH) techniques, allowing identification of optimal processing conditions. It is shown that wavelet algorithms are effective for speckle noise reduction while preserving image features otherwise faded with other algorithms.

  13. Application of off-line image processing for optimization in chest computed radiography using a low cost system.

    PubMed

    Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato

    2015-03-08

     The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.

  14. Application of off‐line image processing for optimization in chest computed radiography using a low cost system

    PubMed Central

    Msaki, Peter; Padovani, Renato

    2015-01-01

    The objective of this study was to improve the visibility of anatomical details by applying off‐line postimage processing in chest computed radiography (CR). Four spatial domain‐based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann‐Whitney U‐test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005≤p≤0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60≤kVp≤70. However, there was no improvement for images acquired using 102≤kVp≤107 (0.127≤p≤0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists. PACS number: 87.59.−e, 87.59.−B, 87.59.−bd PMID:26103165

  15. Applied in situ product recovery in ABE fermentation.

    PubMed

    Outram, Victoria; Lalander, Carl-Axel; Lee, Jonathan G M; Davies, E Timothy; Harvey, Adam P

    2017-05-01

    The production of biobutanol is hindered by the product's toxicity to the bacteria, which limits the productivity of the process. In situ product recovery of butanol can improve the productivity by removing the source of inhibition. This paper reviews in situ product recovery techniques applied to the acetone butanol ethanol fermentation in a stirred tank reactor. Methods of in situ recovery include gas stripping, vacuum fermentation, pervaporation, liquid-liquid extraction, perstraction, and adsorption, all of which have been investigated for the acetone, butanol, and ethanol fermentation. All techniques have shown an improvement in substrate utilization, yield, productivity or both. Different fermentation modes favored different techniques. For batch processing gas stripping and pervaporation were most favorable, but in fed-batch fermentations gas stripping and adsorption were most promising. During continuous processing perstraction appeared to offer the best improvement. The use of hybrid techniques can increase the final product concentration beyond that of single-stage techniques. Therefore, the selection of an in situ product recovery technique would require comparable information on the energy demand and economics of the process. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:563-579, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  16. How High Is the Tramping Track? Mathematising and Applying in a Calculus Model-Eliciting Activity

    ERIC Educational Resources Information Center

    Yoon, Caroline; Dreyfus, Tommy; Thomas, Michael O. J.

    2010-01-01

    Two complementary processes involved in mathematical modelling are mathematising a realistic situation and applying a mathematical technique to a given realistic situation. We present and analyse work from two undergraduate students and two secondary school teachers who engaged in both processes during a mathematical modelling task that required…

  17. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  18. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement.

    PubMed

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.

  19. Computer assisted analysis of auroral images obtained from high altitude polar satellites

    NASA Technical Reports Server (NTRS)

    Samadani, Ramin; Flynn, Michael

    1993-01-01

    Automatic techniques that allow the extraction of physically significant parameters from auroral images were developed. This allows the processing of a much larger number of images than is currently possible with manual techniques. Our techniques were applied to diverse auroral image datasets. These results were made available to geophysicists at NASA and at universities in the form of a software system that performs the analysis. After some feedback from users, an upgraded system was transferred to NASA and to two universities. The feasibility of user-trained search and retrieval of large amounts of data using our automatically derived parameter indices was demonstrated. Techniques based on classification and regression trees (CART) were developed and applied to broaden the types of images to which the automated search and retrieval may be applied. Our techniques were tested with DE-1 auroral images.

  20. Voyager image processing at the Image Processing Laboratory

    NASA Astrophysics Data System (ADS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-09-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  1. Voyager image processing at the Image Processing Laboratory

    NASA Technical Reports Server (NTRS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-01-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  2. Diagnostic Radiology--The Impact of New Technology.

    ERIC Educational Resources Information Center

    Harrison, R. M.

    1989-01-01

    Discussed are technological advances applying computer techniques for image acquisition and processing, including digital radiography, computed tomography, and nuclear magnetic resonance imaging. Several diagrams and pictures showing the use of each technique are presented. (YP)

  3. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  4. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  5. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  6. Process-driven selection of information systems for healthcare

    NASA Astrophysics Data System (ADS)

    Mills, Stephen F.; Yeh, Raymond T.; Giroir, Brett P.; Tanik, Murat M.

    1995-05-01

    Integration of networking and data management technologies such as PACS, RIS and HIS into a healthcare enterprise in a clinically acceptable manner is a difficult problem. Data within such a facility are generally managed via a combination of manual hardcopy systems and proprietary, special-purpose data processing systems. Process modeling techniques have been successfully applied to engineering and manufacturing enterprises, but have not generally been applied to service-based enterprises such as healthcare facilities. The use of process modeling techniques can provide guidance for the placement, configuration and usage of PACS and other informatics technologies within the healthcare enterprise, and thus improve the quality of healthcare. Initial process modeling activities conducted within the Pediatric ICU at Children's Medical Center in Dallas, Texas are described. The ongoing development of a full enterprise- level model for the Pediatric ICU is also described.

  7. Time-Lapse Motion Picture Technique Applied to the Study of Geological Processes.

    PubMed

    Miller, R D; Crandell, D R

    1959-09-25

    Light-weight, battery-operated timers were built and coupled to 16-mm motion-picture cameras having apertures controlled by photoelectric cells. The cameras were placed adjacent to Emmons Glacier on Mount Rainier. The film obtained confirms the view that exterior time-lapse photography can be applied to the study of slow-acting geologic processes.

  8. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.

  9. Analyses of requirements for computer control and data processing experiment subsystems. Volume 1: ATM experiment S-056 image data processing system techniques development

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The solar imaging X-ray telescope experiment (designated the S-056 experiment) is described. It will photograph the sun in the far ultraviolet or soft X-ray region. Because of the imaging characteristics of this telescope and the necessity of using special techniques for capturing images on film at these wave lengths, methods were developed for computer processing of the photographs. The problems of image restoration were addressed to develop and test digital computer techniques for applying a deconvolution process to restore overall S-056 image quality. Additional techniques for reducing or eliminating the effects of noise and nonlinearity in S-056 photographs were developed.

  10. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement

    PubMed Central

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344

  11. Process research of non-CZ silicon material

    NASA Technical Reports Server (NTRS)

    1983-01-01

    High risk, high payoff research areas associated with the Westinghouse process for producing photovoltaic modules using non- CZ sheet material were investigated. All work was performed using dendritic web silicon. The following tasks are discussed and associated technical results are given: (1) determining the technical feasibility of forming front and back junctions in non-CT silicon using dopant techniques; (2) determining the feasibility of forming a liquid applied diffusion mask to replace the more costly chemical vapor deposited SiO2 diffusion mask; (3) determining the feasibility of applying liquid anti-reflective solutions using meniscus coating equipment; (4) studying the production of uniform, high efficiency solar cells using ion implanation junction formation techniques; and (5) quantifying cost improvements associated with process improvements.

  12. Applied photo interpretation for airbrush cartography

    NASA Technical Reports Server (NTRS)

    Inge, J. L.; Bridges, P. M.

    1976-01-01

    New techniques of cartographic portrayal have been developed for the compilation of maps of lunar and planetary surfaces. Conventional photo interpretation methods utilizing size, shape, shadow, tone, pattern, and texture are applied to computer processed satellite television images. The variety of the image data allows the illustrator to interpret image details by inter-comparison and intra-comparison of photographs. Comparative judgements are affected by illumination, resolution, variations in surface coloration, and transmission or processing artifacts. The validity of the interpretation process is tested by making a representational drawing by an airbrush portrayal technique. Production controls insure the consistency of a map series. Photo interpretive cartographic portrayal skills are used to prepare two kinds of map series and are adaptable to map products of different kinds and purposes.

  13. The Elicitation Interview Technique: Capturing People's Experiences of Data Representations.

    PubMed

    Hogan, Trevor; Hinrichs, Uta; Hornecker, Eva

    2016-12-01

    Information visualization has become a popular tool to facilitate sense-making, discovery and communication in a large range of professional and casual contexts. However, evaluating visualizations is still a challenge. In particular, we lack techniques to help understand how visualizations are experienced by people. In this paper we discuss the potential of the Elicitation Interview technique to be applied in the context of visualization. The Elicitation Interview is a method for gathering detailed and precise accounts of human experience. We argue that it can be applied to help understand how people experience and interpret visualizations as part of exploration and data analysis processes. We describe the key characteristics of this interview technique and present a study we conducted to exemplify how it can be applied to evaluate data representations. Our study illustrates the types of insights this technique can bring to the fore, for example, evidence for deep interpretation of visual representations and the formation of interpretations and stories beyond the represented data. We discuss general visualization evaluation scenarios where the Elicitation Interview technique may be beneficial and specify what needs to be considered when applying this technique in a visualization context specifically.

  14. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  15. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  16. Segmentation Techniques for Expanding a Library Instruction Market: Evaluating and Brainstorming.

    ERIC Educational Resources Information Center

    Warren, Rebecca; Hayes, Sherman; Gunter, Donna

    2001-01-01

    Describes a two-part segmentation technique applied to an instruction program for an academic library during a strategic planning process. Discusses a brainstorming technique used to create a list of existing and potential audiences, and then describes a follow-up review session that evaluated the past years' efforts. (Author/LRW)

  17. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  18. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.

  19. Development Of Polarimetric Decomposition Techniques For Indian Forest Resource Assessment Using Radar Imaging Satellite (Risat-1) Images

    NASA Astrophysics Data System (ADS)

    Sridhar, J.

    2015-12-01

    The focus of this work is to examine polarimetric decomposition techniques primarily focussed on Pauli decomposition and Sphere Di-Plane Helix (SDH) decomposition for forest resource assessment. The data processing methods adopted are Pre-processing (Geometric correction and Radiometric calibration), Speckle Reduction, Image Decomposition and Image Classification. Initially to classify forest regions, unsupervised classification was applied to determine different unknown classes. It was observed K-means clustering method gave better results in comparison with ISO Data method.Using the algorithm developed for Radar Tools, the code for decomposition and classification techniques were applied in Interactive Data Language (IDL) and was applied to RISAT-1 image of Mysore-Mandya region of Karnataka, India. This region is chosen for studying forest vegetation and consists of agricultural lands, water and hilly regions. Polarimetric SAR data possess a high potential for classification of earth surface.After applying the decomposition techniques, classification was done by selecting region of interests andpost-classification the over-all accuracy was observed to be higher in the SDH decomposed image, as it operates on individual pixels on a coherent basis and utilises the complete intrinsic coherent nature of polarimetric SAR data. Thereby, making SDH decomposition particularly suited for analysis of high-resolution SAR data. The Pauli Decomposition represents all the polarimetric information in a single SAR image however interpretation of the resulting image is difficult. The SDH decomposition technique seems to produce better results and interpretation as compared to Pauli Decomposition however more quantification and further analysis are being done in this area of research. The comparison of Polarimetric decomposition techniques and evolutionary classification techniques will be the scope of this work.

  20. SHSG processing for three-wavelength HOEs recording in silver halide materials

    NASA Astrophysics Data System (ADS)

    Kim, Jong Man; Choi, Yoon S.; Bjelkhagen, Hans I.; Phillips, Nicholas J.

    2002-06-01

    The recording and processing technique for color HOEs in ultrafine-grain panchromatic silver halide emulsions is presented. It is possible to obtain high diffraction efficiency employing the silver halide sensitized gelatin (SHSG) process. SHSG holograms are similar to holograms recorded in dichromated gelatin (DCG). The drawback of DCG is its low sensitivity and limited spectral response. Panchromatic silver halide materials from Slavich can be processed in such a way that the final holograms have properties like a DCG hologram. The processing method or microvoid technique has been optimized for three laser- wavelength recordings in Slavich PFG-03C emulsion. For example, applying this new processing technique high- efficiency white holographic reflectors can be manufactured. The technique is also suitable for producing efficiency color display holograms. In particular, masters for mass production of color holograms or color HOEs can be performed by contact-copying into photopolymer materials because the reconstruction wavelengths are identical to the recording wavelengths.

  1. Applying Ancestry and Sex Computation as a Quality Control Tool in Targeted Next-Generation Sequencing.

    PubMed

    Mathias, Patrick C; Turner, Emily H; Scroggins, Sheena M; Salipante, Stephen J; Hoffman, Noah G; Pritchard, Colin C; Shirts, Brian H

    2016-03-01

    To apply techniques for ancestry and sex computation from next-generation sequencing (NGS) data as an approach to confirm sample identity and detect sample processing errors. We combined a principal component analysis method with k-nearest neighbors classification to compute the ancestry of patients undergoing NGS testing. By combining this calculation with X chromosome copy number data, we determined the sex and ancestry of patients for comparison with self-report. We also modeled the sensitivity of this technique in detecting sample processing errors. We applied this technique to 859 patient samples with reliable self-report data. Our k-nearest neighbors ancestry screen had an accuracy of 98.7% for patients reporting a single ancestry. Visual inspection of principal component plots was consistent with self-report in 99.6% of single-ancestry and mixed-ancestry patients. Our model demonstrates that approximately two-thirds of potential sample swaps could be detected in our patient population using this technique. Patient ancestry can be estimated from NGS data incidentally sequenced in targeted panels, enabling an inexpensive quality control method when coupled with patient self-report. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique

    PubMed Central

    Riaz, Muhammad Mohsin; Ghafoor, Abdul

    2014-01-01

    Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332

  3. Extreme Learning Machine and Particle Swarm Optimization in optimizing CNC turning operation

    NASA Astrophysics Data System (ADS)

    Janahiraman, Tiagrajah V.; Ahmad, Nooraziah; Hani Nordin, Farah

    2018-04-01

    The CNC machine is controlled by manipulating cutting parameters that could directly influence the process performance. Many optimization methods has been applied to obtain the optimal cutting parameters for the desired performance function. Nonetheless, the industry still uses the traditional technique to obtain those values. Lack of knowledge on optimization techniques is the main reason for this issue to be prolonged. Therefore, the simple yet easy to implement, Optimal Cutting Parameters Selection System is introduced to help the manufacturer to easily understand and determine the best optimal parameters for their turning operation. This new system consists of two stages which are modelling and optimization. In modelling of input-output and in-process parameters, the hybrid of Extreme Learning Machine and Particle Swarm Optimization is applied. This modelling technique tend to converge faster than other artificial intelligent technique and give accurate result. For the optimization stage, again the Particle Swarm Optimization is used to get the optimal cutting parameters based on the performance function preferred by the manufacturer. Overall, the system can reduce the gap between academic world and the industry by introducing a simple yet easy to implement optimization technique. This novel optimization technique can give accurate result besides being the fastest technique.

  4. Image processing developments and applications for water quality monitoring and trophic state determination

    NASA Technical Reports Server (NTRS)

    Blackwell, R. J.

    1982-01-01

    Remote sensing data analysis of water quality monitoring is evaluated. Data anaysis and image processing techniques are applied to LANDSAT remote sensing data to produce an effective operational tool for lake water quality surveying and monitoring. Digital image processing and analysis techniques were designed, developed, tested, and applied to LANDSAT multispectral scanner (MSS) data and conventional surface acquired data. Utilization of these techniques facilitates the surveying and monitoring of large numbers of lakes in an operational manner. Supervised multispectral classification, when used in conjunction with surface acquired water quality indicators, is used to characterize water body trophic status. Unsupervised multispectral classification, when interpreted by lake scientists familiar with a specific water body, yields classifications of equal validity with supervised methods and in a more cost effective manner. Image data base technology is used to great advantage in characterizing other contributing effects to water quality. These effects include drainage basin configuration, terrain slope, soil, precipitation and land cover characteristics.

  5. An Approach to the Evaluation of Hypermedia.

    ERIC Educational Resources Information Center

    Knussen, Christina; And Others

    1991-01-01

    Discusses methods that may be applied to the evaluation of hypermedia, based on six models described by Lawton. Techniques described include observation, self-report measures, interviews, automated measures, psychometric tests, checklists and criterion-based techniques, process models, Experimentally Measuring Usability (EMU), and a naturalistic…

  6. Influence of Processing Techniques on Microstructure and Mechanical Properties of a Biodegradable Mg-3Zn-2Ca Alloy

    PubMed Central

    Doležal, Pavel; Zapletal, Josef; Fintová, Stanislava; Trojanová, Zuzanka; Greger, Miroslav; Roupcová, Pavla; Podrábský, Tomáš

    2016-01-01

    New Mg-3Zn-2Ca magnesium alloy was prepared using different processing techniques: gravity casting as well as squeeze casting in liquid and semisolid states. Materials were further thermally treated; thermal treatment of the gravity cast alloy was additionally combined with the equal channel angular pressing (ECAP). Alloy processed by the squeeze casting in liquid as well as in semisolid state exhibit improved plasticity; the ECAP processing positively influenced both the tensile and compressive characteristics of the alloy. Applied heat treatment influenced the distribution and chemical composition of present intermetallic phases. Influence of particular processing techniques, heat treatment, and intermetallic phase distribution is thoroughly discussed in relation to mechanical behavior of presented alloys. PMID:28774000

  7. Nature of the optical information recorded in speckles

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.

    1998-09-01

    The process of encoding displacement information in electronic Holographic Interferometry is reviewed. Procedures to extend the applicability of this technique to large deformations are given. The proposed techniques are applied and results from these experiments are compared with results obtained by other means. The similarity between the two sets of results illustrates the validity for the new techniques.

  8. A Method of Surrogate Model Construction which Leverages Lower-Fidelity Information using Space Mapping Techniques

    DTIC Science & Technology

    2014-03-27

    fidelity. This pairing is accomplished through the use of a space mapping technique, which is a process where the design space of a lower fidelity model...is aligned a higher fidelity model. The intent of applying space mapping techniques to the field of surrogate construction is to leverage the

  9. Autoclave heat treatment for prealloyed powder products

    NASA Technical Reports Server (NTRS)

    Freche, J. C.; Ashbrook, R. L.

    1973-01-01

    Technique could be applied directly to loose powders as part of hot pressing process of forming them to any required shapes. This would eliminate initial extrusion step commonly applied to prealloyed powders, substantially reduce cost of forming operation, and result in optimum properties.

  10. Processing experiments on non-Czochralski silicon sheet

    NASA Technical Reports Server (NTRS)

    Pryor, R. A.; Grenon, L. A.; Sakiotis, N. G.; Pastirik, E. M.; Sparks, T. O.; Legge, R. N.

    1981-01-01

    A program is described which supports and promotes the development of processing techniques which may be successfully and cost-effectively applied to low-cost sheets for solar cell fabrication. Results are reported in the areas of process technology, cell design, cell metallization, and production cost simulation.

  11. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  12. Discovery of Information Diffusion Process in Social Networks

    NASA Astrophysics Data System (ADS)

    Kim, Kwanho; Jung, Jae-Yoon; Park, Jonghun

    Information diffusion analysis in social networks is of significance since it enables us to deeply understand dynamic social interactions among users. In this paper, we introduce approaches to discovering information diffusion process in social networks based on process mining. Process mining techniques are applied from three perspectives: social network analysis, process discovery and community recognition. We then present experimental results by using a real-life social network data. The proposed techniques are expected to employ as new analytical tools in online social networks such as blog and wikis for company marketers, politicians, news reporters and online writers.

  13. Interactive shape metamorphosis

    NASA Technical Reports Server (NTRS)

    Chen, David T.; State, Andrei; Banks, David

    1994-01-01

    A technique for controlled metamorphosis between surfaces in 3-space is described. Well-understood techniques to produce shape metamorphosis between models in a 2D parametric space is applied. The user selects morphable features interactively, and the morphing process executes in real time on a high-performance graphics multicomputer.

  14. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  15. The implementation of portfolio assessment by the educators on the mathematics learning process in senior high school

    NASA Astrophysics Data System (ADS)

    Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar

    2018-05-01

    Portfolio assessment can shows the development of the ability of learners in a period through the work so that can be seen progress monitored learning of each learner. The purpose of research to describe and know the implementation of portfolio assessment on the mathematics learning process with the Senior High school math teacher class X as the subject because of the importance of applying the assessment for the progress of learning outcomes of learners. This research includes descriptive qualitative research type. Techniques of data collecting is done by observation method, interview and documentation. Data collection then validated using triangulation technique that is observation technique, interview and documentation. Data analysis technique is done by data reduction, data presentation and conclusion. The results showed that the steps taken by teachers in applying portfolio assessment obtained focused on learning outcomes. Student learning outcomes include homework and daily tests. Based on the results of research can be concluded that the implementation of portfolio assessment is the form of learning results are scored. Teachers have not yet implemented other portfolio assessment techniques such as student work.

  16. Bioremediation techniques applied to aqueous media contaminated with mercury.

    PubMed

    Velásquez-Riaño, Möritz; Benavides-Otaya, Holman D

    2016-12-01

    In recent years, the environmental and human health impacts of mercury contamination have driven the search for alternative, eco-efficient techniques different from the traditional physicochemical methods for treating this metal. One of these alternative processes is bioremediation. A comprehensive analysis of the different variables that can affect this process is presented. It focuses on determining the effectiveness of different techniques of bioremediation, with a specific consideration of three variables: the removal percentage, time needed for bioremediation and initial concentration of mercury to be treated in an aqueous medium.

  17. A novel data processing technique for image reconstruction of penumbral imaging

    NASA Astrophysics Data System (ADS)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  18. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    PubMed Central

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  19. NDE of ceramics and ceramic composites

    NASA Technical Reports Server (NTRS)

    Vary, Alex; Klima, Stanley J.

    1991-01-01

    Although nondestructive evaluation (NDE) techniques for ceramics are fairly well developed, they are difficult to apply in many cases for high probability detection of the minute flaws that can cause failure in monolithic ceramics. Conventional NDE techniques are available for monolithic and fiber reinforced ceramic matrix composites, but more exact quantitative techniques needed are still being investigated and developed. Needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in ceramic composites. NDE techniques that will ultimately be applicable to production and quality control of ceramic structures are still emerging from the lab. Needs are different depending on the processing stage, fabrication method, and nature of the finished product. NDE techniques are being developed in concert with materials processing research where they can provide feedback information to processing development and quality improvement. NDE techniques also serve as research tools for materials characterization and for understanding failure processes, e.g., during thermomechanical testing.

  20. Supercritical fluids as alternative, safe, food-processing media: an overview.

    PubMed

    Da Cruz Francisco, José; Szwajcer Dey, Estera

    2003-01-01

    The continuous growth of world population and its concentration in the urban areas require food supplies that are continuous, sufficient and of good quality. To resolve this problem techniques have been developed for increasing food quantity and quality. The techniques are applied throughout the food chain from production, conservation and during distribution to the consumers (from "the field to the fork"). During handling of food, chemicals are often deliberately added to achieve improved processing and better quality. This is one of the main reasons food undergoes different kinds of contamination. This overview focuses on the application of supercritical fluids as media for handling food materials during processing with the perspective of reducing chemical contamination of food. Examples of developmental applications of this technique and on research work in process are presented. Emphasis is given to extraction and biotransformation techniques.

  1. Evaluation of vibrated fluidized bed techniques in coating hemosorbents.

    PubMed

    Morley, D B

    1991-06-01

    A coating technique employing a vibrated fluidized bed was used to apply an ultrathin (2 microns) cellulose nitrate coating to synthetic bead activated charcoal. In vitro characteristics of the resulting coated sorbent, including permeability to model small and middle molecules, and mechanical integrity, were evaluated to determine the suitability of the process in coating granular sorbents used in hemoperfusion. Initial tests suggest the VFB-applied CN coating is both highly uniform and tightly adherent and warrants further investigation as a hemosorbent coating.

  2. Overview of Characterization Techniques for High Speed Crystal Growth

    NASA Technical Reports Server (NTRS)

    Ravi, K. V.

    1984-01-01

    Features of characterization requirements for crystals, devices and completed products are discussed. Key parameters of interest in semiconductor processing are presented. Characterization as it applies to process control, diagnostics and research needs is discussed with appropriate examples.

  3. Applying Nonverbal Techniques to Organizational Diagnosis.

    ERIC Educational Resources Information Center

    Tubbs, Stewart L.; Koske, W. Cary

    Ongoing research programs conducted at General Motors Institute are motivated by the practical objective of improving the company's organizational effectiveness. Computer technology is being used whenever possible; for example, a technique developed by Herman Chernoff was used to process data from a survey of employee attitudes into 18 different…

  4. Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models

    DTIC Science & Technology

    2008-08-01

    Level Outputs Campaign Level Model Campaign Level Outputs Aggregation Metamodeling Complexity (Spatial, Temporal, etc.) Others? Apply VRT (type......reduction, are called variance reduction techniques ( VRT ) [Law, 2006]. The implementation of some type of VRT can prove to be a very valuable tool

  5. Remote sensing and GIS for mapping groundwater recharge and discharge areas in salinity prone catchments, southeastern Australia

    NASA Astrophysics Data System (ADS)

    Tweed, Sarah O.; Leblanc, Marc; Webb, John A.; Lubczynski, Maciek W.

    2007-02-01

    Identifying groundwater recharge and discharge areas across catchments is critical for implementing effective strategies for salinity mitigation, surface-water and groundwater resource management, and ecosystem protection. In this study, a synergistic approach has been developed, which applies a combination of remote sensing and geographic information system (GIS) techniques to map groundwater recharge and discharge areas. This approach is applied to an unconfined basalt aquifer, in a salinity and drought prone region of southeastern Australia. The basalt aquifer covers ~11,500 km2 in an agriculturally intensive region. A review of local hydrogeological processes allowed a series of surface and subsurface indicators of groundwater recharge and discharge areas to be established. Various remote sensing and GIS techniques were then used to map these surface indicators including: terrain analysis, monitoring of vegetation activity, and mapping of infiltration capacity. All regions where groundwater is not discharging to the surface were considered potential recharge areas. This approach, applied systematically across a catchment, provides a framework for mapping recharge and discharge areas. A key component in assigning surface and subsurface indicators is the relevance to the dominant recharge and discharge processes occurring and the use of appropriate remote sensing and GIS techniques with the capacity to identify these processes.

  6. Stereo Image Ranging For An Autonomous Robot Vision System

    NASA Astrophysics Data System (ADS)

    Holten, James R.; Rogers, Steven K.; Kabrisky, Matthew; Cross, Steven

    1985-12-01

    The principles of stereo vision for three-dimensional data acquisition are well-known and can be applied to the problem of an autonomous robot vehicle. Coincidental points in the two images are located and then the location of that point in a three-dimensional space can be calculated using the offset of the points and knowledge of the camera positions and geometry. This research investigates the application of artificial intelligence knowledge representation techniques as a means to apply heuristics to relieve the computational intensity of the low level image processing tasks. Specifically a new technique for image feature extraction is presented. This technique, the Queen Victoria Algorithm, uses formal language productions to process the image and characterize its features. These characterized features are then used for stereo image feature registration to obtain the required ranging information. The results can be used by an autonomous robot vision system for environmental modeling and path finding.

  7. Applied photo interpretation for airbrush cartography

    NASA Technical Reports Server (NTRS)

    Inge, J. L.; Bridges, P. M.

    1976-01-01

    Lunar and planetary exploration has required the development of new techniques of cartographic portrayal. Conventional photo-interpretive methods employing size, shape, shadow, tone, pattern, and texture are applied to computer-processed satellite television images. Comparative judgements are affected by illumination, resolution, variations in surface coloration, and transmission or processing artifacts. The portrayal of tonal densities in a relief illustration is performed using a unique airbrush technique derived from hill-shading of contour maps. The control of tone and line quality is essential because the mid-gray to dark tone densities must be finalized prior to the addition of highlights to the drawing. This is done with an electric eraser until the drawing is completed. The drawing density is controlled with a reflectance-reading densitometer to meet certain density guidelines. The versatility of planetary photo-interpretive methods for airbrushed map portrayals is demonstrated by the application of these techniques to the synthesis of nonrelief data.

  8. Elucidating rhizosphere processes by mass spectrometry - A review.

    PubMed

    Rugova, Ariana; Puschenreiter, Markus; Koellensperger, Gunda; Hann, Stephan

    2017-03-01

    The presented review discusses state-of-the-art mass spectrometric methods, which have been developed and applied for investigation of chemical processes in the soil-root interface, the so-called rhizosphere. Rhizosphere soil's physical and chemical characteristics are to a great extent influenced by a complex mixture of compounds released from plant roots, i.e. root exudates, which have a high impact on nutrient and trace element dynamics in the soil-root interface as well as on microbial activities or soil physico-chemical characteristics. Chemical characterization as well as accurate quantification of the compounds present in the rhizosphere is a major prerequisite for a better understanding of rhizosphere processes and requires the development and application of advanced sampling procedures in combination with highly selective and sensitive analytical techniques. During the last years, targeted and non-targeted mass spectrometry-based methods have emerged and their combination with specific separation methods for various elements and compounds of a wide polarity range have been successfully applied in several studies. With this review we critically discuss the work that has been conducted within the last decade in the context of rhizosphere research and elemental or molecular mass spectrometry emphasizing different separation techniques as GC, LC and CE. Moreover, selected applications such as metal detoxification or nutrient acquisition will be discussed regarding the mass spectrometric techniques applied in studies of root exudates in plant-bacteria interactions. Additionally, a more recent isotope probing technique as novel mass spectrometry based application is highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Solid Phase Extraction (SPE) for Biodiesel Processing and Analysis

    DTIC Science & Technology

    2017-12-13

    1 METHODS ...sources. There are several methods than can be applied to development of separation techniques that may replace necessary water wash steps in...biodiesel refinement. Unfortunately, the most common methods are poorly suited or face high costs when applied to diesel purification. Distillation is

  10. New polyimide polymer has excellent processing characterisitcs with improved thermo-oxidative and hydrolytic stabilities

    NASA Technical Reports Server (NTRS)

    Jones, R. J.; Vaughan, R. W.; Kendrick, W. P.

    1972-01-01

    Polyimide P10P and its processing technique apply to most high temperature plastic products, devices and castings. Prepolymer, when used as varnish, impregnates fibers directly and is able to be processed into advanced composities. Material may also be used as molding powder and adhesive.

  11. Application of Statistical Quality Control Techniques to Detonator Fabrication: Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J. Frank

    1971-05-20

    A feasibility study was performed on the use of process control techniques which might reduce the need for a duplicate inspection by production inspection and quality control inspection. Two active detonator fabrication programs were selected for the study. Inspection areas accounting for the greatest percentage of total inspection costs were selected by applying "Pareto's Principle of Maldistribution." Data from these areas were then gathered and analyzed by a process capabiltiy study.

  12. Pulsed-neutron imaging by a high-speed camera and center-of-gravity processing

    NASA Astrophysics Data System (ADS)

    Mochiki, K.; Uragaki, T.; Koide, J.; Kushima, Y.; Kawarabayashi, J.; Taketani, A.; Otake, Y.; Matsumoto, Y.; Su, Y.; Hiroi, K.; Shinohara, T.; Kai, T.

    2018-01-01

    Pulsed-neutron imaging is attractive technique in the research fields of energy-resolved neutron radiography and RANS (RIKEN) and RADEN (J-PARC/JAEA) are small and large accelerator-driven pulsed-neutron facilities for its imaging, respectively. To overcome the insuficient spatial resolution of the conunting type imaging detectors like μ NID, nGEM and pixelated detectors, camera detectors combined with a neutron color image intensifier were investigated. At RANS center-of-gravity technique was applied to spots image obtained by a CCD camera and the technique was confirmed to be effective for improving spatial resolution. At RADEN a high-frame-rate CMOS camera was used and super resolution technique was applied and it was recognized that the spatial resolution was futhermore improved.

  13. Identification of Optimum Magnetic Behavior of NanoCrystalline CmFeAl Type Heusler Alloy Powders Using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Srivastava, Y.; Srivastava, S.; Boriwal, L.

    2016-09-01

    Mechanical alloying is a novelistic solid state process that has received considerable attention due to many advantages over other conventional processes. In the present work, Co2FeAl healer alloy powder, prepared successfully from premix basic powders of Cobalt (Co), Iron (Fe) and Aluminum (Al) in stoichiometric of 60Co-26Fe-14Al (weight %) by novelistic mechano-chemical route. Magnetic properties of mechanically alloyed powders were characterized by vibrating sample magnetometer (VSM). 2 factor 5 level design matrix was applied to experiment process. Experimental results were used for response surface methodology. Interaction between the input process parameters and the response has been established with the help of regression analysis. Further analysis of variance technique was applied to check the adequacy of developed model and significance of process parameters. Test case study was performed with those parameters, which was not selected for main experimentation but range was same. Response surface methodology, the process parameters must be optimized to obtain improved magnetic properties. Further optimum process parameters were identified using numerical and graphical optimization techniques.

  14. Hybrid Signal Processing Technique to Improve the Defect Estimation in Ultrasonic Non-Destructive Testing of Composite Structures

    PubMed Central

    Raisutis, Renaldas; Samaitis, Vykintas

    2017-01-01

    This work proposes a novel hybrid signal processing technique to extract information on disbond-type defects from a single B-scan in the process of non-destructive testing (NDT) of glass fiber reinforced plastic (GFRP) material using ultrasonic guided waves (GW). The selected GFRP sample has been a segment of wind turbine blade, which possessed an aerodynamic shape. Two disbond type defects having diameters of 15 mm and 25 mm were artificially constructed on its trailing edge. The experiment has been performed using the low-frequency ultrasonic system developed at the Ultrasound Institute of Kaunas University of Technology and only one side of the sample was accessed. A special configuration of the transmitting and receiving transducers fixed on a movable panel with a separation distance of 50 mm was proposed for recording the ultrasonic guided wave signals at each one-millimeter step along the scanning distance up to 500 mm. Finally, the hybrid signal processing technique comprising the valuable features of the three most promising signal processing techniques: cross-correlation, wavelet transform, and Hilbert–Huang transform has been applied to the received signals for the extraction of defects information from a single B-scan image. The wavelet transform and cross-correlation techniques have been combined in order to extract the approximated size and location of the defects and measurements of time delays. Thereafter, Hilbert–Huang transform has been applied to the wavelet transformed signal to compare the variation of instantaneous frequencies and instantaneous amplitudes of the defect-free and defective signals. PMID:29232845

  15. Building livable communities with transit : planning, developing and implementing community-sensitive transit

    DOT National Transportation Integrated Search

    1999-09-01

    This booklet presents some of the successes of the community-sensitive transportation facility development process. Although a comprehensive process is described here, not every project involves the full range of steps. By applying the techniques out...

  16. A Nursing Process Methodology.

    ERIC Educational Resources Information Center

    Ryan-Wenger, Nancy M.

    1990-01-01

    A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)

  17. Some recent developments in headspace gas chromatography

    Treesearch

    J.Y. Zhu; X.-S. Chai

    2005-01-01

    In this study, recent developments in headspace gas chromatography (HSGC) are briefly reviewed. Several novel HSGC techniques developed recently are presented in detail. These techniques were developed using the unique characteristics of the headspace sampling process implemented in commercial HSGC systems and therefore can be easily applied in laboratory and...

  18. Adaptive Educational Software by Applying Reinforcement Learning

    ERIC Educational Resources Information Center

    Bennane, Abdellah

    2013-01-01

    The introduction of the intelligence in teaching software is the object of this paper. In software elaboration process, one uses some learning techniques in order to adapt the teaching software to characteristics of student. Generally, one uses the artificial intelligence techniques like reinforcement learning, Bayesian network in order to adapt…

  19. Modeling User Behavior in Computer Learning Tasks.

    ERIC Educational Resources Information Center

    Mantei, Marilyn M.

    Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…

  20. Numerical simulation of the control of the three-dimensional transition process in boundary layers

    NASA Technical Reports Server (NTRS)

    Kral, L. D.; Fasel, H. F.

    1990-01-01

    Surface heating techniques to control the three-dimensional laminar-turbulent transition process are numerically investigated for a water boundary layer. The Navier-Stokes and energy equations are solved using a fully implicit finite difference/spectral method. The spatially evolving boundary layer is simulated. Results of both passive and active methods of control are shown for small amplitude two-dimensional and three-dimensional disturbance waves. Control is also applied to the early stages of the secondary instability process using passive or active control techniques.

  1. Adhesive performance of a multi-mode adhesive system: 1-year in vitro study.

    PubMed

    Marchesi, Giulio; Frassetto, Andrea; Mazzoni, Annalisa; Apolonio, Fabianni; Diolosà, Marina; Cadenaro, Milena; Di Lenarda, Roberto; Pashley, David H; Tay, Franklin; Breschi, Lorenzo

    2014-05-01

    The aim of this study was to investigate the adhesive stability over time of a multi-mode one-step adhesive applied using different bonding techniques on human coronal dentine. The hypotheses tested were that microtensile bond strength (μTBS), interfacial nanoleakage expression and matrix metalloproteinases (MMPs) activation are not affected by the adhesive application mode (following the use of self-etch technique or with the etch-and-rinse technique on dry or wet dentine) or by ageing for 24h, 6 months and 1year in artificial saliva. Human molars were cut to expose middle/deep dentine and assigned to one of the following bonding systems (N=15): (1) Scotchbond Universal (3M ESPE) self-etch mode, (2) Scotchbond Universal etch-and-rinse technique on wet dentine, (3) Scotchbond Universal etch-and-rinse technique on dry dentine, and (4) Prime&Bond NT (Dentsply De Trey) etch-and-rinse technique on wet dentine (control). Specimens were processed for μTBS test in accordance with the non-trimming technique and stressed to failure after 24h, 6 months or 1 year. Additional specimens were processed and examined to assay interfacial nanoleakage and MMP expression. At baseline, no differences between groups were found. After 1 year of storage, Scotchbond Universal applied in the self-etch mode and Prime&Bond NT showed higher μTBS compared to the other groups. The lowest nanoleakage expression was found for Scotchbond Universal applied in the self-etch mode, both at baseline and after storage. MMPs activation was found after application of each tested adhesive. The results of this study support the use of the self-etch approach for bonding the tested multi-mode adhesive system to dentine due to improved stability over time. Improved bonding effectiveness of the tested universal adhesive system on dentine may be obtained if the adhesive is applied with the self-etch approach. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Scheduling multirobot operations in manufacturing by truncated Petri nets

    NASA Astrophysics Data System (ADS)

    Chen, Qin; Luh, J. Y.

    1995-08-01

    Scheduling of operational sequences in manufacturing processes is one of the important problems in automation. Methods of applying Petri nets to model and analyze the problem with constraints on precedence relations, multiple resources allocation, etc. have been available in literature. Searching for an optimum schedule can be implemented by combining the branch-and-bound technique with the execution of the timed Petri net. The process usually produces a large Petri net which is practically not manageable. This disadvantage, however, can be handled by a truncation technique which divides the original large Petri net into several smaller size subnets. The complexity involved in the analysis of each subnet individually is greatly reduced. However, when the locally optimum schedules of the resulting subnets are combined together, it may not yield an overall optimum schedule for the original Petri net. To circumvent this problem, algorithms are developed based on the concepts of Petri net execution and modified branch-and-bound process. The developed technique is applied to a multi-robot task scheduling problem of the manufacturing work cell.

  3. A Program in Semiconductor Processing.

    ERIC Educational Resources Information Center

    McConica, Carol M.

    1984-01-01

    A graduate program at Colorado State University which focuses on integrated circuit processing is described. The program utilizes courses from several departments while allowing students to apply chemical engineering techniques to an integrated circuit fabrication research topic. Information on employment of chemical engineers by electronics…

  4. Frequency-wavenumber processing for infrasound distributed arrays.

    PubMed

    Costley, R Daniel; Frazier, W Garth; Dillion, Kevin; Picucci, Jennifer R; Williams, Jay E; McKenna, Mihan H

    2013-10-01

    The work described herein discusses the application of a frequency-wavenumber signal processing technique to signals from rectangular infrasound arrays for detection and estimation of the direction of travel of infrasound. Arrays of 100 sensors were arranged in square configurations with sensor spacing of 2 m. Wind noise data were collected at one site. Synthetic infrasound signals were superposed on top of the wind noise to determine the accuracy and sensitivity of the technique with respect to signal-to-noise ratio. The technique was then applied to an impulsive event recorded at a different site. Preliminary results demonstrated the feasibility of this approach.

  5. Controlling basins of attraction in a neural network-based telemetry monitor

    NASA Technical Reports Server (NTRS)

    Bell, Benjamin; Eilbert, James L.

    1988-01-01

    The size of the basins of attraction around fixed points in recurrent neural nets (NNs) can be modified by a training process. Controlling these attractive regions by presenting training data with various amount of noise added to the prototype signal vectors is discussed. Application of this technique to signal processing results in a classification system whose sensitivity can be controlled. This new technique is applied to the classification of temporal sequences in telemetry data.

  6. Processing, quality and safety of irradiation - and high pressure processed meat and seafood products

    USDA-ARS?s Scientific Manuscript database

    In the past two decades, worldwide demands for meat and seafood products have increased dramatically due to the improved economical condition in many countries. To meet the demand, the producers have increased the production of meat and seafood products as well as applied new processing techniques t...

  7. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents.

    PubMed

    Teichgräber, Ulf K; de Bucourt, Maximilian

    2012-01-01

    OJECTIVES: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Image processing analysis on the air-water slug two-phase flow in a horizontal pipe

    NASA Astrophysics Data System (ADS)

    Dinaryanto, Okto; Widyatama, Arif; Majid, Akmal Irfan; Deendarlianto, Indarto

    2016-06-01

    Slug flow is a part of intermittent flow which is avoided in industrial application because of its irregularity and high pressure fluctuation. Those characteristics cause some problems such as internal corrosion and the damage of the pipeline construction. In order to understand the slug characteristics, some of the measurement techniques can be applied such as wire-mesh sensors, CECM, and high speed camera. The present study was aimed to determine slug characteristics by using image processing techniques. Experiment has been carried out in 26 mm i.d. acrylic horizontal pipe with 9 m long. Air-water flow was recorded 5 m from the air-water mixer using high speed video camera. Each of image sequence was processed using MATLAB. There are some steps including image complement, background subtraction, and image filtering that used in this algorithm to produce binary images. Special treatments also were applied to reduce the disturbance effect of dispersed bubble around the bubble. Furthermore, binary images were used to describe bubble contour and calculate slug parameter such as gas slug length, gas slug velocity, and slug frequency. As a result the effect of superficial gas velocity and superficial liquid velocity on the fundamental parameters can be understood. After comparing the results to the previous experimental results, the image processing techniques is a useful and potential technique to explain the slug characteristics.

  9. Metabolomic approach for discrimination of processed ginseng genus (Panax ginseng and Panax quinquefolius) using UPLC-QTOF MS

    PubMed Central

    Park, Hee-Won; In, Gyo; Kim, Jeong-Han; Cho, Byung-Goo; Han, Gyeong-Ho; Chang, Il-Moo

    2013-01-01

    Discriminating between two herbal medicines (Panax ginseng and Panax quinquefolius), with similar chemical and physical properties but different therapeutic effects, is a very serious and difficult problem. Differentiation between two processed ginseng genera is even more difficult because the characteristics of their appearance are very similar. An ultraperformance liquid chromatography-quadrupole time-of-flight mass spectrometry (UPLC-QTOF MS)-based metabolomic technique was applied for the metabolite profiling of 40 processed P. ginseng and processed P. quinquefolius. Currently known biomarkers such as ginsenoside Rf and F11 have been used for the analysis using the UPLC-photodiode array detector. However, this method was not able to fully discriminate between the two processed ginseng genera. Thus, an optimized UPLC-QTOF-based metabolic profiling method was adapted for the analysis and evaluation of two processed ginseng genera. As a result, all known biomarkers were identified by the proposed metabolomics, and additional potential biomarkers were extracted from the huge amounts of global analysis data. Therefore, it is expected that such metabolomics techniques would be widely applied to the ginseng research field. PMID:24558312

  10. Rapid Structured Volume Grid Smoothing and Adaption Technique

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2006-01-01

    A rapid, structured volume grid smoothing and adaption technique, based on signal processing methods, was developed and applied to the Shuttle Orbiter at hypervelocity flight conditions in support of the Columbia Accident Investigation. Because of the fast pace of the investigation, computational aerothermodynamicists, applying hypersonic viscous flow solving computational fluid dynamic (CFD) codes, refined and enhanced a grid for an undamaged baseline vehicle to assess a variety of damage scenarios. Of the many methods available to modify a structured grid, most are time-consuming and require significant user interaction. By casting the grid data into different coordinate systems, specifically two computational coordinates with arclength as the third coordinate, signal processing methods are used for filtering the data [Taubin, CG v/29 1995]. Using a reverse transformation, the processed data are used to smooth the Cartesian coordinates of the structured grids. By coupling the signal processing method with existing grid operations within the Volume Grid Manipulator tool, problems related to grid smoothing are solved efficiently and with minimal user interaction. Examples of these smoothing operations are illustrated for reductions in grid stretching and volume grid adaptation. In each of these examples, other techniques existed at the time of the Columbia accident, but the incorporation of signal processing techniques reduced the time to perform the corrections by nearly 60%. This reduction in time to perform the corrections therefore enabled the assessment of approximately twice the number of damage scenarios than previously possible during the allocated investigation time.

  11. Rapid Structured Volume Grid Smoothing and Adaption Technique

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2004-01-01

    A rapid, structured volume grid smoothing and adaption technique, based on signal processing methods, was developed and applied to the Shuttle Orbiter at hypervelocity flight conditions in support of the Columbia Accident Investigation. Because of the fast pace of the investigation, computational aerothermodynamicists, applying hypersonic viscous flow solving computational fluid dynamic (CFD) codes, refined and enhanced a grid for an undamaged baseline vehicle to assess a variety of damage scenarios. Of the many methods available to modify a structured grid, most are time-consuming and require significant user interaction. By casting the grid data into different coordinate systems, specifically two computational coordinates with arclength as the third coordinate, signal processing methods are used for filtering the data [Taubin, CG v/29 1995]. Using a reverse transformation, the processed data are used to smooth the Cartesian coordinates of the structured grids. By coupling the signal processing method with existing grid operations within the Volume Grid Manipulator tool, problems related to grid smoothing are solved efficiently and with minimal user interaction. Examples of these smoothing operations are illustrated for reduction in grid stretching and volume grid adaptation. In each of these examples, other techniques existed at the time of the Columbia accident, but the incorporation of signal processing techniques reduced the time to perform the corrections by nearly 60%. This reduction in time to perform the corrections therefore enabled the assessment of approximately twice the number of damage scenarios than previously possible during the allocated investigation time.

  12. Membrane processing technology in the food industry: food processing, wastewater treatment, and effects on physical, microbiological, organoleptic, and nutritional properties of foods.

    PubMed

    Kotsanopoulos, Konstantinos V; Arvanitoyannis, Ioannis S

    2015-01-01

    Membrane processing technology (MPT) is increasingly used nowadays in a wide range of applications (demineralization, desalination, stabilization, separation, deacidification, reduction of microbial load, purification, etc.) in food industries. The most frequently applied techniques are electrodialysis (ED), reverse osmosis (RO), nanofiltration (NF), ultrafiltration (UF), and microfiltration (MF). Several membrane characteristics, such as pore size, flow properties, and the applied hydraulic pressure mainly determine membranes' potential uses. In this review paper the basic membrane techniques, their potential applications in a large number of fields and products towards the food industry, the main advantages and disadvantages of these methods, fouling phenomena as well as their effects on the organoleptic, qualitative, and nutritional value of foods are synoptically described. Some representative examples of traditional and modern membrane applications both in tabular and figural form are also provided.

  13. Hall versus conventional stainless steel crown techniques: in vitro investigation of marginal fit and microleakage using three different luting agents.

    PubMed

    Erdemci, Zeynep Yalçınkaya; Cehreli, S Burçak; Tirali, R Ebru

    2014-01-01

    This study's purpose was to investigate microleakage and marginal discrepancies in stainless steel crowns (SSCs) placed using conventional and Hall techniques and cemented with three different luting agents. Seventy-eight human primary maxillary second molars were randomly assigned to two groups (N=39), and SSCs were applied either with the Hall or conventional technique. These two groups were further subgrouped according to the material used for crown cementation (N=13 per group). Two specimens in each group were processed for scanning electron microscopy investigation. The extent of microleakage and marginal fit was quantified in millimeters on digitally photographed sections using image analysis software. The data were compared with a two-way independent and a two-way mixed analysis of variance (P=.05). The scores in the Hall group were significantly worse than those in the conventional technique group (P<.05). In both groups, resin cement displayed the lowest extent of microleakage, followed by glass ionomer and polycarboxylate cements (P<.05). Stainless steel crowns applied using the Hall technique displayed higher microleakage scores than those applied using the conventional technique, regardless of the cementation material. When the interaction of the material and technique was assessed, resin cement presented as the best choice for minimizing microleakage in both techniques.

  14. A novel technique to solve nonlinear higher-index Hessenberg differential-algebraic equations by Adomian decomposition method.

    PubMed

    Benhammouda, Brahim

    2016-01-01

    Since 1980, the Adomian decomposition method (ADM) has been extensively used as a simple powerful tool that applies directly to solve different kinds of nonlinear equations including functional, differential, integro-differential and algebraic equations. However, for differential-algebraic equations (DAEs) the ADM is applied only in four earlier works. There, the DAEs are first pre-processed by some transformations like index reductions before applying the ADM. The drawback of such transformations is that they can involve complex algorithms, can be computationally expensive and may lead to non-physical solutions. The purpose of this paper is to propose a novel technique that applies the ADM directly to solve a class of nonlinear higher-index Hessenberg DAEs systems efficiently. The main advantage of this technique is that; firstly it avoids complex transformations like index reductions and leads to a simple general algorithm. Secondly, it reduces the computational work by solving only linear algebraic systems with a constant coefficient matrix at each iteration, except for the first iteration where the algebraic system is nonlinear (if the DAE is nonlinear with respect to the algebraic variable). To demonstrate the effectiveness of the proposed technique, we apply it to a nonlinear index-three Hessenberg DAEs system with nonlinear algebraic constraints. This technique is straightforward and can be programmed in Maple or Mathematica to simulate real application problems.

  15. Usefulness of Simultaneous EEG-NIRS Recording in Language Studies

    ERIC Educational Resources Information Center

    Wallois, F.; Mahmoudzadeh, M.; Patil, A.; Grebe, R.

    2012-01-01

    One of the most challenging tasks in neuroscience in language studies, is investigation of the brain's ability to integrate and process information. This task can only be successfully addressed by applying various assessment techniques integrated into a multimodal approach. Each of these techniques has its advantages and disadvantages, but help to…

  16. Performance Evaluation of Various STL File Mesh Refining Algorithms Applied for FDM-RP Process

    NASA Astrophysics Data System (ADS)

    Ledalla, Siva Rama Krishna; Tirupathi, Balaji; Sriram, Venkatesh

    2018-06-01

    Layered manufacturing machines use the stereolithography (STL) file to build parts. When a curved surface is converted from a computer aided design (CAD) file to STL, it results in a geometrical distortion and chordal error. Parts manufactured with this file, might not satisfy geometric dimensioning and tolerance requirements due to approximated geometry. Current algorithms built in CAD packages have export options to globally reduce this distortion, which leads to an increase in the file size and pre-processing time. In this work, different mesh subdivision algorithms are applied on STL file of a complex geometric features using MeshLab software. The mesh subdivision algorithms considered in this work are modified butterfly subdivision technique, loops sub division technique and general triangular midpoint sub division technique. A comparative study is made with respect to volume and the build time using the above techniques. It is found that triangular midpoint sub division algorithm is more suitable for the geometry under consideration. Only the wheel cap part is then manufactured on Stratasys MOJO FDM machine. The surface roughness of the part is measured on Talysurf surface roughness tester.

  17. Genetic programming based ensemble system for microarray data classification.

    PubMed

    Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To

    2015-01-01

    Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.

  18. Genetic Programming Based Ensemble System for Microarray Data Classification

    PubMed Central

    Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To

    2015-01-01

    Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748

  19. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The various physical processes that occur in the gas turbine combustor and the development of analytical models that accurately describe these processes are discussed. Aspects covered include fuel sprays; fluid mixing; combustion dynamics; radiation and chemistry and numeric techniques which can be applied to highly turbulent, recirculating, reacting flow fields.

  20. Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling

    NASA Astrophysics Data System (ADS)

    Vidovič, Luka; Milanič, Matija; Majaron, Boris

    2014-02-01

    Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.

  1. Planning and executing motions for multibody systems in free-fall. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan M.

    1991-01-01

    The purpose of this research is to develop an end-to-end system that can be applied to a multibody system in free-fall to analyze its possible motions, save those motions in a database, and design a controller that can execute those motions. A goal is for the process to be highly automated and involve little human intervention. Ideally, the output of the system would be data and algorithms that could be put in ROM to control the multibody system in free-fall. The research applies to more than just robots in space. It applies to any multibody system in free-fall. Mathematical techniques from nonlinear control theory were used to study the nature of the system dynamics and its possible motions. Optimization techniques were applied to plan motions. Image compression techniques were proposed to compress the precomputed motion data for storage. A linearized controller was derived to control the system while it executes preplanned trajectories.

  2. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  3. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  4. High Resolution Imaging of the Sun with CORONAS-1

    NASA Technical Reports Server (NTRS)

    Karovska, Margarita

    1998-01-01

    We applied several image restoration and enhancement techniques, to CORONAS-I images. We carried out the characterization of the Point Spread Function (PSF) using the unique capability of the Blind Iterative Deconvolution (BID) technique, which recovers the real PSF at a given location and time of observation, when limited a priori information is available on its characteristics. We also applied image enhancement technique to extract the small scale structure imbeded in bright large scale structures on the disk and on the limb. The results demonstrate the capability of the image post-processing to substantially increase the yield from the space observations by improving the resolution and reducing noise in the images.

  5. Likelihood-based inference for discretely observed birth-death-shift processes, with applications to evolution of mobile genetic elements.

    PubMed

    Xu, Jason; Guttorp, Peter; Kato-Maeda, Midori; Minin, Vladimir N

    2015-12-01

    Continuous-time birth-death-shift (BDS) processes are frequently used in stochastic modeling, with many applications in ecology and epidemiology. In particular, such processes can model evolutionary dynamics of transposable elements-important genetic markers in molecular epidemiology. Estimation of the effects of individual covariates on the birth, death, and shift rates of the process can be accomplished by analyzing patient data, but inferring these rates in a discretely and unevenly observed setting presents computational challenges. We propose a multi-type branching process approximation to BDS processes and develop a corresponding expectation maximization algorithm, where we use spectral techniques to reduce calculation of expected sufficient statistics to low-dimensional integration. These techniques yield an efficient and robust optimization routine for inferring the rates of the BDS process, and apply broadly to multi-type branching processes whose rates can depend on many covariates. After rigorously testing our methodology in simulation studies, we apply our method to study intrapatient time evolution of IS6110 transposable element, a genetic marker frequently used during estimation of epidemiological clusters of Mycobacterium tuberculosis infections. © 2015, The International Biometric Society.

  6. Assessing the Utility of the Nominal Group Technique as a Consensus-Building Tool in Extension-Led Avian Influenza Response Planning

    ERIC Educational Resources Information Center

    Kline, Terence R.

    2013-01-01

    The intent of the project described was to apply the Nominal Group Technique (NGT) to achieve a consensus on Avian Influenza (AI) planning in Northeastern Ohio. Nominal Group Technique is a process first developed by Delbecq, Vande Ven, and Gustafsen (1975) to allow all participants to have an equal say in an open forum setting. A very diverse…

  7. The angler specialization concept applied: New York’s Salmon River anglers

    Treesearch

    Chad P. Dawson; Tommy L. Brown; Nancy Connelly

    1992-01-01

    The concept of angler specialization was applied to a study of Salmon River anglers to test this concept when using a variety of angling techniques and two species groups within the same environmental setting. A revision of the concept is suggested to account for angler expectancy and cognitive processes.

  8. Interference detection and correction applied to incoherent-scatter radar power spectrum measurement

    NASA Technical Reports Server (NTRS)

    Ying, W. P.; Mathews, J. D.; Rastogi, P. K.

    1986-01-01

    A median filter based interference detection and correction technique is evaluated and the method applied to the Arecibo incoherent scatter radar D-region ionospheric power spectrum is discussed. The method can be extended to other kinds of data when the statistics involved in the process are still valid.

  9. Analyzing Structure and Function of Vascularization in Engineered Bone Tissue by Video-Rate Intravital Microscopy and 3D Image Processing.

    PubMed

    Pang, Yonggang; Tsigkou, Olga; Spencer, Joel A; Lin, Charles P; Neville, Craig; Grottkau, Brian

    2015-10-01

    Vascularization is a key challenge in tissue engineering. Three-dimensional structure and microcirculation are two fundamental parameters for evaluating vascularization. Microscopic techniques with cellular level resolution, fast continuous observation, and robust 3D postimage processing are essential for evaluation, but have not been applied previously because of technical difficulties. In this study, we report novel video-rate confocal microscopy and 3D postimage processing techniques to accomplish this goal. In an immune-deficient mouse model, vascularized bone tissue was successfully engineered using human bone marrow mesenchymal stem cells (hMSCs) and human umbilical vein endothelial cells (HUVECs) in a poly (D,L-lactide-co-glycolide) (PLGA) scaffold. Video-rate (30 FPS) intravital confocal microscopy was applied in vitro and in vivo to visualize the vascular structure in the engineered bone and the microcirculation of the blood cells. Postimage processing was applied to perform 3D image reconstruction, by analyzing microvascular networks and calculating blood cell viscosity. The 3D volume reconstructed images show that the hMSCs served as pericytes stabilizing the microvascular network formed by HUVECs. Using orthogonal imaging reconstruction and transparency adjustment, both the vessel structure and blood cells within the vessel lumen were visualized. Network length, network intersections, and intersection densities were successfully computed using our custom-developed software. Viscosity analysis of the blood cells provided functional evaluation of the microcirculation. These results show that by 8 weeks, the blood vessels in peripheral areas function quite similarly to the host vessels. However, the viscosity drops about fourfold where it is only 0.8 mm away from the host. In summary, we developed novel techniques combining intravital microscopy and 3D image processing to analyze the vascularization in engineered bone. These techniques have broad applicability for evaluating vascularization in other engineered tissues as well.

  10. Floating-point scaling technique for sources separation automatic gain control

    NASA Astrophysics Data System (ADS)

    Fermas, A.; Belouchrani, A.; Ait-Mohamed, O.

    2012-07-01

    Based on the floating-point representation and taking advantage of scaling factor indetermination in blind source separation (BSS) processing, we propose a scaling technique applied to the separation matrix, to avoid the saturation or the weakness in the recovered source signals. This technique performs an automatic gain control in an on-line BSS environment. We demonstrate the effectiveness of this technique by using the implementation of a division-free BSS algorithm with two inputs, two outputs. The proposed technique is computationally cheaper and efficient for a hardware implementation compared to the Euclidean normalisation.

  11. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  12. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.

    1981-01-01

    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  13. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  14. Industrial benefits and future expectations in materials and processes resulting from space technology

    NASA Technical Reports Server (NTRS)

    Meyer, J. D.

    1977-01-01

    Space technology transfer is discussed as applied to the field of materials science. Advances made in processing include improved computer techniques, and structural analysis. Technology transfer is shown to have an important impact potential in the overall productivity of the United States.

  15. Leak Detection by Acoustic Emission Monitoring. Phase 1. Feasibility Study

    DTIC Science & Technology

    1994-05-26

    various signal processing and noise descrimInation techniques during the Data Processing task. C. TEST DESCRIPTION 1. Laboratory Tests Following normal...success in applying these methods to descriminating between the AE bursts generated by two close AE sources In a section of an aircraft structure

  16. Dorm Renovations: To Increase Enrollment or Maintain Status Quo?

    ERIC Educational Resources Information Center

    Stauff, William

    2001-01-01

    Explores South Carolina's Erskine College's planning and decision making process to renovate all seven of its dormitories over three summers. Discusses how the organization built a strong relationship with the contractor, successfully utilized outside architectural and consulting services, applied facilities management techniques in the process,…

  17. Process Mining Online Assessment Data

    ERIC Educational Resources Information Center

    Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul

    2009-01-01

    Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…

  18. The use of applied software for the professional training of students studying humanities

    NASA Astrophysics Data System (ADS)

    Sadchikova, A. S.; Rodin, M. M.

    2017-01-01

    Research practice is an integral part of humanities students' training process. In this regard the training process is to include modern information techniques of the training process of students studying humanities. This paper examines the most popular applied software products used for data processing in social science. For testing purposes we selected the most commonly preferred professional packages: MS Excel, IBM SPSS Statistics, STATISTICA, STADIA. Moreover the article contains testing results of a specialized software Prikladnoy Sotsiolog that is applicable for the preparation stage of the research. The specialised software were tested during one term in groups of students studying humanities.

  19. Application of advanced multidisciplinary analysis and optimization methods to vehicle design synthesis

    NASA Technical Reports Server (NTRS)

    Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.

  20. 41 CFR 102-118.35 - What definitions apply to this part?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... published formats and codes as authorized by the applicable Federal Information Processing Standards... techniques for carrying out transportation transactions using electronic transmissions of the information...

  1. Comparing Pattern Recognition Feature Sets for Sorting Triples in the FIRST Database

    NASA Astrophysics Data System (ADS)

    Proctor, D. D.

    2006-07-01

    Pattern recognition techniques have been used with increasing success for coping with the tremendous amounts of data being generated by automated surveys. Usually this process involves construction of training sets, the typical examples of data with known classifications. Given a feature set, along with the training set, statistical methods can be employed to generate a classifier. The classifier is then applied to process the remaining data. Feature set selection, however, is still an issue. This paper presents techniques developed for accommodating data for which a substantive portion of the training set cannot be classified unambiguously, a typical case for low-resolution data. Significance tests on the sort-ordered, sample-size-normalized vote distribution of an ensemble of decision trees is introduced as a method of evaluating relative quality of feature sets. The technique is applied to comparing feature sets for sorting a particular radio galaxy morphology, bent-doubles, from the Faint Images of the Radio Sky at Twenty Centimeters (FIRST) database. Also examined are alternative functional forms for feature sets. Associated standard deviations provide the means to evaluate the effect of the number of folds, the number of classifiers per fold, and the sample size on the resulting classifications. The technique also may be applied to situations for which, although accurate classifications are available, the feature set is clearly inadequate, but is desired nonetheless to make the best of available information.

  2. State of the art in on-line techniques coupled to flow injection analysis FIA/on-line- a critical review

    PubMed Central

    Puchades, R.; Maquieira, A.; Atienza, J.; Herrero, M. A.

    1990-01-01

    Flow injection analysis (FIA) has emerged as an increasingly used laboratory tool in chemical analysis. Employment of the technique for on-line sample treatment and on-line measurement in chemical process control is a growing trend. This article reviews the recent applications of FlA. Most papers refer to on-line sample treatment. Although FIA is very well suited to continuous on-line process monitoring, few examples have been found in this areamost of them have been applied to water treatment or fermentation processes. PMID:18925271

  3. The development of a method of producing etch resistant wax patterns on solar cells

    NASA Technical Reports Server (NTRS)

    Pastirik, E.

    1980-01-01

    A potentially attractive technique for wax masking of solar cells prior to etching processes was studied. This technique made use of a reuseable wax composition which was applied to the solar cell in patterned form by means of a letterpress printing method. After standard wet etching was performed, wax removal by means of hot water was investigated. Application of the letterpress wax printing process to silicon was met with a number of difficulties. The most serious shortcoming of the process was its inability to produce consistently well-defined printed patterns on the hard silicon cell surface.

  4. Contemporary ultrasonic signal processing approaches for nondestructive evaluation of multilayered structures

    NASA Astrophysics Data System (ADS)

    Zhang, Guang-Ming; Harvey, David M.

    2012-03-01

    Various signal processing techniques have been used for the enhancement of defect detection and defect characterisation. Cross-correlation, filtering, autoregressive analysis, deconvolution, neural network, wavelet transform and sparse signal representations have all been applied in attempts to analyse ultrasonic signals. In ultrasonic nondestructive evaluation (NDE) applications, a large number of materials have multilayered structures. NDE of multilayered structures leads to some specific problems, such as penetration, echo overlap, high attenuation and low signal-to-noise ratio. The signals recorded from a multilayered structure are a class of very special signals comprised of limited echoes. Such signals can be assumed to have a sparse representation in a proper signal dictionary. Recently, a number of digital signal processing techniques have been developed by exploiting the sparse constraint. This paper presents a review of research to date, showing the up-to-date developments of signal processing techniques made in ultrasonic NDE. A few typical ultrasonic signal processing techniques used for NDE of multilayered structures are elaborated. The practical applications and limitations of different signal processing methods in ultrasonic NDE of multilayered structures are analysed.

  5. Multi-filter spectrophotometry of quasar environments

    NASA Technical Reports Server (NTRS)

    Craven, Sally E.; Hickson, Paul; Yee, Howard K. C.

    1993-01-01

    A many-filter photometric technique for determining redshifts and morphological types, by fitting spectral templates to spectral energy distributions, has good potential for application in surveys. Despite success in studies performed on simulated data, the results have not been fully reliable when applied to real, low signal-to-noise data. We are investigating techniques to improve the fitting process.

  6. Application of Guided Inquiry System Technique (GIST) to Controlled Ecological Life Support Systems (CELSS)

    NASA Technical Reports Server (NTRS)

    Aroeste, H.

    1982-01-01

    Guided Inquiry System Technique, a global approach to problem solving, was applied to the subject of Controlled Ecological Life Support Systems (CELSS). Nutrition, food processing, and the use of higher plants in a CELSS were considered by a panel of experts. Specific ideas and recommendations gleaned from discussions with panel members are presented.

  7. Recovering the fine structures in solar images

    NASA Technical Reports Server (NTRS)

    Karovska, Margarita; Habbal, S. R.; Golub, L.; Deluca, E.; Hudson, Hugh S.

    1994-01-01

    Several examples of the capability of the blind iterative deconvolution (BID) technique to recover the real point spread function, when limited a priori information is available about its characteristics. To demonstrate the potential of image post-processing for probing the fine scale and temporal variability of the solar atmosphere, the BID technique is applied to different samples of solar observations from space. The BID technique was originally proposed for correction of the effects of atmospheric turbulence on optical images. The processed images provide a detailed view of the spatial structure of the solar atmosphere at different heights in regions with different large-scale magnetic field structures.

  8. Imaging techniques in digital forensic investigation: a study using neural networks

    NASA Astrophysics Data System (ADS)

    Williams, Godfried

    2006-09-01

    Imaging techniques have been applied to a number of applications, such as translation and classification problems in medicine and defence. This paper examines the application of imaging techniques in digital forensics investigation using neural networks. A review of applications of digital image processing is presented, whiles a Pedagogical analysis of computer forensics is also highlighted. A data set describing selected images in different forms are used in the simulation and experimentation.

  9. Rangeland Riparian Systems

    Treesearch

    Wayne Elmore

    1989-01-01

    The management and recovery of degraded riparian systems is a major conservation issue. Presently there are many grazing management strategies being applied based on the name of the technique with little incorporation of basic stream processes. Managers must understand the exact workings of grazing strategies and the individual processes of each stream before...

  10. Separating Item and Order Information through Process Dissociation

    ERIC Educational Resources Information Center

    Nairne, James S.; Kelley, Matthew R.

    2004-01-01

    In the present paper, we develop and apply a technique, based on the logic of process dissociation, for obtaining numerical estimates of item and order information. Certain variables, such as phonological similarity, are widely believed to produce dissociative effects on item and order retention. However, such beliefs rest on the questionable…

  11. Analyzing Student Inquiry Data Using Process Discovery and Sequence Classification

    ERIC Educational Resources Information Center

    Emond, Bruno; Buffett, Scott

    2015-01-01

    This paper reports on results of applying process discovery mining and sequence classification mining techniques to a data set of semi-structured learning activities. The main research objective is to advance educational data mining to model and support self-regulated learning in heterogeneous environments of learning content, activities, and…

  12. Electrospining of polyaniline/poly(lactic acid) ultrathin fibers: process and statistical modeling using a non-gaussian approach

    USDA-ARS?s Scientific Manuscript database

    Cover: The electrospinning technique was employed to obtain conducting nanofibers based on polyaniline and poly(lactic acid). A statistical model was employed to describe how the process factors (solution concentration, applied voltage, and flow rate) govern the fiber dimensions. Nanofibers down to ...

  13. NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.

    ERIC Educational Resources Information Center

    Zhou, Lina; Zhang, Dongsong

    2003-01-01

    Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…

  14. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    ERIC Educational Resources Information Center

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  15. Strength enhancement process for prealloyed powder superalloys

    NASA Technical Reports Server (NTRS)

    Waters, W. J.; Freche, J. C.

    1977-01-01

    A technique involving superplastic processing and high pressure autoclaving was applied to a nickel base prealloyed powder alloy. Tensile strengths as high as 2865 MN/sq m at 480 C were obtained with as-superplastically deformed material. Appropriate treatments yielding materials with high temperature tensile and stress rupture strengths were also devised.

  16. Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process

    DTIC Science & Technology

    2012-10-01

    involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems

  17. Vapor Hydrogen Peroxide as Alternative to Dry Heat Microbial Reduction

    NASA Technical Reports Server (NTRS)

    Cash, Howard A.; Kern, Roger G.; Chung, Shirley Y.; Koukol, Robert C.; Barengoltz, Jack B.

    2006-01-01

    The Jet Propulsion Laboratory, in conjunction with the NASA Planetary Protection Officer, has selected vapor phase hydrogen peroxide (VHP) sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal is to include this technique, with appropriate specification, in NPG8020.12C as a low temperature complementary technique to the dry heat sterilization process. A series of experiments were conducted in vacuum to determine VHP process parameters that provided significant reductions in spore viability while allowing survival of sufficient spores for statistically significant enumeration. With this knowledge of D values, sensible margins can be applied in a planetary protection specification. The outcome of this study provided an optimization of test sterilizer process conditions: VHP concentration, process duration, a process temperature range for which the worst case D value may be imposed, a process humidity range for which the worst case D value may be imposed, and robustness to selected spacecraft material substrates.

  18. 23 CFR 450.208 - Coordination of planning process activities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... planning carried out under this subpart with statewide trade and economic development planning activities... CFR part 500. (e) States may apply asset management principles and techniques in establishing planning...

  19. Spatiotemporal stochastic models for earth science and engineering applications

    NASA Astrophysics Data System (ADS)

    Luo, Xiaochun

    1998-12-01

    Spatiotemporal processes occur in many areas of earth sciences and engineering. However, most of the available theoretical tools and techniques of space-time daft processing have been designed to operate exclusively in time or in space, and the importance of spatiotemporal variability was not fully appreciated until recently. To address this problem, a systematic framework of spatiotemporal random field (S/TRF) models for geoscience/engineering applications is presented and developed in this thesis. The space-tune continuity characterization is one of the most important aspects in S/TRF modelling, where the space-time continuity is displayed with experimental spatiotemporal variograms, summarized in terms of space-time continuity hypotheses, and modelled using spatiotemporal variogram functions. Permissible spatiotemporal covariance/variogram models are addressed through permissibility criteria appropriate to spatiotemporal processes. The estimation of spatiotemporal processes is developed in terms of spatiotemporal kriging techniques. Particular emphasis is given to the singularity analysis of spatiotemporal kriging systems. The impacts of covariance, functions, trend forms, and data configurations on the singularity of spatiotemporal kriging systems are discussed. In addition, the tensorial invariance of universal spatiotemporal kriging systems is investigated in terms of the space-time trend. The conditional simulation of spatiotemporal processes is proposed with the development of the sequential group Gaussian simulation techniques (SGGS), which is actually a series of sequential simulation algorithms associated with different group sizes. The simulation error is analyzed with different covariance models and simulation grids. The simulated annealing technique honoring experimental variograms, is also proposed, providing a way of conditional simulation without the covariance model fitting which is prerequisite for most simulation algorithms. The proposed techniques were first applied for modelling of the pressure system in a carbonate reservoir, and then applied for modelling of springwater contents in the Dyle watershed. The results of these case studies as well as the theory suggest that these techniques are realistic and feasible.

  20. Wind turbine siting: A summary of the state of the art

    NASA Technical Reports Server (NTRS)

    Hiester, T. R.

    1982-01-01

    The process of siting large wind turbines may be divided into two broad steps: site selection, and site evaluation. Site selection is the process of locating windy sites where wind energy development shows promise of economic viability. Site evaluation is the process of determining in detail for a given site the economic potential of the site. The state of the art in the first aspect of siting, site selection is emphasized. Several techniques for assessing the wind resource were explored or developed in the Federal Wind Energy Program. Local topography and meteorology will determine which of the techniques should be used in locating potential sites. None of the techniques can do the job alone, none are foolproof, and all require considerable knowledge and experience to apply correctly. Therefore, efficient siting requires a strategy which is founded on broad based application of several techniques without relying solely on one narrow field of expertise.

  1. Computer image processing: Geologic applications

    NASA Technical Reports Server (NTRS)

    Abrams, M. J.

    1978-01-01

    Computer image processing of digital data was performed to support several geological studies. The specific goals were to: (1) relate the mineral content to the spectral reflectance of certain geologic materials, (2) determine the influence of environmental factors, such as atmosphere and vegetation, and (3) improve image processing techniques. For detection of spectral differences related to mineralogy, the technique of band ratioing was found to be the most useful. The influence of atmospheric scattering and methods to correct for the scattering were also studied. Two techniques were used to correct for atmospheric effects: (1) dark object subtraction, (2) normalization of use of ground spectral measurements. Of the two, the first technique proved to be the most successful for removing the effects of atmospheric scattering. A digital mosaic was produced from two side-lapping LANDSAT frames. The advantages were that the same enhancement algorithm can be applied to both frames, and there is no seam where the two images are joined.

  2. Wab-InSAR: a new wavelet based InSAR time series technique applied to volcanic and tectonic areas

    NASA Astrophysics Data System (ADS)

    Walter, T. R.; Shirzaei, M.; Nankali, H.; Roustaei, M.

    2009-12-01

    Modern geodetic techniques such as InSAR and GPS provide valuable observations of the deformation field. Because of the variety of environmental interferences (e.g., atmosphere, topography distortion) and incompleteness of the models (assumption of the linear model for deformation), those observations are usually tainted by various systematic and random errors. Therefore we develop and test new methods to identify and filter unwanted periodic or episodic artifacts to obtain accurate and precise deformation measurements. Here we present and implement a new wavelet based InSAR (Wab-InSAR) time series approach. Because wavelets are excellent tools for identifying hidden patterns and capturing transient signals, we utilize wavelet functions for reducing the effect of atmospheric delay and digital elevation model inaccuracies. Wab-InSAR is a model free technique, reducing digital elevation model errors in individual interferograms using a 2D spatial Legendre polynomial wavelet filter. Atmospheric delays are reduced using a 3D spatio-temporal wavelet transform algorithm and a novel technique for pixel selection. We apply Wab-InSAR to several targets, including volcano deformation processes at Hawaii Island, and mountain building processes in Iran. Both targets are chosen to investigate large and small amplitude signals, variable and complex topography and atmospheric effects. In this presentation we explain different steps of the technique, validate the results by comparison to other high resolution processing methods (GPS, PS-InSAR, SBAS) and discuss the geophysical results.

  3. Multi-frame image processing with panning cameras and moving subjects

    NASA Astrophysics Data System (ADS)

    Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric

    2014-06-01

    Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.

  4. A study for high accuracy measurement of residual stress by deep hole drilling technique

    NASA Astrophysics Data System (ADS)

    Kitano, Houichi; Okano, Shigetaka; Mochizuki, Masahito

    2012-08-01

    The deep hole drilling technique (DHD) received much attention in recent years as a method for measuring through-thickness residual stresses. However, some accuracy problems occur when residual stress evaluation is performed by the DHD technique. One of the reasons is that the traditional DHD evaluation formula applies to the plane stress condition. The second is that the effects of the plastic deformation produced in the drilling process and the deformation produced in the trepanning process are ignored. In this study, a modified evaluation formula, which is applied to the plane strain condition, is proposed. In addition, a new procedure is proposed which can consider the effects of the deformation produced in the DHD process by investigating the effects in detail by finite element (FE) analysis. Then, the evaluation results obtained by the new procedure are compared with that obtained by traditional DHD procedure by FE analysis. As a result, the new procedure evaluates the residual stress fields better than the traditional DHD procedure when the measuring object is thick enough that the stress condition can be assumed as the plane strain condition as in the model used in this study.

  5. Freezing-Out Technique Applied to the Concentration of Biologically Active Materials

    PubMed Central

    Wilson, T. E.; Evans, D. J.; Theriot, Mary L.

    1964-01-01

    When applied to a dilute solution of folic acid and glucose, a freezing-out (with agitation) technique was shown to be an effective method of achieving a 20-fold reduction in volume with a loss of 10% of the active material being concentrated. Concentration of a stimulatory factor for Lactobacillus casei produced by Candida albicans in a complex medium was limited by the total solute concentration. Salts in the medium were concentrated to levels inhibitory for L. casei. The process is not selective and all solutes are concentrated. Images FIG. 2 FIG. 3 PMID:14131370

  6. EMI / EMC Design for Class D Payloads (Resource Prospector / NIRVSS)

    NASA Technical Reports Server (NTRS)

    Forgione, Josh; Benton, Joshua Eric; Thompson, Sarah; Colaprete, Anthony

    2015-01-01

    EMI/EMC techniques are applied to a Class D instrument (NIRVSS) to achieve low noise performance and reduce risk of EMI/EMC testing failures and/or issues during system integration and test. Basic techniques are not terribly expensive or complex, but do require close coordination between electrical and mechanical staff early in the design process. Low-cost methods to test subsystems on the bench without renting an EMI chamber are discussed. This method was applied to the NIRVSS instrument and achieved improvements up to 59dB on conducted emissions measurements between hardware revisions.

  7. The balance sheet technique. Volume I. The balance sheet analysis technique for preconstruction review of airports and highways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaBelle, S.J.; Smith, A.E.; Seymour, D.A.

    1977-02-01

    The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less

  8. Yield enhancement with DFM

    NASA Astrophysics Data System (ADS)

    Paek, Seung Weon; Kang, Jae Hyun; Ha, Naya; Kim, Byung-Moo; Jang, Dae-Hyun; Jeon, Junsu; Kim, DaeWook; Chung, Kun Young; Yu, Sung-eun; Park, Joo Hyun; Bae, SangMin; Song, DongSup; Noh, WooYoung; Kim, YoungDuck; Song, HyunSeok; Choi, HungBok; Kim, Kee Sup; Choi, Kyu-Myung; Choi, Woonhyuk; Jeon, JoongWon; Lee, JinWoo; Kim, Ki-Su; Park, SeongHo; Chung, No-Young; Lee, KangDuck; Hong, YoungKi; Kim, BongSeok

    2012-03-01

    A set of design for manufacturing (DFM) techniques have been developed and applied to 45nm, 32nm and 28nm logic process technologies. A noble technology combined a number of potential confliction of DFM techniques into a comprehensive solution. These techniques work in three phases for design optimization and one phase for silicon diagnostics. In the DFM prevention phase, foundation IP such as standard cells, IO, and memory and P&R tech file are optimized. In the DFM solution phase, which happens during ECO step, auto fixing of process weak patterns and advanced RC extraction are performed. In the DFM polishing phase, post-layout tuning is done to improve manufacturability. DFM analysis enables prioritization of random and systematic failures. The DFM technique presented in this paper has been silicon-proven with three successful tape-outs in Samsung 32nm processes; about 5% improvement in yield was achieved without any notable side effects. Visual inspection of silicon also confirmed the positive effect of the DFM techniques.

  9. Accurate Rapid Lifetime Determination on Time-Gated FLIM Microscopy with Optical Sectioning

    PubMed Central

    Silva, Susana F.; Domingues, José Paulo

    2018-01-01

    Time-gated fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to assess the biochemistry of cells and tissues. When applied to living thick samples, it is hampered by the lack of optical sectioning and the need of acquiring many images for an accurate measurement of fluorescence lifetimes. Here, we report on the use of processing techniques to overcome these limitations, minimizing the acquisition time, while providing optical sectioning. We evaluated the application of the HiLo and the rapid lifetime determination (RLD) techniques for accurate measurement of fluorescence lifetimes with optical sectioning. HiLo provides optical sectioning by combining the high-frequency content from a standard image, obtained with uniform illumination, with the low-frequency content of a second image, acquired using structured illumination. Our results show that HiLo produces optical sectioning on thick samples without degrading the accuracy of the measured lifetimes. We also show that instrument response function (IRF) deconvolution can be applied with the RLD technique on HiLo images, improving greatly the accuracy of the measured lifetimes. These results open the possibility of using the RLD technique with pulsed diode laser sources to determine accurately fluorescence lifetimes in the subnanosecond range on thick multilayer samples, providing that offline processing is allowed. PMID:29599938

  10. Accurate Rapid Lifetime Determination on Time-Gated FLIM Microscopy with Optical Sectioning.

    PubMed

    Silva, Susana F; Domingues, José Paulo; Morgado, António Miguel

    2018-01-01

    Time-gated fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to assess the biochemistry of cells and tissues. When applied to living thick samples, it is hampered by the lack of optical sectioning and the need of acquiring many images for an accurate measurement of fluorescence lifetimes. Here, we report on the use of processing techniques to overcome these limitations, minimizing the acquisition time, while providing optical sectioning. We evaluated the application of the HiLo and the rapid lifetime determination (RLD) techniques for accurate measurement of fluorescence lifetimes with optical sectioning. HiLo provides optical sectioning by combining the high-frequency content from a standard image, obtained with uniform illumination, with the low-frequency content of a second image, acquired using structured illumination. Our results show that HiLo produces optical sectioning on thick samples without degrading the accuracy of the measured lifetimes. We also show that instrument response function (IRF) deconvolution can be applied with the RLD technique on HiLo images, improving greatly the accuracy of the measured lifetimes. These results open the possibility of using the RLD technique with pulsed diode laser sources to determine accurately fluorescence lifetimes in the subnanosecond range on thick multilayer samples, providing that offline processing is allowed.

  11. The role of printing techniques for large-area dye sensitized solar cells

    NASA Astrophysics Data System (ADS)

    Mariani, Paolo; Vesce, Luigi; Di Carlo, Aldo

    2015-10-01

    The versatility of printing technologies and their intrinsic ability to outperform other techniques in large-area deposition gives scope to revolutionize the photovoltaic (PV) manufacturing field. Printing methods are commonly used in conventional silicon-based PVs to cover part of the production process. Screen printing techniques, for example, are applied to deposit electrical contacts on the silicon wafer. However, it is with the advent of third generation PVs that printing/coating techniques have been extensively used in almost all of the manufacturing processes. Among all the third generation PVs, dye sensitized solar cell (DSSC) technology has been developed up to commercialization levels. DSSCs and modules can be fabricated by adopting all of the main printing techniques on both rigid and flexible substrates. This allows an easy tuning of cell/module characteristics to the desired application. Transparency, colour, shape, layout and other DSSC’s features can be easily varied by changing the printing parameters and paste/ink formulations used in the printing process. This review focuses on large-area printing/coating technologies for the fabrication of DSSCs devices. The most used and promising techniques are presented underlining the process parameters and applications.

  12. False colors removal on the YCr-Cb color space

    NASA Astrophysics Data System (ADS)

    Tomaselli, Valeria; Guarnera, Mirko; Messina, Giuseppe

    2009-01-01

    Post-processing algorithms are usually placed in the pipeline of imaging devices to remove residual color artifacts introduced by the demosaicing step. Although demosaicing solutions aim to eliminate, limit or correct false colors and other impairments caused by a non ideal sampling, post-processing techniques are usually more powerful in achieving this purpose. This is mainly because the input of post-processing algorithms is a fully restored RGB color image. Moreover, post-processing can be applied more than once, in order to meet some quality criteria. In this paper we propose an effective technique for reducing the color artifacts generated by conventional color interpolation algorithms, in YCrCb color space. This solution efficiently removes false colors and can be executed while performing the edge emphasis process.

  13. Modular multiaperatures for light sensors

    NASA Technical Reports Server (NTRS)

    Rizzo, A. A.

    1977-01-01

    Process involves electroplating multiaperature masks as unit, eliminating alinement and assembly difficulties previously encountered. Technique may be applied to masks in automated and surveillance light systems, when precise, wide angle field of view is needed.

  14. Space shuttle recommendations based on aircraft maintenance experience

    NASA Technical Reports Server (NTRS)

    Spears, J. M.; Fox, C. L.

    1972-01-01

    Space shuttle design recommendations based on aircraft maintenance experience are developed. The recommendations are specifically applied to the landing gear system, nondestructive inspection techniques, hydraulic system design, materials and processes, and program support.

  15. Lightweight Active Object Retrieval with Weak Classifiers.

    PubMed

    Czúni, László; Rashad, Metwally

    2018-03-07

    In the last few years, there has been a steadily growing interest in autonomous vehicles and robotic systems. While many of these agents are expected to have limited resources, these systems should be able to dynamically interact with other objects in their environment. We present an approach where lightweight sensory and processing techniques, requiring very limited memory and processing power, can be successfully applied to the task of object retrieval using sensors of different modalities. We use the Hough framework to fuse optical and orientation information of the different views of the objects. In the presented spatio-temporal perception technique, we apply active vision, where, based on the analysis of initial measurements, the direction of the next view is determined to increase the hit-rate of retrieval. The performance of the proposed methods is shown on three datasets loaded with heavy noise.

  16. Lightweight Active Object Retrieval with Weak Classifiers

    PubMed Central

    2018-01-01

    In the last few years, there has been a steadily growing interest in autonomous vehicles and robotic systems. While many of these agents are expected to have limited resources, these systems should be able to dynamically interact with other objects in their environment. We present an approach where lightweight sensory and processing techniques, requiring very limited memory and processing power, can be successfully applied to the task of object retrieval using sensors of different modalities. We use the Hough framework to fuse optical and orientation information of the different views of the objects. In the presented spatio-temporal perception technique, we apply active vision, where, based on the analysis of initial measurements, the direction of the next view is determined to increase the hit-rate of retrieval. The performance of the proposed methods is shown on three datasets loaded with heavy noise. PMID:29518902

  17. Time-frequency analysis of pediatric murmurs

    NASA Astrophysics Data System (ADS)

    Lombardo, Joseph S.; Blodgett, Lisa A.; Rosen, Ron S.; Najmi, Amir-Homayoon; Thompson, W. Reid

    1998-05-01

    Technology has provided many new tools to assist in the diagnosis of pathologic conditions of the heart. Echocardiography, Ultrafast CT, and MRI are just a few. While these tools are a valuable resource, they are typically too expensive, large and complex in operation for use in rural, homecare, and physician's office settings. Recent advances in computer performance, miniaturization, and acoustic signal processing, have yielded new technologies that when applied to heart sounds can provide low cost screening for pathologic conditions. The short duration and transient nature of these signals requires processing techniques that provide high resolution in both time and frequency. Short-time Fourier transforms, Wigner distributions, and wavelet transforms have been applied to signals form hearts with various pathologic conditions. While no single technique provides the ideal solution, the combination of tools provides a good representation of the acoustic features of the pathologies selected.

  18. Analysis of the United States Marine Corps Continuous Process Improvement Program Applied to the Contracting Process at Marine Corps Regional Contracting Office - Southwest

    DTIC Science & Technology

    2007-12-01

    37 3. Poka - yoke ............................................................................................37 4. Systems for...Standard operating procedures • Visual displays for workflow and communication • Total productive maintenance • Poka - yoke techniques to prevent...process step or eliminating non-value-added steps, and reducing the seven common wastes, will decrease the total time of a process. 3. Poka - yoke

  19. Extension of electronic speckle correlation interferometry to large deformations

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Sciammarella, Federico M.

    1998-07-01

    The process of fringe formation under simultaneous illumination in two orthogonal directions is analyzed. Procedures to extend the applicability of this technique to large deformation and high density of fringes are introduced. The proposed techniques are applied to a number of technical problems. Good agreement is obtained when the experimental results are compared with results obtained by other methods.

  20. Incorporating Multiple-Choice Questions into an AACSB Assurance of Learning Process: A Course-Embedded Assessment Application to an Introductory Finance Course

    ERIC Educational Resources Information Center

    Santos, Michael R.; Hu, Aidong; Jordan, Douglas

    2014-01-01

    The authors offer a classification technique to make a quantitative skills rubric more operational, with the groupings of multiple-choice questions to match the student learning levels in knowledge, calculation, quantitative reasoning, and analysis. The authors applied this classification technique to the mid-term exams of an introductory finance…

  1. A Novel Technique Applying Spectral Estimation to Johnson Noise Thermometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ezell, N. Dianne Bull; Britton, Chuck; Ericson, Nance

    Johnson noise thermometry is one of many important measurement techniques used to monitor the safety levels and stability in a nuclear reactor. However, this measurement is very dependent on the minimal electromagnetic environment. Properly removing unwanted electromagnetic interference (EMI) is critical for accurate drift-free temperature measurements. The two techniques developed by Oak Ridge National Laboratory (ORNL) to remove transient and periodic EMI are briefly discussed here. Spectral estimation is a key component in the signal processing algorithm used for EMI removal and temperature calculation. The cross-power spectral density is a key component in the Johnson noise temperature computation. Applying eithermore » technique requires the simple addition of electronics and signal processing to existing resistive thermometers. With minimal installation changes, the system discussed here can be installed on existing nuclear power plants. The Johnson noise system developed is tested at three locations: ORNL, Sandia National Laboratory, and the Tennessee Valley Authority’s Kingston Fossil Plant. Each of these locations enabled improvement on the EMI removal algorithm. Finally, the conclusions made from the results at each of these locations is discussed, as well as possible future work.« less

  2. A Novel Technique Applying Spectral Estimation to Johnson Noise Thermometry

    DOE PAGES

    Ezell, N. Dianne Bull; Britton, Chuck; Ericson, Nance; ...

    2018-03-30

    Johnson noise thermometry is one of many important measurement techniques used to monitor the safety levels and stability in a nuclear reactor. However, this measurement is very dependent on the minimal electromagnetic environment. Properly removing unwanted electromagnetic interference (EMI) is critical for accurate drift-free temperature measurements. The two techniques developed by Oak Ridge National Laboratory (ORNL) to remove transient and periodic EMI are briefly discussed here. Spectral estimation is a key component in the signal processing algorithm used for EMI removal and temperature calculation. The cross-power spectral density is a key component in the Johnson noise temperature computation. Applying eithermore » technique requires the simple addition of electronics and signal processing to existing resistive thermometers. With minimal installation changes, the system discussed here can be installed on existing nuclear power plants. The Johnson noise system developed is tested at three locations: ORNL, Sandia National Laboratory, and the Tennessee Valley Authority’s Kingston Fossil Plant. Each of these locations enabled improvement on the EMI removal algorithm. Finally, the conclusions made from the results at each of these locations is discussed, as well as possible future work.« less

  3. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  4. Novel technique for fabrication of multi-layered microcoils in microelectromechanical systems (MEMS) applications

    NASA Astrophysics Data System (ADS)

    Chang, Hung-Pin; Qian, Jiangyuan; Bachman, Mark; Congdon, Philip; Li, Guann-pyng

    2002-07-01

    A novel planarization technique, compressive molding planarization (CMP) is developed for implementation of a multi-layered micro coil device. Applying CMP and other micromachining techniques, a multi-layered micro coil device has been designed and fabricated, and its use in the magnetic micro actuators for hard disk drive applications has been demonstrated, showing that it can produce milli-Newton of magnetic force suitable for driving a micro actuator. The novel CMP technique can be equally applicable in other MEMS devices fabrication to ease the process integration for the complicated structure.

  5. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  6. Wholefield displacement measurements using speckle image processing techniques for crash tests

    NASA Astrophysics Data System (ADS)

    Sriram, P.; Hanagud, S.; Ranson, W. F.

    The digital correlation scheme of Peters et al. (1983) was extended to measure out-of-plane deformations, using a white light projection speckle technique. A simple ray optic theory and the digital correlation scheme are outlined. The technique was applied successfully to measure out-of-plane displacements of initially flat rotorcraft structures (an acrylic circular plate and a steel cantilever beam), using a low cost video camera and a desktop computer. The technique can be extended to measurements of three-dimensional deformations and dynamic deformations.

  7. Asteroseismic inversions in the Kepler era: application to the Kepler Legacy sample

    NASA Astrophysics Data System (ADS)

    Buldgen, Gaël; Reese, Daniel; Dupret, Marc-Antoine

    2017-10-01

    In the past few years, the CoRoT and Kepler missions have carried out what is now called the space photometry revolution. This revolution is still ongoing thanks to K2 and will be continued by the Tess and Plato2.0 missions. However, the photometry revolution must also be followed by progress in stellar modelling, in order to lead to more precise and accurate determinations of fundamental stellar parameters such as masses, radii and ages. In this context, the long-lasting problems related to mixing processes in stellar interior is the main obstacle to further improvements of stellar modelling. In this contribution, we will apply structural asteroseismic inversion techniques to targets from the Kepler Legacy sample and analyse how these can help us constrain the fundamental parameters and mixing processes in these stars. Our approach is based on previous studies using the SOLA inversion technique [1] to determine integrated quantities such as the mean density [2], the acoustic radius, and core conditions indicators [3], and has already been successfully applied to the 16Cyg binary system [4]. We will show how this technique can be applied to the Kepler Legacy sample and how new indicators can help us to further constrain the chemical composition profiles of stars as well as provide stringent constraints on stellar ages.

  8. Defective Reduction in Frozen Pie Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Nooted, Oranuch; Tangjitsitcharoen, Somkiat

    2017-06-01

    The frozen pie production has a lot of defects resulting in high production cost. Failure mode and effect analysis (FMEA) technique has been applied to improve the frozen pie process. Pareto chart is also used to determine the major defects of frozen pie. There are 3 main processes that cause the defects which are the 1st freezing to glazing process, the forming process, and the folding process. The Risk Priority Number (RPN) obtained from FMEA is analyzed to reduce the defects. If RPN of each cause exceeds 45, the process will be considered to be improved and selected for the corrective and preventive actions. The results showed that RPN values decreased after the correction. Therefore, the implementation of FMEA technique can help to improve the performance of frozen pie process and reduce the defects approximately 51.9%.

  9. SU-E-I-37: Low-Dose Real-Time Region-Of-Interest X-Ray Fluoroscopic Imaging with a GPU-Accelerated Spatially Different Bilateral Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chung, H; Lee, J; Pua, R

    2014-06-01

    Purpose: The purpose of our study is to reduce imaging radiation dose while maintaining image quality of region of interest (ROI) in X-ray fluoroscopy. A low-dose real-time ROI fluoroscopic imaging technique which includes graphics-processing-unit- (GPU-) accelerated image processing for brightness compensation and noise filtering was developed in this study. Methods: In our ROI fluoroscopic imaging, a copper filter is placed in front of the X-ray tube. The filter contains a round aperture to reduce radiation dose to outside of the aperture. To equalize the brightness difference between inner and outer ROI regions, brightness compensation was performed by use of amore » simple weighting method that applies selectively to the inner ROI, the outer ROI, and the boundary zone. A bilateral filtering was applied to the images to reduce relatively high noise in the outer ROI images. To speed up the calculation of our technique for real-time application, the GPU-acceleration was applied to the image processing algorithm. We performed a dosimetric measurement using an ion-chamber dosimeter to evaluate the amount of radiation dose reduction. The reduction of calculation time compared to a CPU-only computation was also measured, and the assessment of image quality in terms of image noise and spatial resolution was conducted. Results: More than 80% of dose was reduced by use of the ROI filter. The reduction rate depended on the thickness of the filter and the size of ROI aperture. The image noise outside the ROI was remarkably reduced by the bilateral filtering technique. The computation time for processing each frame image was reduced from 3.43 seconds with single CPU to 9.85 milliseconds with GPU-acceleration. Conclusion: The proposed technique for X-ray fluoroscopy can substantially reduce imaging radiation dose to the patient while maintaining image quality particularly in the ROI region in real-time.« less

  10. System Applies Polymer Powder To Filament Tow

    NASA Technical Reports Server (NTRS)

    Baucom, Robert M.; Snoha, John J.; Marchello, Joseph M.

    1993-01-01

    Polymer powder applied uniformly and in continuous manner. Powder-coating system applies dry polymer powder to continuous fiber tow. Unique filament-spreading technique, combined with precise control of tension on fibers in system, ensures uniform application of polymer powder to web of spread filaments. Fiber tows impregnated with dry polymer powders ("towpregs") produced for preform-weaving and composite-material-molding applications. System and process valuable to prepreg industry, for production of flexible filament-windable tows and high-temperature polymer prepregs.

  11. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  12. Occupationally Related Science. Draft Curriculum 1986-87.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Div. of Occupational Education Programs.

    To prepare occupational students for employment, a basic understanding of scientific knowledge and the processes of science that have been applied in the development of tools, machines, instruments, and technological techniques or processes should be taught. When a second unit of science was included for all high school students in the New York…

  13. Computer vision applications for coronagraphic optical alignment and image processing.

    PubMed

    Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A

    2013-05-10

    Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.

  14. A Q-Ising model application for linear-time image segmentation

    NASA Astrophysics Data System (ADS)

    Bentrem, Frank W.

    2010-10-01

    A computational method is presented which efficiently segments digital grayscale images by directly applying the Q-state Ising (or Potts) model. Since the Potts model was first proposed in 1952, physicists have studied lattice models to gain deep insights into magnetism and other disordered systems. For some time, researchers have realized that digital images may be modeled in much the same way as these physical systems ( i.e., as a square lattice of numerical values). A major drawback in using Potts model methods for image segmentation is that, with conventional methods, it processes in exponential time. Advances have been made via certain approximations to reduce the segmentation process to power-law time. However, in many applications (such as for sonar imagery), real-time processing requires much greater efficiency. This article contains a description of an energy minimization technique that applies four Potts (Q-Ising) models directly to the image and processes in linear time. The result is analogous to partitioning the system into regions of four classes of magnetism. This direct Potts segmentation technique is demonstrated on photographic, medical, and acoustic images.

  15. Application of ultrasound to improve lees ageing processes in red wines.

    PubMed

    Del Fresno, Juan Manuel; Loira, Iris; Morata, Antonio; González, Carmen; Suárez-Lepe, Jose Antonio; Cuerda, Rafael

    2018-09-30

    Ageing on lees (AOL) is a technique that increases volatile compounds, promotes colour stability, improves mouthfeel and reduces astringency in red wines. The main drawback is that it is a slow process. Several months are necessary to obtain perceptible effects in wines. Different authors have studied the application of new techniques to accelerate the AOL process. Ultrasound (US) has been used to improve different food industry processes; it could be interesting to accelerate the yeast autolysis during AOL. This work evaluates the use of the US technique together with AOL and oak chips for this purpose studying the effects of different oenological parameters of red wines. The results obtained indicate an increase of polysaccharides content when US is applied in wine AOL. In addition, total polyphenol index (TPI) and volatile acidity were not affected. However, this treatment increases the dissolved oxygen affecting the volatile compounds and total anthocyanins. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Numerical simulation of coupled electrochemical and transport processes in battery systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, B.Y.; Gu, W.B.; Wang, C.Y.

    1997-12-31

    Advanced numerical modeling to simulate dynamic battery performance characteristics for several types of advanced batteries is being conducted using computational fluid dynamics (CFD) techniques. The CFD techniques provide efficient algorithms to solve a large set of highly nonlinear partial differential equations that represent the complex battery behavior governed by coupled electrochemical reactions and transport processes. The authors have recently successfully applied such techniques to model advanced lead-acid, Ni-Cd and Ni-MH cells. In this paper, the authors briefly discuss how the governing equations were numerically implemented, show some preliminary modeling results, and compare them with other modeling or experimental data reportedmore » in the literature. The authors describe the advantages and implications of using the CFD techniques and their capabilities in future battery applications.« less

  17. Application of Compressive Sensing to Gravitational Microlensing Data and Implications for Miniaturized Space Observatories

    NASA Technical Reports Server (NTRS)

    Korde-Patel, Asmita (Inventor); Barry, Richard K.; Mohsenin, Tinoosh

    2016-01-01

    Compressive Sensing is a technique for simultaneous acquisition and compression of data that is sparse or can be made sparse in some domain. It is currently under intense development and has been profitably employed for industrial and medical applications. We here describe the use of this technique for the processing of astronomical data. We outline the procedure as applied to exoplanet gravitational microlensing and analyze measurement results and uncertainty values. We describe implications for on-spacecraft data processing for space observatories. Our findings suggest that application of these techniques may yield significant, enabling benefits especially for power and volume-limited space applications such as miniaturized or micro-constellation satellites.

  18. From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices

    PubMed Central

    Pires, Ivan Miguel; Garcia, Nuno M.; Pombo, Nuno; Flórez-Revuelta, Francisco

    2016-01-01

    This paper focuses on the research on the state of the art for sensor fusion techniques, applied to the sensors embedded in mobile devices, as a means to help identify the mobile device user’s daily activities. Sensor data fusion techniques are used to consolidate the data collected from several sensors, increasing the reliability of the algorithms for the identification of the different activities. However, mobile devices have several constraints, e.g., low memory, low battery life and low processing power, and some data fusion techniques are not suited to this scenario. The main purpose of this paper is to present an overview of the state of the art to identify examples of sensor data fusion techniques that can be applied to the sensors available in mobile devices aiming to identify activities of daily living (ADLs). PMID:26848664

  19. Nowcasting Cloud Fields for U.S. Air Force Special Operations

    DTIC Science & Technology

    2017-03-01

    application of Bayes’ Rule offers many advantages over Kernel Density Estimation (KDE) and other commonly used statistical post-processing methods...reflectance and probability of cloud. A statistical post-processing technique is applied using Bayesian estimation to train the system from a set of past...nowcasting, low cloud forecasting, cloud reflectance, ISR, Bayesian estimation, statistical post-processing, machine learning 15. NUMBER OF PAGES

  20. Proceedings ICASS 2017

    NASA Astrophysics Data System (ADS)

    Fu, Qiang; Schaaf, Peter

    2018-07-01

    This special issue of the high impact international peer reviewed journal Applied Surface Science represents the proceedings of the 2nd International Conference on Applied Surface Science ICASS held 12-16 June 2017 in Dalian China. The conference provided a forum for researchers in all areas of applied surface science to present their work. The main topics of the conference are in line with the most popular areas of research reported in Applied Surface Science. Thus, this issue includes current research on the role and use of surfaces in chemical and physical processes, related to catalysis, electrochemistry, surface engineering and functionalization, biointerfaces, semiconductors, 2D-layered materials, surface nanotechnology, energy, new/functional materials and nanotechnology. Also the various techniques and characterization methods will be discussed. Hence, scientific research on the atomic and molecular level of material properties investigated with specific surface analytical techniques and/or computational methods is essential for any further progress in these fields.

  1. CCD filter and transform techniques for interference excision

    NASA Technical Reports Server (NTRS)

    Borsuk, G. M.; Dewitt, R. N.

    1976-01-01

    The theoretical and some experimental results of a study aimed at applying CCD filter and transform techniques to the problem of interference excision within communications channels were presented. Adaptive noise (interference) suppression was achieved by the modification of received signals such that they were orthogonal to the recently measured noise field. CCD techniques were examined to develop real-time noise excision processing. They were recursive filters, circulating filter banks, transversal filter banks, an optical implementation of the chirp Z transform, and a CCD analog FFT.

  2. Pattern-recognition techniques applied to performance monitoring of the DSS 13 34-meter antenna control assembly

    NASA Technical Reports Server (NTRS)

    Mellstrom, J. A.; Smyth, P.

    1991-01-01

    The results of applying pattern recognition techniques to diagnose fault conditions in the pointing system of one of the Deep Space network's large antennas, the DSS 13 34-meter structure, are discussed. A previous article described an experiment whereby a neural network technique was used to identify fault classes by using data obtained from a simulation model of the Deep Space Network (DSN) 70-meter antenna system. Described here is the extension of these classification techniques to the analysis of real data from the field. The general architecture and philosophy of an autonomous monitoring paradigm is described and classification results are discussed and analyzed in this context. Key features of this approach include a probabilistic time-varying context model, the effective integration of signal processing and system identification techniques with pattern recognition algorithms, and the ability to calibrate the system given limited amounts of training data. Reported here are recognition accuracies in the 97 to 98 percent range for the particular fault classes included in the experiments.

  3. X-31 aerodynamic characteristics determined from flight data

    NASA Technical Reports Server (NTRS)

    Kokolios, Alex

    1993-01-01

    The lateral aerodynamic characteristics of the X-31 were determined at angles of attack ranging from 20 to 45 deg. Estimates of the lateral stability and control parameters were obtained by applying two parameter estimation techniques, linear regression, and the extended Kalman filter to flight test data. An attempt to apply maximum likelihood to extract parameters from the flight data was also made but failed for the reasons presented. An overview of the System Identification process is given. The overview includes a listing of the more important properties of all three estimation techniques that were applied to the data. A comparison is given of results obtained from flight test data and wind tunnel data for four important lateral parameters. Finally, future research to be conducted in this area is discussed.

  4. The application of machine learning techniques in the clinical drug therapy.

    PubMed

    Meng, Huan-Yu; Jin, Wan-Lin; Yan, Cheng-Kai; Yang, Huan

    2018-05-25

    The development of a novel drug is an extremely complicated process that includes the target identification, design and manufacture, and proper therapy of the novel drug, as well as drug dose selection, drug efficacy evaluation, and adverse drug reaction control. Due to the limited resources, high costs, long duration, and low hit-to-lead ratio in the development of pharmacogenetics and computer technology, machine learning techniques have assisted novel drug development and have gradually received more attention by researchers. According to current research, machine learning techniques are widely applied in the process of the discovery of new drugs and novel drug targets, the decision surrounding proper therapy and drug dose, and the prediction of drug efficacy and adverse drug reactions. In this article, we discussed the history, workflow, and advantages and disadvantages of machine learning techniques in the processes mentioned above. Although the advantages of machine learning techniques are fairly obvious, the application of machine learning techniques is currently limited. With further research, the application of machine techniques in drug development could be much more widespread and could potentially be one of the major methods used in drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    NASA Astrophysics Data System (ADS)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  6. Osteochondral integration of multiply incised pure cartilage allograft: repair method of focal chondral defects in a porcine model.

    PubMed

    Bardos, Tamas; Farkas, Boglarka; Mezes, Beata; Vancsodi, Jozsef; Kvell, Krisztian; Czompoly, Tamas; Nemeth, Peter; Bellyei, Arpad; Illes, Tamas

    2009-11-01

    A focal cartilage lesion has limited capacity to heal, and the repair modalities used at present are still unable to provide a universal solution. Pure cartilage graft implantation appears to be a simple option, but it has not been applied widely as cartilage will not reattach easily to the subchondral bone. We used a multiple-incision technique (processed chondrograft) to increase cartilage graft surface. We hypothesized that pure cartilage graft with augmented osteochondral fusion capacity may be used for cartilage repair and we compared this method with other repair techniques. Controlled laboratory study. Full-thickness focal cartilage defects were created on the medial femoral condyle of 9-month-old pigs; defects were repaired using various methods including bone marrow stimulation, autologous chondrocyte implantation, and processed chondrograft. After the repair, at weeks 6 and 24, macroscopic and histologic evaluation was carried out. Compared with other methods, processed chondrograft was found to be similarly effective in cartilage repair. Defects without repair and defects treated with bone marrow stimulation appeared slightly irregular with fibrocartilage filling. Autologous chondrocyte implantation produced hyalinelike cartilage, although its cellular organization was distinguishable from the surrounding articular cartilage. Processed chondrograft demonstrated good osteochondral integration, and the resulting tissue appeared to be hyaline cartilage. The applied cartilage surface processing method allows acceptable osteochondral integration, and the repair tissue appears to have good macroscopic and histologic characteristics. If further studies confirm its efficacy, this technique could be considered for human application in the future.

  7. The Taguchi Method Application to Improve the Quality of a Sustainable Process

    NASA Astrophysics Data System (ADS)

    Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.

    2018-06-01

    Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.

  8. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques.

    PubMed

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Palumbo, Davide; De Finis, Rosa; Galietti, Umberto

    2017-10-11

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength.

  9. Phase-measuring laser holographic interferometer for use in high speed flows

    NASA Astrophysics Data System (ADS)

    Yanta, William J.; Spring, W. Charles, III; Gross, Kimberly Uhrich; McArthur, J. Craig

    Phase-measurement techniques have been applied to a dual-plate laser holographic interferometer (LHI). This interferometer has been used to determine the flowfield densities in a variety of two-dimensional and axisymmetric flows. In particular, LHI has been applied in three different experiments: flowfield measurements inside a two-dimensional scramjet inlet, flow over a blunt cone, and flow over an indented nose shape. Comparisons of experimentally determined densities with computational results indicate that, when phase-measurement techniques are used in conjunction with state-of-the-art image-processing instrumentation, holographic interferometry can be a diagnostic tool with high resolution, high accuracy, and rapid data retrieval.

  10. Using decision-tree classifier systems to extract knowledge from databases

    NASA Technical Reports Server (NTRS)

    St.clair, D. C.; Sabharwal, C. L.; Hacke, Keith; Bond, W. E.

    1990-01-01

    One difficulty in applying artificial intelligence techniques to the solution of real world problems is that the development and maintenance of many AI systems, such as those used in diagnostics, require large amounts of human resources. At the same time, databases frequently exist which contain information about the process(es) of interest. Recently, efforts to reduce development and maintenance costs of AI systems have focused on using machine learning techniques to extract knowledge from existing databases. Research is described in the area of knowledge extraction using a class of machine learning techniques called decision-tree classifier systems. Results of this research suggest ways of performing knowledge extraction which may be applied in numerous situations. In addition, a measurement called the concept strength metric (CSM) is described which can be used to determine how well the resulting decision tree can differentiate between the concepts it has learned. The CSM can be used to determine whether or not additional knowledge needs to be extracted from the database. An experiment involving real world data is presented to illustrate the concepts described.

  11. Study of photon correlation techniques for processing of laser velocimeter signals

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1977-01-01

    The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.

  12. An application of computer image-processing and filmy replica technique to the copper electroplating method of stress analysis

    NASA Astrophysics Data System (ADS)

    Sugiura, M.; Seika, M.

    1994-02-01

    In this study, a new technique to measure the density of slip-bands automatically is developed, namely, a TV image of the slip-bands observed through a microscope is directly processed by an image-processing system using a personal computer and an accurate value of the density of slip-bands is measured quickly. In the case of measuring the local stresses in machine parts of large size with the copper plating foil, the direct observation of slip-bands through an optical microscope is difficult. In this study, to facilitate a technique close to the direct microscopic observation of slip-bands in the foil attached to a large-sized specimen, the replica method using a platic film of acetyl cellulose is applied to replicate the slip-bands in the attached foil.

  13. Arc-Welding Spectroscopic Monitoring based on Feature Selection and Neural Networks.

    PubMed

    Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M

    2008-10-21

    A new spectral processing technique designed for application in the on-line detection and classification of arc-welding defects is presented in this paper. A noninvasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed in two consecutive stages. A compression algorithm is first applied to the data, allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in previous works, giving rise to an improvement in the performance of the monitoring system.

  14. Evaluating Quality of Decision-Making Processes in Medicines' Development, Regulatory Review, and Health Technology Assessment: A Systematic Review of the Literature.

    PubMed

    Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R; Salek, Sam

    2017-01-01

    Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability.

  15. Trichotomous processes in early memory development, aging, and neurocognitive impairment: a unified theory.

    PubMed

    Brainerd, C J; Reyna, V F; Howe, M L

    2009-10-01

    One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.

  16. Evaluation of the Technical Adequacy of Three Methods for Identifying Specific Learning Disabilities Based on Cognitive Discrepancies

    ERIC Educational Resources Information Center

    Stuebing, Karla K.; Fletcher, Jack M.; Branum-Martin, Lee; Francis, David J.

    2012-01-01

    This study used simulation techniques to evaluate the technical adequacy of three methods for the identification of specific learning disabilities via patterns of strengths and weaknesses in cognitive processing. Latent and observed data were generated and the decision-making process of each method was applied to assess concordance in…

  17. The Effects of Jigsaw Technique Based on Cooperative Learning on Prospective Science Teachers' Science Process Skill

    ERIC Educational Resources Information Center

    Karacop, Ataman; Diken, Emine Hatun

    2017-01-01

    The purpose of this study is to investigate the effects of laboratory approach based on jigsaw method with cooperative learning and confirmatory laboratory approach on university students' cognitive process development in Science teaching laboratory applications, and to determine the opinions of the students on applied laboratory methods. The…

  18. The effects of solar incidence angle over digital processing of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.

  19. Proposal of Heuristic Algorithm for Scheduling of Print Process in Auto Parts Supplier

    NASA Astrophysics Data System (ADS)

    Matsumoto, Shimpei; Okuhara, Koji; Ueno, Nobuyuki; Ishii, Hiroaki

    We are interested in the print process on the manufacturing processes of auto parts supplier as an actual problem. The purpose of this research is to apply our scheduling technique developed in university to the actual print process in mass customization environment. Rationalization of the print process is depending on the lot sizing. The manufacturing lead time of the print process is long, and in the present method, production is done depending on worker’s experience and intuition. The construction of an efficient production system is urgent problem. Therefore, in this paper, in order to shorten the entire manufacturing lead time and to reduce the stock, we reexamine the usual method of the lot sizing rule based on heuristic technique, and we propose the improvement method which can plan a more efficient schedule.

  20. Novel high power impulse magnetron sputtering enhanced by an auxiliary electrical field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chunwei, E-mail: lcwnefu@126.com, E-mail: xiubotian@163.com; State Key Laboratory of Advanced Welding and Joining, Harbin Institute of Technology, Harbin 150001; Tian, Xiubo, E-mail: lcwnefu@126.com, E-mail: xiubotian@163.com

    2016-08-15

    The high power impulse magnetron sputtering (HIPIMS) technique is a novel highly ionized physical vapor deposition method with a high application potential. However, the electron utilization efficiency during sputtering is rather low and the metal particle ionization rate needs to be considerably improved to allow for a large-scale industrial application. Therefore, we enhanced the HIPIMS technique by simultaneously applying an electric field (EF-HIPIMS). The effect of the electric field on the discharge process was studied using a current sensor and an optical emission spectrometer. Furthermore, the spatial distribution of the electric potential and electric field during the EF-HIPIMS process wasmore » simulated using the ANSYS software. The results indicate that a higher electron utilization efficiency and a higher particle ionization rate could be achieved. The auxiliary anode obviously changed the distribution of the electric potential and the electric field in the discharge region, which increased the plasma density and enhanced the degree of ionization of the vanadium and argon gas. Vanadium films were deposited to further compare both techniques, and the morphology of the prepared films was investigated by scanning electron microscopy. The films showed a smaller crystal grain size and a denser growth structure when the electric field was applied during the discharge process.« less

  1. Mechanism of amino acid interaction with silicon nitride surface during chemical mechanical planarization

    NASA Astrophysics Data System (ADS)

    America, William George

    Chemical-Mechanical Planarization (CMP) has become an essential technology for making modern semiconductor devices. This technique was originally applied to overcome the depth of focus limitations of lithography tools during pattern development of metal and dielectric films. As features of the semiconductor device became smaller the lithographic process shifted to shorter exposure wavelengths and the useable depth of focus became smaller. The topography differences on the wafer's surface from all of the previous processing steps became greater than the exposure tools could properly project. CMP helped solve this problem by bringing the features of the wafer surface to the same plane. As semiconductor fabrication technology progressed further, CMP was applied to other areas of the process, including shallow trench isolation and metal line Damascene processing. In its simplest application, CMP polishes on features projecting upward and higher than the average surface. These projections experience more work and are polished faster. Given sufficient time the surface becomes essentially flat, on a micro-scale, and the lithographic projection tools has the same plane onto which to focus. Thus, the pattern is properly and uniformly exposed and subsequent reactive ion etching (RIE) steps are executed. This technique was initially applied to later steps in the wafer processing scheme to render a new flat surface at each metal layer. Building on this success, CMP has been applied to a broad range of steps in the wafer processing particularly where surface topography warrants and when RIE of dielectric or metallic films is not practical. CMP has seen its greatest application in semiconductor logic and memory devices and most recently, a Damascene processing for copper lines and shallow trench isolation. This pattern dependent CMP issue is explored in this thesis as it pertains primarily to shallow trench isolation CMP coupled with a highly selective slurry chemistry.

  2. Applicability of different onboard routing and processing techniques to mobile satellite systems

    NASA Technical Reports Server (NTRS)

    Craig, A. D.; Marston, P. C.; Bakken, P. M.; Vernucci, A.; Benedicto, J.

    1993-01-01

    The paper summarizes a study contract recently undertaken for ESA. The study compared the effectiveness of several processing architectures applied to multiple beam, geostationary global and European regional missions. The paper discusses architectures based on transparent SS-FDMA analog, transparent DSP and regenerative processing. Quantitative comparisons are presented and general conclusions are given with respect to suitability of the architectures to different mission requirements.

  3. Endoscopic ultrasound-guided fine-needle aspiration with liquid-based cytologic preparation in the diagnosis of primary pancreatic lymphoma.

    PubMed

    Rossi, Esther Diana; Larghi, Alberto; Verna, Elizabeth C; Martini, Maurizio; Galasso, Domenico; Carnuccio, Antonella; Larocca, Luigi Maria; Costamagna, Guido; Fadda, Guido

    2010-11-01

    The diagnosis subtyping of lymphoma on specimens collected by endoscopic ultrasound fine-needle aspiration (EUS-FNA) can be extremely difficult. When a cytopathologist is available for the on-site evaluation, the diagnosis may be achieved by applying flow cytometric techniques. We describe our experience with immunocytochemistry (ICC) and molecular biology studies applied on EUS-FNA specimens processed with a liquid-based cytologic (LBC) preparation for the diagnosis of primary pancreatic lymphoma (PPL). Three patients with a pancreatic mass underwent EUS-FNA. The collected specimens were processed with the ThinPrep method for the cytologic diagnosis and eventual additional investigations. A morphologic picture consistent with PPL was found on the LBC specimens of the 3 patients. Subsequent ICC and molecular biology studies for immunoglobulin heavy chain gene rearrangement established the diagnosis of pancreatic large B-cell non-Hodgkin lymphoma in 2 patients and a non-Hodgkin lymphoma with plasmoblastic/immunoblastic differentiation in the remaining one. An LBC preparation can be used to diagnose and subtype PPL by applying ICC and molecular biology techniques to specimens collected with EUS-FNA. This method can be an additional processing method for EUS-FNA specimens in centers where on-site cytopathologist expertise is not available.

  4. Pre-processing ambient noise cross-correlations with equalizing the covariance matrix eigenspectrum

    NASA Astrophysics Data System (ADS)

    Seydoux, Léonard; de Rosny, Julien; Shapiro, Nikolai M.

    2017-09-01

    Passive imaging techniques from ambient seismic noise requires a nearly isotropic distribution of the noise sources in order to ensure reliable traveltime measurements between seismic stations. However, real ambient seismic noise often partially fulfils this condition. It is generated in preferential areas (in deep ocean or near continental shores), and some highly coherent pulse-like signals may be present in the data such as those generated by earthquakes. Several pre-processing techniques have been developed in order to attenuate the directional and deterministic behaviour of this real ambient noise. Most of them are applied to individual seismograms before cross-correlation computation. The most widely used techniques are the spectral whitening and temporal smoothing of the individual seismic traces. We here propose an additional pre-processing to be used together with the classical ones, which is based on the spatial analysis of the seismic wavefield. We compute the cross-spectra between all available stations pairs in spectral domain, leading to the data covariance matrix. We apply a one-bit normalization to the covariance matrix eigenspectrum before extracting the cross-correlations in the time domain. The efficiency of the method is shown with several numerical tests. We apply the method to the data collected by the USArray, when the M8.8 Maule earthquake occurred on 2010 February 27. The method shows a clear improvement compared with the classical equalization to attenuate the highly energetic and coherent waves incoming from the earthquake, and allows to perform reliable traveltime measurement even in the presence of the earthquake.

  5. An Analog Macroscopic Technique for Studying Molecular Hydrodynamic Processes in Dense Gases and Liquids.

    PubMed

    Dahlberg, Jerry; Tkacik, Peter T; Mullany, Brigid; Fleischhauer, Eric; Shahinian, Hossein; Azimi, Farzad; Navare, Jayesh; Owen, Spencer; Bisel, Tucker; Martin, Tony; Sholar, Jodie; Keanini, Russell G

    2017-12-04

    An analog, macroscopic method for studying molecular-scale hydrodynamic processes in dense gases and liquids is described. The technique applies a standard fluid dynamic diagnostic, particle image velocimetry (PIV), to measure: i) velocities of individual particles (grains), extant on short, grain-collision time-scales, ii) velocities of systems of particles, on both short collision-time- and long, continuum-flow-time-scales, iii) collective hydrodynamic modes known to exist in dense molecular fluids, and iv) short- and long-time-scale velocity autocorrelation functions, central to understanding particle-scale dynamics in strongly interacting, dense fluid systems. The basic system is composed of an imaging system, light source, vibrational sensors, vibrational system with a known media, and PIV and analysis software. Required experimental measurements and an outline of the theoretical tools needed when using the analog technique to study molecular-scale hydrodynamic processes are highlighted. The proposed technique provides a relatively straightforward alternative to photonic and neutron beam scattering methods traditionally used in molecular hydrodynamic studies.

  6. Bidirectional light-scattering image processing method for high-concentration jet sprays

    NASA Astrophysics Data System (ADS)

    Shimizu, I.; Emori, Y.; Yang, W.-J.; Shimoda, M.; Suzuki, T.

    1985-01-01

    In order to study the distributions of droplet size and volume density in high-concentration jet sprays, a new technique is developed, which combines the forward and backward light scattering method and an image processing method. A pulsed ruby laser is used as the light source. The Mie scattering theory is applied to the results obtained from image processing on the scattering photographs. The time history is obtained for the droplet size and volume density distributions, and the method is demonstrated by diesel fuel sprays under various injecting conditions. The validity of the technique is verified by a good agreement in the injected fuel volume distributions obtained by the present method and by injection rate measurements.

  7. Trapped rubber processing for advanced composites

    NASA Technical Reports Server (NTRS)

    Marra, P. J.

    1976-01-01

    Trapped rubber processing is a molding technique for composites in which precast silicone rubber is placed within a closed cavity where it thermally expands against the composite's surface supported by the vessel walls. The method has been applied by the Douglas Aircraft Company, under contract to NASA-Langley, to the design and fabrication of 10 DC-10 graphite/epoxy upper aft rudder assemblies. A three-bay development tool form mold die has been designed and manufactured, and tooling parameters have been established. Fabrication procedures include graphite layup, assembly of details in the tool, and a cure cycle. The technique has made it possible for the cocured fabrication of complex primary box structures otherwise impracticable via standard composite material processes.

  8. Extractive Fermentation of Sugarcane Juice to Produce High Yield and Productivity of Bioethanol

    NASA Astrophysics Data System (ADS)

    Rofiqah, U.; Widjaja, T.; Altway, A.; Bramantyo, A.

    2017-04-01

    Ethanol production by batch fermentation requires a simple process and it is widely used. Batch fermentation produces ethanol with low yield and productivity due to the accumulation of ethanol in which poisons microorganisms in the fermenter. Extractive fermentation technique is applied to solve the microorganism inhibition problem by ethanol. Extractive fermentation technique can produce ethanol with high yield and productivity. In this process raffinate still, contains much sugar because conversion in the fermentation process is not perfect. Thus, to enhance ethanol yield and productivity, recycle system is applied by returning the raffinate from the extraction process to the fermentation process. This raffinate also contains ethanol which would inhibit the performance of microorganisms in producing ethanol during the fermentation process. Therefore, this study aims to find the optimum condition for the amount of solvent to broth ratio (S: B) and recycle to fresh feed ratio (R: F) which enter the fermenter to produce high yield and productivity. This research was carried out by experiment. In the experiment, sugarcane juice was fermented using Zymomonasmobilis mutant. The fermentation broth was extracted using amyl alcohol. The process was integrated with the recycle system by varying the recycle ratio. The highest yield and productivity is 22.3901% and 103.115 g / L.h respectively, obtained in a process that uses recycle to fresh feed ratio (R: F) of 50:50 and solvents to both ratio of 1.

  9. [Study of the appearance difference of lower complete denture between functional and anatomic impression techniques].

    PubMed

    Zhong, Qun; Wu, Xue-yin; Shen, Qing-yi; Shen, Qing-ping

    2012-04-01

    To compare the difference in oblique external ridge, oblique internal ridge and alveolar process crest of lower complete denture base made through functional impression and anatomic impression techniques. Fifteen patients were chosen to treat with two kinds of complete dentures through functional impression and anatomic impression technique respectively. 3D laser scanner was used to scan the three-dimensional model of the denture base and the differences of the surface structural between two techniques in alveolar process crest, external and internal oblique ridges were analyzed, using paired t test with SPSS 12.0 software package. Between the two techniques, there were significant differences in the areas of internal and external oblique ridge(P<0.01); there was no significant difference in the main support areas(P>0.05). The results explain why there is less tenderness when functional impression technique is applied. The differences measured also indicate that sufficient buffering should be made in external and internal oblique ridge areas in clinic.

  10. A fast and fully automatic registration approach based on point features for multi-source remote-sensing images

    NASA Astrophysics Data System (ADS)

    Yu, Le; Zhang, Dengrong; Holden, Eun-Jung

    2008-07-01

    Automatic registration of multi-source remote-sensing images is a difficult task as it must deal with the varying illuminations and resolutions of the images, different perspectives and the local deformations within the images. This paper proposes a fully automatic and fast non-rigid image registration technique that addresses those issues. The proposed technique performs a pre-registration process that coarsely aligns the input image to the reference image by automatically detecting their matching points by using the scale invariant feature transform (SIFT) method and an affine transformation model. Once the coarse registration is completed, it performs a fine-scale registration process based on a piecewise linear transformation technique using feature points that are detected by the Harris corner detector. The registration process firstly finds in succession, tie point pairs between the input and the reference image by detecting Harris corners and applying a cross-matching strategy based on a wavelet pyramid for a fast search speed. Tie point pairs with large errors are pruned by an error-checking step. The input image is then rectified by using triangulated irregular networks (TINs) to deal with irregular local deformations caused by the fluctuation of the terrain. For each triangular facet of the TIN, affine transformations are estimated and applied for rectification. Experiments with Quickbird, SPOT5, SPOT4, TM remote-sensing images of the Hangzhou area in China demonstrate the efficiency and the accuracy of the proposed technique for multi-source remote-sensing image registration.

  11. Decision science: a scientific approach to enhance public health budgeting.

    PubMed

    Honoré, Peggy A; Fos, Peter J; Smith, Torney; Riley, Michael; Kramarz, Kim

    2010-01-01

    The allocation of resources for public health programming is a complicated and daunting responsibility. Financial decision-making processes within public health agencies are especially difficult when not supported with techniques for prioritizing and ranking alternatives. This article presents a case study of a decision analysis software model that was applied to the process of identifying funding priorities for public health services in the Spokane Regional Health District. Results on the use of this decision support system provide insights into how decision science models, which have been used for decades in business and industry, can be successfully applied to public health budgeting as a means of strengthening agency financial management processes.

  12. Using deliberative techniques to engage the community in policy development.

    PubMed

    Gregory, Judy; Hartz-Karp, Janette; Watson, Rebecca

    2008-07-16

    This paper examines work in deliberative approaches to community engagement used in Western Australia by the Department of Planning and Infrastructure and other planning and infrastructure agencies between 2001 and 2005, and considers whether the techniques could be applied to the development of health policy in Australia. Deliberative processes were used in WA to address specific planning and infrastructure problems. Using deliberative techniques, community participants contributed to joint decision making and policy development. Outcomes from deliberative processes were seriously considered by the Minister and used to influence policy decisions. In many cases, the recommendations generated through deliberative processes were fully adopted by the Minister. The experiences in WA demonstrate that deliberative engagement processes can be successfully implemented by government and can be used to guide policy. The techniques can be adapted to suit the context and issues experienced by a portfolio, and the skills required to conduct deliberative processes can be fostered amongst the portfolio's staff. Health policy makers may be able to learn from the experiences in WA, and adopt approaches to community engagement that allow for informed deliberation and debate in the community about the future of Australia's health system.

  13. Real-time image processing for passive mmW imagery

    NASA Astrophysics Data System (ADS)

    Kozacik, Stephen; Paolini, Aaron; Bonnett, James; Harrity, Charles; Mackrides, Daniel; Dillon, Thomas E.; Martin, Richard D.; Schuetz, Christopher A.; Kelmelis, Eric; Prather, Dennis W.

    2015-05-01

    The transmission characteristics of millimeter waves (mmWs) make them suitable for many applications in defense and security, from airport preflight scanning to penetrating degraded visual environments such as brownout or heavy fog. While the cold sky provides sufficient illumination for these images to be taken passively in outdoor scenarios, this utility comes at a cost; the diffraction limit of the longer wavelengths involved leads to lower resolution imagery compared to the visible or IR regimes, and the low power levels inherent to passive imagery allow the data to be more easily degraded by noise. Recent techniques leveraging optical upconversion have shown significant promise, but are still subject to fundamental limits in resolution and signal-to-noise ratio. To address these issues we have applied techniques developed for visible and IR imagery to decrease noise and increase resolution in mmW imagery. We have developed these techniques into fieldable software, making use of GPU platforms for real-time operation of computationally complex image processing algorithms. We present data from a passive, 77 GHz, distributed aperture, video-rate imaging platform captured during field tests at full video rate. These videos demonstrate the increase in situational awareness that can be gained through applying computational techniques in real-time without needing changes in detection hardware.

  14. Handwritten document age classification based on handwriting styles

    NASA Astrophysics Data System (ADS)

    Ramaiah, Chetan; Kumar, Gaurav; Govindaraju, Venu

    2012-01-01

    Handwriting styles are constantly changing over time. We approach the novel problem of estimating the approximate age of Historical Handwritten Documents using Handwriting styles. This system will have many applications in handwritten document processing engines where specialized processing techniques can be applied based on the estimated age of the document. We propose to learn a distribution over styles across centuries using Topic Models and to apply a classifier over weights learned in order to estimate the approximate age of the documents. We present a comparison of different distance metrics such as Euclidean Distance and Hellinger Distance within this application.

  15. Evaluating Payments for Environmental Services: Methodological Challenges

    PubMed Central

    2016-01-01

    Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850

  16. Experiments with recursive estimation in astronomical image processing

    NASA Technical Reports Server (NTRS)

    Busko, I.

    1992-01-01

    Recursive estimation concepts were applied to image enhancement problems since the 70's. However, very few applications in the particular area of astronomical image processing are known. These concepts were derived, for 2-dimensional images, from the well-known theory of Kalman filtering in one dimension. The historic reasons for application of these techniques to digital images are related to the images' scanned nature, in which the temporal output of a scanner device can be processed on-line by techniques borrowed directly from 1-dimensional recursive signal analysis. However, recursive estimation has particular properties that make it attractive even in modern days, when big computer memories make the full scanned image available to the processor at any given time. One particularly important aspect is the ability of recursive techniques to deal with non-stationary phenomena, that is, phenomena which have their statistical properties variable in time (or position in a 2-D image). Many image processing methods make underlying stationary assumptions either for the stochastic field being imaged, for the imaging system properties, or both. They will underperform, or even fail, when applied to images that deviate significantly from stationarity. Recursive methods, on the contrary, make it feasible to perform adaptive processing, that is, to process the image by a processor with properties tuned to the image's local statistical properties. Recursive estimation can be used to build estimates of images degraded by such phenomena as noise and blur. We show examples of recursive adaptive processing of astronomical images, using several local statistical properties to drive the adaptive processor, as average signal intensity, signal-to-noise and autocorrelation function. Software was developed under IRAF, and as such will be made available to interested users.

  17. Verification and extension of the MBL technique for photo resist pattern shape measurement

    NASA Astrophysics Data System (ADS)

    Isawa, Miki; Tanaka, Maki; Kazumi, Hideyuki; Shishido, Chie; Hamamatsu, Akira; Hasegawa, Norio; De Bisschop, Peter; Laidler, David; Leray, Philippe; Cheng, Shaunee

    2011-03-01

    In order to achieve pattern shape measurement with CD-SEM, the Model Based Library (MBL) technique is in the process of development. In this study, several libraries which consisted by double trapezoid model placed in optimum layout, were used to measure the various layout patterns. In order to verify the accuracy of the MBL photoresist pattern shape measurement, CDAFM measurements were carried out as a reference metrology. Both results were compared to each other, and we confirmed that there is a linear correlation between them. After that, to expand the application field of the MBL technique, it was applied to end-of-line (EOL) shape measurement to show the capability. Finally, we confirmed the possibility that the MBL could be applied to more local area shape measurement like hot-spot analysis.

  18. Analysis techniques for tracer studies of oxidation. M. S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Basu, S. N.

    1984-01-01

    Analysis techniques to obtain quantitative diffusion data from tracer concentration profiles were developed. Mass balance ideas were applied to determine the mechanism of oxide growth and to separate the fraction of inward and outward growth of oxide scales. The process of inward oxygen diffusion with exchange was theoretically modelled and the effect of lattice diffusivity, grain boundary diffusivity and grain size on the tracer concentration profile was studied. The development of the tracer concentration profile in a growing oxide scale was simulated. The double oxidation technique was applied to a FeCrAl-Zr alloy using 0-18 as a tracer. SIMS was used to obtain the tracer concentration profile. The formation of lacey oxide on the alloy was discussed. Careful consideration was given to the quality of data required to obtain quantitative information.

  19. Ventilation in the patient with unilateral lung disease.

    PubMed

    Thomas, A R; Bryce, T L

    1998-10-01

    Severe ULD presents a challenge in ventilator management because of the marked asymmetry in the mechanics of the two lungs. The asymmetry may result from significant decreases or increases in the compliance of the involved lung. Traditional ventilator support may fail to produce adequate gas exchange in these situations and has the potential to cause further deterioration. Fortunately, conventional techniques can be safely and effectively applied in the majority of cases without having to resort to less familiar and potentially hazardous forms of support. In those circumstances when conventional ventilation is unsuccessful in restoring adequate gas exchange, lateral positioning and ILV have proved effective at improving and maintaining gas exchange. Controlled trials to guide clinical decision making are lacking. In patients who have processes associated with decreased compliance in the involved lung, lateral positioning may be a simple method of improving gas exchange but is associated with many practical limitations. ILV in these patients is frequently successful when differential PEEP is applied with the higher pressure to the involved lung. In patients in whom the pathology results in distribution of ventilation favoring the involved lung, particularly BPF, ILV can be used to supply adequate support while minimizing flow through the fistula and allowing it to close. The application of these techniques should be undertaken with an understanding of the pathophysiology of the underlying process; the reported experience with these techniques, including indications and successfully applied methods; and the potential problems encountered with their use. Fortunately, these modalities are infrequently required, but they provide a critical means of support when conventional techniques fail.

  20. Integrated structure/control design - Present methodology and future opportunities

    NASA Technical Reports Server (NTRS)

    Weisshaar, T. A.; Newsom, J. R.; Zeiler, T. A.; Gilbert, M. G.

    1986-01-01

    Attention is given to current methodology applied to the integration of the optimal design process for structures and controls. Multilevel linear decomposition techniques proved to be most effective in organizing the computational efforts necessary for ISCD (integrated structures and control design) tasks. With the development of large orbiting space structures and actively controlled, high performance aircraft, there will be more situations in which this concept can be applied.

  1. Introduction to Command, Control and Communications (C3) Through Comparative Case Analysis

    DTIC Science & Technology

    1990-03-01

    enhancing the process of learning from experience. Case study allows the student to apply concepts , theories, and techniques to an actual incident within...part of the thesis describes selected principles and concepts of 33 related to cormruication management, interoperability, command structure and...The solutions to the cases require applying the principles and concepts presented in the first rart. The four cases are: (1) the Iran hostage rescue

  2. An Evaluation Model Applied to a Mathematics-Methods Program Involving Three Characteristics of Teaching Style and Their Relationship to Pupil Achievement. Teacher Education Forum; Volume 3, Number 4.

    ERIC Educational Resources Information Center

    Dodd, Carol Ann

    This study explores a technique for evaluating teacher education programs in terms of teaching competencies, as applied to the Indiana University Mathematics Methods Program (MMP). The evaluation procedures formulated for the study include a process product design in combination with a modification of Pophan's performance test paradigm and Gage's…

  3. Repair techniques for celion/LARC-160 graphite/polyimide composite structures

    NASA Technical Reports Server (NTRS)

    Jones, J. S.; Graves, S. R.

    1984-01-01

    The large stiffness-to-weight and strength-to-weight ratios of graphite composite in combination with the 600 F structural capability of the polyimide matrix can reduce the total structure/TPS weight of reusable space vehicles by 20-30 percent. It is inevitable that with planned usage of GR/PI structural components, damage will occur either in the form of intrinsic flaw growth or mechanical damage. Research and development programs were initiated to develop repair processes and techniques specific to Celion/LARC-160 GR/PI structure with emphasis on highly loaded and lightly loaded compression critical structures for factory type repair. Repair processes include cocure and secondary bonding techniques applied under vacuum plus positive autoclave pressure. Viable repair designs and processes are discussed for flat laminates, honeycomb sandwich panels, and hat-stiffened skin-stringer panels. The repair methodology was verified through structural element compression tests at room temperature and 315 C (600 F).

  4. Synthetic schlieren—application to the visualization and characterization of air convection

    NASA Astrophysics Data System (ADS)

    Taberlet, Nicolas; Plihon, Nicolas; Auzémery, Lucile; Sautel, Jérémy; Panel, Grégoire; Gibaud, Thomas

    2018-05-01

    Synthetic schlieren is a digital image processing optical method relying on the variation of optical index to visualize the flow of a transparent fluid. In this article, we present a step-by-step, easy-to-implement and affordable experimental realization of this technique. The method is applied to air convection caused by a warm surface. We show that the velocity of rising convection plumes can be linked to the temperature of the warm surface and propose a simple physical argument to explain this dependence. Moreover, using this method, one can reveal the tenuous convection plumes rising from one’s hand, a phenomenon invisible to the naked eye. This spectacular result may help students to realize the power of careful data acquisition combined with astute image processing techniques. This spectacular result may help students to realize the power of careful data acquisition combined with astute image processing techniques (refer to the video abstract).

  5. Online measurement of bead geometry in GMAW-based additive manufacturing using passive vision

    NASA Astrophysics Data System (ADS)

    Xiong, Jun; Zhang, Guangjun

    2013-11-01

    Additive manufacturing based on gas metal arc welding is an advanced technique for depositing fully dense components with low cost. Despite this fact, techniques to achieve accurate control and automation of the process have not yet been perfectly developed. The online measurement of the deposited bead geometry is a key problem for reliable control. In this work a passive vision-sensing system, comprising two cameras and composite filtering techniques, was proposed for real-time detection of the bead height and width through deposition of thin walls. The nozzle to the top surface distance was monitored for eliminating accumulated height errors during the multi-layer deposition process. Various image processing algorithms were applied and discussed for extracting feature parameters. A calibration procedure was presented for the monitoring system. Validation experiments confirmed the effectiveness of the online measurement system for bead geometry in layered additive manufacturing.

  6. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  7. Tracking Polymer Cure Via Embedded Optical Fibers

    NASA Technical Reports Server (NTRS)

    Dean, David L.; Davidson, T. Fred

    1993-01-01

    Fourier-transform infrared spectroscopy applied in interior of specimen of material by bringing infrared light through specimen in optical fiber. Light interacts with material via evanescent-wave effect. Spectra obtained in this way at various times during curing process also combined with data from ultrasonic, thermographic, and dielectric-impedance monitoring, and other measurement techniques to obtain more complete characterization of progress of curing process.

  8. Image processing pipeline for segmentation and material classification based on multispectral high dynamic range polarimetric images.

    PubMed

    Martínez-Domingo, Miguel Ángel; Valero, Eva M; Hernández-Andrés, Javier; Tominaga, Shoji; Horiuchi, Takahiko; Hirai, Keita

    2017-11-27

    We propose a method for the capture of high dynamic range (HDR), multispectral (MS), polarimetric (Pol) images of indoor scenes using a liquid crystal tunable filter (LCTF). We have included the adaptive exposure estimation (AEE) method to fully automatize the capturing process. We also propose a pre-processing method which can be applied for the registration of HDR images after they are already built as the result of combining different low dynamic range (LDR) images. This method is applied to ensure a correct alignment of the different polarization HDR images for each spectral band. We have focused our efforts in two main applications: object segmentation and classification into metal and dielectric classes. We have simplified the segmentation using mean shift combined with cluster averaging and region merging techniques. We compare the performance of our segmentation with that of Ncut and Watershed methods. For the classification task, we propose to use information not only in the highlight regions but also in their surrounding area, extracted from the degree of linear polarization (DoLP) maps. We present experimental results which proof that the proposed image processing pipeline outperforms previous techniques developed specifically for MSHDRPol image cubes.

  9. Application of Six Sigma towards improving surgical outcomes.

    PubMed

    Shukla, P J; Barreto, S G; Nadkarni, M S

    2008-01-01

    Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.

  10. A simple technique to identify key recruitment issues in randomised controlled trials: Q-QAT - Quanti-Qualitative Appointment Timing.

    PubMed

    Paramasivan, Sangeetha; Strong, Sean; Wilson, Caroline; Campbell, Bruce; Blazeby, Jane M; Donovan, Jenny L

    2015-03-11

    Recruitment to pragmatic randomised controlled trials (RCTs) is acknowledged to be difficult, and few interventions have proved to be effective. Previous qualitative research has consistently revealed that recruiters provide imbalanced information about RCT treatments. However, qualitative research can be time-consuming to apply. Within a programme of research to optimise recruitment and informed consent in challenging RCTs, we developed a simple technique, Q-QAT (Quanti-Qualitative Appointment Timing), to systematically investigate and quantify the imbalance to help identify and address recruitment difficulties. The Q-QAT technique comprised: 1) quantification of time spent discussing the RCT and its treatments using transcripts of audio-recorded recruitment appointments, 2) targeted qualitative research to understand the obstacles to recruitment and 3) feedback to recruiters on opportunities for improvement. This was applied to two RCTs with different clinical contexts and recruitment processes. Comparisons were made across clinical centres, recruiters and specialties. In both RCTs, the Q-QAT technique first identified considerable variations in the time spent by recruiters discussing the RCT and its treatments. The patterns emerging from this initial quantification of recruitment appointments then enabled targeted qualitative research to understand the issues and make suggestions to improve recruitment. In RCT1, presentation of the treatments was balanced, but little time was devoted to describing the RCT. Qualitative research revealed patients would have considered participation, but lacked awareness of the RCT. In RCT2, the balance of treatment presentation varied by specialists and centres. Qualitative research revealed difficulties with equipoise and confidence among recruiters presenting the RCT. The quantitative and qualitative findings were well-received by recruiters and opportunities to improve information provision were discussed. A blind coding exercise across three researchers led to the development of guidelines that can be used to apply the Q-QAT technique to other difficult RCTs. The Q-QAT technique was easy to apply and rapidly identified obstacles to recruitment that could be understood through targeted qualitative research and addressed through feedback. The technique's combination of quantitative and qualitative findings enabled the presentation of a holistic picture of recruitment challenges and added credibility to the feedback process. Note: both RCTs in this manuscript asked to be anonymised, so no trial registration details are provided.

  11. Arc-welding quality assurance by means of embedded fiber sensor and spectral processing combining feature selection and neural networks

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; García-Allende, P. B.; Cobo, A.; Conde, O.; López-Higuera, J. M.

    2007-07-01

    A new spectral processing technique designed for its application in the on-line detection and classification of arc-welding defects is presented in this paper. A non-invasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed by means of two consecutive stages. A compression algorithm is first applied to the data allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in a previous paper, giving rise to an improvement in the performance of the monitoring system.

  12. Data Mining Methods for Recommender Systems

    NASA Astrophysics Data System (ADS)

    Amatriain, Xavier; Jaimes*, Alejandro; Oliver, Nuria; Pujol, Josep M.

    In this chapter, we give an overview of the main Data Mining techniques used in the context of Recommender Systems. We first describe common preprocessing methods such as sampling or dimensionality reduction. Next, we review the most important classification techniques, including Bayesian Networks and Support Vector Machines. We describe the k-means clustering algorithm and discuss several alternatives. We also present association rules and related algorithms for an efficient training process. In addition to introducing these techniques, we survey their uses in Recommender Systems and present cases where they have been successfully applied.

  13. Removal of Lattice Imperfections that Impact the Optical Quality of Ti:Sapphire using Advanced Magnetorheological Finishing Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menapace, J A; Schaffers, K I; Bayramian, A J

    2008-02-26

    Advanced magnetorheological finishing (MRF) techniques have been applied to Ti:sapphire crystals to compensate for sub-millimeter lattice distortions that occur during the crystal growing process. Precise optical corrections are made by imprinting topographical structure onto the crystal surfaces to cancel out the effects of the lattice distortion in the transmitted wavefront. This novel technique significantly improves the optical quality for crystals of this type and sets the stage for increasing the availability of high-quality large-aperture sapphire and Ti:sapphire optics in critical applications.

  14. Parallel pivoting combined with parallel reduction

    NASA Technical Reports Server (NTRS)

    Alaghband, Gita

    1987-01-01

    Parallel algorithms for triangularization of large, sparse, and unsymmetric matrices are presented. The method combines the parallel reduction with a new parallel pivoting technique, control over generations of fill-ins and a check for numerical stability, all done in parallel with the work being distributed over the active processes. The parallel technique uses the compatibility relation between pivots to identify parallel pivot candidates and uses the Markowitz number of pivots to minimize fill-in. This technique is not a preordering of the sparse matrix and is applied dynamically as the decomposition proceeds.

  15. Multi-carrier mobile TDMA system with active array antenna

    NASA Technical Reports Server (NTRS)

    Suzuki, Ryutaro; Matsumoto, Yasushi; Hamamoto, Naokazu

    1990-01-01

    A multi-carrier time division multiple access (TDMA) is proposed for the future mobile satellite communications systems that include a multi-satellite system. This TDMA system employs the active array antenna in which the digital beam forming technique is adopted to control the antenna beam direction. The antenna beam forming is carried out at the base band frequency by using the digital signal processing technique. The time division duplex technique is applied for the TDM/TDMA burst format, in order not to overlap transmit and receive timing.

  16. Slaughterhouse Wastewater Treatment by Combined Chemical Coagulation and Electrocoagulation Process

    PubMed Central

    Bazrafshan, Edris; Kord Mostafapour, Ferdos; Farzadkia, Mehdi; Ownagh, Kamal Aldin; Mahvi, Amir Hossein

    2012-01-01

    Slaughterhouse wastewater contains various and high amounts of organic matter (e.g., proteins, blood, fat and lard). In order to produce an effluent suitable for stream discharge, chemical coagulation and electrocoagulation techniques have been particularly explored at the laboratory pilot scale for organic compounds removal from slaughterhouse effluent. The purpose of this work was to investigate the feasibility of treating cattle-slaughterhouse wastewater by combined chemical coagulation and electrocoagulation process to achieve the required standards. The influence of the operating variables such as coagulant dose, electrical potential and reaction time on the removal efficiencies of major pollutants was determined. The rate of removal of pollutants linearly increased with increasing doses of PACl and applied voltage. COD and BOD5 removal of more than 99% was obtained by adding 100 mg/L PACl and applied voltage 40 V. The experiments demonstrated the effectiveness of chemical and electrochemical techniques for the treatment of slaughterhouse wastewaters. Consequently, combined processes are inferred to be superior to electrocoagulation alone for the removal of both organic and inorganic compounds from cattle-slaughterhouse wastewater. PMID:22768233

  17. Controlling the Adhesion of Superhydrophobic Surfaces Using Electrolyte Jet Machining Techniques

    PubMed Central

    Yang, Xiaolong; Liu, Xin; Lu, Yao; Zhou, Shining; Gao, Mingqian; Song, Jinlong; Xu, Wenji

    2016-01-01

    Patterns with controllable adhesion on superhydrophobic areas have various biomedical and chemical applications. Electrolyte jet machining technique (EJM), an electrochemical machining method, was firstly exploited in constructing dimples with various profiles on the superhydrophobic Al alloy surface using different processing parameters. Sliding angles of water droplets on those dimples firstly increased and then stabilized at a certain value with the increase of the processing time or the applied voltages of the EJM, indicating that surfaces with different adhesion force could be obtained by regulating the processing parameters. The contact angle hysteresis and the adhesion force that restricts the droplet from sliding off were investigated through experiments. The results show that the adhesion force could be well described using the classical Furmidge equation. On account of this controllable adhesion force, water droplets could either be firmly pinned to the surface, forming various patterns or slide off at designed tilting angles at specified positions on a superhydrophobic surface. Such dimples on superhydrophopbic surfaces can be applied in water harvesting, biochemical analysis and lab-on-chip devices. PMID:27046771

  18. Minehunting sonar system research and development

    NASA Astrophysics Data System (ADS)

    Ferguson, Brian

    2002-05-01

    Sea mines have the potential to threaten the freedom of the seas by disrupting maritime trade and restricting the freedom of maneuver of navies. The acoustic detection, localization, and classification of sea mines involves a sequence of operations starting with the transmission of a sonar pulse and ending with an operator interpreting the information on a sonar display. A recent improvement to the process stems from the application of neural networks to the computed aided detection of sea mines. The advent of ultrawideband sonar transducers together with pulse compression techniques offers a thousandfold increase in the bandwidth-time product of conventional minehunting sonar transmissions enabling stealth mines to be detected at longer ranges. These wideband signals also enable mines to be imaged at safe standoff distances by applying tomographic image reconstruction techniques. The coupling of wideband transducer technology with synthetic aperture processing enhances the resolution of side scan sonars in both the cross-track and along-track directions. The principles on which conventional and advanced minehunting sonars are based are reviewed and the results of applying novel sonar signal processing algorithms to high-frequency sonar data collected in Australian waters are presented.

  19. Visual enhancement of unmixed multispectral imagery using adaptive smoothing

    USGS Publications Warehouse

    Lemeshewsky, G.P.; Rahman, Z.-U.; Schowengerdt, R.A.; Reichenbach, S.E.

    2004-01-01

    Adaptive smoothing (AS) has been previously proposed as a method to smooth uniform regions of an image, retain contrast edges, and enhance edge boundaries. The method is an implementation of the anisotropic diffusion process which results in a gray scale image. This paper discusses modifications to the AS method for application to multi-band data which results in a color segmented image. The process was used to visually enhance the three most distinct abundance fraction images produced by the Lagrange constraint neural network learning-based unmixing of Landsat 7 Enhanced Thematic Mapper Plus multispectral sensor data. A mutual information-based method was applied to select the three most distinct fraction images for subsequent visualization as a red, green, and blue composite. A reported image restoration technique (partial restoration) was applied to the multispectral data to reduce unmixing error, although evaluation of the performance of this technique was beyond the scope of this paper. The modified smoothing process resulted in a color segmented image with homogeneous regions separated by sharpened, coregistered multiband edges. There was improved class separation with the segmented image, which has importance to subsequent operations involving data classification.

  20. Slaughterhouse wastewater treatment by combined chemical coagulation and electrocoagulation process.

    PubMed

    Bazrafshan, Edris; Kord Mostafapour, Ferdos; Farzadkia, Mehdi; Ownagh, Kamal Aldin; Mahvi, Amir Hossein

    2012-01-01

    Slaughterhouse wastewater contains various and high amounts of organic matter (e.g., proteins, blood, fat and lard). In order to produce an effluent suitable for stream discharge, chemical coagulation and electrocoagulation techniques have been particularly explored at the laboratory pilot scale for organic compounds removal from slaughterhouse effluent. The purpose of this work was to investigate the feasibility of treating cattle-slaughterhouse wastewater by combined chemical coagulation and electrocoagulation process to achieve the required standards. The influence of the operating variables such as coagulant dose, electrical potential and reaction time on the removal efficiencies of major pollutants was determined. The rate of removal of pollutants linearly increased with increasing doses of PACl and applied voltage. COD and BOD(5) removal of more than 99% was obtained by adding 100 mg/L PACl and applied voltage 40 V. The experiments demonstrated the effectiveness of chemical and electrochemical techniques for the treatment of slaughterhouse wastewaters. Consequently, combined processes are inferred to be superior to electrocoagulation alone for the removal of both organic and inorganic compounds from cattle-slaughterhouse wastewater.

  1. Parallel processing approach to transform-based image coding

    NASA Astrophysics Data System (ADS)

    Normile, James O.; Wright, Dan; Chu, Ken; Yeh, Chia L.

    1991-06-01

    This paper describes a flexible parallel processing architecture designed for use in real time video processing. The system consists of floating point DSP processors connected to each other via fast serial links, each processor has access to a globally shared memory. A multiple bus architecture in combination with a dual ported memory allows communication with a host control processor. The system has been applied to prototyping of video compression and decompression algorithms. The decomposition of transform based algorithms for decompression into a form suitable for parallel processing is described. A technique for automatic load balancing among the processors is developed and discussed, results ar presented with image statistics and data rates. Finally techniques for accelerating the system throughput are analyzed and results from the application of one such modification described.

  2. DESIGN OF A P3 EXCHANGE PROGRAM

    EPA Science Inventory

    Numerous pollution prevention techniques have proven themselves to be efficient, effective, and easy answers to environmental difficulties. However, most P2 programs are focused on high technology industries and processes. There is enormous potential to apply existing P2 kno...

  3. Assessing LiDAR elevation data for KDOT applications.

    DOT National Transportation Integrated Search

    2013-02-01

    LiDAR-based elevation surveys are a cost-effective means for mapping topography over large areas. LiDAR : surveys use an airplane-mounted or ground-based laser radar unit to scan terrain. Post-processing techniques are : applied to remove vegetation ...

  4. R&D 100, 2016: T-Quake – Quantum-Mechanical Transmitter/Receiver Microchip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tauke-Pedretti, Anna; Camacho, Ryan; Thayer, Gayle

    2016-11-07

    Applying advanced microfabrication techniques and innovative microdesign, the Sandia Enabled Communications and Authentication Network (SECANT) team has designed and produced photonic microchips capable of sending, receiving, and processing quantum signals for applications in cyber and physical security.

  5. A New Technique for Preparation of High-Grade Titanium Slag from Titanomagnetite Concentrate by Reduction-Melting-Magnetic Separation Processing

    NASA Astrophysics Data System (ADS)

    Lv, Chao; Yang, Kun; Wen, Shu-ming; Bai, Shao-jun; Feng, Qi-cheng

    2017-10-01

    This paper proposes a new technique for preparation of high-grade titanium slag from Panzhihua vanadium titanomagnetite concentrate by reduction-melting-magnetic separation processing. Chemical analysis, x-ray diffraction, and scanning electron microscopy in conjunction with energy-dispersive spectroscopy were used to characterize the samples. The effective separation of iron and titanium slag could be realized by melting metallized pellets at 1550°C for 60 min with the addition of 1% CaO (basicity of 1.1) and 2% graphite powder. The small iron particles embedded in the slag could be removed by fine grinding and magnetic separation process. The grade of TiO2 in the obtained high-grade titanium slag reached 60.68% and the total recovery of TiO2 was 91.25%, which could be directly applied for producing titanium white by the sulfuric acid process. This technique provides an alternative method to use vanadium titanomagnetite concentrate of the Panzhihua area in China.

  6. An adaptive technique to maximize lossless image data compression of satellite images

    NASA Technical Reports Server (NTRS)

    Stewart, Robert J.; Lure, Y. M. Fleming; Liou, C. S. Joe

    1994-01-01

    Data compression will pay an increasingly important role in the storage and transmission of image data within NASA science programs as the Earth Observing System comes into operation. It is important that the science data be preserved at the fidelity the instrument and the satellite communication systems were designed to produce. Lossless compression must therefore be applied, at least, to archive the processed instrument data. In this paper, we present an analysis of the performance of lossless compression techniques and develop an adaptive approach which applied image remapping, feature-based image segmentation to determine regions of similar entropy and high-order arithmetic coding to obtain significant improvements over the use of conventional compression techniques alone. Image remapping is used to transform the original image into a lower entropy state. Several techniques were tested on satellite images including differential pulse code modulation, bi-linear interpolation, and block-based linear predictive coding. The results of these experiments are discussed and trade-offs between computation requirements and entropy reductions are used to identify the optimum approach for a variety of satellite images. Further entropy reduction can be achieved by segmenting the image based on local entropy properties then applying a coding technique which maximizes compression for the region. Experimental results are presented showing the effect of different coding techniques for regions of different entropy. A rule-base is developed through which the technique giving the best compression is selected. The paper concludes that maximum compression can be achieved cost effectively and at acceptable performance rates with a combination of techniques which are selected based on image contextual information.

  7. Prediction in Health Domain Using Bayesian Networks Optimization Based on Induction Learning Techniques

    NASA Astrophysics Data System (ADS)

    Felgaer, Pablo; Britos, Paola; García-Martínez, Ramón

    A Bayesian network is a directed acyclic graph in which each node represents a variable and each arc a probabilistic dependency; they are used to provide: a compact form to represent the knowledge and flexible methods of reasoning. Obtaining it from data is a learning process that is divided in two steps: structural learning and parametric learning. In this paper we define an automatic learning method that optimizes the Bayesian networks applied to classification, using a hybrid method of learning that combines the advantages of the induction techniques of the decision trees (TDIDT-C4.5) with those of the Bayesian networks. The resulting method is applied to prediction in health domain.

  8. A Computer Based Moire Technique To Measure Very Small Displacements

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  9. JIGSAW: Preference-directed, co-operative scheduling

    NASA Technical Reports Server (NTRS)

    Linden, Theodore A.; Gaw, David

    1992-01-01

    Techniques that enable humans and machines to cooperate in the solution of complex scheduling problems have evolved out of work on the daily allocation and scheduling of Tactical Air Force resources. A generalized, formal model of these applied techniques is being developed. It is called JIGSAW by analogy with the multi-agent, constructive process used when solving jigsaw puzzles. JIGSAW begins from this analogy and extends it by propagating local preferences into global statistics that dynamically influence the value and variable ordering decisions. The statistical projections also apply to abstract resources and time periods--allowing more opportunities to find a successful variable ordering by reserving abstract resources and deferring the choice of a specific resource or time period.

  10. Radiation Effects and Hardening Techniques for Spacecraft Microelectronics

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Maki, G. K.

    2002-01-01

    The natural radiation from the Van Allen belts, solar flares, and cosmic rays found outside of the protection of the earth's atmosphere can produce deleterious effects on microelectronics used in space systems. Historically civil space agencies and the commercial satellite industry have been able to utilize components produced in special radiation hardened fabrication process foundries that were developed during the 1970s and 1980s under sponsorship of the Departments of Defense (DoD) and Energy (DoE). In the post--cold war world the DoD and DoE push to advance the rad--hard processes has waned. Today the available rad--hard components lag two-plus technology node generations behind state- of-the-art commercial technologies. As a result space craft designers face a large performance gap when trying to utilize available rad--hard components. Compounding the performance gap problems, rad--hard components are becoming increasingly harder to get. Faced with the economic pitfalls associated with low demand versus the ever increasing investment required for integrated circuit manufacturing equipment most sources of rad--hard parts have simply exited this market in recent years, leaving only two domestic US suppliers of digital rad--hard components. This paper summarizes the radiation induced mechanisms that can cause digital microelectronics to fail in space, techniques that can be applied to mitigate these failure mechanisms, and ground based testing used to validate radiation hardness/tolerance. The radiation hardening techniques can be broken down into two classes, Hardness By Process (HBP) and Hardness By Design (HBD). Fortunately many HBD techniques can be applied to commercial fabrication processes providing space craft designer with radiation tolerant Application Specific Integrated Circuits (ASICs) that can bridge the performance gap between the special HBP foundries and the commercial state-of-the-art performance.

  11. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    PubMed

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  12. Effect of fruit and vegetable processing on reduction of synthetic pyrethroid residues.

    PubMed

    Chauhan, Reena; Kumari, Beena; Rana, M K

    2014-01-01

    In this review, we emphasize that the advantages associated with applying pesticides to enhance agricultural productivity must be weighed against the possible health hazards arising from the appearance of toxic pesticide residues in food. First and foremost, pesticides should be handled and applied in compliance with good agricultural practices to minimize environmental or food commodity contamination.In developing countries, good agricultural practices are not fully abided by.When vegetables are produced in such countries, pesticides are applied or prospectively applied at each growth stage of the crop. Hence, contamination of vegetables and other food commodities occur. It is well known that processing of food derived from pesticide treated crop commodities can serve to reduce residues that reach consumers. Food safety can therefore partially be enhanced by employing suitable food processing techniques and appropriate storage periods, even in developing countries. Even common and simple household processing techniques for certain foods acquire significance as means to reduce the intake of harmful pesticide food residues.Pesticide residue levels in post-harvest raw agricultural commodities (RAC) are affected by the storage, handling and the processing steps they pass through, while being prepared for human consumption. The review of cogent literature presented in this article demonstrated differences among the pyrethroid insecticide residues present on or in foods, depending on how the RAC from which they came were processed for consumption. Peeling vegetables or fruit reduced pyrethroid residues the most (60-100% ), and juicing was nearly as effective in reducing residues (70-100% ). The least reduction occurred for foodstuffs that were only washed with tap water (I 0-70% ). Washing RACs with saline water and detergent was more effective(34-60%) in reducing residues than was simple washing under tap water. Freezing is also effective in reducing residue levels and achieved reductions between 24% and 94%. Cooking of food products eliminated 75-98% of the pesticide residues present, so was also relatively effective. When foods were cooked in oils, however,reductions in pesticide residues were less (45%).

  13. Processing Ultra Wide Band Synthetic Aperture Radar Data with Motion Detectors

    NASA Technical Reports Server (NTRS)

    Madsen, Soren Norvang

    1996-01-01

    Several issues makes the processing of ultra wide band (UWB) SAR data acquired from an airborne platform difficult. The character of UWB data invalidates many of the usual SAR batch processing techniques, leading to the application of wavenumber domain type processors...This paper will suggest and evaluate an algorithm which combines a wavenumber domain processing algorithm with a motion compensation procedure which enables motion compensation to be applied as a function of target range and the azimuth angle.

  14. SOME NEW PROCESSING TECHNIQUES FOR THE IMPERIAL VALLEY 1979 AFTERSHOCKS.

    USGS Publications Warehouse

    Brady, A. Gerald; ,

    1983-01-01

    This paper describes some of the features of the latest processing improvements that the U. S. Geological Survey (USGS) is currently applying to strong-motion accelerograms from the national network of permanent stations. At the same time it introduces the application of this processing to the set of Imperial Valley aftershocks recorded following the main shock of October 15, 1979. Earlier processing of the 22 main shock recordings provided corrected accelerations, velocity and displacement, response spectra, and Fourier spectra.

  15. Applying Lean to the F-15 Maintenance Process for the Royal Saudi Air Force

    DTIC Science & Technology

    2014-03-01

    Royal Saudi Air Force. The research focuses on improving the F-15 maintenance process in the Royal Saudi Air Force’s Maintenance Squadrons. The F-15...and on the aircraft age condition, the researcher concludes it is time to get rid of some obstacles and use new management techniques to resolve the...processes? Research Focus This research is focusing on the Royal Saudi Air Force F-15 maintenance process. Because of the time and

  16. X-ray Absorption Spectroscopy Characterization of Electrochemical Processes in Renewable Energy Storage and Conversion Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmand, Maryam

    2013-05-19

    The development of better energy conversion and storage devices, such as fuel cells and batteries, is crucial for reduction of our global carbon footprint and improving the quality of the air we breathe. However, both of these technologies face important challenges. The development of lower cost and better electrode materials, which are more durable and allow more control over the electrochemical reactions occurring at the electrode/electrolyte interface, is perhaps most important for meeting these challenges. Hence, full characterization of the electrochemical processes that occur at the electrodes is vital for intelligent design of more energy efficient electrodes. X-ray absorption spectroscopymore » (XAS) is a short-range order, element specific technique that can be utilized to probe the processes occurring at operating electrode surfaces, as well for studying the amorphous materials and nano-particles making up the electrodes. It has been increasingly used in recent years to study fuel cell catalysts through application of the and #916; and mgr; XANES technique, in combination with the more traditional X-ray Absorption Near Edge Structure (XANES) and Extended X-ray Absorption Fine Structure (EXAFS) techniques. The and #916; and mgr; XANES data analysis technique, previously developed and applied to heterogeneous catalysts and fuel cell electrocatalysts by the GWU group, was extended in this work to provide for the first time space resolved adsorbate coverages on both electrodes of a direct methanol fuel cell. Even more importantly, the and #916; and mgr; technique was applied for the first time to battery relevant materials, where bulk properties such as the oxidation state and local geometry of a cathode are followed.« less

  17. Edge Detection Method Based on Neural Networks for COMS MI Images

    NASA Astrophysics Data System (ADS)

    Lee, Jin-Ho; Park, Eun-Bin; Woo, Sun-Hee

    2016-12-01

    Communication, Ocean And Meteorological Satellite (COMS) Meteorological Imager (MI) images are processed for radiometric and geometric correction from raw image data. When intermediate image data are matched and compared with reference landmark images in the geometrical correction process, various techniques for edge detection can be applied. It is essential to have a precise and correct edged image in this process, since its matching with the reference is directly related to the accuracy of the ground station output images. An edge detection method based on neural networks is applied for the ground processing of MI images for obtaining sharp edges in the correct positions. The simulation results are analyzed and characterized by comparing them with the results of conventional methods, such as Sobel and Canny filters.

  18. Neural networks for dimensionality reduction of fluorescence spectra and prediction of drinking water disinfection by-products.

    PubMed

    Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C

    2018-06-01

    The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Analysis of cutting force signals by wavelet packet transform for surface roughness monitoring in CNC turning

    NASA Astrophysics Data System (ADS)

    García Plaza, E.; Núñez López, P. J.

    2018-01-01

    On-line monitoring of surface finish in machining processes has proven to be a substantial advancement over traditional post-process quality control techniques by reducing inspection times and costs and by avoiding the manufacture of defective products. This study applied techniques for processing cutting force signals based on the wavelet packet transform (WPT) method for the monitoring of surface finish in computer numerical control (CNC) turning operations. The behaviour of 40 mother wavelets was analysed using three techniques: global packet analysis (G-WPT), and the application of two packet reduction criteria: maximum energy (E-WPT) and maximum entropy (SE-WPT). The optimum signal decomposition level (Lj) was determined to eliminate noise and to obtain information correlated to surface finish. The results obtained with the G-WPT method provided an in-depth analysis of cutting force signals, and frequency ranges and signal characteristics were correlated to surface finish with excellent results in the accuracy and reliability of the predictive models. The radial and tangential cutting force components at low frequency provided most of the information for the monitoring of surface finish. The E-WPT and SE-WPT packet reduction criteria substantially reduced signal processing time, but at the expense of discarding packets with relevant information, which impoverished the results. The G-WPT method was observed to be an ideal procedure for processing cutting force signals applied to the real-time monitoring of surface finish, and was estimated to be highly accurate and reliable at a low analytical-computational cost.

  20. Unsupervised Approaches for Post-Processing in Computationally Efficient Waveform-Similarity-Based Earthquake Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.

    2015-12-01

    Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.

  1. Analysis of randomly time varying systems by gaussian closure technique

    NASA Astrophysics Data System (ADS)

    Dash, P. K.; Iyengar, R. N.

    1982-07-01

    The Gaussian probability closure technique is applied to study the random response of multidegree of freedom stochastically time varying systems under non-Gaussian excitations. Under the assumption that the response, the coefficient and the excitation processes are jointly Gaussian, deterministic equations are derived for the first two response moments. It is further shown that this technique leads to the best Gaussian estimate in a minimum mean square error sense. An example problem is solved which demonstrates the capability of this technique for handling non-linearity, stochastic system parameters and amplitude limited responses in a unified manner. Numerical results obtained through the Gaussian closure technique compare well with the exact solutions.

  2. Min-max hyperellipsoidal clustering for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A

    2006-08-01

    A novel hyperellipsoidal clustering technique is presented for an intrusion-detection system in network security. Hyperellipsoidal clusters toward maximum intracluster similarity and minimum intercluster similarity are generated from training data sets. The novelty of the technique lies in the fact that the parameters needed to construct higher order data models in general multivariate Gaussian functions are incrementally derived from the data sets using accretive processes. The technique is implemented in a feedforward neural network that uses a Gaussian radial basis function as the model generator. An evaluation based on the inclusiveness and exclusiveness of samples with respect to specific criteria is applied to accretively learn the output clusters of the neural network. One significant advantage of this is its ability to detect individual anomaly types that are hard to detect with other anomaly-detection schemes. Applying this technique, several feature subsets of the tcptrace network-connection records that give above 95% detection at false-positive rates below 5% were identified.

  3. Description and detection of burst events in turbulent flows

    NASA Astrophysics Data System (ADS)

    Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.

    2018-04-01

    A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.

  4. Image-Based 3d Reconstruction and Analysis for Orthodontia

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  5. Chemical processing of lunar materials

    NASA Technical Reports Server (NTRS)

    Criswell, D. R.; Waldron, R. D.

    1979-01-01

    The paper highlights recent work on the general problem of processing lunar materials. The discussion covers lunar source materials, refined products, motivations for using lunar materials, and general considerations for a lunar or space processing plant. Attention is given to chemical processing through various techniques, including electrolysis of molten silicates, carbothermic/silicothermic reduction, carbo-chlorination process, NaOH basic-leach process, and HF acid-leach process. Several options for chemical processing of lunar materials are well within the state of the art of applied chemistry and chemical engineering to begin development based on the extensive knowledge of lunar materials.

  6. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    NASA Astrophysics Data System (ADS)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  7. Application of the Delphi technique in healthcare maintenance.

    PubMed

    Njuangang, Stanley; Liyanage, Champika; Akintoye, Akintola

    2017-10-09

    Purpose The purpose of this paper is to examine the research design, issues and considerations in the application of the Delphi technique to identify, refine and rate the critical success factors and performance measures in maintenance-associated infections. Design/methodology/approach In-depth literature review through the application of open and axial coding were applied to formulate the interview and research questions. These were used to conduct an exploratory case study of two healthcare maintenance managers, randomly selected from two National Health Service Foundation Trusts in England. The results of exploratory case study provided the rationale for the application of the Delphi technique in this research. The different processes in the application of the Delphi technique in healthcare research are examined thoroughly. Findings This research demonstrates the need to apply and integrate different research methods to enhance the validity of the Delphi technique. The rationale for the application of the Delphi technique in this research is because some healthcare maintenance managers lack knowledge about basic infection control (IC) principles to make hospitals safe for patient care. The result of first round of the Delphi exercise is a useful contribution in its own rights. It identified a number of salient issues and differences in the opinions of the Delphi participants, noticeably between healthcare maintenance managers and members of the infection control team. It also resulted in useful suggestions and comments to improve the quality and presentation of the second- and third-round Delphi instruments. Practical implications This research provides a research methodology that can be adopted by researchers investigating new and emerging issues in the healthcare sector. As this research demonstrates, the Delphi technique is relevant in soliciting expert knowledge and opinion to identify performance measures to control maintenance-associated infections in hospitals. The methodology provided here could be applied by other researchers elsewhere to probe, investigate and generate rich information about new and emerging healthcare research topics. Originality/value The authors demonstrate how different research methods can be integrated to enhance the validity of the Delphi technique. For example, the results of an exploratory case study provided the rationale for the application of the Delphi technique investigating the key performance measures in maintenance-associated infections. The different processes involved in the application of the Delphi technique are also carefully explored and discussed in depth.

  8. Managing distribution changes in time series prediction

    NASA Astrophysics Data System (ADS)

    Matias, J. M.; Gonzalez-Manteiga, W.; Taboada, J.; Ordonez, C.

    2006-07-01

    When a problem is modeled statistically, a single distribution model is usually postulated that is assumed to be valid for the entire space. Nonetheless, this practice may be somewhat unrealistic in certain application areas, in which the conditions of the process that generates the data may change; as far as we are aware, however, no techniques have been developed to tackle this problem.This article proposes a technique for modeling and predicting this change in time series with a view to improving estimates and predictions. The technique is applied, among other models, to the hypernormal distribution recently proposed. When tested on real data from a range of stock market indices the technique produces better results that when a single distribution model is assumed to be valid for the entire period of time studied.Moreover, when a global model is postulated, it is highly recommended to select the hypernormal distribution parameter in the same likelihood maximization process.

  9. The Living Cell as a Multi-agent Organisation: A Compositional Organisation Model of Intracellular Dynamics

    NASA Astrophysics Data System (ADS)

    Jonker, C. M.; Snoep, J. L.; Treur, J.; Westerhoff, H. V.; Wijngaards, W. C. A.

    Within the areas of Computational Organisation Theory and Artificial Intelligence, techniques have been developed to simulate and analyse dynamics within organisations in society. Usually these modelling techniques are applied to factories and to the internal organisation of their process flows, thus obtaining models of complex organisations at various levels of aggregation. The dynamics in living cells are often interpreted in terms of well-organised processes, a bacterium being considered a (micro)factory. This suggests that organisation modelling techniques may also benefit their analysis. Using the example of Escherichia coli it is shown how indeed agent-based organisational modelling techniques can be used to simulate and analyse E.coli's intracellular dynamics. Exploiting the abstraction levels entailed by this perspective, a concise model is obtained that is readily simulated and analysed at the various levels of aggregation, yet shows the cell's essential dynamic patterns.

  10. Spectroscopic vector analysis for fast pattern quality monitoring

    NASA Astrophysics Data System (ADS)

    Sohn, Younghoon; Ryu, Sungyoon; Lee, Chihoon; Yang, Yusin

    2018-03-01

    In semiconductor industry, fast and effective measurement of pattern variation has been key challenge for assuring massproduct quality. Pattern measurement techniques such as conventional CD-SEMs or Optical CDs have been extensively used, but these techniques are increasingly limited in terms of measurement throughput and time spent in modeling. In this paper we propose time effective pattern monitoring method through the direct spectrum-based approach. In this technique, a wavelength band sensitive to a specific pattern change is selected from spectroscopic ellipsometry signal scattered by pattern to be measured, and the amplitude and phase variation in the wavelength band are analyzed as a measurement index of the pattern change. This pattern change measurement technique is applied to several process steps and verified its applicability. Due to its fast and simple analysis, the methods can be adapted to the massive process variation monitoring maximizing measurement throughput.

  11. Extensible packet processing architecture

    DOEpatents

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  12. Burn-injured tissue detection for debridement surgery through the combination of non-invasive optical imaging techniques.

    PubMed

    Heredia-Juesas, Juan; Thatcher, Jeffrey E; Lu, Yang; Squiers, John J; King, Darlene; Fan, Wensheng; DiMaio, J Michael; Martinez-Lorenzo, Jose A

    2018-04-01

    The process of burn debridement is a challenging technique requiring significant skills to identify the regions that need excision and their appropriate excision depths. In order to assist surgeons, a machine learning tool is being developed to provide a quantitative assessment of burn-injured tissue. This paper presents three non-invasive optical imaging techniques capable of distinguishing four kinds of tissue-healthy skin, viable wound bed, shallow burn, and deep burn-during serial burn debridement in a porcine model. All combinations of these three techniques have been studied through a k-fold cross-validation method. In terms of global performance, the combination of all three techniques significantly improves the classification accuracy with respect to just one technique, from 0.42 up to more than 0.76. Furthermore, a non-linear spatial filtering based on the mode of a small neighborhood has been applied as a post-processing technique, in order to improve the performance of the classification. Using this technique, the global accuracy reaches a value close to 0.78 and, for some particular tissues and combination of techniques, the accuracy improves by 13%.

  13. Using Diffusion Bonding in Making Piezoelectric Actuators

    NASA Technical Reports Server (NTRS)

    Sager, Frank E.

    2003-01-01

    A technique for the fabrication of piezoelectric actuators that generate acceptably large forces and deflections at relatively low applied voltages involves the stacking and diffusion bonding of multiple thin piezoelectric layers coated with film electrodes. The present technique stands in contrast to an older technique in which the layers are bonded chemically, by use of urethane or epoxy agents. The older chemical-bonding technique entails several disadvantages, including the following: It is difficult to apply the bonding agents to the piezoelectric layers. It is difficult to position the layers accurately and without making mistakes. There is a problem of disposal of hazardous urethane and epoxy wastes. The urethane and epoxy agents are nonpiezoelectric materials. As such, they contribute to the thickness of a piezoelectric laminate without contributing to its performance; conversely, for a given total thickness, the performance of the laminate is below that of a unitary piezoelectric plate of the same thickness. The figure depicts some aspects of the fabrication of a laminated piezoelectric actuator by the present diffusion- bonding technique. First, stock sheets of the piezoelectric material are inspected and tested. Next, the hole pattern shown in the figure is punched into the sheets. Alternatively, if the piezoelectric material is not a polymer, then the holes are punched in thermoplastic films. Then both faces of each punched piezoelectric sheet or thermoplastic film are coated with a silver-ink electrode material by use of a silkscreen printer. The electrode and hole patterns are designed for minimal complexity and minimal waste of material. After a final electrical test, all the coated piezoelectric layers (or piezoelectric layers and coated thermoplastic films) are stacked in an alignment jig, which, in turn, is placed in a curved press for the diffusion-bonding process. In this process, the stack is pressed and heated at a specified curing temperature and pressure for a specified curing time. The pressure, temperature, and time depend on the piezoelectric material selected. At the end of the diffusion-bonding process, the resulting laminated piezoelectric actuator is tested to verify the adequacy of the mechanical output as a function of an applied DC voltage.

  14. Capillary electrophoresis of inorganic anions.

    PubMed

    Kaniansky, D; Masár, M; Marák, J; Bodor, R

    1999-02-26

    This review deals with the separation mechanisms applied to the separation of inorganic anions by capillary electrophoresis (CE) techniques. It covers various CE techniques that are suitable for the separation and/or determination of inorganic anions in various matrices, including capillary zone electrophoresis, micellar electrokinetic chromatography, electrochromatography and capillary isotachophoresis. Detection and sample preparation techniques used in CE separations are also reviewed. An extensive part of this review deals with applications of CE techniques in various fields (environmental, food and plant materials, biological and biomedical, technical materials and industrial processes). Attention is paid to speciations of anions of arsenic, selenium, chromium, phosphorus, sulfur and halogen elements by CE.

  15. Analysis of Size Correlations for Microdroplets Produced by Ultrasonic Atomization

    PubMed Central

    Barba, Anna Angela; d'Amore, Matteo

    2013-01-01

    Microencapsulation techniques are widely applied in the field of pharmaceutical production to control drugs release in time and in physiological environments. Ultrasonic-assisted atomization is a new technique to produce microencapsulated systems by a mechanical approach. Interest in this technique is due to the advantages evidenceable (low level of mechanical stress in materials, reduced energy request, reduced apparatuses size) when comparing it to more conventional techniques. In this paper, the groundwork of atomization is introduced, the role of relevant parameters in ultrasonic atomization mechanism is discussed, and correlations to predict droplets size starting from process parameters and material properties are presented and tested. PMID:24501580

  16. Coal liquefaction process streams characterization and evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, G.; Davis, A.; Burke, F.P.

    1991-12-01

    This study demonstrated the use of the gold tube carbonization technique and reflectance microscopy analysis for the examination of process-derived materials from direct coal liquefaction. The carbonization technique, which was applied to coal liquefaction distillation resids, yields information on the amounts of gas plus distillate, pyridine-soluble resid, and pyridine-insoluble material formed when a coal liquid sample is heated to 450{degree}C for one hour at 5000 psi in an inert atmosphere. The pyridine-insolubles then are examined by reflectance microscopy to determine the type, amount, and optical texture of isotropic and anisotropic carbon formed upon carbonization. Further development of these analytical methodsmore » as process development tools may be justified on the basis of these results.« less

  17. Electrochemical Processes Enhanced by Acoustic Liquid Manipulation

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.

    2004-01-01

    Acoustic liquid manipulation is a family of techniques that employ the nonlinear acoustic effects of acoustic radiation pressure and acoustic streaming to manipulate the behavior of liquids. Researchers at the NASA Glenn Research Center are exploring new methods of manipulating liquids for a variety of space applications, and we have found that acoustic techniques may also be used in the normal Earth gravity environment to enhance the performance of existing fluid processes. Working in concert with the NASA Commercial Technology Office, the Great Lakes Industrial Technology Center, and Alchemitron Corporation (Elgin, IL), researchers at Glenn have applied nonlinear acoustic principles to industrial applications. Collaborating with Alchemitron Corporation, we have adapted the devices to create acoustic streaming in a conventional electroplating process.

  18. Urban land use monitoring from computer-implemented processing of airborne multispectral data

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.

    1976-01-01

    Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.

  19. Development of data processing, interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.

    1987-01-01

    The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.

  20. Systems Theoretic Process Analysis Applied to an Offshore Supply Vessel Dynamic Positioning System

    DTIC Science & Technology

    2016-06-01

    additional safety issues that were either not identified or inadequately mitigated through the use of Fault Tree Analysis and Failure Modes and...Techniques ...................................................................................................... 15 1.3.1. Fault Tree Analysis...49 3.2. Fault Tree Analysis Comparison

  1. Assessing LiDAR elevation data for KDOT applications : [technical summary].

    DOT National Transportation Integrated Search

    2013-02-01

    LiDAR-based elevation surveys : are a cost-effective means for : mapping topography over large : areas. LiDAR surveys use an : airplane-mounted or ground-based : laser radar unit to scan terrain. : Post-processing techniques are : applied to remove v...

  2. R&D 100, 2016: T-Quake – Quantum-Mechanical Transmitter/Receiver Microchip

    ScienceCinema

    Tauke-Pedretti, Anna; Camacho, Ryan; Thayer, Gayle

    2018-06-13

    Applying advanced microfabrication techniques and innovative microdesign, the Sandia Enabled Communications and Authentication Network (SECANT) team has designed and produced photonic microchips capable of sending, receiving, and processing quantum signals for applications in cyber and physical security.

  3. 23 CFR 450.306 - Scope of the metropolitan transportation planning process.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: (1) Support the economic vitality of the metropolitan area, especially by enabling global...; (7) Promote efficient system management and operation; and (8) Emphasize the preservation of the... transportation operators may apply asset management principles and techniques in establishing planning goals...

  4. 23 CFR 450.306 - Scope of the metropolitan transportation planning process.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: (1) Support the economic vitality of the metropolitan area, especially by enabling global...; (7) Promote efficient system management and operation; and (8) Emphasize the preservation of the... transportation operators may apply asset management principles and techniques in establishing planning goals...

  5. 23 CFR 450.306 - Scope of the metropolitan transportation planning process.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: (1) Support the economic vitality of the metropolitan area, especially by enabling global...; (7) Promote efficient system management and operation; and (8) Emphasize the preservation of the... transportation operators may apply asset management principles and techniques in establishing planning goals...

  6. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques

    PubMed Central

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Galietti, Umberto

    2017-01-01

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength. PMID:29019948

  7. Estimation of urban runoff and water quality using remote sensing and artificial intelligence.

    PubMed

    Ha, S R; Park, S Y; Park, D H

    2003-01-01

    Water quality and quantity of runoff are strongly dependent on the landuse and landcover (LULC) criteria. In this study, we developed a more improved parameter estimation procedure for the environmental model using remote sensing (RS) and artificial intelligence (AI) techniques. Landsat TM multi-band (7bands) and Korea Multi-Purpose Satellite (KOMPSAT) panchromatic data were selected for input data processing. We employed two kinds of artificial intelligence techniques, RBF-NN (radial-basis-function neural network) and ANN (artificial neural network), to classify LULC of the study area. A bootstrap resampling method, a statistical technique, was employed to generate the confidence intervals and distribution of the unit load. SWMM was used to simulate the urban runoff and water quality and applied to the study watershed. The condition of urban flow and non-point contaminations was simulated with rainfall-runoff and measured water quality data. The estimated total runoff, peak time, and pollutant generation varied considerably according to the classification accuracy and percentile unit load applied. The proposed procedure would efficiently be applied to water quality and runoff simulation in a rapidly changing urban area.

  8. Applied learning-based color tone mapping for face recognition in video surveillance system

    NASA Astrophysics Data System (ADS)

    Yew, Chuu Tian; Suandi, Shahrel Azmin

    2012-04-01

    In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.

  9. Novel casting processes for single-crystal turbine blades of superalloys

    NASA Astrophysics Data System (ADS)

    Ma, Dexin

    2018-03-01

    This paper presents a brief review of the current casting techniques for single-crystal (SC) blades, as well as an analysis of the solidification process in complex turbine blades. A series of novel casting methods based on the Bridgman process were presented to illustrate the development in the production of SC blades from superalloys. The grain continuator and the heat conductor techniques were developed to remove geometry-related grain defects. In these techniques, the heat barrier that hinders lateral SC growth from the blade airfoil into the extremities of the platform is minimized. The parallel heating and cooling system was developed to achieve symmetric thermal conditions for SC solidification in blade clusters, thus considerably decreasing the negative shadow effect and its related defects in the current Bridgman process. The dipping and heaving technique, in which thinshell molds are utilized, was developed to enable the establishment of a high temperature gradient for SC growth and the freckle-free solidification of superalloy castings. Moreover, by applying the targeted cooling and heating technique, a novel concept for the three-dimensional and precise control of SC growth, a proper thermal arrangement may be dynamically established for the microscopic control of SC growth in the critical areas of large industrial gas turbine blades.

  10. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    PubMed

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  11. X-ray micro-beam techniques and phase contrast tomography applied to biomaterials

    NASA Astrophysics Data System (ADS)

    Fratini, Michela; Campi, Gaetano; Bukreeva, Inna; Pelliccia, Daniele; Burghammer, Manfred; Tromba, Giuliana; Cancedda, Ranieri; Mastrogiacomo, Maddalena; Cedola, Alessia

    2015-12-01

    A deeper comprehension of the biomineralization (BM) process is at the basis of tissue engineering and regenerative medicine developments. Several in-vivo and in-vitro studies were dedicated to this purpose via the application of 2D and 3D diagnostic techniques. Here, we develop a new methodology, based on different complementary experimental techniques (X-ray phase contrast tomography, micro-X-ray diffraction and micro-X-ray fluorescence scanning technique) coupled to new analytical tools. A qualitative and quantitative structural investigation, from the atomic to the micrometric length scale, is obtained for engineered bone tissues. The high spatial resolution achieved by X-ray scanning techniques allows us to monitor the bone formation at the first-formed mineral deposit at the organic-mineral interface within a porous scaffold. This work aims at providing a full comprehension of the morphology and functionality of the biomineralization process, which is of key importance for developing new drugs for preventing and healing bone diseases and for the development of bio-inspired materials.

  12. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    NASA Astrophysics Data System (ADS)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  13. Fission gas bubble identification using MATLAB's image processing toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collette, R.

    Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. This study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding proved to bemore » the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods. - Highlights: •Automated image processing can aid in the fuel qualification process. •Routines are developed to characterize fission gas bubbles in irradiated U–Mo fuel. •Frequency domain filtration effectively eliminates FIB curtaining artifacts. •Adaptive thresholding proved to be the most accurate segmentation method. •The techniques established are ready to be applied to large scale data extraction testing.« less

  14. An Investigation of Network Enterprise Risk Management Techniques to Support Military Net-Centric Operations

    DTIC Science & Technology

    2009-09-01

    this information supports the decison - making process as it is applied to the management of risk. 2. Operational Risk Operational risk is the threat... reasonability . However, to make a software system fault tolerant, the system needs to recognize and fix a system state condition. To detect a fault, a fault...Tracking ..........................................51 C. DECISION- MAKING PROCESS................................................................51 1. Risk

  15. Applying activity-based costing to healthcare settings.

    PubMed

    Canby, J B

    1995-02-01

    Activity-based costing (ABC) focuses on processes that drive cost. By tracing healthcare activities back to events that generate cost, a more accurate measurement of financial performance is possible. This article uses ABC principles and techniques to determine costs associated with the x-ray process in a midsized outpatient clinic. The article also provides several tips for initiating an ABC cost system for an entire healthcare organization.

  16. Response Manual for Combating Spills of Floating Hazardous CHRIS chemicals

    DTIC Science & Technology

    1989-01-01

    CHRIS Chemicals CHRIS Chemical Name Code Floatability PARAFORMALDEHYDE PFA No PARALDEHYDE PDH No PARATHION PTO No PENTABORANE PTB No PENTACHLOROETHANE...5.2.1.3 Weir Skimmers ................................ 76 5.2.2 Chemical Removal Techniques ........................... 77 5.2.2.1 Sorption ...an- ditions such as high winds and rain 5.2.2.1 Sorption Sorption is commonly applied in water treatment processes. Being a surface process

  17. Two pass method and radiation interchange processing when applied to thermal-structural analysis of large space truss structures

    NASA Technical Reports Server (NTRS)

    Warren, Andrew H.; Arelt, Joseph E.; Lalicata, Anthony L.; Rogers, Karen M.

    1993-01-01

    A method of efficient and automated thermal-structural processing of very large space structures is presented. The method interfaces the finite element and finite difference techniques. It also results in a pronounced reduction of the quantity of computations, computer resources and manpower required for the task, while assuring the desired accuracy of the results.

  18. Dynamics and Stability of Acoustic Wavefronts in the Ocean

    DTIC Science & Technology

    2014-09-30

    processes on underwater acoustic fields. The 3-D HWT algorithm was also applied to investigate long- range propagation of infrasound in the atmosphere...oceanographic processes on underwater sound propagation and also has been demonstrated to be an efficient and robust technique for modeling infrasound ...algorithm by modeling propagation of infrasound generated by Eyjafjallajökull volcano in southern Iceland. Eruptions of this volcano were recorded by

  19. Using wavelet denoising and mathematical morphology in the segmentation technique applied to blood cells images.

    PubMed

    Boix, Macarena; Cantó, Begoña

    2013-04-01

    Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells.

  20. Mathematical Model and Artificial Intelligent Techniques Applied to a Milk Industry through DSM

    NASA Astrophysics Data System (ADS)

    Babu, P. Ravi; Divya, V. P. Sree

    2011-08-01

    The resources for electrical energy are depleting and hence the gap between the supply and the demand is continuously increasing. Under such circumstances, the option left is optimal utilization of available energy resources. The main objective of this chapter is to discuss about the Peak load management and overcome the problems associated with it in processing industries such as Milk industry with the help of DSM techniques. The chapter presents a generalized mathematical model for minimizing the total operating cost of the industry subject to the constraints. The work presented in this chapter also deals with the results of application of Neural Network, Fuzzy Logic and Demand Side Management (DSM) techniques applied to a medium scale milk industrial consumer in India to achieve the improvement in load factor, reduction in Maximum Demand (MD) and also the consumer gets saving in the energy bill.

  1. Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José

    This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  2. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  3. The requirements and feasibility of business planning in the office of space and terrestrial applications

    NASA Technical Reports Server (NTRS)

    Greenberg, J. S.; Miller, B. P.

    1979-01-01

    The feasibility of applying strategic business planning techniques which are developed and used in the private sector to the planning of certain projects within the NASA Office of Space and Terrestrial Applications was assessed. The methods of strategic business planning that are currently in use in the private sector are examined. The typical contents of a private sector strategic business plan and the techniques commonly used to develop the contents of the plan are described, along with modifications needed to apply these concepts to public sector projects. The current long-range planning process in the Office of Space and Terrestrial Applications is reviewed and program initiatives that might be candidates for the use of strategic business planning techniques are identified. In order to more fully illustrate the information requirements of a strategic business plan for a NASA program, a sample business plan is prepared for a hypothetical Operational Earth Resources Satellite program.

  4. Autofocusing and Polar Body Detection in Automated Cell Manipulation.

    PubMed

    Wang, Zenan; Feng, Chen; Ang, Wei Tech; Tan, Steven Yih Min; Latt, Win Tun

    2017-05-01

    Autofocusing and feature detection are two essential processes for performing automated biological cell manipulation tasks. In this paper, we have introduced a technique capable of focusing on a holding pipette and a mammalian cell under a bright-field microscope automatically, and a technique that can detect and track the presence and orientation of the polar body of an oocyte that is rotated at the tip of a micropipette. Both algorithms were evaluated by using mouse oocytes. Experimental results show that both algorithms achieve very high success rates: 100% and 96%. As robust and accurate image processing methods, they can be widely applied to perform various automated biological cell manipulations.

  5. Compressed sensing system considerations for ECG and EMG wireless biosensors.

    PubMed

    Dixon, Anna M R; Allstot, Emily G; Gangopadhyay, Daibashish; Allstot, David J

    2012-04-01

    Compressed sensing (CS) is an emerging signal processing paradigm that enables sub-Nyquist processing of sparse signals such as electrocardiogram (ECG) and electromyogram (EMG) biosignals. Consequently, it can be applied to biosignal acquisition systems to reduce the data rate to realize ultra-low-power performance. CS is compared to conventional and adaptive sampling techniques and several system-level design considerations are presented for CS acquisition systems including sparsity and compression limits, thresholding techniques, encoder bit-precision requirements, and signal recovery algorithms. Simulation studies show that compression factors greater than 16X are achievable for ECG and EMG signals with signal-to-quantization noise ratios greater than 60 dB.

  6. Performance assessment in algebra learning process

    NASA Astrophysics Data System (ADS)

    Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar

    2017-12-01

    The purpose of research to describe the implementation of performance assessment on algebra learning process. The subject in this research is math educator of SMAN 1 Ngawi class X. This research includes descriptive qualitative research type. Techniques of data collecting are done by observation method, interview, and documentation. Data analysis technique is done by data reduction, data presentation, and conclusion. The results showed any indication that the steps taken by the educator in applying the performance assessment are 1) preparing individual worksheets and group worksheets, 2) preparing rubric assessments for independent worksheets and groups and 3) making performance assessments rubric to learners’ performance results with individual or groups task.

  7. Investigations in adaptive processing of multispectral data

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Horwitz, H. M.

    1973-01-01

    Adaptive data processing procedures are applied to the problem of classifying objects in a scene scanned by multispectral sensor. These procedures show a performance improvement over standard nonadaptive techniques. Some sources of error in classification are identified and those correctable by adaptive processing are discussed. Experiments in adaptation of signature means by decision-directed methods are described. Some of these methods assume correlation between the trajectories of different signature means; for others this assumption is not made.

  8. Method of measuring metal coating adhesion

    DOEpatents

    Roper, J.R.

    A method for measuring metal coating adhesion to a substrate material comprising the steps of preparing a test coupon of substrate material having the metal coating applied to one surface thereof, applying a second metal coating of gold or silver to opposite surfaces of the test coupon by hot hollow cathode process, applying a coating to one end of each of two pulling rod members, joining the coated ends of the pulling rod members to said opposite coated surfaces of the test coupon by a solid state bonding technique and finally applying instrumented static tensile loading to the pulling rod members until fracture of the metal coating adhesion to the substrate material occurs.

  9. Method of measuring metal coating adhesion

    DOEpatents

    Roper, John R.

    1985-01-01

    A method for measuring metal coating adhesion to a substrate material comprising the steps of preparing a test coupon of substrate material having the metal coating applied to one surface thereof, applying a second metal coating of gold or silver to opposite surfaces of the test coupon by hot hollow cathode process, applying a coating to one end of each of two pulling rod members, joining the coated ends of the pulling rod members to said opposite coated surfaces of the test coupon by a solid state bonding technique and finally applying instrumented static tensile loading to the pulling rod members until fracture of the metal coating adhesion to the substrate material occurs.

  10. A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.

    2011-01-01

    An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.

  11. Introducing Interactive Teaching Styles into Astronomy Lectures

    NASA Astrophysics Data System (ADS)

    Deming, G. L.

    1997-12-01

    The majority of undergraduate students who take an astronomy class are non-science majors attempting to satisfy a science requirement. Often in these "scientific literacy" courses, facts are memorized for the exam and forgotten shortly afterwards. Scientific literacy courses should advance student skills toward processing information and applying higher order thinking rather than simple recall and memorization of facts. Thinking about material as it is presented, applying new knowledge to solve problems, and thinking critically about topics are objectives that many astronomy instructors hope their students are achieving. A course in astronomy is more likely to achieve such goals if students routinely participate in their learning. Interactive techniques can be quite effective even in large classes. Examples of activities are presented that involve using cooperative learning techniques, writing individual and group "minute papers," identifying and correcting misconceptions, including the whole class in a demonstration, and applying knowledge to new situations.

  12. Technical Note: Detection of gas bubble leakage via correlation of water column multibeam images

    NASA Astrophysics Data System (ADS)

    Schneider von Deimling, J.; Papenberg, C.

    2011-07-01

    Hydroacoustic detection of natural gas release from the seafloor has been conducted in the past by using singlebeam echosounders. In contrast modern multibeam swath mapping systems allow much wider coverage, higher resolution, and offer 3-D spatial correlation. However, up to the present, the extremely high data rate hampers water column backscatter investigations. More sophisticated visualization and processing techniques for water column backscatter analysis are still under development. We here present such water column backscattering data gathered with a 50 kHz prototype multibeam system. Water column backscattering data is presented in videoframes grabbed over 75 s and a "re-sorted" singlebeam presentation. Thus individual gas bubbles rising from the 24 m deep seafloor clearly emerge in the acoustic images and rise velocities can be determined. A sophisticated processing scheme is introduced to identify those rising gas bubbles in the hydroacoustic data. It applies a cross-correlation technique similar to that used in Particle Imaging Velocimetry (PIV) to the acoustic backscatter images. Tempo-spatial drift patterns of the bubbles are assessed and match very well measured and theoretical rise patterns. The application of this processing scheme to our field data gives impressive results with respect to unambiguous bubble detection and remote bubble rise velocimetry. The method can identify and exclude the main driver for misinterpretations, i.e. fish-mediated echoes. Even though image-based cross-correlation techniques are well known in the field of fluid mechanics for high resolution and non-inversive current flow field analysis, this technique was never applied in the proposed sense for an acoustic bubble detector.

  13. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  14. Optimal cure cycle design of a resin-fiber composite laminate

    NASA Technical Reports Server (NTRS)

    Hou, Jean W.; Hou, Tan H.; Sheen, Jeen S.

    1987-01-01

    Fibers reinforced composites are used in many applications. The composite parts and structures are often manufactured by curing the prepreg or unmolded material. The magnitudes and durations of the cure temperature and the cure pressure applied during the cure process have significant consequences on the performance of the finished product. The goal of this study is to exploit the potential of applying the optimization technique to the cure cycle design. The press molding process of a polyester is used as an example. Various optimization formulations for the cure cycle design are investigated. Recommendations are given for further research in computerizing the cure cycle design.

  15. Electro-pumped whispering gallery mode ZnO microlaser array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, G. Y.; State Key Laboratory of Bioelectronics, School of Electronic Science and Engineering, Southeast University, Nanjing 210096; Li, J. T.

    2015-01-12

    By employing vapor-phase transport method, ZnO microrods are fabricated and directly assembled on p-GaN substrate to form a heterostructural microlaser array, which avoids of the relatively complicated etching process comparing previous work. Under applied forward bias, whispering gallery mode ZnO ultraviolet lasing is obtained from the as-fabricated heterostructural microlaser array. The device's electroluminescence originates from three distinct electron-hole recombination processes in the heterojunction interface, and whispering gallery mode ultraviolet lasing is obtained when the applied voltage is beyond the lasing threshold. This work may present a significant step towards future fabrication of a facile technique for micro/nanolasers.

  16. Evaluating Quality of Decision-Making Processes in Medicines' Development, Regulatory Review, and Health Technology Assessment: A Systematic Review of the Literature

    PubMed Central

    Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R.; Salek, Sam

    2017-01-01

    Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability. PMID:28443022

  17. Potential application of quantitative microbiological risk assessment techniques to an aseptic-UHT process in the food industry.

    PubMed

    Pujol, Laure; Albert, Isabelle; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2013-04-01

    Aseptic ultra-high-temperature (UHT)-type processed food products (e.g., milk or soup) are ready to eat products which are consumed extensively globally due to a combination of their comparative high quality and long shelf life, with no cold chain or other preservation requirements. Due to the inherent microbial vulnerability of aseptic-UHT product formulations, the safety and stability-related performance objectives (POs) required at the end of the manufacturing process are the most demanding found in the food industry. The key determinants to achieving sterility, and which also differentiates aseptic-UHT from in-pack sterilised products, are the challenges associated with the processes of aseptic filling and sealing. This is a complex process that has traditionally been run using deterministic or empirical process settings. Quantifying the risk of microbial contamination and recontamination along the aseptic-UHT process, using the scientifically based process quantitative microbial risk assessment (QMRA), offers the possibility to improve on the currently tolerable sterility failure rate (i.e., 1 defect per 10,000 units). In addition, benefits of applying QMRA are (i) to implement process settings in a transparent and scientific manner; (ii) to develop a uniform common structure whatever the production line, leading to a harmonisation of these process settings, and; (iii) to bring elements of a cost-benefit analysis of the management measures. The objective of this article is to explore how QMRA techniques and risk management metrics may be applied to aseptic-UHT-type processed food products. In particular, the aseptic-UHT process should benefit from a number of novel mathematical and statistical concepts that have been developed in the field of QMRA. Probabilistic techniques such as Monte Carlo simulation, Bayesian inference and sensitivity analysis, should help in assessing the compliance with safety and stability-related POs set at the end of the manufacturing process. The understanding of aseptic-UHT process contamination will be extended beyond the current "as-low-as-reasonably-achievable" targets to a risk-based framework, through which current sterility performance and future process designs can be optimised. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. PETPVC: a toolbox for performing partial volume correction techniques in positron emission tomography

    NASA Astrophysics Data System (ADS)

    Thomas, Benjamin A.; Cuplov, Vesna; Bousse, Alexandre; Mendes, Adriana; Thielemans, Kris; Hutton, Brian F.; Erlandsson, Kjell

    2016-11-01

    Positron emission tomography (PET) images are degraded by a phenomenon known as the partial volume effect (PVE). Approaches have been developed to reduce PVEs, typically through the utilisation of structural information provided by other imaging modalities such as MRI or CT. These methods, known as partial volume correction (PVC) techniques, reduce PVEs by compensating for the effects of the scanner resolution, thereby improving the quantitative accuracy. The PETPVC toolbox described in this paper comprises a suite of methods, both classic and more recent approaches, for the purposes of applying PVC to PET data. Eight core PVC techniques are available. These core methods can be combined to create a total of 22 different PVC techniques. Simulated brain PET data are used to demonstrate the utility of toolbox in idealised conditions, the effects of applying PVC with mismatched point-spread function (PSF) estimates and the potential of novel hybrid PVC methods to improve the quantification of lesions. All anatomy-based PVC techniques achieve complete recovery of the PET signal in cortical grey matter (GM) when performed in idealised conditions. Applying deconvolution-based approaches results in incomplete recovery due to premature termination of the iterative process. PVC techniques are sensitive to PSF mismatch, causing a bias of up to 16.7% in GM recovery when over-estimating the PSF by 3 mm. The recovery of both GM and a simulated lesion was improved by combining two PVC techniques together. The PETPVC toolbox has been written in C++, supports Windows, Mac and Linux operating systems, is open-source and publicly available.

  19. New signal processing technique for density profile reconstruction using reflectometry.

    PubMed

    Clairet, F; Ricaud, B; Briolle, F; Heuraux, S; Bottereau, C

    2011-08-01

    Reflectometry profile measurement requires an accurate determination of the plasma reflected signal. Along with a good resolution and a high signal to noise ratio of the phase measurement, adequate data analysis is required. A new data processing based on time-frequency tomographic representation is used. It provides a clearer separation between multiple components and improves isolation of the relevant signals. In this paper, this data processing technique is applied to two sets of signals coming from two different reflectometer devices used on the Tore Supra tokamak. For the standard density profile reflectometry, it improves the initialization process and its reliability, providing a more accurate profile determination in the far scrape-off layer with density measurements as low as 10(16) m(-1). For a second reflectometer, which provides measurements in front of a lower hybrid launcher, this method improves the separation of the relevant plasma signal from multi-reflection processes due to the proximity of the plasma.

  20. Assessment of hospital processes using a process mining technique: Outpatient process analysis at a tertiary hospital.

    PubMed

    Yoo, Sooyoung; Cho, Minsu; Kim, Eunhye; Kim, Seok; Sim, Yerim; Yoo, Donghyun; Hwang, Hee; Song, Minseok

    2016-04-01

    Many hospitals are increasing their efforts to improve processes because processes play an important role in enhancing work efficiency and reducing costs. However, to date, a quantitative tool has not been available to examine the before and after effects of processes and environmental changes, other than the use of indirect indicators, such as mortality rate and readmission rate. This study used process mining technology to analyze process changes based on changes in the hospital environment, such as the construction of a new building, and to measure the effects of environmental changes in terms of consultation wait time, time spent per task, and outpatient care processes. Using process mining technology, electronic health record (EHR) log data of outpatient care before and after constructing a new building were analyzed, and the effectiveness of the technology in terms of the process was evaluated. Using the process mining technique, we found that the total time spent in outpatient care did not increase significantly compared to that before the construction of a new building, considering that the number of outpatients increased, and the consultation wait time decreased. These results suggest that the operation of the outpatient clinic was effective after changes were implemented in the hospital environment. We further identified improvements in processes using the process mining technique, thereby demonstrating the usefulness of this technique for analyzing complex hospital processes at a low cost. This study confirmed the effectiveness of process mining technology at an actual hospital site. In future studies, the use of process mining technology will be expanded by applying this approach to a larger variety of process change situations. Copyright © 2016. Published by Elsevier Ireland Ltd.

  1. On-line coupling of supercritical fluid extraction and chromatographic techniques.

    PubMed

    Sánchez-Camargo, Andrea Del Pilar; Parada-Alfonso, Fabián; Ibáñez, Elena; Cifuentes, Alejandro

    2017-01-01

    This review summarizes and discusses recent advances and applications of on-line supercritical fluid extraction coupled to liquid chromatography, gas chromatography, and supercritical fluid chromatographic techniques. Supercritical fluids, due to their exceptional physical properties, provide unique opportunities not only during the extraction step but also in the separation process. Although supercritical fluid extraction is especially suitable for recovery of non-polar organic compounds, this technique can also be successfully applied to the extraction of polar analytes by the aid of modifiers. Supercritical fluid extraction process can be performed following "off-line" or "on-line" approaches and their main features are contrasted herein. Besides, the parameters affecting the supercritical fluid extraction process are explained and a "decision tree" is for the first time presented in this review work as a guide tool for method development. The general principles (instrumental and methodological) of the different on-line couplings of supercritical fluid extraction with chromatographic techniques are described. Advantages and shortcomings of supercritical fluid extraction as hyphenated technique are discussed. Besides, an update of the most recent applications (from 2005 up to now) of the mentioned couplings is also presented in this review. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    PubMed

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  3. Collaborative care: Using six thinking hats for decision making.

    PubMed

    Cioffi, Jane Marie

    2017-12-01

    To apply six thinking hats technique for decision making in collaborative care. In collaborative partnerships, effective communications need to occur in patient, family, and health care professional meetings. The effectiveness of these meetings depends on the engagement of participants and the quality of the meeting process. The use of six thinking hats technique to engage all participants in effective dialogue is proposed. Discussion paper. Electronic databases, CINAHL, Pub Med, and Science Direct, were searched for years 1990 to 2017. Using six thinking hats technique in patient family meetings nurses can guide a process of dialogue that focuses decision making to build equal care partnerships inclusive of all participants. Nurses will need to develop the skills for using six thinking hats technique and provide support to all participants during the meeting process. Collaborative decision making can be augmented by six thinking hat technique to provide patients, families, and health professionals with opportunities to make informed decisions about care that considers key issues for all involved. Nurses who are most often advocates for patients and their families are in a unique position to lead this initiative in meetings as they network with all health professionals. © 2017 John Wiley & Sons Australia, Ltd.

  4. Indirect three-dimensional printing of synthetic polymer scaffold based on thermal molding process.

    PubMed

    Park, Jeong Hun; Jung, Jin Woo; Kang, Hyun-Wook; Cho, Dong-Woo

    2014-06-01

    One of the major issues in tissue engineering has been the development of three-dimensional (3D) scaffolds, which serve as a structural template for cell growth and extracellular matrix formation. In scaffold-based tissue engineering, 3D printing (3DP) technology has been successfully applied for the fabrication of complex 3D scaffolds by using both direct and indirect techniques. In principle, direct 3DP techniques rely on the straightforward utilization of the final scaffold materials during the actual scaffold fabrication process. In contrast, indirect 3DP techniques use a negative mold based on a scaffold design, to which the desired biomaterial is cast and then sacrificed to obtain the final scaffold. Such indirect 3DP techniques generally impose a solvent-based process for scaffold fabrication, resulting in a considerable increase in the fabrication time and poor mechanical properties. In addition, the internal architecture of the resulting scaffold is affected by the properties of the biomaterial solution. In this study, we propose an advanced indirect 3DP technique using projection-based micro-stereolithography and an injection molding system (IMS) in order to address these challenges. The scaffold was fabricated by a thermal molding process using IMS to overcome the limitation of the solvent-based molding process in indirect 3DP techniques. The results indicate that the thermal molding process using an IMS has achieved a substantial reduction in scaffold fabrication time and has also provided the scaffold with higher mechanical modulus and strength. In addition, cell adhesion and proliferation studies have indicated no significant difference in cell activity between the scaffolds prepared by solvent-based and thermal molding processes.

  5. Creative Stories: A Storytelling Game Fostering Creativity

    ERIC Educational Resources Information Center

    Koukourikos, Antonis; Karampiperis, Pythagoras; Panagopoulos, George

    2014-01-01

    The process of identifying techniques for fostering creativity, and applying these theoretical constructs in real-world educational activities, is, by nature, multifaceted and not straightforward, pertaining to several fields such as cognitive theory and psychology. Furthermore, the quantification of the impact of different activities on…

  6. Applying a Continuous Quality Improvement Model To Assess Institutional Effectiveness.

    ERIC Educational Resources Information Center

    Roberts, Keith

    This handbook outlines techniques and processes for improving institutional effectiveness and ensuring continuous quality improvement, based on strategic planning activities at Wisconsin's Milwaukee Area Technical College (MATC). First, institutional effectiveness is defined and 17 core indicators of effectiveness developed by the Wisconsin…

  7. Speech Recognition for A Digital Video Library.

    ERIC Educational Resources Information Center

    Witbrock, Michael J.; Hauptmann, Alexander G.

    1998-01-01

    Production of the meta-data supporting the Informedia Digital Video Library interface is automated using techniques derived from artificial intelligence research. Speech recognition and natural-language processing, information retrieval, and image analysis are applied to produce an interface that helps users locate information and navigate more…

  8. Simulated Batch Production of Penicillin

    ERIC Educational Resources Information Center

    Whitaker, A.; Walker, J. D.

    1973-01-01

    Describes a program in applied biology in which the simulation of the production of penicillin in a batch fermentor is used as a teaching technique to give students experience before handling a genuine industrial fermentation process. Details are given for the calculation of minimum production cost. (JR)

  9. Phase-locked-loop interferometry applied to aspheric testing with a computer-stored compensator.

    PubMed

    Servin, M; Malacara, D; Rodriguez-Vera, R

    1994-05-01

    A recently developed technique for continuous-phase determination of interferograms with a digital phase-locked loop (PLL) is applied to the null testing of aspheres. Although this PLL demodulating scheme is also a synchronous or direct interferometric technique, the separate unwrapping process is not explicitly required. The unwrapping and the phase-detection processes are achieved simultaneously within the PLL. The proposed method uses a computer-generated holographic compensator. The holographic compensator does not need to be printed out by any means; it is calculated and used from the computer. This computer-stored compensator is used as the reference signal to phase demodulate a sample interferogram obtained from the asphere being tested. Consequently the demodulated phase contains information about the wave-front departures from the ideal computer-stored aspheric interferogram. Wave-front differences of ~ 1 λ are handled easily by the proposed PLL scheme. The maximum recorded frequency in the template's interferogram as well as in the sampled interferogram are assumed to be below the Nyquist frequency.

  10. Technical support for creating an artificial intelligence system for feature extraction and experimental design

    NASA Technical Reports Server (NTRS)

    Glick, B. J.

    1985-01-01

    Techniques for classifying objects into groups or clases go under many different names including, most commonly, cluster analysis. Mathematically, the general problem is to find a best mapping of objects into an index set consisting of class identifiers. When an a priori grouping of objects exists, the process of deriving the classification rules from samples of classified objects is known as discrimination. When such rules are applied to objects of unknown class, the process is denoted classification. The specific problem addressed involves the group classification of a set of objects that are each associated with a series of measurements (ratio, interval, ordinal, or nominal levels of measurement). Each measurement produces one variable in a multidimensional variable space. Cluster analysis techniques are reviewed and methods for incuding geographic location, distance measures, and spatial pattern (distribution) as parameters in clustering are examined. For the case of patterning, measures of spatial autocorrelation are discussed in terms of the kind of data (nominal, ordinal, or interval scaled) to which they may be applied.

  11. Efficient generation of discontinuity-preserving adaptive triangulations from range images.

    PubMed

    Garcia, Miguel Angel; Sappa, Angel Domingo

    2004-10-01

    This paper presents an efficient technique for generating adaptive triangular meshes from range images. The algorithm consists of two stages. First, a user-defined number of points is adaptively sampled from the given range image. Those points are chosen by taking into account the surface shapes represented in the range image in such a way that points tend to group in areas of high curvature and to disperse in low-variation regions. This selection process is done through a noniterative, inherently parallel algorithm in order to gain efficiency. Once the image has been subsampled, the second stage applies a two and one half-dimensional Delaunay triangulation to obtain an initial triangular mesh. To favor the preservation of surface and orientation discontinuities (jump and crease edges) present in the original range image, the aforementioned triangular mesh is iteratively modified by applying an efficient edge flipping technique. Results with real range images show accurate triangular approximations of the given range images with low processing times.

  12. Cider fermentation process monitoring by Vis-NIR sensor system and chemometrics.

    PubMed

    Villar, Alberto; Vadillo, Julen; Santos, Jose I; Gorritxategi, Eneko; Mabe, Jon; Arnaiz, Aitor; Fernández, Luis A

    2017-04-15

    Optimization of a multivariate calibration process has been undertaken for a Visible-Near Infrared (400-1100nm) sensor system, applied in the monitoring of the fermentation process of the cider produced in the Basque Country (Spain). The main parameters that were monitored included alcoholic proof, l-lactic acid content, glucose+fructose and acetic acid content. The multivariate calibration was carried out using a combination of different variable selection techniques and the most suitable pre-processing strategies were selected based on the spectra characteristics obtained by the sensor system. The variable selection techniques studied in this work include Martens Uncertainty test, interval Partial Least Square Regression (iPLS) and Genetic Algorithm (GA). This procedure arises from the need to improve the calibration models prediction ability for cider monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Noise reduction techniques for Bayer-matrix images

    NASA Astrophysics Data System (ADS)

    Kalevo, Ossi; Rantanen, Henry

    2002-04-01

    In this paper, some arrangements to apply Noise Reduction (NR) techniques for images captured by a single sensor digital camera are studied. Usually, the NR filter processes full three-color component image data. This requires that raw Bayer-matrix image data, available from the image sensor, is first interpolated by using Color Filter Array Interpolation (CFAI) method. Another choice is that the raw Bayer-matrix image data is processed directly. The advantages and disadvantages of both processing orders, before (pre-) CFAI and after (post-) CFAI, are studied with linear, multi-stage median, multistage median hybrid and median-rational filters .The comparison is based on the quality of the output image, the processing power requirements and the amount of memory needed. Also the solution, which improves preservation of details in the NR filtering before the CFAI, is proposed.

  14. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood forecasting purposes, A.H. Weerts, G.Y.H. El Serafy, S. Hummel, J. Dhondia, and H. Gerritsen (2009), accepted by Geoscience & Computers.

  15. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    NASA Astrophysics Data System (ADS)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  16. Field Calibration of Wind Direction Sensor to the True North and Its Application to the Daegwanryung Wind Turbine Test Sites

    PubMed Central

    Lee, Jeong Wan

    2008-01-01

    This paper proposes a field calibration technique for aligning a wind direction sensor to the true north. The proposed technique uses the synchronized measurements of captured images by a camera, and the output voltage of a wind direction sensor. The true wind direction was evaluated through image processing techniques using the captured picture of the sensor with the least square sense. Then, the evaluated true value was compared with the measured output voltage of the sensor. This technique solves the discordance problem of the wind direction sensor in the process of installing meteorological mast. For this proposed technique, some uncertainty analyses are presented and the calibration accuracy is discussed. Finally, the proposed technique was applied to the real meteorological mast at the Daegwanryung test site, and the statistical analysis of the experimental testing estimated the values of stable misalignment and uncertainty level. In a strict sense, it is confirmed that the error range of the misalignment from the exact north could be expected to decrease within the credibility level. PMID:27873957

  17. Semiconductor/dielectric interface engineering and characterization

    NASA Astrophysics Data System (ADS)

    Lucero, Antonio T.

    The focus of this dissertation is the application and characterization of several, novel interface passivation techniques for III-V semiconductors, and the development of an in-situ electrical characterization. Two different interface passivation techniques were evaluated. The first is interface nitridation using a nitrogen radical plasma source. The nitrogen radical plasma generator is a unique system which is capable of producing a large flux of N-radicals free of energetic ions. This was applied to Si and the surface was studied using x-ray photoelectron spectroscopy (XPS). Ultra-thin nitride layers could be formed from 200-400° C. Metal-oxide-semiconductor capacitors (MOSCAPs) were fabricated using this passivation technique. Interface nitridation was able to reduce leakage current and improve the equivalent oxide thickness of the devices. The second passivation technique studied is the atomic layer deposition (ALD) diethylzinc (DEZ)/water treatment of sulfur treated InGaAs and GaSb. On InGaAs this passivation technique is able to chemically reduce higher oxidation states on the surface, and the process results in the deposition of a ZnS/ZnO interface passivation layer, as determined by XPS. Capacitance-voltage (C-V) measurements of MOSCAPs made on p-InGaAs reveal a large reduction in accumulation dispersion and a reduction in the density of interfacial traps. The same technique was applied to GaSb and the process was studied in an in-situ half-cycle XPS experiment. DEZ/H2O is able to remove all Sb-S from the surface, forming a stable ZnS passivation layer. This passivation layer is resistant to further reoxidation during dielectric deposition. The final part of this dissertation is the design and construction of an ultra-high vacuum cluster tool for in-situ electrical characterization. The system consists of three deposition chambers coupled to an electrical probe station. With this setup, devices can be processed and subsequently electrically characterized without exposing the sample to air. This is the first time that such a system has been reported. A special air-gap C-V probe will allow top gated measurements to be made, allowing semiconductor-dielectric interfaces to be studied during device processing.

  18. Marine geodetic control for geoidal profile mapping across the Puerto Rican Trench

    NASA Technical Reports Server (NTRS)

    Fubara, D. M.; Mourad, A. G.

    1975-01-01

    A marine geodetic control was established for the northern end of the geoidal profile mapping experiment across the Puerto Rican Trench by determining the three-dimensional geodetic coordinates of the four ocean-bottom mounted acoustic transponders. The data reduction techniques employed and analytical processes involved are described. Before applying the analytical techniques to the field data, they were tested with simulated data and proven to be effective in theory as well as in practice.

  19. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  20. Protein folding optimization based on 3D off-lattice model via an improved artificial bee colony algorithm.

    PubMed

    Li, Bai; Lin, Mu; Liu, Qiao; Li, Ya; Zhou, Changjun

    2015-10-01

    Protein folding is a fundamental topic in molecular biology. Conventional experimental techniques for protein structure identification or protein folding recognition require strict laboratory requirements and heavy operating burdens, which have largely limited their applications. Alternatively, computer-aided techniques have been developed to optimize protein structures or to predict the protein folding process. In this paper, we utilize a 3D off-lattice model to describe the original protein folding scheme as a simplified energy-optimal numerical problem, where all types of amino acid residues are binarized into hydrophobic and hydrophilic ones. We apply a balance-evolution artificial bee colony (BE-ABC) algorithm as the minimization solver, which is featured by the adaptive adjustment of search intensity to cater for the varying needs during the entire optimization process. In this work, we establish a benchmark case set with 13 real protein sequences from the Protein Data Bank database and evaluate the convergence performance of BE-ABC algorithm through strict comparisons with several state-of-the-art ABC variants in short-term numerical experiments. Besides that, our obtained best-so-far protein structures are compared to the ones in comprehensive previous literature. This study also provides preliminary insights into how artificial intelligence techniques can be applied to reveal the dynamics of protein folding. Graphical Abstract Protein folding optimization using 3D off-lattice model and advanced optimization techniques.

  1. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho

    2015-01-01

    Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  2. Non-Intrusive Measurement Techniques Applied to the Hybrid Solid Fuel Degradation

    NASA Astrophysics Data System (ADS)

    Cauty, F.

    2004-10-01

    The knowledge of the solid fuel regression rate and the time evolution of the grain geometry are requested for hybrid motor design and control of its operating conditions. Two non-intrusive techniques (NDT) have been applied to hybrid propulsion : both are based on wave propagation, the X-rays and the ultrasounds, through the materials. X-ray techniques allow local thickness measurements (attenuated signal level) using small probes or 2D images (Real Time Radiography), with a link between the size of field of view and accuracy. Beside the safety hazards associated with the high-intensity X-ray systems, the image analysis requires the use of quite complex post-processing techniques. The ultrasound technique is more widely used in energetic material applications, including hybrid fuels. Depending upon the transducer size and the associated equipment, the application domain is large, from tiny samples to the quad-port wagon wheel grain of the 1.1 MN thrust HPDP motor. The effect of the physical quantities has to be taken into account in the wave propagation analysis. With respect to the various applications, there is no unique and perfect experimental method to measure the fuel regression rate. The best solution could be obtained by combining two techniques at the same time, each technique enhancing the quality of the global data.

  3. Review of surface steam sterilization for validation purposes.

    PubMed

    van Doornmalen, Joost; Kopinga, Klaas

    2008-03-01

    Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.

  4. Reconstructing the Sky Location of Gravitational-Wave Detected Compact Binary Systems: Methodology for Testing and Comparison

    NASA Technical Reports Server (NTRS)

    Sidney, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.; hide

    2014-01-01

    The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiralonly signals from compact binary systems with a total mass of equal to or less than 20M solar mass and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor approx. equals 20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor approx. equals 1000 longer processing time.

  5. Reconstructing the sky location of gravitational-wave detected compact binary systems: Methodology for testing and comparison

    NASA Astrophysics Data System (ADS)

    Sidery, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.; Kalogera, V.; Mandel, I.; O'Shaughnessy, R.; Pitkin, M.; Price, L.; Raymond, V.; Röver, C.; Singer, L.; van der Sluys, M.; Smith, R. J. E.; Vecchio, A.; Veitch, J.; Vitale, S.

    2014-04-01

    The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiral-only signals from compact binary systems with a total mass of ≤20M⊙ and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor ≈20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor ≈1000 longer processing time.

  6. Application of neural networks in the acousto-ultrasonic evaluation of metal-matrix composite specimens

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Tjia, Robert E.; Vary, Alex; Kautz, Harold E.

    1992-01-01

    Acousto-ultrasonics (AU) is a nondestructive evaluation (NDE) technique that was devised for the testing of various types of composite materials. A study has been done to determine how effectively the AU technique may be applied to metal-matrix composites (MMCs). The authors use the results and data obtained from that study and apply neural networks to them, particularly in the assessment of mechanical property variations of a specimen from AU measurements. It is assumed that there is no information concerning the important features of the AU signal which relate to the mechanical properties of the specimen. Minimally processed AU measurements are used while relying on the network's ability to extract the significant features of the signal.

  7. A Review of the Anaerobic Digestion of Fruit and Vegetable Waste.

    PubMed

    Ji, Chao; Kong, Chui-Xue; Mei, Zi-Li; Li, Jiang

    2017-11-01

    Fruit and vegetable waste is an ever-growing global question. Anaerobic digestion techniques have been developed that facilitate turning such waste into possible sources for energy and fertilizer, simultaneously helping to reduce environmental pollution. However, various problems are encountered in applying these techniques. The purpose of this study is to review local and overseas studies, which focus on the use of anaerobic digestion to dispose fruit and vegetable wastes, discuss the acidification problems and solutions in applying anaerobic digestion for fruit and vegetable wastes and investigate the reactor design (comparing single phase with two phase) and the thermal pre-treatment for processing raw wastes. Furthermore, it analyses the dominant microorganisms involved at different stages of digestion and suggests a focus for future studies.

  8. The Potential of Sequential Extraction in the Characterisation and Management of Wastes from Steel Processing: A Prospective Review

    PubMed Central

    Rodgers, Kiri J.; Hursthouse, Andrew; Cuthbert, Simon

    2015-01-01

    As waste management regulations become more stringent, yet demand for resources continues to increase, there is a pressing need for innovative management techniques and more sophisticated supporting analysis techniques. Sequential extraction (SE) analysis, a technique previously applied to soils and sediments, offers the potential to gain a better understanding of the composition of solid wastes. SE attempts to classify potentially toxic elements (PTEs) by their associations with phases or fractions in waste, with the aim of improving resource use and reducing negative environmental impacts. In this review we explain how SE can be applied to steel wastes. These present challenges due to differences in sample characteristics compared with materials to which SE has been traditionally applied, specifically chemical composition, particle size and pH buffering capacity, which are critical when identifying a suitable SE method. We highlight the importance of delineating iron-rich phases, and find that the commonly applied BCR (The community Bureau of reference) extraction method is problematic due to difficulties with zinc speciation (a critical steel waste constituent), hence a substantially modified SEP is necessary to deal with particular characteristics of steel wastes. Successful development of SE for steel wastes could have wider implications, e.g., for the sustainable management of fly ash and mining wastes. PMID:26393631

  9. The Potential of Sequential Extraction in the Characterisation and Management of Wastes from Steel Processing: A Prospective Review.

    PubMed

    Rodgers, Kiri J; Hursthouse, Andrew; Cuthbert, Simon

    2015-09-18

    As waste management regulations become more stringent, yet demand for resources continues to increase, there is a pressing need for innovative management techniques and more sophisticated supporting analysis techniques. Sequential extraction (SE) analysis, a technique previously applied to soils and sediments, offers the potential to gain a better understanding of the composition of solid wastes. SE attempts to classify potentially toxic elements (PTEs) by their associations with phases or fractions in waste, with the aim of improving resource use and reducing negative environmental impacts. In this review we explain how SE can be applied to steel wastes. These present challenges due to differences in sample characteristics compared with materials to which SE has been traditionally applied, specifically chemical composition, particle size and pH buffering capacity, which are critical when identifying a suitable SE method. We highlight the importance of delineating iron-rich phases, and find that the commonly applied BCR (The community Bureau of reference) extraction method is problematic due to difficulties with zinc speciation (a critical steel waste constituent), hence a substantially modified SEP is necessary to deal with particular characteristics of steel wastes. Successful development of SE for steel wastes could have wider implications, e.g., for the sustainable management of fly ash and mining wastes.

  10. A novel process for introducing a new intraoperative program: a multidisciplinary paradigm for mitigating hazards and improving patient safety.

    PubMed

    Rodriguez-Paz, Jose M; Mark, Lynette J; Herzer, Kurt R; Michelson, James D; Grogan, Kelly L; Herman, Joseph; Hunt, David; Wardlow, Linda; Armour, Elwood P; Pronovost, Peter J

    2009-01-01

    Since the Institute of Medicine's report, To Err is Human, was published, numerous interventions have been designed and implemented to correct the defects that lead to medical errors and adverse events; however, most efforts were largely reactive. Safety, communication, team performance, and efficiency are areas of care that attract a great deal of attention, especially regarding the introduction of new technologies, techniques, and procedures. We describe a multidisciplinary process that was implemented at our hospital to identify and mitigate hazards before the introduction of a new technique: high-dose-rate intraoperative radiation therapy, (HDR-IORT). A multidisciplinary team of surgeons, anesthesiologists, radiation oncologists, physicists, nurses, hospital risk managers, and equipment specialists used a structured process that included in situ clinical simulation to uncover concerns among care providers and to prospectively identify and mitigate defects for patients who would undergo surgery using the HDR-IORT technique. We identified and corrected 20 defects in the simulated patient care process before application to actual patients. Subsequently, eight patients underwent surgery using the HDR-IORT technique with no recurrence of simulation-identified or unanticipated defects. Multiple benefits were derived from the use of this systematic process to introduce the HDR-IORT technique; namely, the safety and efficiency of care for this select patient population was optimized, and this process mitigated harmful or adverse events before the inclusion of actual patients. Further work is needed, but the process outlined in this paper can be universally applied to the introduction of any new technologies, treatments, or procedures.

  11. Alloyed coatings for dispersion strengthened alloys

    NASA Technical Reports Server (NTRS)

    Wermuth, F. R.; Stetson, A. R.

    1971-01-01

    Processing techniques were developed for applying several diffusion barriers to TD-Ni and TD-NiCr. Barrier coated specimens of both substrates were clad with Ni-Cr-Al and Fe-Cr-Al alloys and diffusion annealed in argon. Measurement of the aluminum distribution after annealing showed that, of the readily applicable diffusion barriers, a slurry applied tungsten barrier most effectively inhibited the diffusion of aluminum from the Ni-Cr-Al clad into the TD-alloy substrates. No barrier effectively limited interdiffusion of the Fe-Cr-Al clad with the substrates. A duplex process was then developed for applying Ni-Cr-Al coating compositions to the tungsten barrier coated substrates. A Ni-(16 to 32)Cr-3Si modifier was applied by slurry spraying and firing in vacuum, and was then aluminized by a fusion slurry process. Cyclic oxidation tests at 2300 F resulted in early coating failure due to inadequate edge coverage and areas of coating porosity. EMP analysis showed that oxidation had consumed 70 to 80 percent of the aluminum in the coating in less than 50 hours.

  12. Fabrication of ф 160 mm convex hyperbolic mirror for remote sensing instrument

    NASA Astrophysics Data System (ADS)

    Kuo, Ching-Hsiang; Yu, Zong-Ru; Ho, Cheng-Fang; Hsu, Wei-Yao; Chen, Fong-Zhi

    2012-10-01

    In this study, efficient polishing processes with inspection procedures for a large convex hyperbolic mirror of Cassegrain optical system are presented. The polishing process combines the techniques of conventional lapping and CNC polishing. We apply the conventional spherical lapping process to quickly remove the sub-surface damage (SSD) layer caused by grinding process and to get the accurate radius of best-fit sphere (BFS) of aspheric surface with fine surface texture simultaneously. Thus the removed material for aspherization process can be minimized and the polishing time for SSD removal can also be reduced substantially. The inspection procedure was carried out by using phase shift interferometer with CGH and stitching technique. To acquire the real surface form error of each sub aperture, the wavefront errors of the reference flat and CGH flat due to gravity effect of the vertical setup are calibrated in advance. Subsequently, we stitch 10 calibrated sub-aperture surface form errors to establish the whole irregularity of the mirror in 160 mm diameter for correction polishing. The final result of the In this study, efficient polishing processes with inspection procedures for a large convex hyperbolic mirror of Cassegrain optical system are presented. The polishing process combines the techniques of conventional lapping and CNC polishing. We apply the conventional spherical lapping process to quickly remove the sub-surface damage (SSD) layer caused by grinding process and to get the accurate radius of best-fit sphere (BFS) of aspheric surface with fine surface texture simultaneously. Thus the removed material for aspherization process can be minimized and the polishing time for SSD removal can also be reduced substantially. The inspection procedure was carried out by using phase shift interferometer with CGH and stitching technique. To acquire the real surface form error of each sub aperture, the wavefront errors of the reference flat and CGH flat due to gravity effect of the vertical setup are calibrated in advance. Subsequently, we stitch 10 calibrated sub-aperture surface form errors to establish the whole irregularity of the mirror in 160 mm diameter for correction polishing. The final result of the Fabrication of ф160 mm Convex Hyperbolic Mirror for Remote Sensing Instrument160 mm convex hyperbolic mirror is 0.15 μm PV and 17.9 nm RMS.160 mm convex hyperbolic mirror is 0.15 μm PV and 17.9 nm RMS.

  13. The application of phase grating to CLM technology for the sub-65nm node optical lithography

    NASA Astrophysics Data System (ADS)

    Yoon, Gi-Sung; Kim, Sung-Hyuck; Park, Ji-Soong; Choi, Sun-Young; Jeon, Chan-Uk; Shin, In-Kyun; Choi, Sung-Woon; Han, Woo-Sung

    2005-06-01

    As a promising technology for sub-65nm node optical lithography, CLM(Chrome-Less Mask) technology among RETs(Resolution Enhancement Techniques) for low k1 has been researched worldwide in recent years. CLM has several advantages, such as relatively simple manufacturing process and competitive performance compared to phase-edge PSM's. For the low-k1 lithography, we have researched CLM technique as a good solution especially for sub-65nm node. As a step for developing the sub-65nm node optical lithography, we have applied CLM technology in 80nm-node lithography with mesa and trench method. From the analysis of the CLM technology in the 80nm lithography, we found that there is the optimal shutter size for best performance in the technique, the increment of wafer ADI CD varied with pattern's pitch, and a limitation in patterning various shapes and size by OPC dead-zone - OPC dead-zone in CLM technique is the specific region of shutter size that dose not make the wafer CD increased more than a specific size. And also small patterns are easily broken, while fabricating the CLM mask in mesa method. Generally, trench method has better optical performance than mesa. These issues have so far restricted the application of CLM technology to a small field. We approached these issues with 3-D topographic simulation tool and found that the issues could be overcome by applying phase grating in trench-type CLM. With the simulation data, we made some test masks which had many kinds of patterns with many different conditions and analyzed their performance through AIMS fab 193 and exposure on wafer. Finally, we have developed the CLM technology which is free of OPC dead-zone and pattern broken in fabrication process. Therefore, we can apply the CLM technique into sub-65nm node optical lithography including logic devices.

  14. Processing and analysis of commercial satellite image data of the nuclear accident near Chernobyl, U. S. S. R

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadowski, F.G.; Covington, S.J.

    1987-01-01

    Advanced digital processing techniques were applied to Landsat-5 Thematic Mapper (TM) data and SPOT high-resolution visible (HRV) panchromatic data to maximize the utility of images of a nuclear power plant emergency at Chernobyl in the Soviet Ukraine. The results of the data processing and analysis illustrate the spectral and spatial capabilities of the two sensor systems and provide information about the severity and duration of the events occurring at the power plant site.

  15. Digital processing of the Mariner 10 images of Venus and Mercury

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Lynn, D. J.; Mosher, J. A.; Elliot, D. A.

    1977-01-01

    An extensive effort was devoted to the digital processing of the Mariner 10 images of Venus and Mercury at the Image Processing Laboratory of the Jet Propulsion Laboratory. This effort was designed to optimize the display of the considerable quantity of information contained in the images. Several image restoration, enhancement, and transformation procedures were applied; examples of these techniques are included. A particular task was the construction of large mosaics which characterize the surface of Mercury and the atmospheric structure of Venus.

  16. Final Report for Contract N00014-89-J-1967 for the Time Period from 1 May 1989 to 31 December 1990 (Texas Univ. at Austin. Applied Research Labs.)

    DTIC Science & Technology

    1991-04-23

    in this section. In our investigation of higher order processing methods for remote acoustic sensing we sought to understand the principles of laser...magnitude less than those presently detected in laboratory measurements. An initial study of several potential higher order processing techniques was...incoherent. The use of higher order processing methods to provide some level of discrimination against noise thus appears tractable. Finally, the effects

  17. Interlibrary Lending with Computerized Union Catalogues.

    ERIC Educational Resources Information Center

    Lehmann, Klaus-Dieter

    Interlibrary loans in the Federal Republic of Germany are facilitated by applying techniques of data processing and computer output microfilm (COM) to the union catalogs of the national library system. The German library system consists of two national libraries, four central specialized libraries of technology, medicine, agriculture, and…

  18. Lessons from Child of Water.

    ERIC Educational Resources Information Center

    Silver, Steven M.

    Native American Vietnam War veterans offer important concepts and techniques concerning the reintegration process which may be generalized to psychiatry as a whole and which may be particularly valuable to the adjustment and treatment of other Vietnam War veterans. Native American psychological healing practices as applied to returning war…

  19. Action Research: Enhancing Classroom Practice and Fulfilling Educational Responsibilities

    ERIC Educational Resources Information Center

    Young, Mark R.; Rapp, Eve; Murphy, James W.

    2010-01-01

    Action Research is an applied scholarly paradigm resulting in action for continuous improvement in our teaching and learning techniques offering faculty immediate classroom payback and providing documentation of meeting our educational responsibilities as required by AACSB standards. This article reviews the iterative action research process of…

  20. Techniques for Programming Visual Demonstrations.

    ERIC Educational Resources Information Center

    Gropper, George L.

    Visual demonstrations may be used as part of programs to deliver both content objectives and process objectives. Research has shown that learning of concepts is easier, more accurate, and more broadly applied when it is accompanied by visual examples. The visual examples supporting content learning should emphasize both discrimination and…

  1. Essentials of Suggestopedia: A Primer for Practitioners.

    ERIC Educational Resources Information Center

    Caskey, Owen L.; Flake, Muriel H.

    Suggestology is the scientific study of the psychology of suggestion and Suggestopedia in the application of relaxation and suggestion techniques to learning. The approach applied to learning processes (called Suggestopedic) developed by Dr. Georgi Lozanov (called the Lozanov Method) utilizes mental and physical relaxation, deep breathing,…

  2. Applying Data Mining Principles to Library Data Collection.

    ERIC Educational Resources Information Center

    Guenther, Kim

    2000-01-01

    Explains how libraries can use data mining techniques for more effective data collection. Highlights include three phases: data selection and acquisition; data preparation and processing, including a discussion of the use of XML (extensible markup language); and data interpretation and integration, including database management systems. (LRW)

  3. Applying Constructivism to Improve Public Relations for Education

    ERIC Educational Resources Information Center

    Marek, Michael

    2005-01-01

    Educators are often hesitant to use techniques of public relations and marketing communication to attempt to alter undesirable understandings of the rationale and processes of education held by external constituencies. This paper shows that contemporary practice in public relations and marketing communication can be conceptualized as an…

  4. Support Materials for the Software Technical Review Process

    DTIC Science & Technology

    1988-04-01

    the Software Technical Review Process Softwar-e reviewing is a general term applied to techniques for the use of human hitellectual power to detect...more systematic than random. It utilizes data supplied by students, rather than relying solely on the subjective opinions of the instructor. The...The experience of other users is now essential.) "• Are the resulting grades accurate? (Thus far, they appear to correlate with student grades on

  5. Research into language concepts for the mission control center

    NASA Technical Reports Server (NTRS)

    Dellenback, Steven W.; Barton, Timothy J.; Ratner, Jeremiah M.

    1990-01-01

    A final report is given on research into language concepts for the Mission Control Center (MCC). The Specification Driven Language research is described. The state of the image processing field and how image processing techniques could be applied toward automating the generation of the language known as COmputation Development Environment (CODE or Comp Builder) are discussed. Also described is the development of a flight certified compiler for Comps.

  6. Preliminary Evaluation of an Aviation Safety Thesaurus' Utility for Enhancing Automated Processing of Incident Reports

    NASA Technical Reports Server (NTRS)

    Barrientos, Francesca; Castle, Joseph; McIntosh, Dawn; Srivastava, Ashok

    2007-01-01

    This document presents a preliminary evaluation the utility of the FAA Safety Analytics Thesaurus (SAT) utility in enhancing automated document processing applications under development at NASA Ames Research Center (ARC). Current development efforts at ARC are described, including overviews of the statistical machine learning techniques that have been investigated. An analysis of opportunities for applying thesaurus knowledge to improving algorithm performance is then presented.

  7. GNU Radio Sandia Utilities v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Jacob; Knee, Peter

    This software adds a data handling module to the GNU Radio (GR) software defined radio (SDR) framework as well as some general-purpose function blocks (filters, metadata control, etc). This software is useful for processing bursty RF transmissions with GR, and serves as a base for applying SDR signal processing techniques to a whole burst of data at a time, as opposed to streaming data which GR has been primarily focused around.

  8. Influence of material quality and process-induced defects on semiconductor device performance and yield

    NASA Technical Reports Server (NTRS)

    Porter, W. A.; Mckee, W. R.

    1974-01-01

    An overview of major causes of device yield degradation is presented. The relationships of device types to critical processes and typical defects are discussed, and the influence of the defect on device yield and performance is demonstrated. Various defect characterization techniques are described and applied. A correlation of device failure, defect type, and cause of defect is presented in tabular form with accompanying illustrations.

  9. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  10. The Complex Cepstrum - Revisited

    NASA Astrophysics Data System (ADS)

    Kemerait, R. C., Sr.

    2016-12-01

    Since this paper comes at the twilight of my career, it is appropriate to share my views on a subject very dear to my heart and to my long career. In 2004 "From Frequency to Quefrency: A History of the Cepstrum" was published in the IEEE Signal Processing magazine. There is no question that the authors, Alan V. Oppenheim and Ronald W. Schafer, were pioneers in this area of research, and this publication documents their involvement quite nicely. In parallel research also performed in the 1960's, Childers, et. al., renamed the original "Cepstrum" to the "Power Cepstrum" to avoid confusion with the principal topic of their research, that being the "Complex Cepstrum." The term "Power Cepstrum" has become widely used in the literature since that time. The Childers team, including Dr. Kemerait, published a summary of their work, as of that date, in the IEEE Proceedings of October 1977, and titled the article "The Cepstrum: A Guide to Processing." In the subsequent 40 years, Dr. Kemerait has continued to research cepstral techniques applied to many diverse problems; however, his primary research has been on estimating the depth of underground and underwater events. He has also applied these techniques to biomedical data: EEG, EKG, and Visua-evoked responses as well as on hydroacoustic data ; thereby, determining the "bubble pulse frequency", and the depths of the explosion and the ocean depth at the explosion point. He has also used cepstral techniques in the processing of ground penetrating radar, speech, machine diagnostics, and, throughout these years, seismic data. This paper emphasizes his recent improvements in processing primarily seismic and infrasound data associated with nuclear treaty monitoring. The emphasis is mainly on the recent improvements and the automation of the Complex Cepstrum process.

  11. Monitoring Pre-Stressed Composites Using Optical Fibre Sensors.

    PubMed

    Krishnamurthy, Sriram; Badcock, Rodney A; Machavaram, Venkata R; Fernando, Gerard F

    2016-05-28

    Residual stresses in fibre reinforced composites can give rise to a number of undesired effects such as loss of dimensional stability and premature fracture. Hence, there is significant merit in developing processing techniques to mitigate the development of residual stresses. However, tracking and quantifying the development of these fabrication-induced stresses in real-time using conventional non-destructive techniques is not straightforward. This article reports on the design and evaluation of a technique for manufacturing pre-stressed composite panels from unidirectional E-glass/epoxy prepregs. Here, the magnitude of the applied pre-stress was monitored using an integrated load-cell. The pre-stressing rig was based on a flat-bed design which enabled autoclave-based processing. A method was developed to end-tab the laminated prepregs prior to pre-stressing. The development of process-induced residual strain was monitored in-situ using embedded optical fibre sensors. Surface-mounted electrical resistance strain gauges were used to measure the strain when the composite was unloaded from the pre-stressing rig at room temperature. Four pre-stress levels were applied prior to processing the laminated preforms in an autoclave. The results showed that the application of a pre-stress of 108 MPa to a unidirectional [0]16 E-glass/913 epoxy preform, reduced the residual strain in the composite from -600 µε (conventional processing without pre-stress) to approximately zero. A good correlation was observed between the data obtained from the surface-mounted electrical resistance strain gauge and the embedded optical fibre sensors. In addition to "neutralising" the residual stresses, superior axial orientation of the reinforcement can be obtained from pre-stressed composites. A subsequent publication will highlight the consequences of pres-stressing on fibre alignment, the tensile, flexural, compressive and fatigue performance of unidirectional E-glass composites.

  12. Monitoring Pre-Stressed Composites Using Optical Fibre Sensors

    PubMed Central

    Krishnamurthy, Sriram; Badcock, Rodney A.; Machavaram, Venkata R.; Fernando, Gerard F.

    2016-01-01

    Residual stresses in fibre reinforced composites can give rise to a number of undesired effects such as loss of dimensional stability and premature fracture. Hence, there is significant merit in developing processing techniques to mitigate the development of residual stresses. However, tracking and quantifying the development of these fabrication-induced stresses in real-time using conventional non-destructive techniques is not straightforward. This article reports on the design and evaluation of a technique for manufacturing pre-stressed composite panels from unidirectional E-glass/epoxy prepregs. Here, the magnitude of the applied pre-stress was monitored using an integrated load-cell. The pre-stressing rig was based on a flat-bed design which enabled autoclave-based processing. A method was developed to end-tab the laminated prepregs prior to pre-stressing. The development of process-induced residual strain was monitored in-situ using embedded optical fibre sensors. Surface-mounted electrical resistance strain gauges were used to measure the strain when the composite was unloaded from the pre-stressing rig at room temperature. Four pre-stress levels were applied prior to processing the laminated preforms in an autoclave. The results showed that the application of a pre-stress of 108 MPa to a unidirectional [0]16 E-glass/913 epoxy preform, reduced the residual strain in the composite from −600 µε (conventional processing without pre-stress) to approximately zero. A good correlation was observed between the data obtained from the surface-mounted electrical resistance strain gauge and the embedded optical fibre sensors. In addition to “neutralising” the residual stresses, superior axial orientation of the reinforcement can be obtained from pre-stressed composites. A subsequent publication will highlight the consequences of pres-stressing on fibre alignment, the tensile, flexural, compressive and fatigue performance of unidirectional E-glass composites. PMID:27240378

  13. High Precision Metal Thin Film Liftoff Technique

    NASA Technical Reports Server (NTRS)

    Brown, Ari D. (Inventor); Patel, Amil A. (Inventor)

    2015-01-01

    A metal film liftoff process includes applying a polymer layer onto a silicon substrate, applying a germanium layer over the polymer layer to create a bilayer lift off mask, applying a patterned photoresist layer over the germanium layer, removing an exposed portion of the germanium layer, removing the photoresist layer and a portion of the polymer layer to expose a portion of the substrate and create an overhanging structure of the germanium layer, depositing a metal film over the exposed portion of the substrate and the germanium layer, and removing the polymer and germanium layers along with the overlaying metal film.

  14. Coal liquefaction process streams characterization and evaluation. Gold tube carbonization and reflectance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, G.; Davis, A.; Burke, F.P.

    1991-12-01

    This study demonstrated the use of the gold tube carbonization technique and reflectance microscopy analysis for the examination of process-derived materials from direct coal liquefaction. The carbonization technique, which was applied to coal liquefaction distillation resids, yields information on the amounts of gas plus distillate, pyridine-soluble resid, and pyridine-insoluble material formed when a coal liquid sample is heated to 450{degree}C for one hour at 5000 psi in an inert atmosphere. The pyridine-insolubles then are examined by reflectance microscopy to determine the type, amount, and optical texture of isotropic and anisotropic carbon formed upon carbonization. Further development of these analytical methodsmore » as process development tools may be justified on the basis of these results.« less

  15. High order volume-preserving algorithms for relativistic charged particles in general electromagnetic fields

    NASA Astrophysics Data System (ADS)

    He, Yang; Sun, Yajuan; Zhang, Ruili; Wang, Yulei; Liu, Jian; Qin, Hong

    2016-09-01

    We construct high order symmetric volume-preserving methods for the relativistic dynamics of a charged particle by the splitting technique with processing. By expanding the phase space to include the time t, we give a more general construction of volume-preserving methods that can be applied to systems with time-dependent electromagnetic fields. The newly derived methods provide numerical solutions with good accuracy and conservative properties over long time of simulation. Furthermore, because of the use of an accuracy-enhancing processing technique, the explicit methods obtain high-order accuracy and are more efficient than the methods derived from standard compositions. The results are verified by the numerical experiments. Linear stability analysis of the methods shows that the high order processed method allows larger time step size in numerical integrations.

  16. Application of continuous substrate feeding to the ABE fermentation: Relief of product inhibition using extraction, perstraction, stripping, and pervaporation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qureshi, N.; Maddox, I.S.; Friedl, A.

    1992-09-01

    The technique of continuous substrate feeding has been applied to the batch fermentation process using freely suspended cells, for ABE (acetone-butanol-ethanol) production. To avoid the product inhibition which normally restricts ABE production to less than 20 g/L and sugar utilization to 60 g/L, a product removal technique has been integrated into the fermentation process. The techniques investigated were liquid-liquid extraction, perstraction, gas-stripping, and pervaporation. By using a substrate of whey permeate, the reactor productivity has been improved over that observed in a traditional batch fermentation, while equivalent lactose utilization and ABE production values of 180 g and 69 g, respectively,more » have been achieved in a 1-L culture volume. 17 refs., 14 figs., 5 tabs.« less

  17. Traveling Magnetic Field Applications for Materials Processing in Space

    NASA Technical Reports Server (NTRS)

    Grugel, R. N.; Mazuruk, K.; Curreri, Peter A. (Technical Monitor)

    2001-01-01

    Including the capability to induce a controlled fluid flow in the melt can significantly enrich research on solidification phenomena in a microgravity environment. The traveling magnetic field (TMF) is a promising technique to achieve this goal and is the aim of our ground-based project. In this presentation we will discuss new theoretical as well as experimental results recently obtained by our group. In particular, we experimentally demonstrated efficient mixing of metal alloys in long tubes subjected to TMF during processing. Application of this technique can provide an elegant solution to ensure melt homogenization prior to solidification in a microgravity environment where natural convection is generally absent. Results of our experimental work of applying the TMF technique to alloy melts will be presented. Possible applications of TMF on board the International Space Station will also be discussed.

  18. A Novel Catalyst Deposition Technique for the Growth of Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Delzeit, Lance; Cassell, A.; Stevens, R.; Nguyen, C.; Meyyappan, M.; DeVincenzi, Donald L. (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the development of a technique at NASA's Ames Research Center by which carbon nanotubes (NT) can be grown. The project had several goals which included: 1) scaleability, 2) ability to control single wall nanotube (SWNT) and multiwall nanotube (MWNT) formation, 3) ability to control the density of nanotubes as they grow, 4) ability to apply standard masking techniques for NT patterning. Information regarding the growth technique includes its use of a catalyst deposition process. SWNTs of varying thicknesses can be grown by changing the catalyst composition. Demonstrations are given of various methods of masking including the use of transmission electron microscopic (TEM) grids.

  19. A strategy for selecting data mining techniques in metabolomics.

    PubMed

    Banimustafa, Ahmed Hmaidan; Hardy, Nigel W

    2012-01-01

    There is a general agreement that the development of metabolomics depends not only on advances in chemical analysis techniques but also on advances in computing and data analysis methods. Metabolomics data usually requires intensive pre-processing, analysis, and mining procedures. Selecting and applying such procedures requires attention to issues including justification, traceability, and reproducibility. We describe a strategy for selecting data mining techniques which takes into consideration the goals of data mining techniques on the one hand, and the goals of metabolomics investigations and the nature of the data on the other. The strategy aims to ensure the validity and soundness of results and promote the achievement of the investigation goals.

  20. Applying Toyota production system techniques for medication delivery: improving hospital safety and efficiency.

    PubMed

    Newell, Terry L; Steinmetz-Malato, Laura L; Van Dyke, Deborah L

    2011-01-01

    The inpatient medication delivery system used at a large regional acute care hospital in the Midwest had become antiquated and inefficient. The existing 24-hr medication cart-fill exchange process with delivery to the patients' bedside did not always provide ordered medications to the nursing units when they were needed. In 2007 the principles of the Toyota Production System (TPS) were applied to the system. Project objectives were to improve medication safety and reduce the time needed for nurses to retrieve patient medications. A multidisciplinary team was formed that included representatives from nursing, pharmacy, informatics, quality, and various operational support departments. Team members were educated and trained in the tools and techniques of TPS, and then designed and implemented a new pull system benchmarking the TPS Ideal State model. The newly installed process, providing just-in-time medication availability, has measurably improved delivery processes as well as patient safety and satisfaction. Other positive outcomes have included improved nursing satisfaction, reduced nursing wait time for delivered medications, and improved efficiency in the pharmacy. After a successful pilot on two nursing units, the system is being extended to the rest of the hospital. © 2010 National Association for Healthcare Quality.

  1. Rotation covariant image processing for biomedical applications.

    PubMed

    Skibbe, Henrik; Reisert, Marco

    2013-01-01

    With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences.

  2. Sterilization by oxygen plasma

    NASA Astrophysics Data System (ADS)

    Moreira, Adir José; Mansano, Ronaldo Domingues; Andreoli Pinto, Terezinha de Jesus; Ruas, Ronaldo; Zambon, Luis da Silva; da Silva, Mônica Valero; Verdonck, Patrick Bernard

    2004-07-01

    The use of polymeric medical devices has stimulated the development of new sterilization methods. The traditional techniques rely on ethylene oxide, but there are many questions concerning the carcinogenic properties of the ethylene oxide residues adsorbed on the materials after processing. Another common technique is the gamma irradiation process, but it is costly, its safe operation requires an isolated site and it also affects the bulk properties of the polymers. The use of a gas plasma is an elegant alternative sterilization technique. The plasma promotes an efficient inactivation of the micro-organisms, minimises the damage to the materials and presents very little danger for personnel and the environment. Pure oxygen reactive ion etching type of plasmas were applied to inactivate a biologic indicator, the Bacillus stearothermophilus, to confirm the efficiency of this process. The sterilization processes took a short time, in a few minutes the mortality was complete. In situ analysis of the micro-organisms' inactivating time was possible using emission spectrophotometry. The increase in the intensity of the 777.5 nm oxygen line shows the end of the oxidation of the biologic materials. The results were also observed and corroborated by scanning electron microscopy.

  3. Development of multiple source data processing for structural analysis at a regional scale. [digital remote sensing in geology

    NASA Technical Reports Server (NTRS)

    Carrere, Veronique

    1990-01-01

    Various image processing techniques developed for enhancement and extraction of linear features, of interest to the structural geologist, from digital remote sensing, geologic, and gravity data, are presented. These techniques include: (1) automatic detection of linear features and construction of rose diagrams from Landsat MSS data; (2) enhancement of principal structural directions using selective filters on Landsat MSS, Spacelab panchromatic, and HCMM NIR data; (3) directional filtering of Spacelab panchromatic data using Fast Fourier Transform; (4) detection of linear/elongated zones of high thermal gradient from thermal infrared data; and (5) extraction of strong gravimetric gradients from digitized Bouguer anomaly maps. Processing results can be compared to each other through the use of a geocoded database to evaluate the structural importance of each lineament according to its depth: superficial structures in the sedimentary cover, or deeper ones affecting the basement. These image processing techniques were successfully applied to achieve a better understanding of the transition between Provence and the Pyrenees structural blocks, in southeastern France, for an improved structural interpretation of the Mediterranean region.

  4. Distillation Designs for the Lunar Surface

    NASA Technical Reports Server (NTRS)

    Boul, Peter J.; Lange,Kevin E.; Conger, Bruce; Anderson, Molly

    2010-01-01

    Gravity-based distillation methods may be applied to the purification of wastewater on the lunar base. These solutions to water processing are robust physical separation techniques, which may be more advantageous than many other techniques for their simplicity in design and operation. The two techniques can be used in conjunction with each other to obtain high purity water. The components and feed compositions for modeling waste water streams are presented in conjunction with the Aspen property system for traditional stage distillation. While the individual components for each of the waste streams will vary naturally within certain bounds, an analog model for waste water processing is suggested based on typical concentration ranges for these components. Target purity levels for recycled water are determined for each individual component based on NASA s required maximum contaminant levels for potable water Optimum parameters such as reflux ratio, feed stage location, and processing rates are determined with respect to the power consumption of the process. Multistage distillation is evaluated for components in wastewater to determine the minimum number of stages necessary for each of 65 components in humidity condensate and urine wastewater mixed streams.

  5. New Ground Truth Capability from InSAR Time Series Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, S; Vincent, P; Yang, D

    2005-07-13

    We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less

  6. Vision-based obstacle recognition system for automated lawn mower robot development

    NASA Astrophysics Data System (ADS)

    Mohd Zin, Zalhan; Ibrahim, Ratnawati

    2011-06-01

    Digital image processing techniques (DIP) have been widely used in various types of application recently. Classification and recognition of a specific object using vision system require some challenging tasks in the field of image processing and artificial intelligence. The ability and efficiency of vision system to capture and process the images is very important for any intelligent system such as autonomous robot. This paper gives attention to the development of a vision system that could contribute to the development of an automated vision based lawn mower robot. The works involve on the implementation of DIP techniques to detect and recognize three different types of obstacles that usually exist on a football field. The focus was given on the study on different types and sizes of obstacles, the development of vision based obstacle recognition system and the evaluation of the system's performance. Image processing techniques such as image filtering, segmentation, enhancement and edge detection have been applied in the system. The results have shown that the developed system is able to detect and recognize various types of obstacles on a football field with recognition rate of more 80%.

  7. Numerical Modeling of Inclusion Behavior in Liquid Metal Processing

    NASA Astrophysics Data System (ADS)

    Bellot, Jean-Pierre; Descotes, Vincent; Jardy, Alain

    2013-09-01

    Thermomechanical performance of metallic alloys is directly related to the metal cleanliness that has always been a challenge for metallurgists. During liquid metal processing, particles can grow or decrease in size either by mass transfer with the liquid phase or by agglomeration/fragmentation mechanisms. As a function of numerical density of inclusions and of the hydrodynamics of the reactor, different numerical modeling approaches are proposed; in the case of an isolated particle, the Lagrangian technique coupled with a dissolution model is applied, whereas in the opposite case of large inclusion phase concentration, the population balance equation must be solved. Three examples of numerical modeling studies achieved at Institut Jean Lamour are discussed. They illustrate the application of the Lagrangian technique (for isolated exogenous inclusion in titanium bath) and the Eulerian technique without or with the aggregation process: for precipitation and growing of inclusions at the solidification front of a Maraging steel, and for endogenous inclusions in the molten steel bath of a gas-stirred ladle, respectively.

  8. Near-infrared hyperspectral imaging for quality analysis of agricultural and food products

    NASA Astrophysics Data System (ADS)

    Singh, C. B.; Jayas, D. S.; Paliwal, J.; White, N. D. G.

    2010-04-01

    Agricultural and food processing industries are always looking to implement real-time quality monitoring techniques as a part of good manufacturing practices (GMPs) to ensure high-quality and safety of their products. Near-infrared (NIR) hyperspectral imaging is gaining popularity as a powerful non-destructive tool for quality analysis of several agricultural and food products. This technique has the ability to analyse spectral data in a spatially resolved manner (i.e., each pixel in the image has its own spectrum) by applying both conventional image processing and chemometric tools used in spectral analyses. Hyperspectral imaging technique has demonstrated potential in detecting defects and contaminants in meats, fruits, cereals, and processed food products. This paper discusses the methodology of hyperspectral imaging in terms of hardware, software, calibration, data acquisition and compression, and development of prediction and classification algorithms and it presents a thorough review of the current applications of hyperspectral imaging in the analyses of agricultural and food products.

  9. A FMEA clinical laboratory case study: how to make problems and improvements measurable.

    PubMed

    Capunzo, Mario; Cavallo, Pierpaolo; Boccia, Giovanni; Brunetti, Luigi; Pizzuti, Sante

    2004-01-01

    The authors have experimented the application of the Failure Mode and Effect Analysis (FMEA) technique in a clinical laboratory. FMEA technique allows: a) to evaluate and measure the hazards of a process malfunction, b) to decide where to execute improvement actions, and c) to measure the outcome of those actions. A small sample of analytes has been studied: there have been determined the causes of the possible malfunctions of the analytical process, calculating the risk probability index (RPI), with a value between 1 and 1,000. Only for the cases of RPI > 400, improvement actions have been implemented that allowed a reduction of RPI values between 25% to 70% with a costs increment of < 1%. FMEA technique can be applied to the processes of a clinical laboratory, even if of small dimensions, and offers a high potential of improvement. Nevertheless, such activity needs a thorough planning because it is complex, even if the laboratory already operates an ISO 9000 Quality Management System.

  10. From theory to field experiments

    NASA Astrophysics Data System (ADS)

    de Vos, Bram

    2016-04-01

    Peter Raats' achievements in Haren (NL) 1986-1997 were based on a solid theoretical insight in hydrology and transport process in soil. However, Peter was also the driving force behind many experimental studies and applied research. This will be illustrated by a broad range of examples ranging from the dynamics of composting processes of organic material; modelling and monitoring nutrient leaching at field-scale; wind erosion; water and nutrient dynamics in horticultural production systems; oxygen diffusion in soils; and processes of water and nutrient uptake by plant roots. Peter's leadership led to may new approaches and the introduction of innovative measurement techniques in Dutch research; ranging from TDR to nutrient concentration measurements in closed fertigation systems. This presentation will give a brief overview how Peter's theoretical and mathematical insights accelerated this applied research.

  11. Analysis of the stochastic excitability in the flow chemical reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashkirtseva, Irina

    2015-11-30

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  12. Validation of Reference Genes in mRNA Expression Analysis Applied to the Study of Asthma.

    PubMed

    Segundo-Val, Ignacio San; Sanz-Lozano, Catalina S

    2016-01-01

    The quantitative Polymerase Chain Reaction is the most used technique for the study of gene expression. To correct putative experimental errors of this technique is necessary normalizing the expression results of the gene of interest with the obtained for reference genes. Here, we describe an example of the process to select reference genes. In this particular case, we select reference genes for expression studies in the peripheral blood mononuclear cells of asthmatic patients.

  13. Iterative repair for scheduling and rescheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Davis, Eugene; Deale, Michael

    1991-01-01

    An iterative repair search method is described called constraint based simulated annealing. Simulated annealing is a hill climbing search technique capable of escaping local minima. The utility of the constraint based framework is shown by comparing search performance with and without the constraint framework on a suite of randomly generated problems. Results are also shown of applying the technique to the NASA Space Shuttle ground processing problem. These experiments show that the search methods scales to complex, real world problems and reflects interesting anytime behavior.

  14. PET/SPECT: Instrumentation, radiopharmaceuticals, neurology and physiological measurement. Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-31

    The following collection of papers was presented at the Department of Energy sponsored symposium ``Frontiers in Nuclear Medicine - PET/SPECT 1987`` held in Washington, D.C. September 27-- 28, 1987. The meeting and these manuscripts concentrate on the techniques of tomography, useful radiopharmaceuticals, and clinical neurologic and cardiac evaluation. The authors of these papers are for the most part those who either developed the techniques or who have extensively applied them to clinical practice. Individual reports are processed separately for the databases.

  15. Methods of determination of periods in the motion of asteroids

    NASA Astrophysics Data System (ADS)

    Bien, R.; Schubart, J.

    Numerical techniques for the analysis of fundamental periods in asteroidal motion are evaluated. The specific techniques evaluated were: the periodogram analysis procedure of Wundt (1980); Stumpff's (1937) system of algebraic transformations; and Labrouste's procedure. It is shown that the Labrouste procedure permitted sufficient isolation of single oscillations from the quasi-periodic process of asteroidal motion. The procedure was applied to the analysis of resonance in the motion of Trojan-type and Hilda-type asteroids, and some preliminary results are discussed.

  16. Analysis of the stochastic excitability in the flow chemical reactor

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina

    2015-11-01

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  17. Effect of Applied Current Density on Cavitation-Erosion Characteristics for Anodized Al Alloy.

    PubMed

    Lee, Seung-Jun; Kim, Seong-Jong

    2018-02-01

    Surface finishing is as important as selection of material to achieve durability. Surface finishing is a process to provide surface with the desired performance and features by applying external forces such as thermal energy or stress. This study investigated the optimum supply current density for preventing from cavitation damages by applying to an anodizing technique that artificially forms on the surface an oxide coating that has excellent mechanical characteristics, such as hardness, wear resistance. Result of hardness test, the greater hardness was associated with greater brittleness, resulting in deleterious characteristics. Consequently, under conditions such as the electrolyte concentration of 10 vol.%, the processing time of 40 min, the electrolyte temperature of 10 °C, and the current density of 20 mA/cm2 were considered to be the optimum anodizing conditions for improvement of durability in seawater.

  18. Classification of high-resolution multi-swath hyperspectral data using Landsat 8 surface reflectance data as a calibration target and a novel histogram based unsupervised classification technique to determine natural classes from biophysically relevant fit parameters

    NASA Astrophysics Data System (ADS)

    McCann, C.; Repasky, K. S.; Morin, M.; Lawrence, R. L.; Powell, S. L.

    2016-12-01

    Compact, cost-effective, flight-based hyperspectral imaging systems can provide scientifically relevant data over large areas for a variety of applications such as ecosystem studies, precision agriculture, and land management. To fully realize this capability, unsupervised classification techniques based on radiometrically-calibrated data that cluster based on biophysical similarity rather than simply spectral similarity are needed. An automated technique to produce high-resolution, large-area, radiometrically-calibrated hyperspectral data sets based on the Landsat surface reflectance data product as a calibration target was developed and applied to three subsequent years of data covering approximately 1850 hectares. The radiometrically-calibrated data allows inter-comparison of the temporal series. Advantages of the radiometric calibration technique include the need for minimal site access, no ancillary instrumentation, and automated processing. Fitting the reflectance spectra of each pixel using a set of biophysically relevant basis functions reduces the data from 80 spectral bands to 9 parameters providing noise reduction and data compression. Examination of histograms of these parameters allows for determination of natural splitting into biophysical similar clusters. This method creates clusters that are similar in terms of biophysical parameters, not simply spectral proximity. Furthermore, this method can be applied to other data sets, such as urban scenes, by developing other physically meaningful basis functions. The ability to use hyperspectral imaging for a variety of important applications requires the development of data processing techniques that can be automated. The radiometric-calibration combined with the histogram based unsupervised classification technique presented here provide one potential avenue for managing big-data associated with hyperspectral imaging.

  19. Super Resolution and Interference Suppression Technique applied to SHARAD Radar Data

    NASA Astrophysics Data System (ADS)

    Raguso, M. C.; Mastrogiuseppe, M.; Seu, R.; Piazzo, L.

    2017-12-01

    We will present a super resolution and interference suppression technique applied to the data acquired by the SHAllow RADar (SHARAD) on board the NASA's 2005 Mars Reconnaissance Orbiter (MRO) mission, currently operating around Mars [1]. The algorithms allow to improve the range resolution roughly by a factor of 3 and the Signal to Noise Ratio (SNR) by a several decibels. Range compression algorithms usually adopt conventional Fourier transform techniques, which are limited in the resolution by the transmitted signal bandwidth, analogous to the Rayleigh's criterion in optics. In this work, we investigate a super resolution method based on autoregressive models and linear prediction techniques [2]. Starting from the estimation of the linear prediction coefficients from the spectral data, the algorithm performs the radar bandwidth extrapolation (BWE), thereby improving the range resolution of the pulse-compressed coherent radar data. Moreover, the EMIs (ElectroMagnetic Interferences) are detected and the spectra is interpolated in order to reconstruct an interference free spectrum, thereby improving the SNR. The algorithm can be applied to the single complex look image after synthetic aperture processing (SAR). We apply the proposed algorithm to simulated as well as to real radar data. We will demonstrate the effective enhancement on vertical resolution with respect to the classical spectral estimator. We will show that the imaging of the subsurface layered structures observed in radargrams is improved, allowing additional insights for the scientific community in the interpretation of the SHARAD radar data, which will help to further our understanding of the formation and evolution of known geological features on Mars. References: [1] Seu et al. 2007, Science, 2007, 317, 1715-1718 [2] K.M. Cuomo, "A Bandwidth Extrapolation Technique for Improved Range Resolution of Coherent Radar Data", Project Report CJP-60, Revision 1, MIT Lincoln Laboratory (4 Dec. 1992).

  20. Recent progress in the applications of layer-by-layer assembly to the preparation of nanostructured ion-rejecting water purification membranes.

    PubMed

    Sanyal, Oishi; Lee, Ilsoon

    2014-03-01

    Reverse osmosis (RO) and nanofiltration (NF) are the two dominant membrane separation processes responsible for ion rejection. While RO is highly efficient in removal of ions it needs a high operating pressure and offers very low selectivity between ions. Nanofiltration on the other hand has a comparatively low operating pressure and most commercial membranes offer selectivity in terms of ion rejection. However in many nanofiltration operations rejection of monovalent ions is not appreciable. Therefore a high flux high rejection membrane is needed that can be applied to water purification systems. One such alternative is the usage of polyelectrolyte multilayer membranes that are prepared by the deposition of alternately charged polyelectrolytes via layer-by-layer (LbL) assembly method. LbL is one of the most common self-assembly techniques and finds application in various areas. It has a number of tunable parameters like deposition conditions, number of bilayers deposited etc. which can be manipulated as per the type of application. This technique can be applied to make a nanothin membrane skin which gives high rejection and at the same time allow a high water flux across it. Several research groups have applied this highly versatile technique to prepare membranes that can be employed for water purification. Some of these membranes have shown better performance than the commercial nanofiltration and reverse osmosis membranes. These membranes have the potential to be applied to various different aspects of water treatment like water softening, desalination and recovery of certain ions. Besides the conventional method of LbL technique other alternative methods have also been suggested that can make the technique fast, more efficient and thereby make it more commercially acceptable.

  1. Adaptive vibration control of structures under earthquakes

    NASA Astrophysics Data System (ADS)

    Lew, Jiann-Shiun; Juang, Jer-Nan; Loh, Chin-Hsiung

    2017-04-01

    techniques, for structural vibration suppression under earthquakes. Various control strategies have been developed to protect structures from natural hazards and improve the comfort of occupants in buildings. However, there has been little development of adaptive building control with the integration of real-time system identification and control design. Generalized predictive control, which combines the process of real-time system identification and the process of predictive control design, has received widespread acceptance and has been successfully applied to various test-beds. This paper presents a formulation of the predictive control scheme for adaptive vibration control of structures under earthquakes. Comprehensive simulations are performed to demonstrate and validate the proposed adaptive control technique for earthquake-induced vibration of a building.

  2. Numerical model estimating the capabilities and limitations of the fast Fourier transform technique in absolute interferometry

    NASA Astrophysics Data System (ADS)

    Talamonti, James J.; Kay, Richard B.; Krebs, Danny J.

    1996-05-01

    A numerical model was developed to emulate the capabilities of systems performing noncontact absolute distance measurements. The model incorporates known methods to minimize signal processing and digital sampling errors and evaluates the accuracy limitations imposed by spectral peak isolation by using Hanning, Blackman, and Gaussian windows in the fast Fourier transform technique. We applied this model to the specific case of measuring the relative lengths of a compound Michelson interferometer. By processing computer-simulated data through our model, we project the ultimate precision for ideal data, and data containing AM-FM noise. The precision is shown to be limited by nonlinearities in the laser scan. absolute distance, interferometer.

  3. Imaging Study of Multi-Crystalline Silicon Wafers Throughout the Manufacturing Process: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, S.; Yan, F.; Zaunbracher, K.

    2011-07-01

    Imaging techniques are applied to multi-crystalline silicon bricks, wafers at various process steps, and finished solar cells. Photoluminescence (PL) imaging is used to characterize defects and material quality on bricks and wafers. Defect regions within the wafers are influenced by brick position within an ingot and height within the brick. The defect areas in as-cut wafers are compared to imaging results from reverse-bias electroluminescence and dark lock-in thermography and cell parameters of near-neighbor finished cells. Defect areas are also characterized by defect band emissions. The defect areas measured by these techniques on as-cut wafers are shown to correlate to finishedmore » cell performance.« less

  4. NMR Hole-Burning Experiments on Superionic Conductor Glasses

    NASA Astrophysics Data System (ADS)

    Kawamura, J.; Kuwata, N.; Hattori, T.

    2004-04-01

    Inhomogeneity is an inherent nature of glass, which is the density and concentration fluctuation frozen at glass transition temperature. The inhomogeneity of the glass plays significant role in so called superionic conductor glasses (SIG), since the mobile ions seek to move through energetically favorable paths. The localization of mobile ions in SIG near the 2nd glass transition is a remaining issue, where the trapping, percolation and many-body interactions are playing the roles. In order to investigate the trapping process in SIG, the authors have applied 109Ag NMR Hole-Burning technique to AgI containing SIG glasses. By using this technique, the slowing down process of the site-exchange rates between different sites were evaluated.

  5. An architecture for designing fuzzy logic controllers using neural networks

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1991-01-01

    Described here is an architecture for designing fuzzy controllers through a hierarchical process of control rule acquisition and by using special classes of neural network learning techniques. A new method for learning to refine a fuzzy logic controller is introduced. A reinforcement learning technique is used in conjunction with a multi-layer neural network model of a fuzzy controller. The model learns by updating its prediction of the plant's behavior and is related to the Sutton's Temporal Difference (TD) method. The method proposed here has the advantage of using the control knowledge of an experienced operator and fine-tuning it through the process of learning. The approach is applied to a cart-pole balancing system.

  6. Career Development Theory and Its Application. Career Knowledge Series

    ERIC Educational Resources Information Center

    National Career Development Association, 2015

    2015-01-01

    Covers career development theory, models, and techniques and how to apply them; understand the steps in the career development process and why career choice and development theory is important as well as limitations. Presents the assumptions that underlie four different types of theories; trait and factor, learning, developmental, and transition…

  7. Recipe for an Infographic

    ERIC Educational Resources Information Center

    Abilock, Debbie; Williams, Connie

    2014-01-01

    In this article Debbie Abilock and Connie Williams apply the processes used by a food testing organization that strives to develop absolutely the best recipes for popular dishes, testing each recipe until they arrive at the combination of ingredients, technique, temperature, cooking time, and equipment that yields the best, most fool-proof recipe…

  8. : Signal Decomposition of High Resolution Time Series River data to Separate Local and Regional Components of Conductivity

    EPA Science Inventory

    Signal processing techniques were applied to high-resolution time series data obtained from conductivity loggers placed upstream and downstream of a wastewater treatment facility along a river. Data was collected over 14-60 days, and several seasons. The power spectral densit...

  9. Spectral mixture modeling: Further analysis of rock and soil types at the Viking Lander sites

    NASA Technical Reports Server (NTRS)

    Adams, John B.; Smith, Milton O.

    1987-01-01

    A new image processing technique was applied to Viking Lander multispectral images. Spectral endmembers were defined that included soil, rock and shade. Mixtures of these endmembers were found to account for nearly all the spectral variance in a Viking Lander image.

  10. Comparative behavioral and protein study of salivary secretions in Homalodisca spp. sharpshooters (Hemiptera: Cicadellidae: Cicadellinae)

    USDA-ARS?s Scientific Manuscript database

    A novel brush-induced method to physically stimulate salivation was applied to the glassy-winged and smoke tree sharpshooters. This technique enabled the direct observation of salivary secretion processes, solidification of saliva and for collection of salivary secretions. For both species, brush...

  11. Simulation/Gaming in the EAP Writing Class: Benefits and Drawbacks.

    ERIC Educational Resources Information Center

    Salies, Tania Gastao

    2002-01-01

    Describes an integrated use of simulation/gaming in an English for Academic Purposes (EAP) class, analyzes benefits and drawbacks, and suggest how the technique could apply to other specific contexts. Explains how international students ran a simulation on gun control; discusses the debriefing process; and considers motivation, metacognitive…

  12. The Impedance Response of Semiconductors: An Electrochemical Engineering Perspective.

    ERIC Educational Resources Information Center

    Orazem, Mark E.

    1990-01-01

    Shows that the principles learned in the study of mass transport, thermodynamics, and kinetics associated with electrochemical systems can be applied to the transport and reaction processes taking place within a semiconductor. Describes impedance techniques and provides several graphs illustrating impedance data for diverse circuit systems. (YP)

  13. Noah Pflaum | NREL

    Science.gov Websites

    | 303-384-7527 Noah joined NREL in 2017 after having worked as a consulting building energy analyst. His to smooth the integration of building energy modeling into the building design process. Noah applies a variety of analytical techniques to solve problems associated with building performance as they

  14. 23 CFR 450.208 - Coordination of planning process activities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and related multistate planning efforts; (3) Consider the concerns of Federal land management agencies.... (d) States may use any one or more of the management systems (in whole or in part) described in 23 CFR part 500. (e) States may apply asset management principles and techniques in establishing planning...

  15. Phonics Instruction for Disabled Learners: Applying Theory to Method. Technical Report # 7.

    ERIC Educational Resources Information Center

    Fayne, Harriet R.

    To design effective remedial phonics instruction, it is necessary to examine both learner characteristics and task requirements. The paper integrates research related to information processing and psycholinguistics to formulate questions which can be used to evaluate techniques and materials used with a learning disabled population. Information…

  16. Wire-mesh sensor, ultrasound and high-speed videometry applied for the characterization of horizontal gas-liquid slug flow

    NASA Astrophysics Data System (ADS)

    Ofuchi, C. Y.; Morales, R. E. M.; Arruda, L. V. R.; Neves, F., Jr.; Dorini, L.; do Amaral, C. E. F.; da Silva, M. J.

    2012-03-01

    Gas-liquid flows occur in a broad range of industrial applications, for instance in chemical, petrochemical and nuclear industries. Correct understating of flow behavior is crucial for safe and optimized operation of equipments and processes. Thus, measurement of gas-liquid flow plays an important role. Many techniques have been proposed and applied to analyze two-phase flows so far. In this experimental research, data from a wire-mesh sensor, an ultrasound technique and high-speed camera are used to study two-phase slug flows in horizontal pipes. The experiments were performed in an experimental two-phase flow loop which comprises a horizontal acrylic pipe of 26 mm internal diameter and 9 m length. Water and air were used to produce the two-phase flow and their flow rates are separately controlled to produce different flow conditions. As a parameter of choice, translational velocity of air bubbles was determined by each of the techniques and comparatively evaluated along with a mechanistic flow model. Results obtained show good agreement among all techniques. The visualization of flow obtained by the different techniques is also presented.

  17. Post Processing Methods used to Improve Surface Finish of Products which are Manufactured by Additive Manufacturing Technologies: A Review

    NASA Astrophysics Data System (ADS)

    Kumbhar, N. N.; Mulay, A. V.

    2016-08-01

    The Additive Manufacturing (AM) processes open the possibility to go directly from Computer-Aided Design (CAD) to a physical prototype. These prototypes are used as test models before it is finalized as well as sometimes as a final product. Additive Manufacturing has many advantages over the traditional process used to develop a product such as allowing early customer involvement in product development, complex shape generation and also save time as well as money. Additive manufacturing also possess some special challenges that are usually worth overcoming such as Poor Surface quality, Physical Properties and use of specific raw material for manufacturing. To improve the surface quality several attempts had been made by controlling various process parameters of Additive manufacturing and also applying different post processing techniques on components manufactured by Additive manufacturing. The main objective of this work is to document an extensive literature review in the general area of post processing techniques which are used in Additive manufacturing.

  18. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    NASA Technical Reports Server (NTRS)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    The Kepler mission is described in overview and the Kepler technique for discovering exoplanets is discussed. The design and implementation of the Kepler spacecraft, tracing the data path from photons entering the telescope aperture through raw observation data transmitted to the ground operations team is described. The technical challenges of operating a large aperture photometer with an unprecedented 95 million pixel detector are addressed as well as the onboard technique for processing and reducing the large volume of data produced by the Kepler photometer. The technique and challenge of day-to-day mission operations that result in a very high percentage of time on target is discussed. This includes the day to day process for monitoring and managing the health of the spacecraft, the annual process for maintaining sun on the solar arrays while still keeping the telescope pointed at the fixed science target, the process for safely but rapidly returning to science operations after a spacecraft initiated safing event and the long term anomaly resolution process.The ground data processing pipeline, from the point that science data is received on the ground to the presentation of preliminary planetary candidates and supporting data to the science team for further evaluation is discussed. Ground management, control, exchange and storage of Kepler's large and growing data set is discussed as well as the process and techniques for removing noise sources and applying calibrations to intermediate data products.

  19. Application of flow cytometry to wine microorganisms.

    PubMed

    Longin, Cédric; Petitgonnet, Clément; Guilloux-Benatier, Michèle; Rousseaux, Sandrine; Alexandre, Hervé

    2017-04-01

    Flow cytometry (FCM) is a powerful technique allowing detection and enumeration of microbial populations in food and during food process. Thanks to the fluorescent dyes used and specific probes, FCM provides information about cell physiological state and allows enumeration of a microorganism in a mixed culture. Thus, this technique is increasingly used to quantify pathogen, spoilage microorganisms and microorganisms of interest. Since one decade, FCM applications to the wine field increase greatly to determine population and physiological state of microorganisms performing alcoholic and malolactic fermentations. Wine spoilage microorganisms were also studied. In this review we briefly describe FCM principles. Next, a deep revision concerning enumeration of wine microorganisms by FCM is presented including the fluorescent dyes used and techniques allowing a yeast and bacteria species specific enumeration. Then, the last chapter is dedicated to fluorescent dyes which are used to date in fluorescent microscopy but applicable in FCM. This chapter also describes other interesting "future" techniques which could be applied to study the wine microorganisms. Thus, this review seeks to highlight the main advantages of the flow cytometry applied to wine microbiology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Maintaining the Health of Software Monitors

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Rungta, Neha

    2013-01-01

    Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.

  1. Oximetry using multispectral imaging: theory and application

    NASA Astrophysics Data System (ADS)

    MacKenzie, Lewis E.; Harvey, Andrew R.

    2018-06-01

    Multispectral imaging (MSI) is a technique for measurement of blood oxygen saturation in vivo that can be applied using various imaging modalities to provide new insights into physiology and disease development. This tutorial aims to provide a thorough introduction to the theory and application of MSI oximetry for researchers new to the field, whilst also providing detailed information for more experienced researchers. The optical theory underlying two-wavelength oximetry, three-wavelength oximetry, pulse oximetry, and multispectral oximetry algorithms are described in detail. The varied challenges of applying MSI oximetry to in vivo applications are outlined and discussed, covering: the optical properties of blood and tissue, optical paths in blood vessels, tissue auto-fluorescence, oxygen diffusion, and common oximetry artefacts. Essential image processing techniques for MSI are discussed, in particular, image acquisition, image registration strategies, and blood vessel line profile fitting. Calibration and validation strategies for MSI are discussed, including comparison techniques, physiological interventions, and phantoms. The optical principles and unique imaging capabilities of various cutting-edge MSI oximetry techniques are discussed, including photoacoustic imaging, spectroscopic optical coherence tomography, and snapshot MSI.

  2. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  3. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  4. Numerical simulation of the SAGD process coupled with geomechanical behavior

    NASA Astrophysics Data System (ADS)

    Li, Pingke

    Canada has vast oil sand resources. While a large portion of this resource can be recovered by surface mining techniques, a majority is located at depths requiring the application of in situ recovery technologies. Although a number of in situ recovery technologies exist, the steam assisted gravity drainage (SAGD) process has emerged as one of the most promising technologies to develop the in situ oil sands resources. During the SAGD operations, saturated steam is continuously injected into the oil sands reservoir, which induces pore pressure and stress variations. As a result, reservoir parameters and processes may also vary, particularly when tensile and shear failure occur. This geomechanical effect is obvious for oil sands material because oil sands have the in situ interlocked fabric. The conventional reservoir simulation generally does not take this coupled mechanism into consideration. Therefore, this research is to improve the reservoir simulation techniques of the SAGD process applied in the development of oil sands and heavy oil reservoirs. The analyses of the decoupled reservoir geomechanical simulation results show that the geomechanical behavior in SAGD has obvious impact on reservoir parameters, such as absolute permeability. The issues with the coupled reservoir geomechanical simulations of the SAGD process have been clarified and the permeability variations due to geomechanical behaviors in the SAGD process investigated. A methodology of sequentially coupled reservoir geomechanical simulation technique was developed based on the reservoir simulator, EXOTHERM, and the geomechanical simulator, FLAC. In addition, a representative geomechanical model of oil sands material was summarized in this research. Finally, this reservoir geomechanical simulation methodology was verified with the UTF Phase A SAGD project and applied in a SAGD operation with gas-over-bitumen geometry. Based on this methodology, the geomechanical effect on the SAGD production performance can be quantified. This research program involves the analyses of laboratory testing results obtained from literatures. However, no laboratory testing was conducted in the process of this research.

  5. A method for the analysis of nonlinearities in aircraft dynamic response to atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1976-01-01

    An analytical method is developed which combines the equivalent linearization technique for the analysis of the response of nonlinear dynamic systems with the amplitude modulated random process (Press model) for atmospheric turbulence. The method is initially applied to a bilinear spring system. The analysis of the response shows good agreement with exact results obtained by the Fokker-Planck equation. The method is then applied to an example of control-surface displacement limiting in an aircraft with a pitch-hold autopilot.

  6. Robust watermark technique using masking and Hermite transform.

    PubMed

    Coronel, Sandra L Gomez; Ramírez, Boris Escalante; Mosqueda, Marco A Acevedo

    2016-01-01

    The following paper evaluates a watermark algorithm designed for digital images by using a perceptive mask and a normalization process, thus preventing human eye detection, as well as ensuring its robustness against common processing and geometric attacks. The Hermite transform is employed because it allows a perfect reconstruction of the image, while incorporating human visual system properties; moreover, it is based on the Gaussian functions derivates. The applied watermark represents information of the digital image proprietor. The extraction process is blind, because it does not require the original image. The following techniques were utilized in the evaluation of the algorithm: peak signal-to-noise ratio, the structural similarity index average, the normalized crossed correlation, and bit error rate. Several watermark extraction tests were performed, with against geometric and common processing attacks. It allowed us to identify how many bits in the watermark can be modified for its adequate extraction.

  7. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  8. Applying machine learning classification techniques to automate sky object cataloguing

    NASA Astrophysics Data System (ADS)

    Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav

    1993-08-01

    We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is consistency of classification. The classification rules which are the product of the inductive learning techniques will form an objective, examinable basis for classifying sky objects. A final, not to be underestimated benefit is that astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems based on automatically catalogued data.

  9. The effect of processing on the mechanical properties of self-reinforced composites

    NASA Astrophysics Data System (ADS)

    Hassani, Farzaneh; Martin, Peter J.; Falzon, Brian G.

    2018-05-01

    Hot-compaction is one of the most common manufacturing methods for creating recyclable all thermoplastic composites. The current work investigates the compaction of highly oriented self-reinforced fabrics with three processing methods to study the effect of pressure and temperature in the tensile mechanical properties of the consolidated laminates. Hot-press, calender roller and vacuum bag technique were adopted to consolidate bi-component polypropylene woven fabrics in a range of pressures and compaction temperatures. Hot-pressed samples exhibited the highest quality of compaction. The modulus of the hot-pressed samples increased with compaction temperature initially due to the improved interlayer bonding and decreased after a maximum at 150°C because of partial melting of the reinforcement phase. The calender roller technique exhibited to have smaller processing temperature window as the pressure is only applied for a short time and the fabrics start to shrink with increasing the processing temperature. The need for constraining the fabrics through the process is therefore found to be paramount. The Vacuum bag results showed this technique to be the least efficient method because of the low compaction pressure. Microscopic images and void content measurement of the consolidated samples further validate the results from tensile testing.

  10. Signal processing techniques for damage detection with piezoelectric wafer active sensors and embedded ultrasonic structural radar

    NASA Astrophysics Data System (ADS)

    Yu, Lingyu; Bao, Jingjing; Giurgiutiu, Victor

    2004-07-01

    Embedded ultrasonic structural radar (EUSR) algorithm is developed for using piezoelectric wafer active sensor (PWAS) array to detect defects within a large area of a thin-plate specimen. Signal processing techniques are used to extract the time of flight of the wave packages, and thereby to determine the location of the defects with the EUSR algorithm. In our research, the transient tone-burst wave propagation signals are generated and collected by the embedded PWAS. Then, with signal processing, the frequency contents of the signals and the time of flight of individual frequencies are determined. This paper starts with an introduction of embedded ultrasonic structural radar algorithm. Then we will describe the signal processing methods used to extract the time of flight of the wave packages. The signal processing methods being used include the wavelet denoising, the cross correlation, and Hilbert transform. Though hardware device can provide averaging function to eliminate the noise coming from the signal collection process, wavelet denoising is included to ensure better signal quality for the application in real severe environment. For better recognition of time of flight, cross correlation method is used. Hilbert transform is applied to the signals after cross correlation in order to extract the envelope of the signals. Signal processing and EUSR are both implemented by developing a graphical user-friendly interface program in LabView. We conclude with a description of our vision for applying EUSR signal analysis to structural health monitoring and embedded nondestructive evaluation. To this end, we envisage an automatic damage detection application utilizing embedded PWAS, EUSR, and advanced signal processing.

  11. Argon-oxygen atmospheric pressure plasma treatment on carbon fiber reinforced polymer for improved bonding

    NASA Astrophysics Data System (ADS)

    Chartosias, Marios

    Acceptance of Carbon Fiber Reinforced Polymer (CFRP) structures requires a robust surface preparation method with improved process controls capable of ensuring high bond quality. Surface preparation in a production clean room environment prior to applying adhesive for bonding would minimize risk of contamination and reduce cost. Plasma treatment is a robust surface preparation process capable of being applied in a production clean room environment with process parameters that are easily controlled and documented. Repeatable and consistent processing is enabled through the development of a process parameter window utilizing techniques such as Design of Experiments (DOE) tailored to specific adhesive and substrate bonding applications. Insight from respective plasma treatment Original Equipment Manufacturers (OEMs) and screening tests determined critical process factors from non-factors and set the associated factor levels prior to execution of the DOE. Results from mode I Double Cantilever Beam (DCB) testing per ASTM D 5528 [1] standard and DOE statistical analysis software are used to produce a regression model and determine appropriate optimum settings for each factor.

  12. Application of fuzzy AHP method to IOCG prospectivity mapping: A case study in Taherabad prospecting area, eastern Iran

    NASA Astrophysics Data System (ADS)

    Najafi, Ali; Karimpour, Mohammad Hassan; Ghaderi, Majid

    2014-12-01

    Using fuzzy analytical hierarchy process (AHP) technique, we propose a method for mineral prospectivity mapping (MPM) which is commonly used for exploration of mineral deposits. The fuzzy AHP is a popular technique which has been applied for multi-criteria decision-making (MCDM) problems. In this paper we used fuzzy AHP and geospatial information system (GIS) to generate prospectivity model for Iron Oxide Copper-Gold (IOCG) mineralization on the basis of its conceptual model and geo-evidence layers derived from geological, geochemical, and geophysical data in Taherabad area, eastern Iran. The FuzzyAHP was used to determine the weights belonging to each criterion. Three geoscientists knowledge on exploration of IOCG-type mineralization have been applied to assign weights to evidence layers in fuzzy AHP MPM approach. After assigning normalized weights to all evidential layers, fuzzy operator was applied to integrate weighted evidence layers. Finally for evaluating the ability of the applied approach to delineate reliable target areas, locations of known mineral deposits in the study area were used. The results demonstrate the acceptable outcomes for IOCG exploration.

  13. Automatic Tracking Of Remote Sensing Precipitation Data Using Genetic Algorithm Image Registration Based Automatic Morphing: September 1999 Storm Floyd Case Study

    NASA Astrophysics Data System (ADS)

    Chiu, L.; Vongsaard, J.; El-Ghazawi, T.; Weinman, J.; Yang, R.; Kafatos, M.

    U Due to the poor temporal sampling by satellites, data gaps exist in satellite derived time series of precipitation. This poses a challenge for assimilating rain- fall data into forecast models. To yield a continuous time series, the classic image processing technique of digital image morphing has been used. However, the digital morphing technique was applied manually and that is time consuming. In order to avoid human intervention in the process, an automatic procedure for image morphing is needed for real-time operations. For this purpose, Genetic Algorithm Based Image Registration Automatic Morphing (GRAM) model was developed and tested in this paper. Specifically, automatic morphing technique was integrated with Genetic Algo- rithm and Feature Based Image Metamorphosis technique to fill in data gaps between satellite coverage. The technique was tested using NOWRAD data which are gener- ated from the network of NEXRAD radars. Time series of NOWRAD data from storm Floyd that occurred at the US eastern region on September 16, 1999 for 00:00, 01:00, 02:00,03:00, and 04:00am were used. The GRAM technique was applied to data col- lected at 00:00 and 04:00am. These images were also manually morphed. Images at 01:00, 02:00 and 03:00am were interpolated from the GRAM and manual morphing and compared with the original NOWRAD rainrates. The results show that the GRAM technique outperforms manual morphing. The correlation coefficients between the im- ages generated using manual morphing are 0.905, 0.900, and 0.905 for the images at 01:00, 02:00,and 03:00 am, while the corresponding correlation coefficients are 0.946, 0.911, and 0.913, respectively, based on the GRAM technique. Index terms ­ Remote Sensing, Image Registration, Hydrology, Genetic Algorithm, Morphing, NEXRAD

  14. New technique for ensemble dressing combining Multimodel SuperEnsemble and precipitation PDF

    NASA Astrophysics Data System (ADS)

    Cane, D.; Milelli, M.

    2009-09-01

    The Multimodel SuperEnsemble technique (Krishnamurti et al., Science 285, 1548-1550, 1999) is a postprocessing method for the estimation of weather forecast parameters reducing direct model output errors. It differs from other ensemble analysis techniques by the use of an adequate weighting of the input forecast models to obtain a combined estimation of meteorological parameters. Weights are calculated by least-square minimization of the difference between the model and the observed field during a so-called training period. Although it can be applied successfully on the continuous parameters like temperature, humidity, wind speed and mean sea level pressure (Cane and Milelli, Meteorologische Zeitschrift, 15, 2, 2006), the Multimodel SuperEnsemble gives good results also when applied on the precipitation, a parameter quite difficult to handle with standard post-processing methods. Here we present our methodology for the Multimodel precipitation forecasts applied on a wide spectrum of results over Piemonte very dense non-GTS weather station network. We will focus particularly on an accurate statistical method for bias correction and on the ensemble dressing in agreement with the observed precipitation forecast-conditioned PDF. Acknowledgement: this work is supported by the Italian Civil Defence Department.

  15. A novel fast and flexible technique of radical kinetic behaviour investigation based on pallet for plasma evaluation structure and numerical analysis

    NASA Astrophysics Data System (ADS)

    Malinowski, Arkadiusz; Takeuchi, Takuya; Chen, Shang; Suzuki, Toshiya; Ishikawa, Kenji; Sekine, Makoto; Hori, Masaru; Lukasiak, Lidia; Jakubowski, Andrzej

    2013-07-01

    This paper describes a new, fast, and case-independent technique for sticking coefficient (SC) estimation based on pallet for plasma evaluation (PAPE) structure and numerical analysis. Our approach does not require complicated structure, apparatus, or time-consuming measurements but offers high reliability of data and high flexibility. Thermal analysis is also possible. This technique has been successfully applied to estimation of very low value of SC of hydrogen radicals on chemically amplified ArF 193 nm photoresist (the main goal of this study). Upper bound of our technique has been determined by investigation of SC of fluorine radical on polysilicon (in elevated temperature). Sources of estimation error and ways of its reduction have been also discussed. Results of this study give an insight into the process kinetics, and not only they are helpful in better process understanding but additionally they may serve as parameters in a phenomenological model development for predictive modelling of etching for ultimate CMOS topography simulation.

  16. Thermal Analysis of Brazing Seal and Sterilizing Technique to Break Contamination Chain for Mars Sample Return

    NASA Technical Reports Server (NTRS)

    Bao, Xiaoqi; Badescu, Mircea; Bar-Cohen, Yoseph

    2015-01-01

    The potential to return Martian samples to Earth for extensive analysis is in great interest of the planetary science community. It is important to make sure the mission would securely contain any microbes that may possibly exist on Mars so that they would not be able to cause any adverse effects on Earth's environment. A brazing sealing and sterilizing technique has been proposed to break the Mars-to-Earth contamination chain. Thermal analysis of the brazing process was conducted for several conceptual designs that apply the technique. Control of the increase of the temperature of the Martian samples is a challenge. The temperature profiles of the Martian samples being sealed in the container were predicted by finite element thermal models. The results show that the sealing and sterilization process can be controlled such that the samples' temperature is maintained below the potentially required level, and that the brazing technique is a feasible approach to break the contamination chain.

  17. An evaluation of machine processing techniques of ERTS-1 data for user applications. [urban land use and soil association mapping in Indiana

    NASA Technical Reports Server (NTRS)

    Landgrebe, D.

    1974-01-01

    A broad study is described to evaluate a set of machine analysis and processing techniques applied to ERTS-1 data. Based on the analysis results in urban land use analysis and soil association mapping together with previously reported results in general earth surface feature identification and crop species classification, a profile of general applicability of this procedure is beginning to emerge. Put in the hands of a user who knows well the information needed from the data and also is familiar with the region to be analyzed it appears that significantly useful information can be generated by these methods. When supported by preprocessing techniques such as the geometric correction and temporal registration capabilities, final products readily useable by user agencies appear possible. In parallel with application, through further research, there is much potential for further development of these techniques both with regard to providing higher performance and in new situations not yet studied.

  18. Combining LCT tools for the optimization of an industrial process: material and energy flow analysis and best available techniques.

    PubMed

    Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares

    2011-09-15

    Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Combining geographic information system, multicriteria evaluation techniques and fuzzy logic in siting MSW landfills

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Tsihrintzis, Vassilios A.; Voudrias, Evangelos; Petalas, Christos; Stravodimos, George

    2007-01-01

    This study presents a methodology for siting municipal solid waste landfills, coupling geographic information systems (GIS), fuzzy logic, and multicriteria evaluation techniques. Both exclusionary and non-exclusionary criteria are used. Factors, i.e., non-exclusionary criteria, are divided in two distinct groups which do not have the same level of trade off. The first group comprises factors related to the physical environment, which cannot be expressed in terms of monetary cost and, therefore, they do not easily trade off. The second group includes those factors related to human activities, i.e., socioeconomic factors, which can be expressed as financial cost, thus showing a high level of trade off. GIS are used for geographic data acquisition and processing. The analytical hierarchy process (AHP) is the multicriteria evaluation technique used, enhanced with fuzzy factor standardization. Besides assigning weights to factors through the AHP, control over the level of risk and trade off in the siting process is achieved through a second set of weights, i.e., order weights, applied to factors in each factor group, on a pixel-by-pixel basis, thus taking into account the local site characteristics. The method has been applied to Evros prefecture (NE Greece), an area of approximately 4,000 km2. The siting methodology results in two intermediate suitability maps, one related to environmental and the other to socioeconomic criteria. Combination of the two intermediate maps results in the final composite suitability map for landfill siting.

  20. Drug loading into beta-cyclodextrin granules using a supercritical fluid process for improved drug dissolution.

    PubMed

    Hussein, Khaled; Türk, Michael; Wahl, Martin A

    2008-03-03

    To improve dissolution properties of drugs, a supercritical fluid (SCF) technique was used to load these drugs into a solid carrier. In this study, granules based on beta-cyclodextrin (betaCD) were applied as a carrier for poor water-soluble drug and loaded with a model drug (ibuprofen) using two different procedures: controlled particle deposition (CPD), SCF process and solution immersion (SI) as a conventional method for comparison. Using the CPD technique, 17.42+/-2.06wt.% (n=3) ibuprofen was loaded into betaCD-granules, in contrast to only 3.8+/-0.15wt.% (n=3) in the SI-product. The drug loading was confirmed as well by reduction of the BET surface area for the CPD-product (1.134+/-0.07m(2)/g) compared to the unloaded-granules (1.533+/-0.031m(2)/g). Such a reduction was not seen in the SI-product (1.407+/-0.048m(2)/g). The appearance of an endothermic melting peak at 77 degrees C and X-ray patterns representing ibuprofen in drug-loaded granules can be attributed to the amount of ibuprofen loaded in its crystalline form. A significant increase in drug dissolution was achieved by either drug-loading procedures compared to the unprocessed ibuprofen. In this study, the CPD technique, a supercritical fluid process avoiding the use of toxic or organic solvents was successfully applied to load drug into solid carriers, thereby improving the water-solubility of the drug.

  1. A low-frequency near-field interferometric-TOA 3-D Lightning Mapping Array

    NASA Astrophysics Data System (ADS)

    Lyu, Fanchao; Cummer, Steven A.; Solanki, Rahulkumar; Weinert, Joel; McTague, Lindsay; Katko, Alex; Barrett, John; Zigoneanu, Lucian; Xie, Yangbo; Wang, Wenqi

    2014-11-01

    We report on the development of an easily deployable LF near-field interferometric-time of arrival (TOA) 3-D Lightning Mapping Array applied to imaging of entire lightning flashes. An interferometric cross-correlation technique is applied in our system to compute windowed two-sensor time differences with submicrosecond time resolution before TOA is used for source location. Compared to previously reported LF lightning location systems, our system captures many more LF sources. This is due mainly to the improved mapping of continuous lightning processes by using this type of hybrid interferometry/TOA processing method. We show with five station measurements that the array detects and maps different lightning processes, such as stepped and dart leaders, during both in-cloud and cloud-to-ground flashes. Lightning images mapped by our LF system are remarkably similar to those created by VHF mapping systems, which may suggest some special links between LF and VHF emission during lightning processes.

  2. Preparation of highly hydrophobic cotton fabrics by modification with bifunctional silsesquioxanes in the sol-gel process

    NASA Astrophysics Data System (ADS)

    Przybylak, Marcin; Maciejewski, Hieronim; Dutkiewicz, Agnieszka

    2016-11-01

    The surface modification of cotton fabrics was carried out using two types of bifunctional fluorinated silsesquioxanes with different ratios of functional groups. The modification was performed either by one- or two-step process. Two methods, the sol-gel and the dip coating method were used in different configurations. The heat treatment and the washing process were applied after modification. The wettability of cotton fabric was evaluated by measuring water contact angles (WCA). Changes in the surface morphology were examined by scanning electron microscopy (SEM, SEM-LFD) and atomic force microscopy (AFM). Moreover, the modified fabrics were subjected to analysis of elemental composition of the applied coatings using SEM-EDS techniques. Highly hydrophobic textiles were obtained in all cases studied and one of the modifications resulted in imparting superhydrophobic properties. Most of impregnated textiles remained hydrophobic even after multiple washing process which shows that the studied modification is durable.

  3. A new approach for remediation of As-contaminated soil: ball mill-based technique.

    PubMed

    Shin, Yeon-Jun; Park, Sang-Min; Yoo, Jong-Chan; Jeon, Chil-Sung; Lee, Seung-Woo; Baek, Kitae

    2016-02-01

    In this study, a physical ball mill process instead of chemical extraction using toxic chemical agents was applied to remove arsenic (As) from contaminated soil. A statistical analysis was carried out to establish the optimal conditions for ball mill processing. As a result of the statistical analysis, approximately 70% of As was removed from the soil at the following conditions: 5 min, 1.0 cm, 10 rpm, and 5% of operating time, media size, rotational velocity, and soil loading conditions, respectively. A significant amount of As remained in the grinded fine soil after ball mill processing while more than 90% of soil has the original properties to be reused or recycled. As a result, the ball mill process could remove the metals bound strongly to the surface of soil by the surface grinding, which could be applied as a pretreatment before application of chemical extraction to reduce the load.

  4. The Neuroscience of Consumer Choice

    PubMed Central

    Hsu, Ming; Yoon, Carolyn

    2015-01-01

    We review progress and challenges relating to scientific and applied goals of the nascent field of consumer neuroscience. Scientifically, substantial progress has been made in understanding the neurobiology of choice processes. Further advances, however, require researchers to begin clarifying the set of developmental and cognitive processes that shape and constrain choices. First, despite the centrality of preferences in theories of consumer choice, we still know little about where preferences come from and the underlying developmental processes. Second, the role of attention and memory processes in consumer choice remains poorly understood, despite importance ascribed to them in interpreting data from the field. The applied goal of consumer neuroscience concerns our ability to translate this understanding to augment prediction at the population level. Although the use of neuroscientific data for market-level predictions remains speculative, there is growing evidence of superiority in specific cases over existing market research techniques. PMID:26665152

  5. Effect of Cooling Rate on SCC Susceptibility of β-Processed Ti-6Al-4V Alloy in 0.6M NaCl Solution

    NASA Astrophysics Data System (ADS)

    Ahn, Soojin; Park, Jiho; Jeong, Daeho; Sung, Hyokyung; Kwon, Yongnam; Kim, Sangshik

    2018-03-01

    The effects of cooling rate on the stress corrosion cracking (SCC) susceptibility of β-processed Ti-6Al-4V (Ti64) alloy, including BA/S specimen with furnace cooling and BQ/S specimen with water quenching, were investigated in 0.6M NaCl solution under various applied potentials using a slow strain rate test technique. It was found that the SCC susceptibility of β-processed Ti64 alloy in aqueous NaCl solution decreased with fast cooling rate, which was particularly substantial under an anodic applied potential. The micrographic and fractographic analyses suggested that the enhancement with fast cooling rate was related to the random orientation of acicular α platelets in BQ/S specimen. Based on the experimental results, the effect of cooling rate on the SCC behavior of β-processed Ti64 alloy in aqueous NaCl solution was discussed.

  6. Constructed Pools-and-Riffles: Application and Assessment in Illinois.

    NASA Astrophysics Data System (ADS)

    Day, D. M.; Dodd, H. R.; Carney, D. A.; Holtrop, A. M.; Whiles, M. R.; White, B.; Roseboom, D.; Kinney, W.; Keefer, L. L.; Beardsley, J.

    2005-05-01

    The diversity of Illinois' streams provides a broad range of conditions, and thus a variety of restoration techniques may be required to adequately compensate for watershed alterations. Resource management agencies and research institutions in the state have collaborated on a variety of applied research initiatives to assess the efficacy of various stream protection and restoration techniques. Constructed pool-and-riffle structures have received significant attention because they tend to address watershed processes (i.e., channel evolution model) and may benefit biotic communities and processes along with physical habitat. Constructed pools-and-riffles have been applied primarily to address geomorphic instability, yet understanding biological responses can provide further rationale for their use and design specifications. In three stream systems around the state, fish were collected pre- and post- installation of structures, using primarily electrofishing techniques (e.g., electric seine & backpack). In general, within the first five years after installation, changes in fish communities have included a shift from high-abundance, small cyprinid-dominated assemblages to low-density Centrarchidae and Catostomidae assemblages. Changes in macro invertebrates at selected sites included increases in filter feeders and sensitive taxa such as the Ephemeroptera, Plecoptera, and Trichoptera (EPT). Ongoing assessments will be critical for understanding long-term influences on stream ecosystem structure and function.

  7. Processing and analysis of commercial satellite image data of the nuclear accident near Chernobyl, U.S.S.R.

    USGS Publications Warehouse

    Sadowski, Franklin G.; Covington, Steven J.

    1987-01-01

    Advanced digital processing techniques were applied to Landsat-5 Thematic Mapper (TM) data and SPOT highresolution visible (HRV) panchromatic data to maximize the utility of images of a nuclear powerplant emergency at Chernobyl in the Soviet Ukraine. The images demonstrate the unique interpretive capabilities provided by the numerous spectral bands of the Thematic Mapper and the high spatial resolution of the SPOT HRV sensor.

  8. Lunar Impact Flash Locations

    NASA Technical Reports Server (NTRS)

    Moser, D. E.; Suggs, R. M.; Kupferschmidt, L.; Feldman, J.

    2015-01-01

    A bright impact flash detected by the NASA Lunar Impact Monitoring Program in March 2013 brought into focus the importance of determining the impact flash location. A process for locating the impact flash, and presumably its associated crater, was developed using commercially available software tools. The process was successfully applied to the March 2013 impact flash and put into production on an additional 300 impact flashes. The goal today: provide a description of the geolocation technique developed.

  9. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  10. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  11. Evaluation of area strain response of dielectric elastomer actuator using image processing technique

    NASA Astrophysics Data System (ADS)

    Sahu, Raj K.; Sudarshan, Koyya; Patra, Karali; Bhaumik, Shovan

    2014-03-01

    Dielectric elastomer actuator (DEA) is a kind of soft actuators that can produce significantly large electric-field induced actuation strain and may be a basic unit of artificial muscles and robotic elements. Understanding strain development on a pre-stretched sample at different regimes of electrical field is essential for potential applications. In this paper, we report about ongoing work on determination of area strain using digital camera and image processing technique. The setup, developed in house consists of low cost digital camera, data acquisition and image processing algorithm. Samples have been prepared by biaxially stretched acrylic tape and supported between two cardboard frames. Carbon-grease has been pasted on the both sides of the sample, which will be compliant with electric field induced large deformation. Images have been grabbed before and after the application of high voltage. From incremental image area, strain has been calculated as a function of applied voltage on a pre-stretched dielectric elastomer (DE) sample. Area strain has been plotted with the applied voltage for different pre-stretched samples. Our study shows that the area strain exhibits nonlinear relationship with applied voltage. For same voltage higher area strain has been generated on a sample having higher pre-stretched value. Also our characterization matches well with previously published results which have been done with costly video extensometer. The study may be helpful for the designers to fabricate the biaxial pre-stretched planar actuator from similar kind of materials.

  12. Multivariable control of a rapid thermal processor using ultrasonic sensors

    NASA Astrophysics Data System (ADS)

    Dankoski, Paul C. P.

    The semiconductor manufacturing industry faces the need for tighter control of thermal budget and process variations as circuit feature sizes decrease. Strategies to meet this need include supervisory control, run-to-run control, and real-time feedback control. Typically, the level of control chosen depends upon the actuation and sensing available. Rapid Thermal Processing (RTP) is one step of the manufacturing cycle requiring precise temperature control and hence real-time feedback control. At the outset of this research, the primary ingredient lacking from in-situ RTP temperature control was a suitable sensor. This research looks at an alternative to the traditional approach of pyrometry, which is limited by the unknown and possibly time-varying wafer emissivity. The technique is based upon the temperature dependence of the propagation time of an acoustic wave in the wafer. The aim of this thesis is to evaluate the ultrasonic sensors as a potentially viable sensor for control in RTP. To do this, an experimental implementation was developed at the Center for Integrated Systems. Because of the difficulty in applying a known temperature standard in an RTP environment, calibration to absolute temperature is nontrivial. Given reference propagation delays, multivariable model-based feedback control is applied to the system. The modelling and implementation details are described. The control techniques have been applied to a number of research processes including rapid thermal annealing and rapid thermal crystallization of thin silicon films on quartz/glass substrates.

  13. A hybrid artificial neural network as a software sensor for optimal control of a wastewater treatment process.

    PubMed

    Choi, D J; Park, H

    2001-11-01

    For control and automation of biological treatment processes, lack of reliable on-line sensors to measure water quality parameters is one of the most important problems to overcome. Many parameters cannot be measured directly with on-line sensors. The accuracy of existing hardware sensors is also not sufficient and maintenance problems such as electrode fouling often cause trouble. This paper deals with the development of software sensor techniques that estimate the target water quality parameter from other parameters using the correlation between water quality parameters. We focus our attention on the preprocessing of noisy data and the selection of the best model feasible to the situation. Problems of existing approaches are also discussed. We propose a hybrid neural network as a software sensor inferring wastewater quality parameter. Multivariate regression, artificial neural networks (ANN), and a hybrid technique that combines principal component analysis as a preprocessing stage are applied to data from industrial wastewater processes. The hybrid ANN technique shows an enhancement of prediction capability and reduces the overfitting problem of neural networks. The result shows that the hybrid ANN technique can be used to extract information from noisy data and to describe the nonlinearity of complex wastewater treatment processes.

  14. Captured metagenomics: large-scale targeting of genes based on ‘sequence capture’ reveals functional diversity in soils

    PubMed Central

    Manoharan, Lokeshwaran; Kushwaha, Sandeep K.; Hedlund, Katarina; Ahrén, Dag

    2015-01-01

    Microbial enzyme diversity is a key to understand many ecosystem processes. Whole metagenome sequencing (WMG) obtains information on functional genes, but it is costly and inefficient due to large amount of sequencing that is required. In this study, we have applied a captured metagenomics technique for functional genes in soil microorganisms, as an alternative to WMG. Large-scale targeting of functional genes, coding for enzymes related to organic matter degradation, was applied to two agricultural soil communities through captured metagenomics. Captured metagenomics uses custom-designed, hybridization-based oligonucleotide probes that enrich functional genes of interest in metagenomic libraries where only probe-bound DNA fragments are sequenced. The captured metagenomes were highly enriched with targeted genes while maintaining their target diversity and their taxonomic distribution correlated well with the traditional ribosomal sequencing. The captured metagenomes were highly enriched with genes related to organic matter degradation; at least five times more than similar, publicly available soil WMG projects. This target enrichment technique also preserves the functional representation of the soils, thereby facilitating comparative metagenomics projects. Here, we present the first study that applies the captured metagenomics approach in large scale, and this novel method allows deep investigations of central ecosystem processes by studying functional gene abundances. PMID:26490729

  15. Interference Mitigation Effects on Synthetic Aperture Radar Coherent Data Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musgrove, Cameron

    For synthetic aperture radars radio frequency interference from sources external to the radar system and techniques to mitigate the interference can degrade the quality of the image products. Usually the radar system designer will try to balance the amount of mitigation for an acceptable amount of interference to optimize the image quality. This dissertation examines the effect of interference mitigation upon coherent data products of fine resolution, high frequency synthetic aperture radars using stretch processing. Novel interference mitigation techniques are introduced that operate on single or multiple apertures of data that increase average coherence compared to existing techniques. New metricsmore » are applied to evaluate multiple mitigation techniques for image quality and average coherence. The underlying mechanism for interference mitigation techniques that affect coherence is revealed.« less

  16. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  17. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  18. Customer satisfaction planning and industrial engineering move hospital towards in-house stockless program.

    PubMed

    Burton, R; Mauk, D

    1993-03-01

    By integrating customer satisfaction planning and industrial engineering techniques when examining internal costs and efficiencies, materiel managers are able to better realize what concepts will best meet their customers' needs. Defining your customer(s), applying industrial engineering techniques, completing work sampling studies, itemizing recommendations and benefits to each alternative, performing feasibility and cost-analysis matrixes and utilizing resources through productivity monitoring will get you on the right path toward selecting concepts to use. This article reviews the above procedures as they applied to one hospital's decision-making process to determine whether to incorporate a stockless inventory program. Through an analysis of customer demand, the hospital realized that stockless was the way to go, but not by outsourcing the function--the hospital incorporated an in-house stockless inventory program.

  19. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  20. An efficient interior-point algorithm with new non-monotone line search filter method for nonlinear constrained programming

    NASA Astrophysics Data System (ADS)

    Wang, Liwei; Liu, Xinggao; Zhang, Zeyin

    2017-02-01

    An efficient primal-dual interior-point algorithm using a new non-monotone line search filter method is presented for nonlinear constrained programming, which is widely applied in engineering optimization. The new non-monotone line search technique is introduced to lead to relaxed step acceptance conditions and improved convergence performance. It can also avoid the choice of the upper bound on the memory, which brings obvious disadvantages to traditional techniques. Under mild assumptions, the global convergence of the new non-monotone line search filter method is analysed, and fast local convergence is ensured by second order corrections. The proposed algorithm is applied to the classical alkylation process optimization problem and the results illustrate its effectiveness. Some comprehensive comparisons to existing methods are also presented.

  1. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  2. Comparison of various techniques for the extraction of umbelliferone and herniarin in Matricaria chamomilla processing fractions.

    PubMed

    Molnar, Maja; Mendešević, Nikolina; Šubarić, Drago; Banjari, Ines; Jokić, Stela

    2017-08-05

    Chamomile, a well-known medicinal plant, is a rich source of bioactive compounds, among which two coumarin derivatives, umbelliferone and herniarin, are often found in its extracts. Chamomile extracts have found a different uses in cosmetic industry, as well as umbelliferone itself, which is, due to its strong absorption of UV light, usually added to sunscreens, while herniarin (7-methoxycoumarin) is also known for its biological activity. Therefore, chamomile extracts with certain herniarin and umbelliferone content could be of interest for application in pharmaceutical and cosmetic products. The aim of this study was to compare the extracts of different chamomile fractions (unprocessed chamomile flowers first class, processed chamomile flowers first class, pulvis and processing waste) and to identify the best material and method of extraction to obtain herniarin and umbelliferone. Various extraction techniques such as soxhlet, hydrodistillation, maceration and supercritical CO 2 extraction were used in this study. Umbelliferone and herniarin content was determined by high performance liquid chromatography (HPLC). The highest yield of umbelliferone (11.80 mg/100 g) and herniarin (82.79 mg/100 g) were obtained from chamomile processing waste using maceration technique with 50% aqueous ethanol solution and this extract has also proven to possess antioxidant activity (61.5% DPPH scavenging activity). This study shows a possibility of potential utilization of waste from chamomile processing applying different extraction techniques.

  3. Comparative study of resist stabilization techniques for metal etch processing

    NASA Astrophysics Data System (ADS)

    Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.

    1999-06-01

    This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.

  4. Multiscale Image Processing of Solar Image Data

    NASA Astrophysics Data System (ADS)

    Young, C.; Myers, D. C.

    2001-12-01

    It is often said that the blessing and curse of solar physics is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also increased the amount of highly complex data. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present several applications of multiscale techniques applied to solar image data. Specifically, we discuss uses of the wavelet, curvelet, and related transforms to define a multiresolution support for EIT, LASCO and TRACE images.

  5. Noise suppression in surface microseismic data by τ-p transform

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Batzle, Mike; Behura, Jyoti; Willis, Mark; Haines, Seth; Davidson, Michael

    2013-01-01

    Surface passive seismic methods are receiving increased attention for monitoring changes in reservoirs during the production of unconventional oil and gas. However, in passive seismic data the strong cultural and ambient noise (mainly surface-waves) decreases the effectiveness of these techniques. Hence, suppression of surface-waves is a critical step in surface microseismic monitoring. We apply a noise suppression technique, based on the τ — p transform, to a surface passive seismic dataset recorded over a Barnett Shale reservoir undergoing a hydraulic fracturing process. This technique not only improves the signal-to-noise ratios of added synthetic microseismic events, but it also preserves the event waveforms.

  6. Multivariable PID controller design tuning using bat algorithm for activated sludge process

    NASA Astrophysics Data System (ADS)

    Atikah Nor’Azlan, Nur; Asmiza Selamat, Nur; Mat Yahya, Nafrizuan

    2018-04-01

    The designing of a multivariable PID control for multi input multi output is being concerned with this project by applying four multivariable PID control tuning which is Davison, Penttinen-Koivo, Maciejowski and Proposed Combined method. The determination of this study is to investigate the performance of selected optimization technique to tune the parameter of MPID controller. The selected optimization technique is Bat Algorithm (BA). All the MPID-BA tuning result will be compared and analyzed. Later, the best MPID-BA will be chosen in order to determine which techniques are better based on the system performances in terms of transient response.

  7. Determination of piezo-optic coefficients of crystals by means of four-point bending.

    PubMed

    Krupych, Oleg; Savaryn, Viktoriya; Krupych, Andriy; Klymiv, Ivan; Vlokh, Rostyslav

    2013-06-10

    A technique developed recently for determining piezo-optic coefficients (POCs) of isotropic optical media, which represents a combination of digital imaging laser interferometry and a classical four-point bending method, is generalized and applied to a single-crystalline anisotropic material. The peculiarities of measuring procedures and data processing for the case of optically uniaxial crystals are described in detail. The capabilities of the technique are tested on the example of canonical nonlinear optical crystal LiNbO3. The high precision achieved in determination of the POCs for isotropic and anisotropic materials testifies that the technique should be both versatile and reliable.

  8. Monitoring gypsy moth defoliation by applying change detection techniques to Landsat imagery

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Stauffer, M. L.

    1978-01-01

    The overall objective of a research effort at NASA's Goddard Space Flight Center is to develop and evaluate digital image processing techniques that will facilitate the assessment of the intensity and spatial distribution of forest insect damage in Northeastern U.S. forests using remotely sensed data from Landsats 1, 2 and C. Automated change detection techniques are presently being investigated as a method of isolating the areas of change in the forest canopy resulting from pest outbreaks. In order to follow the change detection approach, Landsat scene correction and overlay capabilities are utilized to provide multispectral/multitemporal image files of 'defoliation' and 'nondefoliation' forest stand conditions.

  9. Development and flight test of an experimental maneuver autopilot for a highly maneuverable aircraft

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Jones, Frank P.; Roncoli, Ralph B.

    1986-01-01

    This report presents the development of an experimental flight test maneuver autopilot (FTMAP) for a highly maneuverable aircraft. The essence of this technique is the application of an autopilot to provide precise control during required flight test maneuvers. This newly developed flight test technique is being applied at the Dryden Flight Research Facility of NASA Ames Research Center. The FTMAP is designed to increase the quantity and quality of data obtained in test flight. The technique was developed and demonstrated on the highly maneuverable aircraft technology (HiMAT) vehicle. This report describes the HiMAT vehicle systems, maneuver requirements, FTMAP development process, and flight results.

  10. Spline function approximation techniques for image geometric distortion representation. [for registration of multitemporal remote sensor imagery

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.

    1975-01-01

    Least squares approximation techniques were developed for use in computer aided correction of spatial image distortions for registration of multitemporal remote sensor imagery. Polynomials were first used to define image distortion over the entire two dimensional image space. Spline functions were then investigated to determine if the combination of lower order polynomials could approximate a higher order distortion with less computational difficulty. Algorithms for generating approximating functions were developed and applied to the description of image distortion in aircraft multispectral scanner imagery. Other applications of the techniques were suggested for earth resources data processing areas other than geometric distortion representation.

  11. EU-FP7-iMARS: Analysis of Mars Multi-Resolution Images Using Auto-Coregistration Data Mining and Crowd Source Techniques: Processed Results - a First Look

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Tao, Yu; Sidiropoulos, Panagiotis; Gwinner, Klaus; Willner, Konrad; Fanara, Lida; Waehlisch, Marita; van Gasselt, Stephan; Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Ivanov, Anton; Cantini, Federico; Wardlaw, Jessica; Morley, Jeremy; Sprinks, James; Giordano, Michele; Marsh, Stuart; Kim, Jungrack; Houghton, Robert; Bamford, Steven

    2016-06-01

    Understanding planetary atmosphere-surface exchange and extra-terrestrial-surface formation processes within our Solar System is one of the fundamental goals of planetary science research. There has been a revolution in planetary surface observations over the last 15 years, especially in 3D imaging of surface shape. This has led to the ability to overlay image data and derived information from different epochs, back in time to the mid 1970s, to examine changes through time, such as the recent discovery of mass movement, tracking inter-year seasonal changes and looking for occurrences of fresh craters. Within the EU FP-7 iMars project, we have developed a fully automated multi-resolution DTM processing chain, called the Coregistration ASP-Gotcha Optimised (CASP-GO), based on the open source NASA Ames Stereo Pipeline (ASP) [Tao et al., this conference], which is being applied to the production of planetwide DTMs and ORIs (OrthoRectified Images) from CTX and HiRISE. Alongside the production of individual strip CTX & HiRISE DTMs & ORIs, DLR [Gwinner et al., 2015] have processed HRSC mosaics of ORIs and DTMs for complete areas in a consistent manner using photogrammetric bundle block adjustment techniques. A novel automated co-registration and orthorectification chain has been developed by [Sidiropoulos & Muller, this conference]. Using the HRSC map products (both mosaics and orbital strips) as a map-base it is being applied to many of the 400,000 level-1 EDR images taken by the 4 NASA orbital cameras. In particular, the NASA Viking Orbiter camera (VO), Mars Orbiter Camera (MOC), Context Camera (CTX) as well as the High Resolution Imaging Science Experiment (HiRISE) back to 1976. A webGIS has been developed [van Gasselt et al., this conference] for displaying this time sequence of imagery and will be demonstrated showing an example from one of the HRSC quadrangle map-sheets. Automated quality control [Sidiropoulos & Muller, 2015] techniques are applied to screen for suitable images and these are extended to detect temporal changes in features on the surface such as mass movements, streaks, spiders, impact craters, CO2 geysers and Swiss Cheese terrain. For result verification these data mining techniques are then being employed within a citizen science project within the Zooniverse family. Examples of data mining and its verification will be presented.

  12. Blob-enhanced reconstruction technique

    NASA Astrophysics Data System (ADS)

    Castrillo, Giusy; Cafiero, Gioacchino; Discetti, Stefano; Astarita, Tommaso

    2016-09-01

    A method to enhance the quality of the tomographic reconstruction and, consequently, the 3D velocity measurement accuracy, is presented. The technique is based on integrating information on the objects to be reconstructed within the algebraic reconstruction process. A first guess intensity distribution is produced with a standard algebraic method, then the distribution is rebuilt as a sum of Gaussian blobs, based on location, intensity and size of agglomerates of light intensity surrounding local maxima. The blobs substitution regularizes the particle shape allowing a reduction of the particles discretization errors and of their elongation in the depth direction. The performances of the blob-enhanced reconstruction technique (BERT) are assessed with a 3D synthetic experiment. The results have been compared with those obtained by applying the standard camera simultaneous multiplicative reconstruction technique (CSMART) to the same volume. Several blob-enhanced reconstruction processes, both substituting the blobs at the end of the CSMART algorithm and during the iterations (i.e. using the blob-enhanced reconstruction as predictor for the following iterations), have been tested. The results confirm the enhancement in the velocity measurements accuracy, demonstrating a reduction of the bias error due to the ghost particles. The improvement is more remarkable at the largest tested seeding densities. Additionally, using the blobs distributions as a predictor enables further improvement of the convergence of the reconstruction algorithm, with the improvement being more considerable when substituting the blobs more than once during the process. The BERT process is also applied to multi resolution (MR) CSMART reconstructions, permitting simultaneously to achieve remarkable improvements in the flow field measurements and to benefit from the reduction in computational time due to the MR approach. Finally, BERT is also tested on experimental data, obtaining an increase of the signal-to-noise ratio in the reconstructed flow field and a higher value of the correlation factor in the velocity measurements with respect to the volume to which the particles are not replaced.

  13. Assessment of Spatial and Temporal Variation of Surface Water Quality in Streams Affected by Coalbed Methane Development

    NASA Astrophysics Data System (ADS)

    Chitrakar, S.; Miller, S. N.; Liu, T.; Caffrey, P. A.

    2015-12-01

    Water quality data have been collected from three representative stream reaches in a coalbed methane (CBM) development area for over five years to improve the understanding of salt loading in the system. These streams are located within Atlantic Rim development area of the Muddy Creek in south-central Wyoming. Significant development of CBM wells is ongoing in the study area. Three representative sampling stream reaches included the Duck Pond Draw and Cow Creek, which receive co-produced water, and; South Fork Creek, and upstream Cow Creek which do not receive co-produced water. Water samples were assayed for various parameters which included sodium, calcium, magnesium, fluoride, chlorine, nitrate, O-phosphate, sulfate, carbonate, bicarbonates, and other water quality parameters such as pH, conductivity, and TDS. Based on these water quality parameters we have investigated various hydrochemical and geochemical processes responsible for the high variability in water quality in the region. However, effective interpretation of complex databases to understand aforementioned processes has been a challenging task due to the system's complexity. In this work we applied multivariate statistical techniques including cluster analysis (CA), principle component analysis (PCA) and discriminant analysis (DA) to analyze water quality data and identify similarities and differences among our locations. First, CA technique was applied to group the monitoring sites based on the multivariate similarities. Second, PCA technique was applied to identify the prevalent parameters responsible for the variation of water quality in each group. Third, the DA technique was used to identify the most important factors responsible for variation of water quality during low flow season and high flow season. The purpose of this study is to improve the understanding of factors or sources influencing the spatial and temporal variation of water quality. The ultimate goal of this whole research is to develop coupled salt loading and GIS-based hydrological modelling tool that will be able to simulate the salt loadings under various user defined scenarios in the regions undergoing CBM development. Therefore, the findings from this study will be used to formulate the predominant processes responsible for solute loading.

  14. Finite Element Modeling of the Thermographic Inspection for Composite Materials

    NASA Technical Reports Server (NTRS)

    Bucinell, Ronald B.

    1996-01-01

    The performance of composite materials is dependent on the constituent materials selected, material structural geometry, and the fabrication process. Flaws can form in composite materials as a result of the fabrication process, handling in the manufacturing environment, and exposure in the service environment to anomalous activity. Often these flaws show no indication on the surface of the material while having the potential of substantially degrading the integrity of the composite structure. For this reason it is important to have available inspection techniques that can reliably detect sub-surface defects such as inter-ply disbonds, inter-ply cracks, porosity, and density changes caused by variations in fiber volume content. Many non-destructive evaluation techniques (NDE) are capable of detecting sub-surface flaws in composite materials. These include shearography, video image correlation, ultrasonic, acoustic emissions, and X-ray. The difficulty with most of these techniques is that they are time consuming and often difficult to apply to full scale structures. An NDE technique that appears to have the capability to quickly and easily detect flaws in composite structure is thermography. This technique uses heat to detect flaws. Heat is applied to the surface of a structure with the use of a heat lamp or heat gun. A thermographic camera is then pointed at the surface and records the surface temperature as the composite structure cools. Flaws in the material will cause the thermal-mechanical material response to change. Thus, the surface over an area where a flaw is present will cool differently than regions where flaws do not exist. This paper discusses the effort made to thermo-mechanically model the thermography process. First the material properties and physical parameters used in the model will be explained. This will be followed by a detailed discussion of the finite element model used. Finally, the result of the model will be summarized along with recommendations for future work.

  15. Thin-film-transistor array: an exploratory attempt for high throughput cell manipulation using electrowetting principle

    NASA Astrophysics Data System (ADS)

    Shaik, F. Azam; Cathcart, G.; Ihida, S.; Lereau-Bernier, M.; Leclerc, E.; Sakai, Y.; Toshiyoshi, H.; Tixier-Mita, A.

    2017-05-01

    In lab-on-a-chip (LoC) devices, microfluidic displacement of liquids is a key component. electrowetting on dielectric (EWOD) is a technique to move fluids, with the advantage of not requiring channels, pumps or valves. Fluids are discretized into droplets on microelectrodes and moved by applying an electric field via the electrodes to manipulate the contact angle. Micro-objects, such as biological cells, can be transported inside of these droplets. However, the design of conventional microelectrodes, made by standard micro-fabrication techniques, fixes the path of the droplets, and limits the reconfigurability of paths and thus limits the parallel processing of droplets. In that respect, thin film transistor (TFT) technology presents a great opportunity as it allows infinitely reconfigurable paths, with high parallelizability. We propose here to investigate the possibility of using TFT array devices for high throughput cell manipulation using EWOD. A COMSOL based 2D simulation coupled with a MATLAB algorithm was used to simulate the contact angle modulation, displacement and mixing of droplets. These simulations were confirmed by experimental results. The EWOD technique was applied to a droplet of culture medium containing HepG2 carcinoma cells and demonstrated no negative effects on the viability of the cells. This confirms the possibility of applying EWOD techniques to cellular applications, such as parallel cell analysis.

  16. A Comparison of Inductive Sensors in the Characterization of Partial Discharges and Electrical Noise Using the Chromatic Technique.

    PubMed

    Ardila-Rey, Jorge Alfredo; Montaña, Johny; de Castro, Bruno Albuquerque; Schurch, Roger; Covolan Ulson, José Alfredo; Muhammad-Sukki, Firdaus; Bani, Nurul Aini

    2018-03-29

    Partial discharges (PDs) are one of the most important classes of ageing processes that occur within electrical insulation. PD detection is a standardized technique to qualify the state of the insulation in electric assets such as machines and power cables. Generally, the classical phase-resolved partial discharge (PRPD) patterns are used to perform the identification of the type of PD source when they are related to a specific degradation process and when the electrical noise level is low compared to the magnitudes of the PD signals. However, in practical applications such as measurements carried out in the field or in industrial environments, several PD sources and large noise signals are usually present simultaneously. In this study, three different inductive sensors have been used to evaluate and compare their performance in the detection and separation of multiple PD sources by applying the chromatic technique to each of the measured signals.

  17. Thermophysical Properties Measurements of Zr62Cu20Al10Ni8

    NASA Technical Reports Server (NTRS)

    Bradshaw, Richard C.; Waren, Mary; Rogers, Jan R.; Rathz, Thomas J.; Gangopadhyay, Anup K.; Kelton, Ken F.; Hyers, Robert W.

    2006-01-01

    Thermophysical property studies performed at high temperature can prove challenging because of reactivity problems brought on by the elevated temperatures. Contaminants from measuring devices and container walls can cause changes in properties. To prevent this, containerless processing techniques can be employed to isolate a sample during study. A common method used for this is levitation. Typical levitation methods used for containerless processing are, aerodynamically, electromagnetically and electrostatically based. All levitation methods reduce heterogeneous nucleation sites, 'which in turn provide access to metastable undercooled phases. In particular, electrostatic levitation is appealing because sample motion and stirring are minimized; and by combining it with optically based non-contact measuring techniques, many thermophysical properties can be measured. Applying some of these techniques, surface tension, viscosity and density have been measured for the glass forming alloy Zr62Cu20Al10Ni8 and will be presented with a brief overview of the non-contact measuring method used.

  18. Image processing for x-ray inspection of pistachio nuts

    NASA Astrophysics Data System (ADS)

    Casasent, David P.

    2001-03-01

    A review is provided of image processing techniques that have been applied to the inspection of pistachio nuts using X-ray images. X-ray sensors provide non-destructive internal product detail not available from other sensors. The primary concern in this data is detecting the presence of worm infestations in nuts, since they have been linked to the presence of aflatoxin. We describe new techniques for segmentation, feature selection, selection of product categories (clusters), classifier design, etc. Specific novel results include: a new segmentation algorithm to produce images of isolated product items; preferable classifier operation (the classifier with the best probability of correct recognition Pc is not best); higher-order discrimination information is present in standard features (thus, high-order features appear useful); classifiers that use new cluster categories of samples achieve improved performance. Results are presented for X-ray images of pistachio nuts; however, all techniques have use in other product inspection applications.

  19. Solvent-free melting techniques for the preparation of lipid-based solid oral formulations.

    PubMed

    Becker, Karin; Salar-Behzadi, Sharareh; Zimmer, Andreas

    2015-05-01

    Lipid excipients are applied for numerous purposes such as taste masking, controlled release, improvement of swallowability and moisture protection. Several melting techniques have evolved in the last decades. Common examples are melt coating, melt granulation and melt extrusion. The required equipment ranges from ordinary glass beakers for lab scale up to large machines such as fluid bed coaters, spray dryers or extruders. This allows for upscaling to pilot or production scale. Solvent free melt processing provides a cost-effective, time-saving and eco-friendly method for the food and pharmaceutical industries. This review intends to give a critical overview of the published literature on experiences, formulations and challenges and to show possibilities for future developments in this promising field. Moreover, it should serve as a guide for selecting the best excipients and manufacturing techniques for the development of a product with specific properties using solvent free melt processing.

  20. Denoising and segmentation of retinal layers in optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Dash, Puspita; Sigappi, A. N.

    2018-04-01

    Optical Coherence Tomography (OCT) is an imaging technique used to localize the intra-retinal boundaries for the diagnostics of macular diseases. Due to speckle noise, low image contrast and accurate segmentation of individual retinal layers is difficult. Due to this, a method for retinal layer segmentation from OCT images is presented. This paper proposes a pre-processing filtering approach for denoising and segmentation methods for segmenting retinal layers OCT images using graph based segmentation technique. These techniques are used for segmentation of retinal layers for normal as well as patients with Diabetic Macular Edema. The algorithm based on gradient information and shortest path search is applied to optimize the edge selection. In this paper the four main layers of the retina are segmented namely Internal limiting membrane (ILM), Retinal pigment epithelium (RPE), Inner nuclear layer (INL) and Outer nuclear layer (ONL). The proposed method is applied on a database of OCT images of both ten normal and twenty DME affected patients and the results are found to be promising.

Top