Science.gov

Sample records for biomedical signal processing

  1. Biomedical signal and image processing.

    PubMed

    Cerutti, Sergio; Baselli, Giuseppe; Bianchi, Anna; Caiani, Enrico; Contini, Davide; Cubeddu, Rinaldo; Dercole, Fabio; Rienzo, Luca; Liberati, Diego; Mainardi, Luca; Ravazzani, Paolo; Rinaldi, Sergio; Signorini, Maria; Torricelli, Alessandro

    2011-01-01

    Generally, physiological modeling and biomedical signal processing constitute two important paradigms of biomedical engineering (BME): their fundamental concepts are taught starting from undergraduate studies and are more completely dealt with in the last years of graduate curricula, as well as in Ph.D. courses. Traditionally, these two cultural aspects were separated, with the first one more oriented to physiological issues and how to model them and the second one more dedicated to the development of processing tools or algorithms to enhance useful information from clinical data. A practical consequence was that those who did models did not do signal processing and vice versa. However, in recent years,the need for closer integration between signal processing and modeling of the relevant biological systems emerged very clearly [1], [2]. This is not only true for training purposes(i.e., to properly prepare the new professional members of BME) but also for the development of newly conceived research projects in which the integration between biomedical signal and image processing (BSIP) and modeling plays a crucial role. Just to give simple examples, topics such as brain–computer machine or interfaces,neuroengineering, nonlinear dynamical analysis of the cardiovascular (CV) system,integration of sensory-motor characteristics aimed at the building of advanced prostheses and rehabilitation tools, and wearable devices for vital sign monitoring and others do require an intelligent fusion of modeling and signal processing competences that are certainly peculiar of our discipline of BME.

  2. [A biomedical signal processing toolkit programmed by Java].

    PubMed

    Xie, Haiyuan

    2012-09-01

    According to the biomedical signal characteristics, a new biomedical signal processing toolkit is developed. The toolkit is programmed by Java. It is used in basic digital signal processing, random signal processing and etc. All the methods in toolkit has been tested, the program is robust. The feature of the toolkit is detailed explained, easy use and good practicability.

  3. Software for biomedical engineering signal processing laboratory experiments.

    PubMed

    Tompkins, Willis J; Wilson, J

    2009-01-01

    In the early 1990's we developed a special computer program called UW DigiScope to provide a mechanism for anyone interested in biomedical digital signal processing to study the field without requiring any other instrument except a personal computer. There are many digital filtering and pattern recognition algorithms used in processing biomedical signals. In general, students have very limited opportunity to have hands-on access to the mechanisms of digital signal processing. In a typical course, the filters are designed non-interactively, which does not provide the student with significant understanding of the design constraints of such filters nor their actual performance characteristics. UW DigiScope 3.0 is the first major update since version 2.0 was released in 1994. This paper provides details on how the new version based on MATLAB! works with signals, including the filter design tool that is the programming interface between UW DigiScope and processing algorithms.

  4. SoC-based architecture for biomedical signal processing.

    PubMed

    Gutiérrez-Rivas, R; Hernández, A; García, J J; Marnane, W

    2015-01-01

    Over the last decades, many algorithms have been proposed for processing biomedical signals. Most of these algorithms have been focused on the elimination of noise and artifacts existing in these signals, so they can be used for automatic monitoring and/or diagnosis applications. With regard to remote monitoring, the use of portable devices often requires a reduced number of resources and power consumption, being necessary to reach a trade-off between the accuracy of algorithms and their computational complexity. This paper presents a SoC (System-on-Chip) architecture, based on a FPGA (Field-Programmable Gate Array) device, suitable for the implementation of biomedical signal processing. The proposal has been successfully validated by implementing an efficient QRS complex detector. The results show that, using a reduced amount of resources, values of sensitivity and positive predictive value above 99.49% are achieved, which make the proposed approach suitable for telemedicine applications.

  5. Demystifying biomedical signals: a student centred approach to learning signal processing.

    PubMed

    Simpson, D M; De Stefano, A; Allen, R; Lutman, M E

    2005-09-01

    The processing and analysis of physiological signals has become firmly established in clinical medicine and biomedical research. Many of the users of this technology however do not come from an engineering or science background, and traditional approaches in teaching signal processing are thus not appropriate for them. We have therefore developed a series of modular courses that are aimed specifically at an audience with a background in medicine, health-care or the life-sciences. In these courses, we focus on the concepts, principles and rationale of applying signal processing methods, rather than the mathematical foundations of the techniques. Thus, we aim to remove some of the perceived 'mystery' often surrounding this subject. The very practical approach, with hands-on experience using the MATLAB software, has been well received, with strong evidence that students have learnt to apply their knowledge. This paper describes the learning and teaching approach taken, and some of the experience acquired.

  6. BioSig: The Free and Open Source Software Library for Biomedical Signal Processing

    PubMed Central

    Vidaurre, Carmen; Sander, Tilmann H.; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals. PMID:21437227

  7. BioSig: the free and open source software library for biomedical signal processing.

    PubMed

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  8. Noise-assisted data processing with empirical mode decomposition in biomedical signals.

    PubMed

    Karagiannis, Alexandros; Constantinou, Philip

    2011-01-01

    In this paper, a methodology is described in order to investigate the performance of empirical mode decomposition (EMD) in biomedical signals, and especially in the case of electrocardiogram (ECG). Synthetic ECG signals corrupted with white Gaussian noise are employed and time series of various lengths are processed with EMD in order to extract the intrinsic mode functions (IMFs). A statistical significance test is implemented for the identification of IMFs with high-level noise components and their exclusion from denoising procedures. Simulation campaign results reveal that a decrease of processing time is accomplished with the introduction of preprocessing stage, prior to the application of EMD in biomedical time series. Furthermore, the variation in the number of IMFs according to the type of the preprocessing stage is studied as a function of SNR and time-series length. The application of the methodology in MIT-BIH ECG records is also presented in order to verify the findings in real ECG signals.

  9. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    PubMed

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper.

  10. Digital signal processing by virtual instrumentation of a MEMS magnetic field sensor for biomedical applications.

    PubMed

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M; Manjarrez, Elías; Tapia, Jesús A; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A; Herrera-May, Agustín L

    2013-11-05

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG).

  11. Digital Signal Processing by Virtual Instrumentation of a MEMS Magnetic Field Sensor for Biomedical Applications

    PubMed Central

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M.; Manjarrez, Elías; Tapia, Jesús A.; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A.; Herrera-May, Agustín L.

    2013-01-01

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG). PMID:24196434

  12. Bioinspired Polarization Imaging Sensors: From Circuits and Optics to Signal Processing Algorithms and Biomedical Applications

    PubMed Central

    York, Timothy; Powell, Samuel B.; Gao, Shengkui; Kahan, Lindsey; Charanya, Tauseef; Saha, Debajit; Roberts, Nicholas W.; Cronin, Thomas W.; Marshall, Justin; Achilefu, Samuel; Lake, Spencer P.; Raman, Baranidharan; Gruev, Viktor

    2015-01-01

    In this paper, we present recent work on bioinspired polarization imaging sensors and their applications in biomedicine. In particular, we focus on three different aspects of these sensors. First, we describe the electro–optical challenges in realizing a bioinspired polarization imager, and in particular, we provide a detailed description of a recent low-power complementary metal–oxide–semiconductor (CMOS) polarization imager. Second, we focus on signal processing algorithms tailored for this new class of bioinspired polarization imaging sensors, such as calibration and interpolation. Third, the emergence of these sensors has enabled rapid progress in characterizing polarization signals and environmental parameters in nature, as well as several biomedical areas, such as label-free optical neural recording, dynamic tissue strength analysis, and early diagnosis of flat cancerous lesions in a murine colorectal tumor model. We highlight results obtained from these three areas and discuss future applications for these sensors. PMID:26538682

  13. Biomedical image processing.

    PubMed

    Huang, H K

    1981-01-01

    Biomedical image processing is a very broad field; it covers biomedical signal gathering, image forming, picture processing, and image display to medical diagnosis based on features extracted from images. This article reviews this topic in both its fundamentals and applications. In its fundamentals, some basic image processing techniques including outlining, deblurring, noise cleaning, filtering, search, classical analysis and texture analysis have been reviewed together with examples. The state-of-the-art image processing systems have been introduced and discussed in two categories: general purpose image processing systems and image analyzers. In order for these systems to be effective for biomedical applications, special biomedical image processing languages have to be developed. The combination of both hardware and software leads to clinical imaging devices. Two different types of clinical imaging devices have been discussed. There are radiological imagings which include radiography, thermography, ultrasound, nuclear medicine and CT. Among these, thermography is the most noninvasive but is limited in application due to the low energy of its source. X-ray CT is excellent for static anatomical images and is moving toward the measurement of dynamic function, whereas nuclear imaging is moving toward organ metabolism and ultrasound is toward tissue physical characteristics. Heart imaging is one of the most interesting and challenging research topics in biomedical image processing; current methods including the invasive-technique cineangiography, and noninvasive ultrasound, nuclear medicine, transmission, and emission CT methodologies have been reviewed. Two current federally funded research projects in heart imaging, the dynamic spatial reconstructor and the dynamic cardiac three-dimensional densitometer, should bring some fruitful results in the near future. Miscrosopic imaging technique is very different from the radiological imaging technique in the sense that

  14. In the spotlight: biomedical signal processing--a well established discipline or a paradigm to promising integrated visions?

    PubMed

    Cerutti, Sergio

    2009-01-01

    Biomedical signals carry fundamental information about the nature and the status of the living systems under study. A proper processing of these signals obtains useful physiological and clinical information. A closer integration between signal processing and modeling of the relevant biological systems is capable to directly attribute pathophysiological meaning to the parameters obtained from the processing and vice versa. Another issue of great interest in which BSP plays an important role is the Brain-Computer Interface (BCI) or Brain-Machine Interface (BMI) where fast and reliable signal processing approaches are fundamental for a practical implementation. The physiological mechanisms underlying these heart rate variability findings are supposed to be related to stochastic processes at the cellular level, to influence of respiration on the heart rate, and to the interactions of the multiple feedback loops regulating the cardiovascular system. Another important area in which BSP plays a pivotal role is the "computational genomics and proteomics." It is true that "traditional" biomedical engineering studies biomedical signals which are obtained at the level of the major physiological systems.

  15. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation.

  16. Accelerating Biomedical Signal Processing Using GPU: A Case Study of Snore Sound Feature Extraction.

    PubMed

    Guo, Jian; Qian, Kun; Zhang, Gongxuan; Xu, Huijie; Schuller, Björn

    2017-09-25

    The advent of 'Big Data' and 'Deep Learning' offers both, a great challenge and a huge opportunity for personalised health-care. In machine learning-based biomedical data analysis, feature extraction is a key step for 'feeding' the subsequent classifiers. With increasing numbers of biomedical data, extracting features from these 'big' data is an intensive and time-consuming task. In this case study, we employ a Graphics Processing Unit (GPU) via Python to extract features from a large corpus of snore sound data. Those features can subsequently be imported into many well-known deep learning training frameworks without any format processing. The snore sound data were collected from several hospitals (20 subjects, with 770-990 MB per subject - in total 17.20 GB). Experimental results show that our GPU-based processing significantly speeds up the feature extraction phase, by up to seven times, as compared to the previous CPU system.

  17. Independent component analysis for biomedical signals.

    PubMed

    James, Christopher J; Hesse, Christian W

    2005-02-01

    Independent component analysis (ICA) is increasing in popularity in the field of biomedical signal processing. It is generally used when it is required to separate measured multi-channel biomedical signals into their constituent underlying components. The use of ICA has been facilitated in part by the free availability of toolboxes that implement popular flavours of the techniques. Fundamentally ICA in biomedicine involves the extraction and separation of statistically independent sources underlying multiple measurements of biomedical signals. Technical advances in algorithmic developments implementing ICA are reviewed along with new directions in the field. These advances are specifically summarized with applications to biomedical signals in mind. The basic assumptions that are made when applying ICA are discussed, along with their implications when applied particularly to biomedical signals. ICA as a specific embodiment of blind source separation (BSS) is also discussed, and as a consequence the criterion used for establishing independence between sources is reviewed and this leads to the introduction of ICA/BSS techniques based on time, frequency and joint time-frequency decomposition of the data. Finally, advanced implementations of ICA are illustrated as applied to neurophysiologic signals in the form of electro-magnetic brain signals data.

  18. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    PubMed

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  19. Biomedical signals monitoring based in mobile computing.

    PubMed

    Serigioli, Nilton; Reina Munoz, Rodrigo; Rodriguez, Edgar Charry

    2010-01-01

    The main objective of this project consists in the development of a biomedical instrumentation prototype for acquisition, processing and transmission of biomedical signals. These biomedical signals are acquired and then processed with a microcontroller. After processing, all data are sent to a communication interface that can send this information to a personal computer or a cell phone. The prototype developed, which is a digital blood pressure meter, is intended to allow remote monitoring of patients living in areas with limited access to medical assistance or scarce clinical resources. We believe that this development could be helpful to improve people's quality of life, as well as to allow an improvement in the government attendance indices.

  20. Eventogram: A Visual Representation of Main Events in Biomedical Signals.

    PubMed

    Elgendi, Mohamed

    2016-09-22

    Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the "eventogram" in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram-based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.

  1. Efficient sequential compression of multi-channel biomedical signals.

    PubMed

    Capurro, Ignacio; Lecumberry, Federico; Martin, Alvaro; Ramirez, Ignacio; Rovira, Eugenio; Seroussi, Gadiel

    2016-06-21

    This work proposes lossless and near-lossless compression algorithms for multi-channel biomedical signals. The algorithms are sequential and efficient, which makes them suitable for low-latency and low-power signal transmission applications. We make use of information theory and signal processing tools (such as universal coding, universal prediction, and fast online implementations of multivariate recursive least squares), combined with simple methods to exploit spatial as well as temporal redundancies typically present in biomedical signals. The algorithms are tested with publicly available electroencephalogram and electrocardiogram databases, surpassing in all cases the current state of the art in near-lossless and lossless compression ratios.

  2. Open architecture software platform for biomedical signal analysis.

    PubMed

    Duque, Juliano J; Silva, Luiz E V; Murta, Luiz O

    2013-01-01

    Biomedical signals are very important reporters of the physiological status in human body. Therefore, great attention is devoted to the study of analysis methods that help extracting the greatest amount of relevant information from these signals. There are several free of charge softwares which can process biomedical data, but they are usually closed architecture, not allowing addition of new functionalities by users. This paper presents a proposal for free open architecture software platform for biomedical signal analysis, named JBioS. Implemented in Java, the platform offers some basic functionalities to load and display signals, and allows the integration of new software components through plugins. JBioS facilitates validation of new analysis methods and provides an environment for multi-methods analysis. Plugins can be developed for preprocessing, analyzing and simulating signals. Some applications have been done using this platform, suggesting that, with these features, JBioS presents itself as a software with potential applications in both research and clinical area.

  3. Generalized precursor pattern discovery for biomedical signals.

    PubMed

    Lan, Mars; Ghasemzadeh, Hassan; Sarrafzadeh, Majid

    2012-01-01

    With the advent of low-cost, high-fidelity, and long lasting sensors in recent years, it has become possible to acquire biomedical signals cheaply and remotely over a prolonged period of time. Oftentimes different types of sensors are deployed in the hope of capturing precursor patterns that are highly correlated to a particular clinical episode, such as seizure, congestive heart failure etc. While there have been several studies that successfully identify patterns as reliable precursors for specific medical conditions, most of them require domain-specific knowledge and expertise. The developed algorithms are also unlikely to be applicable to other medical conditions. In this paper we present a generalized algorithm that discovers potential precursor patterns without prior knowledge or domain expertise. The algorithm makes use of wavelet transform and information theory to extract generic features, and it is also classifier agnostic. Based on experiment results using three distinct datasets collected from real-world patients, our algorithm has attained performance comparable to those obtained from previous studies that rely heavily on domain-expert knowledge. Furthermore, the algorithm also discovers non-trivial knowledge in the process.

  4. Telemedicine optoelectronic biomedical data processing system

    NASA Astrophysics Data System (ADS)

    Prosolovska, Vita V.

    2010-08-01

    The telemedicine optoelectronic biomedical data processing system is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.

  5. A low-cost biomedical signal transceiver based on a Bluetooth wireless system.

    PubMed

    Fazel-Rezai, Reza; Pauls, Mark; Slawinski, David

    2007-01-01

    Most current wireless biomedical signal transceivers use range-limiting communication. This work presents a low-cost biomedical signal transceiver that uses Bluetooth wireless technology. The design is implemented in a modular form to be adaptable to different types of biomedical signals. The signal front end obtains and processes incoming signals, which are then transmitted via a microcontroller and wireless module. Near real-time receive software in LabVIEW was developed to demonstrate the system capability. The completed transmitter prototype successfully transmits ECG signals, and is able to simultaneously send multiple signals. The sampling rate of the transmitter is fast enough to send up to thirteen ECG signals simultaneously, with an error rate below 0.1% for transmission exceeding 65 meters. A low-cost wireless biomedical transceiver has many applications, such as real-time monitoring of patients with a known condition in non-clinical settings.

  6. Signal processing

    NASA Astrophysics Data System (ADS)

    Norman, David M.

    The application of signal processing technology to conventional weapons systems can lower operator workloads and enhance kill probabilities, while automating wide-area surveillance, target search and classification, target tracking, and aimpoint selection. Immediate opportunities exist for automatic target cueing in underwater and over-the-horizon targeting, as well as for airborne multiple-target fire control. By embedding the transit/receive electronics into conformal aircraft sensor arrays, a 'smart' skin can be created. Electronically scanned phased arrays can be used to yield accurate azimuthal and elevation positions while nullifying EW threats. Attention is given to major development thrusts in algorithm design.

  7. Signal Processing

    DTIC Science & Technology

    1989-03-01

    ORGANIZATION Univ of Minnesota (f*fto U. S. Army Research Office 6c. ADDRESS (City, State, and ZIP Code) 7b. ADDRESS (Wiy Stat, and ZIP Code...Minneapolis, MN 55455 P. 0. Box 12211 Research Triangle Park, NC 27709-2211 Sa. NAME Of FUNDING ISPONSORING Sb. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT...PROJECT ITASK jWORK UNIT Research Triangle Park, NC 27709-2211 EMNTO.I NO NO CESOIO 11. TITLE (Incudt Security Classifiratio") Signal Processing of, he auth

  8. Bioinspired Polarization Imaging Sensors: From Circuits and Optics to Signal Processing Algorithms and Biomedical Applications: Analysis at the focal plane emulates nature's method in sensors to image and diagnose with polarized light.

    PubMed

    York, Timothy; Powell, Samuel B; Gao, Shengkui; Kahan, Lindsey; Charanya, Tauseef; Saha, Debajit; Roberts, Nicholas W; Cronin, Thomas W; Marshall, Justin; Achilefu, Samuel; Lake, Spencer P; Raman, Baranidharan; Gruev, Viktor

    2014-10-01

    In this paper, we present recent work on bioinspired polarization imaging sensors and their applications in biomedicine. In particular, we focus on three different aspects of these sensors. First, we describe the electro-optical challenges in realizing a bioinspired polarization imager, and in particular, we provide a detailed description of a recent low-power complementary metal-oxide-semiconductor (CMOS) polarization imager. Second, we focus on signal processing algorithms tailored for this new class of bioinspired polarization imaging sensors, such as calibration and interpolation. Third, the emergence of these sensors has enabled rapid progress in characterizing polarization signals and environmental parameters in nature, as well as several biomedical areas, such as label-free optical neural recording, dynamic tissue strength analysis, and early diagnosis of flat cancerous lesions in a murine colorectal tumor model. We highlight results obtained from these three areas and discuss future applications for these sensors.

  9. Visual parameter optimisation for biomedical image processing.

    PubMed

    Pretorius, A J; Zhou, Y; Ruddle, R A

    2015-01-01

    Biomedical image processing methods require users to optimise input parameters to ensure high-quality output. This presents two challenges. First, it is difficult to optimise multiple input parameters for multiple input images. Second, it is difficult to achieve an understanding of underlying algorithms, in particular, relationships between input and output. We present a visualisation method that transforms users' ability to understand algorithm behaviour by integrating input and output, and by supporting exploration of their relationships. We discuss its application to a colour deconvolution technique for stained histology images and show how it enabled a domain expert to identify suitable parameter values for the deconvolution of two types of images, and metrics to quantify deconvolution performance. It also enabled a breakthrough in understanding by invalidating an underlying assumption about the algorithm. The visualisation method presented here provides analysis capability for multiple inputs and outputs in biomedical image processing that is not supported by previous analysis software. The analysis supported by our method is not feasible with conventional trial-and-error approaches.

  10. Visual parameter optimisation for biomedical image processing

    PubMed Central

    2015-01-01

    Background Biomedical image processing methods require users to optimise input parameters to ensure high-quality output. This presents two challenges. First, it is difficult to optimise multiple input parameters for multiple input images. Second, it is difficult to achieve an understanding of underlying algorithms, in particular, relationships between input and output. Results We present a visualisation method that transforms users' ability to understand algorithm behaviour by integrating input and output, and by supporting exploration of their relationships. We discuss its application to a colour deconvolution technique for stained histology images and show how it enabled a domain expert to identify suitable parameter values for the deconvolution of two types of images, and metrics to quantify deconvolution performance. It also enabled a breakthrough in understanding by invalidating an underlying assumption about the algorithm. Conclusions The visualisation method presented here provides analysis capability for multiple inputs and outputs in biomedical image processing that is not supported by previous analysis software. The analysis supported by our method is not feasible with conventional trial-and-error approaches. PMID:26329538

  11. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.

    PubMed

    Elgendi, Mohamed

    2016-11-02

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  12. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    PubMed Central

    Elgendi, Mohamed

    2016-01-01

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852

  13. [Application of the mixed programming with Labview and Matlab in biomedical signal analysis].

    PubMed

    Yu, Lu; Zhang, Yongde; Sha, Xianzheng

    2011-01-01

    This paper introduces the method of mixed programming with Labview and Matlab, and applies this method in a pulse wave pre-processing and feature detecting system. The method has been proved suitable, efficient and accurate, which has provided a new kind of approach for biomedical signal analysis.

  14. Application of the dual-tree complex wavelet transform in biomedical signal denoising.

    PubMed

    Wang, Fang; Ji, Zhong

    2014-01-01

    In biomedical signal processing, Gibbs oscillation and severe frequency aliasing may occur when using the traditional discrete wavelet transform (DWT). Herein, a new denoising algorithm based on the dual-tree complex wavelet transform (DTCWT) is presented. Electrocardiogram (ECG) signals and heart sound signals are denoised based on the DTCWT. The results prove that the DTCWT is efficient. The signal-to-noise ratio (SNR) and the mean square error (MSE) are used to compare the denoising effect. Results of the paired samples t-test show that the new method can remove noise more thoroughly and better retain the boundary and texture of the signal.

  15. Filter for biomedical imaging and image processing.

    PubMed

    Mondal, Partha P; Rajan, K; Ahmad, Imteyaz

    2006-07-01

    Image filtering techniques have numerous potential applications in biomedical imaging and image processing. The design of filters largely depends on the a priori, knowledge about the type of noise corrupting the image. This makes the standard filters application specific. Widely used filters such as average, Gaussian, and Wiener reduce noisy artifacts by smoothing. However, this operation normally results in smoothing of the edges as well. On the other hand, sharpening filters enhance the high-frequency details, making the image nonsmooth. An integrated general approach to design a finite impulse response filter based on Hebbian learning is proposed for optimal image filtering. This algorithm exploits the interpixel correlation by updating the filter coefficients using Hebbian learning. The algorithm is made iterative for achieving efficient learning from the neighborhood pixels. This algorithm performs optimal smoothing of the noisy image by preserving high-frequency as well as low-frequency features. Evaluation results show that the proposed finite impulse response filter is robust under various noise distributions such as Gaussian noise, salt-and-pepper noise, and speckle noise. Furthermore, the proposed approach does not require any a priori knowledge about the type of noise. The number of unknown parameters is few, and most of these parameters are adaptively obtained from the processed image. The proposed filter is successfully applied for image reconstruction in a positron emission tomography imaging modality. The images reconstructed by the proposed algorithm are found to be superior in quality compared with those reconstructed by existing PET image reconstruction methodologies.

  16. Microcontroller-based wireless recorder for biomedical signals.

    PubMed

    Chien, C-N; Hsu, H-W; Jang, J-K; Rau, C-L; Jaw, F-S

    2005-01-01

    A portable multichannel system is described for the recording of biomedical signals wirelessly. Instead of using the conversional time-division analog-modulation method, the technique of digital multiplexing was applied to increase the number of signal channels to 4. Detailed design considerations and functional allocation of the system is discussed. The frontend unit was modularly designed to condition the input signal in an optimal manner. Then, the microcontroller handled the tasks of data conversion, wireless transmission, as well as providing the ability of simple preprocessing such as waveform averaging or rectification. The low-power nature of this microcontroller affords the benefit of battery operation and hence, patient isolation of the system. Finally, a single-chip receiver, which compatible with the RF transmitter of the microcontroller, was used to implement a compact interface with the host computer. An application of this portable recorder for low-back pain studies is shown. This device can simultaneously record one ECG and two surface EMG wirelessly, thus, is helpful in relieving patients' anxiety devising clinical measurement. Such an approach, microcontroller-based wireless measurement, could be an important trend for biomedical instrumentation and we help that this paper could be useful for other colleagues.

  17. The creative process in biomedical visualization: strategies and management.

    PubMed

    Anderson, P A

    1990-01-01

    The phases of the creative process (identification, preparation, incubation, insight, and elaboration/verification) are related to strategies for management of biomedical illustration projects. The idea that creativity is a mystery--and is unpredictable and uncontrollable--is not accepted. This article presents a practical way to encourage creative thinking in the biomedical visualization studio.

  18. Acoustic Signal Processing

    NASA Astrophysics Data System (ADS)

    Hartmann, William M.; Candy, James V.

    Signal processing refers to the acquisition, storage, display, and generation of signals - also to the extraction of information from signals and the re-encoding of information. As such, signal processing in some form is an essential element in the practice of all aspects of acoustics. Signal processing algorithms enable acousticians to separate signals from noise, to perform automatic speech recognition, or to compress information for more efficient storage or transmission. Signal processing concepts are the building blocks used to construct models of speech and hearing. Now, in the 21st century, all signal processing is effectively digital signal processing. Widespread access to high-speed processing, massive memory, and inexpensive software make signal processing procedures of enormous sophistication and power available to anyone who wants to use them. Because advanced signal processing is now accessible to everybody, there is a need for primers that introduce basic mathematical concepts that underlie the digital algorithms. The present handbook chapter is intended to serve such a purpose.

  19. Application of higher order statistics/spectra in biomedical signals--a review.

    PubMed

    Chua, Kuang Chua; Chandran, Vinod; Acharya, U Rajendra; Lim, Choo Min

    2010-09-01

    For many decades correlation and power spectrum have been primary tools for digital signal processing applications in the biomedical area. The information contained in the power spectrum is essentially that of the autocorrelation sequence; which is sufficient for complete statistical descriptions of Gaussian signals of known means. However, there are practical situations where one needs to look beyond autocorrelation of a signal to extract information regarding deviation from Gaussianity and the presence of phase relations. Higher order spectra, also known as polyspectra, are spectral representations of higher order statistics, i.e. moments and cumulants of third order and beyond. HOS (higher order statistics or higher order spectra) can detect deviations from linearity, stationarity or Gaussianity in the signal. Most of the biomedical signals are non-linear, non-stationary and non-Gaussian in nature and therefore it can be more advantageous to analyze them with HOS compared to the use of second-order correlations and power spectra. In this paper we have discussed the application of HOS for different bio-signals. HOS methods of analysis are explained using a typical heart rate variability (HRV) signal and applications to other signals are reviewed.

  20. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  1. A Novel Short-term Event Extraction Algorithm for Biomedical Signals.

    PubMed

    Yazdani, Sasan; Fallet, Sibylle; Vesin, Jean-Marc

    2017-06-21

    In this paper we propose a fast novel non-linear filtering method named Relative-Energy (Rel-En), for robust short-term event extraction from biomedical signals. We developed an algorithm that extracts short- and long-term energies in a signal and provides a coefficient vector with which the signal is multiplied, heightening events of interest. This algorithm is thoroughly assessed on benchmark datasets in three different biomedical applications namely, ECG QRS-complex detection, EEG K-complex detection, and imaging photoplethysmography (iPPG) peak detection. Rel-En successfully identified the events in these settings. Compared to the state-of-the-art, better or comparable results were obtained on QRS-complex and K-complex detection. For iPPG peak detection, the proposed method was used as a preprocessing step to a fixed threshold algorithm that lead to a significant improvement in overall results. While easily defined and computed, Rel-En robustly extracted short-term events of interest. The proposed algorithm can be implemented by two filters and its parameters can be selected easily and intuitively. Furthermore, Rel-En algorithm can be used in other biomedical signal processing applications where a need of short-term event extraction is present.

  2. Measurement of biomedical signals from helmet based system.

    PubMed

    Kim, Yun Seong; Choi, Jong Min; Lee, Haet Bit; Kim, Jung Soo; Baek, Hyun Jae; Ryu, Myung Suk; Son, Ryang Hee; Park, Kwang Suk

    2007-01-01

    In the military, there are many dangerous, difficult working conditions. Sometimes soldiers have to cope with the enemy and cannot sleep all the day because of overnight training or operation. So the health state of serviceperson has been the main concern of the commander. Despite the importance, to monitor the health condition of the serviceperson without influence of daily schedule or combat situation is very difficult, because there are so many artifact sources in the field. And the devices which monitor the subjects need the additional work and subjects need to be very careful doing something with devices. We intend to develop the system that can monitor the soldier's biomedical signal unconstrainedly. In this paper we introduce the prototype of our helmet based system. This system measured the ECG, EOG and Eye Blink successfully.

  3. Optical signal processing

    NASA Technical Reports Server (NTRS)

    Casasent, D.

    1978-01-01

    The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.

  4. Modified Log-LMS adaptive filter with low signal distortion for biomedical applications.

    PubMed

    Jiao, Yuzhong; Cheung, Rex Y P; Mok, Mark P C

    2012-01-01

    Life signals from human body, e.g. heartbeat or electrocardiography (ECG), are usually weak and susceptible to external noise and interference. Adaptive filter is a good tool to reduce the influence of ambient noise/interference on the life signals. Least mean squares (LMS) algorithm, as one of most popular adaptive algorithms for active noise cancellation (ANC) by adaptive filtering, has the advantage of easy implementation. In order to further decrease the complexity of LMS algorithm based adaptive filter, a Log-LMS algorithm was proposed, which quantized signals by the function of log2. The algorithm can replace multipliers by simple shifting. However, both LMS algorithm and Log-LMS algorithm have the disadvantage of serious signal distortion in biomedical applications. In this paper, a modified Log-LMS algorithm is presented, which divides the convergence process into two different stages, and utilizes different quantization method in each stage. Two scenarios of biomedical applications are used for analysis, 1) using stethoscope in emergence medical helicopter and 2) measuring ECG under power line interference. The simulated results show that the modified algorithm can achieve fast convergence and low signal distortion in processing periodic life signals.

  5. Signal Processing, Analysis, & Display

    SciTech Connect

    Lager, Darrell; Azevado, Stephen

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  6. Recent advances in natural language processing for biomedical applications.

    PubMed

    Collier, Nigel; Nazarenko, Adeline; Baud, Robert; Ruch, Patrick

    2006-06-01

    We survey a set a recent advances in natural language processing applied to biomedical applications, which were presented in Geneva, Switzerland, in 2004 at an international workshop. While text mining applied to molecular biology and biomedical literature can report several interesting achievements, we observe that studies applied to clinical contents are still rare. In general, we argue that clinical corpora, including electronic patient records, must be made available to fill the gap between bioinformatics and medical informatics.

  7. Biomedical Simulation Models of Human Auditory Processes

    NASA Technical Reports Server (NTRS)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  8. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Oppenheim, A. V.; Baggeroer, A. B.; Lim, J. S.; Musicus, B. R.; Mook, D. R.; Duckworth, G. L.; Bordley, T. E.; Curtis, S. R.; Deadrick, D. S.; Dove, W. P.

    1984-01-01

    Signal and image processing research projects are described. Topics include: (1) modeling underwater acoustic propagation; (2) image restoration; (3) signal reconstruction; (4) speech enhancement; (5) pitch detection; (6) spectral analysis; (7) speech synthesis; (8) speech enhancement; (9) autoregressive spectral estimation; (10) knowledge based array processing; (11) speech analysis; (12) estimating the degree of coronary stenosis with image processing; (13) automatic target detection; and (14) video conferencing.

  9. Semantic biomedical resource discovery: a Natural Language Processing framework.

    PubMed

    Sfakianaki, Pepi; Koumakis, Lefteris; Sfakianakis, Stelios; Iatraki, Galatia; Zacharioudakis, Giorgos; Graf, Norbert; Marias, Kostas; Tsiknakis, Manolis

    2015-09-30

    A plethora of publicly available biomedical resources do currently exist and are constantly increasing at a fast rate. In parallel, specialized repositories are been developed, indexing numerous clinical and biomedical tools. The main drawback of such repositories is the difficulty in locating appropriate resources for a clinical or biomedical decision task, especially for non-Information Technology expert users. In parallel, although NLP research in the clinical domain has been active since the 1960s, progress in the development of NLP applications has been slow and lags behind progress in the general NLP domain. The aim of the present study is to investigate the use of semantics for biomedical resources annotation with domain specific ontologies and exploit Natural Language Processing methods in empowering the non-Information Technology expert users to efficiently search for biomedical resources using natural language. A Natural Language Processing engine which can "translate" free text into targeted queries, automatically transforming a clinical research question into a request description that contains only terms of ontologies, has been implemented. The implementation is based on information extraction techniques for text in natural language, guided by integrated ontologies. Furthermore, knowledge from robust text mining methods has been incorporated to map descriptions into suitable domain ontologies in order to ensure that the biomedical resources descriptions are domain oriented and enhance the accuracy of services discovery. The framework is freely available as a web application at ( http://calchas.ics.forth.gr/ ). For our experiments, a range of clinical questions were established based on descriptions of clinical trials from the ClinicalTrials.gov registry as well as recommendations from clinicians. Domain experts manually identified the available tools in a tools repository which are suitable for addressing the clinical questions at hand, either

  10. Real time biomedical signal transmission of mixed ECG signal and patient information using visible light communication.

    PubMed

    Tan, Yee Yong; Jung, Sang-Joong; Chung, Wan-Young

    2013-01-01

    The utilization of radio-frequency (RF) communication technology in healthcare application, especially in the transmission of health-related data such as biomedical signal and patient information is often perturbed by electromagnetic interference (EMI). This will not only significantly reduce the accuracy and reliability of the data transmitted, but could also compromise the safety of the patients due to radio frequency (RF) radiation. In this paper, we propose a method which utilizes visible light communication technology as a platform for transmission and to provide real-time monitoring of heart rate and patient information. White LED beam is used as the illuminating source to simultaneously transmit biomedical signal as well as patient record. On-off Keying (OOK) modulation technique is used to modulate all the data onto the visible light beam. Both types of data will be transmitted using a single data packet. At the receiving end, a receiver circuit consisting of a high-speed PIN photodetector and a demodulation circuit is employed to demodulate the data from the visible light beam. The demodulated data is then serially transmitted to a personal computer where the biomedical signal, patient information and heart rate can be monitored in real-time.

  11. A novel application of the S-transform in removing powerline interference from biomedical signals.

    PubMed

    Huang, Chien-Chun; Liang, Sheng-Fu; Young, Ming-Shing; Shaw, Fu-Zen

    2009-01-01

    Powerline interference always disturbs recordings of biomedical signals. Numerous methods have been developed to reduce powerline interference. However, most of these techniques not only reduce the interference but also attenuate the 60 Hz power of the biomedical signals themselves. In the present study, we applied the S-transform, which provides an absolute phase of each frequency in a multi-resolution time-frequency analysis, to reduce 60 Hz interference. According to results from an electrocardiogram (ECG) to which a simulated 60 Hz noise was added, the S-transform de-noising process restored a power spectrum identical to that of the original ECG coincident with a significant reduction in the 60 Hz interference. Moreover, the S-transform de-noised the signal in an intensity-independent manner when reducing the 60 Hz interference. In both a real ECG signal from the MIT database and natural brain activity contaminated with 60 Hz interference, the S-transform also displayed superior merit to a notch filter in the aspect of reducing noise and preserving the signal. Based on these data, a novel application of the S-transform for removing powerline interference is established.

  12. Heavy-tailed prediction error: a difficulty in predicting biomedical signals of 1/f noise type.

    PubMed

    Li, Ming; Zhao, Wei; Chen, Biao

    2012-01-01

    A fractal signal x(t) in biomedical engineering may be characterized by 1/f noise, that is, the power spectrum density (PSD) divergences at f = 0. According the Taqqu's law, 1/f noise has the properties of long-range dependence and heavy-tailed probability density function (PDF). The contribution of this paper is to exhibit that the prediction error of a biomedical signal of 1/f noise type is long-range dependent (LRD). Thus, it is heavy-tailed and of 1/f noise. Consequently, the variance of the prediction error is usually large or may not exist, making predicting biomedical signals of 1/f noise type difficult.

  13. A hierarchical method for removal of baseline drift from biomedical signals: application in ECG analysis.

    PubMed

    Luo, Yurong; Hargraves, Rosalyn H; Belle, Ashwin; Bai, Ou; Qi, Xuguang; Ward, Kevin R; Pfaffenberger, Michael Paul; Najarian, Kayvan

    2013-01-01

    Noise can compromise the extraction of some fundamental and important features from biomedical signals and hence prohibit accurate analysis of these signals. Baseline wander in electrocardiogram (ECG) signals is one such example, which can be caused by factors such as respiration, variations in electrode impedance, and excessive body movements. Unless baseline wander is effectively removed, the accuracy of any feature extracted from the ECG, such as timing and duration of the ST-segment, is compromised. This paper approaches this filtering task from a novel standpoint by assuming that the ECG baseline wander comes from an independent and unknown source. The technique utilizes a hierarchical method including a blind source separation (BSS) step, in particular independent component analysis, to eliminate the effect of the baseline wander. We examine the specifics of the components causing the baseline wander and the factors that affect the separation process. Experimental results reveal the superiority of the proposed algorithm in removing the baseline wander.

  14. Surface Electromyography Signal Processing and Classification Techniques

    PubMed Central

    Chowdhury, Rubana H.; Reaz, Mamun B. I.; Ali, Mohd Alauddin Bin Mohd; Bakar, Ashrif A. A.; Chellappan, Kalaivani; Chang, Tae. G.

    2013-01-01

    Electromyography (EMG) signals are becoming increasingly important in many applications, including clinical/biomedical, prosthesis or rehabilitation devices, human machine interactions, and more. However, noisy EMG signals are the major hurdles to be overcome in order to achieve improved performance in the above applications. Detection, processing and classification analysis in electromyography (EMG) is very desirable because it allows a more standardized and precise evaluation of the neurophysiological, rehabitational and assistive technological findings. This paper reviews two prominent areas; first: the pre-processing method for eliminating possible artifacts via appropriate preparation at the time of recording EMG signals, and second: a brief explanation of the different methods for processing and classifying EMG signals. This study then compares the numerous methods of analyzing EMG signals, in terms of their performance. The crux of this paper is to review the most recent developments and research studies related to the issues mentioned above. PMID:24048337

  15. Astronaut Kenneth Reightler processes biomedical samples in SPACEHAB

    NASA Image and Video Library

    1994-02-09

    STS060-301-003 (3-11 Feb 1994) --- Astronaut Kenneth S. Reightler, STS-60 pilot, processes biomedical samples in a centrifuge aboard the SPACEHAB module. Reightler joined four other NASA astronauts and a Russian cosmonaut for eight days of research aboard the Space Shuttle Discovery.

  16. Array signal processing

    SciTech Connect

    Haykin, S.; Justice, J.H.; Owsley, N.L.; Yen, J.L.; Kak, A.C.

    1985-01-01

    This is the first book to be devoted completely to array signal processing, a subject that has become increasingly important in recent years. The book consists of six chapters. Chapter 1, which is introductory, reviews some basic concepts in wave propagation. The remaining five chapters deal with the theory and applications of array signal processing in (a) exploration seismology, (b) passive sonar, (c) radar, (d) radio astronomy, and (e) tomographic imaging. The various chapters of the book are self-contained. The book is written by a team of five active researchers, who are specialists in the individual fields covered by the pertinent chapters.

  17. [Required procedure for nominal data files processing in biomedical research].

    PubMed

    Chambon-Savanovitch, C; Dubray, C; Albuisson, E; Sauvant, M P

    2001-12-01

    To date, biomedical research using nominal data files for the data collection, data acquisition or data processing has had to comply with 2 French laws (Law of December, 20, 1988, modified, relating to the protection of patients participating in biomedical research, and the Law of January, 6, 1978, completed by the Law of July 1, 1994 n degrees 94-548, chapter V bis). This later law dictates rules not only for the establishment of nominal data files, but also confer individual rights to filed persons. These regulations concern epidemiological research, clinical trials, drug watch studies and economic health research. In this note, we describe the obligations and specific general and simplified procedure required for conducting biomedical research. Included in the requirements are an information and authorization procedure with the local and national consultative committees on data processing in biomedical research (CCTIRS, Comité Consultatif sur le Traitement de l'Information en Recherche Biomédicale, and CNIL, Commission Nationale Informatique et Libertés).

  18. Telemetry Ranging: Signal Processing

    NASA Astrophysics Data System (ADS)

    Hamkins, J.; Kinman, P.; Xie, H.; Vilnrotter, V.; Dolinar, S.

    2016-02-01

    This article describes the details of the signal processing used in a telemetry ranging system in which timing information is extracted from the downlink telemetry signal in order to compute spacecraft range. A previous article describes telemetry ranging concepts and architecture, which are a slight variation of a scheme published earlier. As in that earlier work, the telemetry ranging concept eliminates the need for a dedicated downlink ranging signal to communicate the necessary timing information. The present article describes the operation and performance of the major receiver functions on the spacecraft and the ground --- many of which are standard tracking loops already in use in JPL's flight and ground radios --- and how they can be used to provide the relevant information for making a range measurement. It also describes the implementation of these functions in software, and performance of an end-to-end software simulation of the telemetry ranging system.

  19. RASSP signal processing architectures

    NASA Astrophysics Data System (ADS)

    Shirley, Fred; Bassett, Bob; Letellier, J. P.

    1995-06-01

    The rapid prototyping of application specific signal processors (RASSP) program is an ARPA/tri-service effort to dramatically improve the process by which complex digital systems, particularly embedded signal processors, are specified, designed, documented, manufactured, and supported. The domain of embedded signal processing was chosen because it is important to a variety of military and commercial applications as well as for the challenge it presents in terms of complexity and performance demands. The principal effort is being performed by two major contractors, Lockheed Sanders (Nashua, NH) and Martin Marietta (Camden, NJ). For both, improvements in methodology are to be exercised and refined through the performance of individual 'Demonstration' efforts. The Lockheed Sanders' Demonstration effort is to develop an infrared search and track (IRST) processor. In addition, both contractors' results are being measured by a series of externally administered (by Lincoln Labs) six-month Benchmark programs that measure process improvement as a function of time. The first two Benchmark programs are designing and implementing a synthetic aperture radar (SAR) processor. Our demonstration team is using commercially available VME modules from Mercury Computer to assemble a multiprocessor system scalable from one to hundreds of Intel i860 microprocessors. Custom modules for the sensor interface and display driver are also being developed. This system implements either proprietary or Navy owned algorithms to perform the compute-intensive IRST function in real time in an avionics environment. Our Benchmark team is designing custom modules using commercially available processor ship sets, communication submodules, and reconfigurable logic devices. One of the modules contains multiple vector processors optimized for fast Fourier transform processing. Another module is a fiberoptic interface that accepts high-rate input data from the sensors and provides video-rate output data to a

  20. Adaptive Signal Processing Testbed

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1991-09-01

    The design and implementation of a system for the acquisition, processing, and analysis of signal data is described. The initial application for the system is the development and analysis of algorithms for excision of interfering tones from direct sequence spread spectrum communication systems. The system is called the Adaptive Signal Processing Testbed (ASPT) and is an integrated hardware and software system built around the TMS320C30 chip. The hardware consists of a radio frequency data source, digital receiver, and an adaptive signal processor implemented on a Sun workstation. The software components of the ASPT consists of a number of packages including the Sun driver package; UNIX programs that support software development on the TMS320C30 boards; UNIX programs that provide the control, user interaction, and display capabilities for the data acquisition, processing, and analysis components of the ASPT; and programs that perform the ASPT functions including data acquisition, despreading, and adaptive filtering. The performance of the ASPT system is evaluated by comparing actual data rates against their desired values. A number of system limitations are identified and recommendations are made for improvements.

  1. Natural Language Processing Methods and Systems for Biomedical Ontology Learning

    PubMed Central

    Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.

    2010-01-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054

  2. Natural Language Processing methods and systems for biomedical ontology learning.

    PubMed

    Liu, Kaihong; Hogan, William R; Crowley, Rebecca S

    2011-02-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of Natural Language Processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies.

  3. Multipoint multirate signal processing

    NASA Astrophysics Data System (ADS)

    Claypoole, Roger L., Jr.

    1994-12-01

    This thesis provides a fundamentally new, systematic study of multipoint multirate signal processing systems. The multipoint multirate operators are analyzed via equivalent circuits comprised entirely of conventional multirate operators. Interconnections of the operators are demonstrated, and the multipoint noble identities are derived. The multipoint polyphase representation is presented, and the M channel multipoint multirate system with vector length N is presented as an MN channel multipoint polyphase system. The conditions sufficient for perfect reconstruction in the multipoint multirate system are derived. These conditions constrain the multipoint filter banks to be composed of comb filters generated from paraunitary sets of conventional filters. The perfect reconstruction multipoint multirate system is then combined with the multiresolution wavelet decomposition to form the generalized wavelet decomposition with varying vector decimation length at each level. The generalized wavelet decomposition is used as an algorithm to redistribute the energy of a signal throughout the levels of the decomposition. It is shown that, for band pass and high pass signals, significant improvements can be made in the energy distribution. It is recommended that this algorithm be studied as a front end to a vector quantizer for data compression applications.

  4. Packet loss mitigation for biomedical signals in healthcare telemetry.

    PubMed

    Garudadri, Harinath; Baheti, Pawan K

    2009-01-01

    In this work, we propose an effective application layer solution for packet loss mitigation in the context of Body Sensor Networks (BSN) and healthcare telemetry. Packet losses occur due to many reasons including excessive path loss, interference from other wireless systems, handoffs, congestion, system loading, etc. A call for action is in order, as packet losses can have extremely adverse impact on many healthcare applications relying on BAN and WAN technologies. Our approach for packet loss mitigation is based on Compressed Sensing (CS), an emerging signal processing concept, wherein significantly fewer sensor measurements than that suggested by Shannon/Nyquist sampling theorem can be used to recover signals with arbitrarily fine resolution. We present simulation results demonstrating graceful degradation of performance with increasing packet loss rate. We also compare the proposed approach with retransmissions. The CS based packet loss mitigation approach was found to maintain up to 99% beat-detection accuracy at packet loss rates of 20%, with a constant latency of less than 2.5 seconds.

  5. Optical signal processing

    NASA Astrophysics Data System (ADS)

    Dorey, J.

    The theoretical principles, design, and application of optical signal-processing devices are examined in a general review and illustrated with diagrams, with an emphasis on their use in radar, sonar, and lidar systems. Topics discussed include Fourier and Fresnel transforms, coherent-light computer techniques (film, electrooptical acoustooptical, and hybrid recording methods; processing of SLAR data; the convolution theorem in coherent optics; and the use of spatial or temporal integration in acoustooptic components), and incoherent-light techniques (the Mertz setup, mask correlation, elimination of spurious components, localization and imaging of EM or IR sources by a mobile-mask technique, and processing of vectors and matrices). The need to compress the output data of high-speed optical processors by detection, thresholding, or (possibly nonlinear) block-recognition functions related to extraction and decision-making processes is stressed, since otherwise digital processing of the output causes a bottleneck effect which negates the speed advantages of optical systems over all-digital solutions.

  6. Compression of biomedical signals with mother wavelet optimization and best-basis wavelet packet selection.

    PubMed

    Brechet, Laurent; Lucas, Marie-Françoise; Doncarli, Christian; Farina, Dario

    2007-12-01

    We propose a novel scheme for signal compression based on the discrete wavelet packet transform (DWPT) decompositon. The mother wavelet and the basis of wavelet packets were optimized and the wavelet coefficients were encoded with a modified version of the embedded zerotree algorithm. This signal dependant compression scheme was designed by a two-step process. The first (internal optimization) was the best basis selection that was performed for a given mother wavelet. For this purpose, three additive cost functions were applied and compared. The second (external optimization) was the selection of the mother wavelet based on the minimal distortion of the decoded signal given a fixed compression ratio. The mother wavelet was parameterized in the multiresolution analysis framework by the scaling filter, which is sufficient to define the entire decomposition in the orthogonal case. The method was tested on two sets of ten electromyographic (EMG) and ten electrocardiographic (ECG) signals that were compressed with compression ratios in the range of 50%-90%. For 90% compression ratio of EMG (ECG) signals, the percent residual difference after compression decreased from (mean +/- SD) 48.6 +/- 9.9% (21.5 +/- 8.4%) with discrete wavelet transform (DWT) using the wavelet leading to poorest performance to 28.4 +/- 3.0% (6.7 +/- 1.9%) with DWPT, with optimal basis selection and wavelet optimization. In conclusion, best basis selection and optimization of the mother wavelet through parameterization led to substantial improvement of performance in signal compression with respect to DWT and randon selection of the mother wavelet. The method provides an adaptive approach for optimal signal representation for compression and can thus be applied to any type of biomedical signal.

  7. Powerline noise elimination in biomedical signals via blind source separation and wavelet analysis.

    PubMed

    Akwei-Sekyere, Samuel

    2015-01-01

    The distortion of biomedical signals by powerline noise from recording biomedical devices has the potential to reduce the quality and convolute the interpretations of the data. Usually, powerline noise in biomedical recordings are extinguished via band-stop filters. However, due to the instability of biomedical signals, the distribution of signals filtered out may not be centered at 50/60 Hz. As a result, self-correction methods are needed to optimize the performance of these filters. Since powerline noise is additive in nature, it is intuitive to model powerline noise in a raw recording and subtract it from the raw data in order to obtain a relatively clean signal. This paper proposes a method that utilizes this approach by decomposing the recorded signal and extracting powerline noise via blind source separation and wavelet analysis. The performance of this algorithm was compared with that of a 4th order band-stop Butterworth filter, empirical mode decomposition, independent component analysis and, a combination of empirical mode decomposition with independent component analysis. The proposed method was able to expel sinusoidal signals within powerline noise frequency range with higher fidelity in comparison with the mentioned techniques, especially at low signal-to-noise ratio.

  8. Powerline noise elimination in biomedical signals via blind source separation and wavelet analysis

    PubMed Central

    2015-01-01

    The distortion of biomedical signals by powerline noise from recording biomedical devices has the potential to reduce the quality and convolute the interpretations of the data. Usually, powerline noise in biomedical recordings are extinguished via band-stop filters. However, due to the instability of biomedical signals, the distribution of signals filtered out may not be centered at 50/60 Hz. As a result, self-correction methods are needed to optimize the performance of these filters. Since powerline noise is additive in nature, it is intuitive to model powerline noise in a raw recording and subtract it from the raw data in order to obtain a relatively clean signal. This paper proposes a method that utilizes this approach by decomposing the recorded signal and extracting powerline noise via blind source separation and wavelet analysis. The performance of this algorithm was compared with that of a 4th order band-stop Butterworth filter, empirical mode decomposition, independent component analysis and, a combination of empirical mode decomposition with independent component analysis. The proposed method was able to expel sinusoidal signals within powerline noise frequency range with higher fidelity in comparison with the mentioned techniques, especially at low signal-to-noise ratio. PMID:26157639

  9. On the application of the auto mutual information rate of decrease to biomedical signals.

    PubMed

    Escudero, Javier; Hornero, Roberto; Abasolo, Daniel; Lopez, Miguel

    2008-01-01

    The auto mutual information function (AMIF) evaluates the signal predictability by assessing linear and non-linear dependencies between two measurements taken from a single time series. Furthermore, the AMIF rate of decrease (AMIFRD) is correlated with signal entropy. This metric has been used to analyze biomedical data, including cardiac and brain activity recordings. Hence, the AMIFRD can be a relevant parameter in the context of biomedical signal analysis. Thus, in this pilot study, we have analyzed a synthetic sequence (a Lorenz system) and real biosignals (electroencephalograms recorded with eyes open and closed) with the AMIFRD. We aimed at illustrating the application of this parameter to biomedical time series. Our results show that the AMIFRD can detect changes in the non-linear dynamics of a sequence and that it can distinguish different physiological conditions.

  10. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  11. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  12. A synchronization system for the analysis of biomedical signals recorded with different devices from mechanically ventilated patients.

    PubMed

    Camacho, Alejandro; Hernández, A Mauricio; Londoño, Zulma; Serna, Leidy Y; Mañanas, Miguel A

    2012-01-01

    Conducting research associated with mechanically ventilated patients often requires the recording of several biomedical signals to dispose of multiple sources of information to perform a robust analysis. This is especially important in the analysis of the relationship between pressure, volume and flow, signals available from mechanical ventilators, and other biopotentials such as the electromyogram of respiratory muscles, intrinsically related with the ventilatory process, but not commonly recorded in the clinical practice. Despite the usefulness of recording signals from multiple sources, few medical devices include the possibility of synchronizing its data with other provided by different biomedical equipment and some may use inaccurate sampling frequencies. Even thought a variant or inaccurate sampling rate does not affect the monitoring of critical patients, it restricts the study of simultaneous related events useful in research of respiratory system activity. In this article a device for temporal synchronization of signals recorded from multiple biomedical devices is described as well as its application in the study of patients undergoing mechanical ventilation with research purposes.

  13. Self-processing networks and their biomedical implications

    SciTech Connect

    Reggia, J.A.; Sutton, G.G. III )

    1988-06-01

    Self-processing networks (connectionist models, neural networks, marker-passing systems, etc.) represent information as a network of interconnected nodes and process that information through the controlled spread of activation throughout the network. This paper characterizes the nature of self-processing networks developed as models of intelligent systems in neuroscience, cognitive science, and artificial intelligence, and contrasts them with more traditional information processing models. In spite of the different perspectives and goals of individuals in these three fields, it is seen that important common principles are being revealed by this multidisciplinary work. In addition to the emphasis on intelligence as an emergent ''system property,'' these common themes include the importance of parallelism to intelligent systems and the notion of an active rather than passive memory. The historical evolution of these principles is emphasized and their potential biomedical significance is explored.

  14. Neural Network Communications Signal Processing

    DTIC Science & Technology

    1994-08-01

    This final technical report describes the research and development- results of the Neural Network Communications Signal Processing (NNCSP) Program...The objectives of the NNCSP program are to: (1) develop and implement a neural network and communications signal processing simulation system for the...purpose of exploring the applicability of neural network technology to communications signal processing; (2) demonstrate several configurations of the

  15. Hilbert-Huang transformation-based time-frequency analysis methods in biomedical signal applications.

    PubMed

    Lin, Chin-Feng; Zhu, Jin-De

    2012-03-01

    Hilbert-Huang transformation, wavelet transformation, and Fourier transformation are the principal time-frequency analysis methods. These transformations can be used to discuss the frequency characteristics of linear and stationary signals, the time-frequency features of linear and non-stationary signals, the time-frequency features of non-linear and non-stationary signals, respectively. The Hilbert-Huang transformation is a combination of empirical mode decomposition and Hilbert spectral analysis. The empirical mode decomposition uses the characteristics of signals to adaptively decompose them to several intrinsic mode functions. Hilbert transforms are then used to transform the intrinsic mode functions into instantaneous frequencies, to obtain the signal's time-frequency-energy distributions and features. Hilbert-Huang transformation-based time-frequency analysis can be applied to natural physical signals such as earthquake waves, winds, ocean acoustic signals, mechanical diagnosis signals, and biomedical signals. In previous studies, we examined Hilbert-Huang transformation-based time-frequency analysis of the electroencephalogram FPI signals of clinical alcoholics, and 'sharp I' wave-based Hilbert-Huang transformation time-frequency features. In this paper, we discuss the application of Hilbert-Huang transformation-based time-frequency analysis to biomedical signals, such as electroencephalogram, electrocardiogram signals, electrogastrogram recordings, and speech signals.

  16. TIME INVARIANT MULTI ELECTRODE AVERAGING FOR BIOMEDICAL SIGNALS.

    PubMed

    Orellana, R Martinez; Erem, B; Brooks, D H

    2013-12-31

    One of the biggest challenges in averaging ECG or EEG signals is to overcome temporal misalignments and distortions, due to uncertain timing or complex non-stationary dynamics. Standard methods average individual leads over a collection of epochs on a time-sample by time-sample basis, even when multi-electrode signals are available. Here we propose a method that averages multi electrode recordings simultaneously by using spatial patterns and without relying on time or frequency.

  17. Optical Signal Processing

    DTIC Science & Technology

    1990-02-28

    compatible with the laser cation in the on-line inspection of products such as source. Thus, if the laser wavelength is z850 nm, hypodermic needles ...content for cw signals, short pulse signals, and evolving pulse signals - - the most difficult ones to analyze. We performed an extensive analysis on a...agreer.nt with the theory , and support our claims concerning the high performance level of our acousto-optir. architecture. We recognized the opportunity to

  18. Semi-supervised Stacked Label Consistent Autoencoder for Reconstruction and Analysis of Biomedical Signals.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2016-11-22

    An autoencoder based framework that simultaneously reconstruct and classify biomedical signals is proposed. Previous work has treated reconstruction and classification as separate problems. This is the first work that proposes a combined framework to address the issue in a holistic fashion.

  19. Compact biomedical pulsed signal generator for bone tissue stimulation

    DOEpatents

    Kronberg, J.W.

    1993-06-08

    An apparatus for stimulating bone tissue for stimulating bone growth or treating osteoporosis by applying directly to the skin of the patient an alternating current electrical signal comprising wave forms known to simulate the piezoelectric constituents in bone. The apparatus may, by moving a switch, stimulate bone growth or treat osteoporosis, as desired. Based on low-power CMOS technology and enclosed in a moisture-resistant case shaped to fit comfortably, two astable multivibrators produce the desired waveforms. The amplitude, pulse width and pulse frequency, and the subpulse width and subpulse frequency of the waveforms are adjustable. The apparatus, preferably powered by a standard 9-volt battery, includes signal amplitude sensors and warning signals indicate an output is being produced and the battery needs to be replaced.

  20. Compact biomedical pulsed signal generator for bone tissue stimulation

    SciTech Connect

    Kronberg, James W.

    1993-01-01

    An apparatus for stimulating bone tissue for stimulating bone growth or treating osteoporosis by applying directly to the skin of the patient an alternating current electrical signal comprising wave forms known to simulate the piezoelectric constituents in bone. The apparatus may, by moving a switch, stimulate bone growth or treat osteoporosis, as desired. Based on low-power CMOS technology and enclosed in a moisture-resistant case shaped to fit comfortably, two astable multivibrators produce the desired waveforms. The amplitude, pulse width and pulse frequency, and the subpulse width and subpulse frequency of the waveforms are adjustable. The apparatus, preferably powered by a standard 9-volt battery, includes signal amplitude sensors and warning signals indicate an output is being produced and the battery needs to be replaced.

  1. Multidimensional signal processing for ultrasonic signal classification

    NASA Astrophysics Data System (ADS)

    Kim, J.; Ramuhalli, P.; Udpa, L.; Udpa, S.

    2001-04-01

    Neural network based signal classification systems are being used increasingly in the analysis of large volumes of data obtained in NDE applications. One example is in the interpretation on ultrasonic signals obtained from inspection of welds where signals can be due to porosity, slag, lack of fusion and cracks in the weld region. Standard techniques rely on differences in individual A-scans to classify the signals. This paper proposes an ultrasonic signal classification technique based on the information in a group of signals and examining the statistical characteristics of the signals. The method was 2-dimensional signal processing algorithms to analyze the information in B- and B'-scan images. In this paper, 2-dimensional transform based coefficients of the images are used as features and a multilayer perceptron is used to classify them. These results are then combined to get the final classification for the inspected region. Results of applying the technique to data obtained from the inspection of welds are presented.

  2. Assignment of Empirical Mode Decomposition Components and Its Application to Biomedical Signals.

    PubMed

    Schiecke, K; Schmidt, C; Piper, D; Putsche, P; Feucht, M; Witte, H; Leistritz, L

    2015-01-01

    automated EMD-based processing concepts for biomedical signals.

  3. An Ultra-Low Power Turning Angle Based Biomedical Signal Compression Engine with Adaptive Threshold Tuning

    PubMed Central

    Zhou, Jun; Wang, Chao

    2017-01-01

    Intelligent sensing is drastically changing our everyday life including healthcare by biomedical signal monitoring, collection, and analytics. However, long-term healthcare monitoring generates tremendous data volume and demands significant wireless transmission power, which imposes a big challenge for wearable healthcare sensors usually powered by batteries. Efficient compression engine design to reduce wireless transmission data rate with ultra-low power consumption is essential for wearable miniaturized healthcare sensor systems. This paper presents an ultra-low power biomedical signal compression engine for healthcare data sensing and analytics in the era of big data and sensor intelligence. It extracts the feature points of the biomedical signal by window-based turning angle detection. The proposed approach has low complexity and thus low power consumption while achieving a large compression ratio (CR) and good quality of reconstructed signal. Near-threshold design technique is adopted to further reduce the power consumption on the circuit level. Besides, the angle threshold for compression can be adaptively tuned according to the error between the original signal and reconstructed signal to address the variation of signal characteristics from person to person or from channel to channel to meet the required signal quality with optimal CR. For demonstration, the proposed biomedical compression engine has been used and evaluated for ECG compression. It achieves an average (CR) of 71.08% and percentage root-mean-square difference (PRD) of 5.87% while consuming only 39 nW. Compared to several state-of-the-art ECG compression engines, the proposed design has significantly lower power consumption while achieving similar CRD and PRD, making it suitable for long-term wearable miniaturized sensor systems to sense and collect healthcare data for remote data analytics. PMID:28783079

  4. An Ultra-Low Power Turning Angle Based Biomedical Signal Compression Engine with Adaptive Threshold Tuning.

    PubMed

    Zhou, Jun; Wang, Chao

    2017-08-06

    Intelligent sensing is drastically changing our everyday life including healthcare by biomedical signal monitoring, collection, and analytics. However, long-term healthcare monitoring generates tremendous data volume and demands significant wireless transmission power, which imposes a big challenge for wearable healthcare sensors usually powered by batteries. Efficient compression engine design to reduce wireless transmission data rate with ultra-low power consumption is essential for wearable miniaturized healthcare sensor systems. This paper presents an ultra-low power biomedical signal compression engine for healthcare data sensing and analytics in the era of big data and sensor intelligence. It extracts the feature points of the biomedical signal by window-based turning angle detection. The proposed approach has low complexity and thus low power consumption while achieving a large compression ratio (CR) and good quality of reconstructed signal. Near-threshold design technique is adopted to further reduce the power consumption on the circuit level. Besides, the angle threshold for compression can be adaptively tuned according to the error between the original signal and reconstructed signal to address the variation of signal characteristics from person to person or from channel to channel to meet the required signal quality with optimal CR. For demonstration, the proposed biomedical compression engine has been used and evaluated for ECG compression. It achieves an average (CR) of 71.08% and percentage root-mean-square difference (PRD) of 5.87% while consuming only 39 nW. Compared to several state-of-the-art ECG compression engines, the proposed design has significantly lower power consumption while achieving similar CRD and PRD, making it suitable for long-term wearable miniaturized sensor systems to sense and collect healthcare data for remote data analytics.

  5. Signal processing in SETI

    NASA Technical Reports Server (NTRS)

    Cullers, D. K.; Linscott, I. R.; Oliver, B. M.

    1985-01-01

    It is believed that the Galaxy might contain ten billion potential life sites. In view of the physical inaccessibility of extraterrestrial life on account of the vast distances involved, a logical first step in a search for extraterrestrial intelligence (SETI) appears to be an attempt to detect signals already being radiated. The characteristics of the signals to be expected are discussed together with the search strategy of a NASA program. It is pointed out that all presently planned searches will use existing radio-astronomy antennas. If no extraterrestrial intelligence signals are discovered, society will have to decide whether SETI justifies a dedicated facility of much greater collecting area. Attention is given to a multichannel spectrum analyzer, CW signal detection, pulse detection, the pattern detector, and details of SETI system operation.

  6. Signal processing in SETI

    NASA Technical Reports Server (NTRS)

    Cullers, D. K.; Linscott, I. R.; Oliver, B. M.

    1985-01-01

    It is believed that the Galaxy might contain ten billion potential life sites. In view of the physical inaccessibility of extraterrestrial life on account of the vast distances involved, a logical first step in a search for extraterrestrial intelligence (SETI) appears to be an attempt to detect signals already being radiated. The characteristics of the signals to be expected are discussed together with the search strategy of a NASA program. It is pointed out that all presently planned searches will use existing radio-astronomy antennas. If no extraterrestrial intelligence signals are discovered, society will have to decide whether SETI justifies a dedicated facility of much greater collecting area. Attention is given to a multichannel spectrum analyzer, CW signal detection, pulse detection, the pattern detector, and details of SETI system operation.

  7. Optical Signal Processing.

    DTIC Science & Technology

    1986-10-31

    8217 \\..\\ We let k be the ratio of the time base of the reference\\ \\ . signal to that of the received signal. PAU~ ~; II .~** We could analyze this case for an...decreases. The central to the approximation usually stated in optics texts. lobe of the pattern just covers the region II :s I when k We claimed...Reference Waveforms for Heterodyne Spectrum Analyzers K We previously developed the use of a distributed local oscillator, generated by a reference wavefront

  8. Signal processing in SETI.

    PubMed

    Cullers, D K; Linscott, I R; Oliver, B M

    1985-11-01

    The development of a multi-channel spectrum analyzer (MCSA) for the SETI program is described. The spectrum analyzer is designed for both all-sky surveys and targeted searches. The mechanisms of the MCSA are explained and a diagram is provided. Detection of continuous wave signals, pulses, and patterns is examined.

  9. Compression of multidimensional biomedical signals with spatial and temporal codebook-excited linear prediction.

    PubMed

    Carotti, Elias S G; De Martin, Juan Carlos; Merletti, Roberto; Farina, Dario

    2009-11-01

    In this paper, we propose a model-based lossy coding technique for biomedical signals in multiple dimensions. The method is based on the codebook-excited linear prediction approach and models signals as filtered noise. The filter models short-term redundancy in time; the shape of the power spectrum of the signal and the residual noise, quantized using an algebraic codebook, is used for reconstruction of the waveforms. In addition to temporal redundancy, redundancy in the coding of the filter and residual noise across spatially related signals is also exploited, yielding better compression performance in terms of SNR for a given bit rate. The proposed coding technique was tested on sets of multichannel electromyography (EMG) and EEG signals as representative examples. For 2-D EMG recordings of 56 signals, the coding technique resulted in SNR greater than 3.4 +/- 1.3 dB with respect to independent coding of the signals in the grid when the compression ratio was 89%. For EEG recordings of 15 signals and the same compression ratio as for EMG, the average gain in SNR was 2.4 +/- 0.1 dB. In conclusion, a method for exploiting both the temporal and spatial redundancy, typical of multidimensional biomedical signals, has been proposed and proved to be superior to previous coding schemes.

  10. Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis.

    PubMed

    Azami, Hamed; Fernández, Alberto; Escudero, Javier

    2017-05-02

    Multiscale entropy (MSE) has been a prevalent algorithm to quantify the complexity of biomedical time series. Recent developments in the field have tried to alleviate the problem of undefined MSE values for short signals. Moreover, there has been a recent interest in using other statistical moments than the mean, i.e., variance, in the coarse-graining step of the MSE. Building on these trends, here we introduce the so-called refined composite multiscale fuzzy entropy based on the standard deviation (RCMFEσ) and mean (RCMFEμ) to quantify the dynamical properties of spread and mean, respectively, over multiple time scales. We demonstrate the dependency of the RCMFEσ and RCMFEμ, in comparison with other multiscale approaches, on several straightforward signal processing concepts using a set of synthetic signals. The results evidenced that the RCMFEσ and RCMFEμ values are more stable and reliable than the classical multiscale entropy ones. We also inspect the ability of using the standard deviation as well as the mean in the coarse-graining process using magnetoencephalograms in Alzheimer's disease and publicly available electroencephalograms recorded from focal and non-focal areas in epilepsy. Our results indicated that when the RCMFEμ cannot distinguish different types of dynamics of a particular time series at some scale factors, the RCMFEσ may do so, and vice versa. The results showed that RCMFEσ-based features lead to higher classification accuracies in comparison with the RCMFEμ-based ones. We also made freely available all the Matlab codes used in this study at http://dx.doi.org/10.7488/ds/1477 .

  11. Automatic Seismic Signal Processing

    DTIC Science & Technology

    1982-02-04

    CATALOG NUMBER 4. TITLE (end Sublitle) S. TYPE O REPORT & PERIOD COVERED FINAL TECHNICAL REPORT - ROUTINE AUTOM!ATIC SEISMIC ANALYSIS TECHNICAL PACKAGE 6...Seismic Analysis Package ARPA Order Number: 4199 Name of Contractor: ENSCO, Inc. 4 - Contract Number: F086 06-80-C-0021 Effective Date of Contract: 10...developed and demonstrated. This timing detector algorithm times the start time of signals and their envelope peaks. It was designed to measure the size

  12. SIG. Signal Processing, Analysis, & Display

    SciTech Connect

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG; a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a `repeat` sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  13. SIG. Signal Processing, Analysis, & Display

    SciTech Connect

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  14. SIG. Signal Processing, Analysis, & Display

    SciTech Connect

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  15. Monofractal and multifractal approaches to complex biomedical signals

    NASA Astrophysics Data System (ADS)

    Stanley, H. E.; Amaral, L. A. N.; Goldberger, A. L.; Havlin, S.; Ivanov, P. Ch.; Peng, C.-K.

    2000-02-01

    Even under healthy, basal conditions, physiologic systems show erratic fluctuations resembling those found in dynamical systems driven away from an equilibrium state. Do such "nonequilibrium" fluctuations simply reflect the fact that physiologic systems are being constantly perturbed by external and intrinsic uncorrelated noise? Or, do these fluctuations actually contain "hidden" information about the underlying nonequilibrium control mechanisms? We report some recent attempts to understand the dynamics of complex physiologic fluctuations by adapting and extending concepts and methods developed very recently in statistical physics. Specifically, we focus on interbeat interval variability as an important quantity to help elucidate possibly nonhomeostatic physiologic variability because (i) the heart rate is under direct neuroautonomic control, (ii) interbeat interval variability is readily measured by noninvasive means, and (iii) analysis of these heart rate dynamics may provide important practical diagnostic and prognostic information not obtainable with current approaches. The analytic tools we discuss may be used on a wider range of physiologic signals. We first review recent progress using two analysis methods—detrended fluctuation analysis and wavelets—appropriate for quantifying monofractal structures. We then describe very recent work that quantifies multifractal features of interbeat interval series, and the discovery that the multifractal structure of healthy subjects is different from that of diseased subjects. We also discuss the application of fractal scaling analysis to the dynamics of heartbeat regulation, and report the recent finding that the scaling exponent α is smaller during sleep periods compared to wake periods.

  16. Signal Processing Circuit Development.

    DTIC Science & Technology

    1986-07-01

    Simplified active filter circuit 64 42. Video output amplifier 66 43. 64/128 gated clock circuitry 68 44. Two pole Sallen-Key active filters 7!1 45. Switched...four quadrant multiplier, log compression, multiple pole active video filtering and black level control. In what follows in this report an attempt...chip is shown in Fiqu-e 5. This is the master synch chip which generates all of the control signals necessary for TV monitor presentation of video data

  17. Sparse approximation of long-term biomedical signals for classification via dynamic PCA.

    PubMed

    Xie, Shengkun; Jin, Feng; Krishnan, Sridhar

    2011-01-01

    Sparse approximation is a novel technique in applications of event detection problems to long-term complex biomedical signals. It involves simplifying the extent of resources required to describe a large set of data sufficiently for classification. In this paper, we propose a multivariate statistical approach using dynamic principal component analysis along with the non-overlapping moving window technique to extract feature information from univariate long-term observational signals. Within the dynamic PCA framework, a few principal components plus the energy measure of signals in principal component subspace are highly promising for applying event detection problems to both stationary and non-stationary signals. The proposed method has been first tested using synthetic databases which contain various representative signals. The effectiveness of the method is then verified with real EEG signals for the purpose of epilepsy diagnosis and epileptic seizure detection. This sparse method produces a 100% classification accuracy for both synthetic data and real single channel EEG data.

  18. Software and Algorithms for Biomedical Image Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Lambert, James; Lam, Raymond

    2004-01-01

    A new software equipped with novel image processing algorithms and graphical-user-interface (GUI) tools has been designed for automated analysis and processing of large amounts of biomedical image data. The software, called PlaqTrak, has been specifically used for analysis of plaque on teeth of patients. New algorithms have been developed and implemented to segment teeth of interest from surrounding gum, and a real-time image-based morphing procedure is used to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The PlaqTrak system integrates these components into a single software suite with an easy-to-use GUI (see Figure 1) that allows users to do an end-to-end run of a patient s record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image. The automated and accurate processing of the captured images to segment each tooth [see Figure 2(a)] and then detect plaque on a tooth-by-tooth basis is a critical component of the PlaqTrak system to do clinical trials and analysis with minimal human intervention. These features offer distinct advantages over other competing systems that analyze groups of teeth or synthetic teeth. PlaqTrak divides each segmented tooth into eight regions using an advanced graphics morphing procedure [see results on a chipped tooth in Figure 2(b)], and a pattern recognition classifier is then used to locate plaque [red regions in Figure 2(d)] and enamel regions. The morphing allows analysis within regions of teeth, thereby facilitating detailed statistical analysis such as the amount of plaque present on the biting surfaces on teeth. This software system is applicable to a host of biomedical applications, such as cell analysis and life detection, or robotic applications, such

  19. Multidimensional Signal Processing

    DTIC Science & Technology

    1988-06-01

    6C ADDRESS (City, State, and ZIP Code) 7b. ADDRESS (City. State, and ZIP Code) Dept of Elec Engr & Computer Science Ecbolten NJ 07030 Griffiss AF~3...product extends over all r~i (2.Al) PropIerty 2.B9: If a is a polynomial and degi(a+a) < degia for some i, then (a+a) must contain the factor zi...information processing, surveillance sensors, intelligence data collection and handling, solid state sciences , elect romagnetics, and propagation, and electronic reliability/maintainability and compatibiit,.

  20. Bluetooth telemedicine processor for multichannel biomedical signal transmission via mobile cellular networks.

    PubMed

    Rasid, Mohd Fadlee A; Woodward, Bryan

    2005-03-01

    One of the emerging issues in m-Health is how best to exploit the mobile communications technologies that are now almost globally available. The challenge is to produce a system to transmit a patient's biomedical signals directly to a hospital for monitoring or diagnosis, using an unmodified mobile telephone. The paper focuses on the design of a processor, which samples signals from sensors on the patient. It then transmits digital data over a Bluetooth link to a mobile telephone that uses the General Packet Radio Service. The modular design adopted is intended to provide a "future-proofed" system, whose functionality may be upgraded by modifying the software.

  1. Noninvasive evaluation of mental stress using by a refined rough set technique based on biomedical signals.

    PubMed

    Liu, Tung-Kuan; Chen, Yeh-Peng; Hou, Zone-Yuan; Wang, Chao-Chih; Chou, Jyh-Horng

    2014-06-01

    Evaluating and treating of stress can substantially benefits to people with health problems. Currently, mental stress evaluated using medical questionnaires. However, the accuracy of this evaluation method is questionable because of variations caused by factors such as cultural differences and individual subjectivity. Measuring of biomedical signals is an effective method for estimating mental stress that enables this problem to be overcome. However, the relationship between the levels of mental stress and biomedical signals remain poorly understood. A refined rough set algorithm is proposed to determine the relationship between mental stress and biomedical signals, this algorithm combines rough set theory with a hybrid Taguchi-genetic algorithm, called RS-HTGA. Two parameters were used for evaluating the performance of the proposed RS-HTGA method. A dataset obtained from a practice clinic comprising 362 cases (196 male, 166 female) was adopted to evaluate the performance of the proposed approach. The empirical results indicate that the proposed method can achieve acceptable accuracy in medical practice. Furthermore, the proposed method was successfully used to identify the relationship between mental stress levels and bio-medical signals. In addition, the comparison between the RS-HTGA and a support vector machine (SVM) method indicated that both methods yield good results. The total averages for sensitivity, specificity, and precision were greater than 96%, the results indicated that both algorithms produced highly accurate results, but a substantial difference in discrimination existed among people with Phase 0 stress. The SVM algorithm shows 89% and the RS-HTGA shows 96%. Therefore, the RS-HTGA is superior to the SVM algorithm. The kappa test results for both algorithms were greater than 0.936, indicating high accuracy and consistency. The area under receiver operating characteristic curve for both the RS-HTGA and a SVM method were greater than 0.77, indicating

  2. Signal processing for semiconductor detectors

    SciTech Connect

    Goulding, F.S.; Landis, D.A.

    1982-02-01

    A balanced perspective is provided on the processing of signals produced by semiconductor detectors. The general problems of pulse shaping to optimize resolution with constraints imposed by noise, counting rate and rise time fluctuations are discussed.

  3. Snore related signals processing in a private cloud computing system.

    PubMed

    Qian, Kun; Guo, Jian; Xu, Huijie; Zhu, Zhaomeng; Zhang, Gongxuan

    2014-09-01

    Snore related signals (SRS) have been demonstrated to carry important information about the obstruction site and degree in the upper airway of Obstructive Sleep Apnea-Hypopnea Syndrome (OSAHS) patients in recent years. To make this acoustic signal analysis method more accurate and robust, big SRS data processing is inevitable. As an emerging concept and technology, cloud computing has motivated numerous researchers and engineers to exploit applications both in academic and industry field, which could have an ability to implement a huge blue print in biomedical engineering. Considering the security and transferring requirement of biomedical data, we designed a system based on private cloud computing to process SRS. Then we set the comparable experiments of processing a 5-hour audio recording of an OSAHS patient by a personal computer, a server and a private cloud computing system to demonstrate the efficiency of the infrastructure we proposed.

  4. Adaptive digital notch filter design on the unit circle for the removal of powerline noise from biomedical signals.

    PubMed

    Ferdjallah, M; Barr, R E

    1994-06-01

    This paper investigates adaptive digital notch filters for the elimination of powerline noise from biomedical signals. Since the distribution of the frequency variation of the powerline noise may or may not be centered at 60 Hz, three different adaptive digital notch filters are considered. For the first case, an adaptive FIR second-order digital notch filter is designed to track the center frequency variation. For the second case, the zeroes of an adaptive IIR second-order digital notch filter are fixed on the unit circle and the poles are adapted to find an optimum bandwidth to eliminate the noise to a pre-defined attenuation level. In the third case, both the poles and zeroes of the adaptive IIR second-order filter are adapted to track the center frequency variation within an optimum bandwidth. The adaptive process is considerably simplified by designing the notch filters by pole-zero placement on the unit circle using some suggested rules. A constrained least mean-squared (CLMS) algorithm is used for the adaptive process. To evaluate their performance, the three adaptive notch filters are applied to a powerline noise sample and to a noisy EEG as an illustration of a biomedical signal.

  5. Microsystem for signal processing applications

    NASA Astrophysics Data System (ADS)

    Frankenstein, B.; Froehlich, K.-J.; Hentschel, D.; Reppe, G.

    2005-05-01

    Acoustic monitoring of technological processes requires methods that eliminate noise as much as possible. Sensor-near signal evaluation can contribute substantially. Frequently, a further necessity exists to integrate the measuring technique in the monitored structure. The solution described contains components for analog preprocessing of acoustic signals, their digitization, algorithms for data reduction, and digital communication. The core component is a digital signal processor (DSP). Digital signal processors perform the algorithms necessary for filtering, down sampling, FFT computation and correlation of spectral components particularly effective. A compact, sensor-near signal processing structure was realized. It meets the Match-X standard, which as specified by the German Association for Mechanical and Plant Engineering (VDMA) for development of micro-technical modules, which can be combined to applicaiton specific systems. The solution is based on AL2O3 ceramic components including different signal processing modules as ADC, as well as memory and power supply. An arbitrary waveform generator has been developed and combined with a power amplifier for piezoelectric transducers in a special module. A further module interfaces to these transducers. It contains a multi-channel preamplifier, some high-pass filters for analog signal processing and an ADC-driver. A Bluetooth communication chip for wireless data transmission and a DiscOnChip module are under construction. As a first application, the combustion behavior of safety-relevant contacts is monitored. A special waveform up to 5MHz is produced and sent to the monitored object. The resulting signal form is evaluated with special algorithms, which extract significant parameters of the signal, and transmitted via CAN-bus.

  6. Biomedical image and signal de-noising using dual tree complex wavelet transform

    NASA Astrophysics Data System (ADS)

    Rizi, F. Yousefi; Noubari, H. Ahmadi; Setarehdan, S. K.

    2011-10-01

    Dual tree complex wavelet transform(DTCWT) is a form of discrete wavelet transform, which generates complex coefficients by using a dual tree of wavelet filters to obtain their real and imaginary parts. The purposes of de-noising are reducing noise level and improving signal to noise ratio (SNR) without distorting the signal or image. This paper proposes a method for removing white Gaussian noise from ECG signals and biomedical images. The discrete wavelet transform (DWT) is very valuable in a large scope of de-noising problems. However, it has limitations such as oscillations of the coefficients at a singularity, lack of directional selectivity in higher dimensions, aliasing and consequent shift variance. The complex wavelet transform CWT strategy that we focus on in this paper is Kingsbury's and Selesnick's dual tree CWT (DTCWT) which outperforms the critically decimated DWT in a range of applications, such as de-noising. Each complex wavelet is oriented along one of six possible directions, and the magnitude of each complex wavelet has a smooth bell-shape. In the final part of this paper, we present biomedical image and signal de-noising by the means of thresholding magnitude of the wavelet coefficients.

  7. Cognitive Algorithms for Signal Processing

    DTIC Science & Technology

    2011-03-18

    63] L. I. Perlovsky and R. Kozma. Eds. Neurodynamics of Higher-Level Cognition and Consciousness. Heidelberg, Germany: Springer-Verlag, 2007. [64...AFRL-RY-HS-TR-2011-0013 ________________________________________________________________________ Cognitive Algorithms for Signal Processing...in more details in [46]. ..................................... 16  1 Abstract Processes in the mind: perception, cognition

  8. A method for the time-varying nonlinear prediction of complex nonstationary biomedical signals.

    PubMed

    Faes, Luca; Chon, Ki H; Nollo, Giandomenico

    2009-02-01

    A method to perform time-varying (TV) nonlinear prediction of biomedical signals in the presence of nonstationarity is presented in this paper. The method is based on identification of TV autoregressive models through expansion of the TV coefficients onto a set of basis functions and on k-nearest neighbor local linear approximation to perform nonlinear prediction. The approach provides reasonable nonlinear prediction even for TV deterministic chaotic signals, which has been a daunting task to date. Moreover, the method is used in conjunction with a TV surrogate method to provide statistical validation that the presence of nonlinearity is not due to nonstationarity itself. The approach is tested on simulated linear and nonlinear signals reproducing both time-invariant (TIV) and TV dynamics to assess its ability to quantify TIV and TV degrees of predictability and detect nonlinearity. Applicative examples relevant to heart rate variability and EEG analyses are then illustrated.

  9. Secure information embedding into 1D biomedical signals based on SPIHT.

    PubMed

    Rubio, Oscar J; Alesanco, Alvaro; García, José

    2013-08-01

    This paper proposes an encoding system for 1D biomedical signals that allows embedding metadata and provides security and privacy. The design is based on the analysis of requirements for secure and efficient storage, transmission and access to medical tests in e-health environment. This approach uses the 1D SPIHT algorithm to compress 1D biomedical signals with clinical quality, metadata embedding in the compressed domain to avoid extra distortion, digital signature to implement security and attribute-level encryption to support Role-Based Access Control. The implementation has been extensively tested using standard electrocardiogram and electroencephalogram databases (MIT-BIH Arrhythmia, MIT-BIH Compression and SCCN-EEG), demonstrating high embedding capacity (e.g. 3 KB in resting ECGs, 200 KB in stress tests, 30 MB in ambulatory ECGs), short delays (2-3.3s in real-time transmission) and compression of the signal (by ≃3 in real-time transmission, by ≃5 in offline operation) despite of the embedding of security elements and metadata to enable e-health services.

  10. Signal processor for processing ultrasonic receiver signals

    DOEpatents

    Fasching, George E.

    1980-01-01

    A signal processor is provided which uses an analog integrating circuit in conjunction with a set of digital counters controlled by a precision clock for sampling timing to provide an improved presentation of an ultrasonic transmitter/receiver signal. The signal is sampled relative to the transmitter trigger signal timing at precise times, the selected number of samples are integrated and the integrated samples are transferred and held for recording on a strip chart recorder or converted to digital form for storage. By integrating multiple samples taken at precisely the same time with respect to the trigger for the ultrasonic transmitter, random noise, which is contained in the ultrasonic receiver signal, is reduced relative to the desired useful signal.

  11. Surface Characteristics of Titanium during ECM Process for Biomedical Applications

    SciTech Connect

    Dhobe, Shirish D.; Doloi, B.; Bhattacharyya, B.

    2011-01-17

    Electrochemical machining is described as the controlled removal of metal by anodic dissolution of the workpiece in electrolyte cell. Titanium is extensively used in aerospace, defence, biomedical applications. The human response to implanted titanium parts strongly related to the implant surface conditions. The aim of this paper is to present experimental investigation on electrochemically machined surface characteristics acquired on titanium, utilizing developed cross flow electrolyte system. It is observed that applied voltage and electrolyte flow rate are the some of the persuading parameter to attain desired surface characteristics on machined surface. Attempt has made to develop surface along with self-generated oxide layer, which facilitates in improving the corrosion and chemical resistance of titanium implant in biomedical application. The surface roughness of oxide layered machined surface obtained within 2.4 {mu}m to 2.93 {mu}m, which is within acceptable value for functional attachment between bone and implant.

  12. Acoustic signal processing toolbox for array processing

    NASA Astrophysics Data System (ADS)

    Pham, Tien; Whipps, Gene T.

    2003-08-01

    The US Army Research Laboratory (ARL) has developed an acoustic signal processing toolbox (ASPT) for acoustic sensor array processing. The intent of this document is to describe the toolbox and its uses. The ASPT is a GUI-based software that is developed and runs under MATLAB. The current version, ASPT 3.0, requires MATLAB 6.0 and above. ASPT contains a variety of narrowband (NB) and incoherent and coherent wideband (WB) direction-of-arrival (DOA) estimation and beamforming algorithms that have been researched and developed at ARL. Currently, ASPT contains 16 DOA and beamforming algorithms. It contains several different NB and WB versions of the MVDR, MUSIC and ESPRIT algorithms. In addition, there are a variety of pre-processing, simulation and analysis tools available in the toolbox. The user can perform simulation or real data analysis for all algorithms with user-defined signal model parameters and array geometries.

  13. Techniques of EMG signal analysis: detection, processing, classification and applications

    PubMed Central

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  14. SAR processing using SHARC signal processing systems

    NASA Astrophysics Data System (ADS)

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.

    1998-09-01

    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

  15. BioSigPlot: an opensource tool for the visualization of multi-channel biomedical signals with Matlab.

    PubMed

    Boudet, Samuel; Peyrodie, Laurent; Gallois, Philippe; de l'Aulnoit, Denis Houzé; Cao, Hua; Forzy, Gérard

    2013-01-01

    This paper presents a Matlab-based software (MathWorks inc.) called BioSigPlot for the visualization of multi-channel biomedical signals, particularly for the EEG. This tool is designed for researchers on both engineering and medicine who have to collaborate to visualize and analyze signals. It aims to provide a highly customizable interface for signal processing experimentation in order to plot several kinds of signals while integrating the common tools for physician. The main advantages compared to other existing programs are the multi-dataset displaying, the synchronization with video and the online processing. On top of that, this program uses object oriented programming, so that the interface can be controlled by both graphic controls and command lines. It can be used as EEGlab plug-in but, since it is not limited to EEG, it would be distributed separately. BioSigPlot is distributed free of charge (http://biosigplot.sourceforge.net), under the terms of GNU Public License for non-commercial use and open source development.

  16. Interactive Processing and Visualization of Image Data forBiomedical and Life Science Applications

    SciTech Connect

    Staadt, Oliver G.; Natarjan, Vijay; Weber, Gunther H.; Wiley,David F.; Hamann, Bernd

    2007-02-01

    Background: Applications in biomedical science and life science produce large data sets using increasingly powerful imaging devices and computer simulations. It is becoming increasingly difficult for scientists to explore and analyze these data using traditional tools. Interactive data processing and visualization tools can support scientists to overcome these limitations. Results: We show that new data processing tools and visualization systems can be used successfully in biomedical and life science applications. We present an adaptive high-resolution display system suitable for biomedical image data, algorithms for analyzing and visualization protein surfaces and retinal optical coherence tomography data, and visualization tools for 3D gene expression data. Conclusion: We demonstrated that interactive processing and visualization methods and systems can support scientists in a variety of biomedical and life science application areas concerned with massive data analysis.

  17. Signal processing of anthropometric data

    NASA Astrophysics Data System (ADS)

    Zimmermann, W. J.

    1983-09-01

    The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.

  18. Signal processing of anthropometric data

    NASA Technical Reports Server (NTRS)

    Zimmermann, W. J.

    1983-01-01

    The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.

  19. Line scan CCD image processing for biomedical application

    NASA Astrophysics Data System (ADS)

    Lee, Choon-Young; Yan, Lei; Lee, Sang-Ryong

    2010-02-01

    Blood samples are frequently analyzed for the blood disorders or other diseases in the research and clinic applications. Most of the analyses are related to blood cell counts and blood cell sizes. In this paper, the line scan CCD imaging system is developed, which is based on the Texas Instruments' TMS320C6416T (DSP6416), a high performance digital signal processor and Altera's Field programmable Gate Array (FPGA) EP3C25F324. It is used to acquire and process the images of blood cells for counting the number of cells, sizing and positioning them. The cell image is captured by line scan CCD sensor and then the digital image data converted by Analogue Front-End (AFE) are transferred into FPGA, after pre-processing they are transferred into DSP6416 through the interface of First In First Out (FIFO) in FPGA and External Memory Interfaces (EMIF) of DSP6416. Then the image data are processed in DSP6416. Experimental results show that this system is useful and efficient.

  20. Highly Parallel Modern Signal Processing.

    DTIC Science & Technology

    1982-02-28

    looked at the application of these techniques to systems with coherent speckle noise, such as synthetic aperature (SAR) imagery, coherent sonar and...pprtitioned matrix inversion , comput;atio-n o"f crossambigul ty fun~ctions, formation of outer prCdu1cL tAand skewed outer products, and multiplication of...operations are multiplication, inversion , and L-U decomposition. In signal processing such operations can be found in adaptive filtering, data

  1. Signal processing for smart cards

    NASA Astrophysics Data System (ADS)

    Quisquater, Jean-Jacques; Samyde, David

    2003-06-01

    In 1998, Paul Kocher showed that when a smart card computes cryptographic algorithms, for signatures or encryption, its consumption or its radiations leak information. The keys or the secrets hidden in the card can then be recovered using a differential measurement based on the intercorrelation function. A lot of silicon manufacturers use desynchronization countermeasures to defeat power analysis. In this article we detail a new resynchronization technic. This method can be used to facilitate the use of a neural network to do the code recognition. It becomes possible to reverse engineer a software code automatically. Using data and clock separation methods, we show how to optimize the synchronization using signal processing. Then we compare these methods with watermarking methods for 1D and 2D signal. The very last watermarking detection improvements can be applied to signal processing for smart cards with very few modifications. Bayesian processing is one of the best ways to do Differential Power Analysis, and it is possible to extract a PIN code from a smart card in very few samples. So this article shows the need to continue to set up effective countermeasures for cryptographic processors. Although the idea to use advanced signal processing operators has been commonly known for a long time, no publication explains that results can be obtained. The main idea of differential measurement is to use the cross-correlation of two random variables and to repeat consumption measurements on the processor to be analyzed. We use two processors clocked at the same external frequency and computing the same data. The applications of our design are numerous. Two measurements provide the inputs of a central operator. With the most accurate operator we can improve the signal noise ratio, re-synchronize the acquisition clock with the internal one, or remove jitter. The analysis based on consumption or electromagnetic measurements can be improved using our structure. At first sight

  2. Refined Composite Multiscale Dispersion Entropy and its Application to Biomedical Signals.

    PubMed

    Azami, Hamed; Rostaghi, Mostafa; Abasolo, Dani; Escudero, Javier

    2017-03-08

    We propose a novel complexity measure to overcome the deficiencies of the widespread and powerful multiscale entropy (MSE), including, MSE values may be undefined for short signals, and MSE is slow for real-time applications. We introduce multiscale dispersion entropy (DisEn - MDE) as a very fast and powerful method to quantify the complexity of signals. MDE is based on our recently developed DisEn, which has a computation cost of O(N), compared with O(N2) for sample entropy used in MSE. We also propose the refined composite MDE (RCMDE) to improve the stability of MDE. We evaluate MDE, RCMDE, and refined composite MSE (RCMSE) on synthetic signals and three biomedical datasets. The MDE, RCMDE, and RCMSE methods show similar results, although the MDE and RCMDE are faster, lead to more stable results, and discriminate different types of physiological signals better than MSE and RCMSE. For noisy short and long time series, MDE and RCMDE are noticeably more stable than MSE and RCMSE, respectively. For short signals, MDE and RCMDE, unlike MSE and RCMSE, do not lead to undefined values. The proposed MDE and RCMDE are significantly faster than MSE and RCMSE, especially for long signals, and lead to larger differences between physiological conditions known to alter the complexity of the physiological recordings. MDE and RCMDE are expected to be useful for the analysis of physiological signals thanks to their ability to distinguish different types of dynamics. The Matlab codes used in this paper are freely available at http://dx.doi.org/10.7488/ds/1982.

  3. Signal processing in eukaryotic chemotaxis

    NASA Astrophysics Data System (ADS)

    Segota, Igor; Rachakonda, Archana; Franck, Carl

    2013-03-01

    Unlike inanimate condensed matter, living cells depend upon the detection of chemical signals for their existence. First, we experimentally determined the chemotaxis response of eukaryotic Dictyostelium cells to static folic acid gradients and show that they can respond to gradients as shallow as 0.2% across the cell body. Second, using Shannon's information theory, we showed that the information cells receive about the gradient exceeds the theoretically predicted information at the receptor-ligand binding step, resulting in the violation of the data processing inequality. Finally, we analyzed how eukaryotic cells can affect the gradient signals by secreting enzymes that degrade the signal. We analyzed this effect with a focus on a well described Dictyostelium cAMP chemotaxis system where cAMP signals are affected by an extracellular cAMP phosphodiesterase (PDE) and its inhibitor (PDI). Using a reaction-diffusion model of this set of interactions in the extracellular space, we show that cells can effectively sense much steeper chemical gradients than naively expected (up to a factor of 12). We also found that the rough estimates of experimental PDE and PDI secretion rates are close to the optimal values for gradient sensing as predicted by our model.

  4. Nanotubes for noisy signal processing

    NASA Astrophysics Data System (ADS)

    Lee, Ian Yenyin

    Nanotubes can process noisy signals. We present two central results in support of this general thesis and make an informed extrapolation that uses nanotubes to improve body armor. The first result is that noise can help nanotubes detect weak signals. The finding confirmed a stochastic-resonance theoretical prediction that noise can enhance detection at the nano-level. Laboratory experiments with nanotubes showed that three types of noise improved three measures of detection. Small amounts of Gaussian, uniform, and Cauchy additive white noise increased mutual-information, cross-correlation, and bit-error-rate measures before degrading them with further increases in noise. Nanotubes can apply this noise-enhancement and nanotube electrical and mechanical properties to improve signal processing. Similar noise enhancement may benefit a proposed nanotube-array cochlear-model spectral processing. The second result is that nanotube antennas can directly detect narrowband electromagnetic (EM) signals. The finding showed that nanotube and thin-wire dipoles are similar: They are resonant and narrowband and can implement linear-array designs if the EM waves in the nanotubes propagate at or near the free-space velocity of light. The nanotube-antenna prediction is based on a Fresnel-zone or near-zone analysis of antenna impedance using a quantum-conductor model. The analysis also predicts a failure to resonate if the nanotube EM-wave propagation is much slower than free-space light propagation. We extrapolate based on applied and theoretical analysis of body armor. Field experiments used a baseball comparison and statistical and other techniques to model body-armor bruising effects. A baseball comparison showed that a large caliber handgun bullet can hit an armored chest as hard as a fast baseball can hit a bare chest. Adaptive fuzzy systems learned to predict a bruise profile directly from the experimental data and also from statistical analysis of the data. Nanotube signal

  5. Nuclear sensor signal processing circuit

    DOEpatents

    Kallenbach, Gene A.; Noda, Frank T.; Mitchell, Dean J.; Etzkin, Joshua L.

    2007-02-20

    An apparatus and method are disclosed for a compact and temperature-insensitive nuclear sensor that can be calibrated with a non-hazardous radioactive sample. The nuclear sensor includes a gamma ray sensor that generates tail pulses from radioactive samples. An analog conditioning circuit conditions the tail-pulse signals from the gamma ray sensor, and a tail-pulse simulator circuit generates a plurality of simulated tail-pulse signals. A computer system processes the tail pulses from the gamma ray sensor and the simulated tail pulses from the tail-pulse simulator circuit. The nuclear sensor is calibrated under the control of the computer. The offset is adjusted using the simulated tail pulses. Since the offset is set to zero or near zero, the sensor gain can be adjusted with a non-hazardous radioactive source such as, for example, naturally occurring radiation and potassium chloride.

  6. Arguing effectiveness of biomedical signal acquisition devices using colored Petri Nets models and assurance cases in GSN: an ECG case study.

    PubMed

    Sobrinho, Alvaro; Cunha, Paulo; Dias da Silva, Leandro; Perkusich, Angelo; Cordeiro, Thiago; Segundo, Jarbas

    2016-08-01

    Reported cases of adverse events and product recalls expose limitations of biomedical signal acquisition devices. Approximately, ninety percent of the 1.210 recalls reported by the US Food and Drug Administration (FDA) between 2006 and 2011 were of class 2 devices such as Electrocardiography (ECG) devices. We show in this paper how manufacturers of biomedical signal acquisition devices can argue effectiveness of these devices using Colored Petri Nets (CPN) models and assurance cases in Goal Structuring Notation (GSN) by means of an ECG case study. We illustrate how CPN models are used to generate effectiveness evidences in order to present them during certification. In this context, we use assurance cases in GSN to present evidences arguing effectiveness of the device. We were able to conclude based on the ECG case study that the use of CPN models of devices can decrease costs and development time once manufacturers reuse them during the development and certification process.

  7. A 1V low power second-order delta-sigma modulator for biomedical signal application.

    PubMed

    Hsu, Chih-Han; Tang, Kea-Tiong

    2013-01-01

    This paper presents the design and implementation of a low-power delta-sigma modulator for biomedical application with a standard 90 nm CMOS technology. The delta-sigma architecture is implemented as 2nd order feedforward architecture. A low quiescent current operational transconductance amplifier (OTA) is utilized to reduce power consumption. This delta-sigma modulator operated in 1V power supply, and achieved 64.87 dB signal to noise distortion ratio (SNDR) at 10 KHz bandwidth with an oversampling ratio (OSR) of 64. The power consumption is 17.14 µW, and the figure-of-merit (FOM) is 0.60 pJ/conv.

  8. A new algorithm for quadratic sample entropy optimization for very short biomedical signals: application to blood pressure records.

    PubMed

    Cirugeda-Roldán, E M; Cuesta-Frau, D; Miró-Martínez, P; Oltra-Crespo, S; Vigil-Medina, L; Varela-Entrecanales, M

    2014-05-01

    This paper describes a new method to optimize the computation of the quadratic sample entropy (QSE) metric. The objective is to enhance its segmentation capability between pathological and healthy subjects for short and unevenly sampled biomedical records, like those obtained using ambulatory blood pressure monitoring (ABPM). In ABPM, blood pressure is measured every 20-30 min during 24h while patients undergo normal daily activities. ABPM is indicated for a number of applications such as white-coat, suspected, borderline, or masked hypertension. Hypertension is a very important clinical issue that can lead to serious health implications, and therefore its identification and characterization is of paramount importance. Nonlinear processing of signals by means of entropy calculation algorithms has been used in many medical applications to distinguish among signal classes. However, most of these methods do not perform well if the records are not long enough and/or not uniformly sampled. That is the case for ABPM records. These signals are extremely short and scattered with outliers or missing/resampled data. This is why ABPM Blood pressure signal screening using nonlinear methods is a quite unexplored field. We propose an additional stage for the computation of QSE independently of its parameter r and the input signal length. This enabled us to apply a segmentation process to ABPM records successfully. The experimental dataset consisted of 61 blood pressure data records of control and pathological subjects with only 52 samples per time series. The entropy estimation values obtained led to the segmentation of the two groups, while other standard nonlinear methods failed.

  9. Digitally Controlled Analog Signal Processing

    DTIC Science & Technology

    1988-04-01

    as a* .1 4* U S -;. - 4.. 14 01 1.2-8 A discusion of alternative state of the art approaches to monolithic continuous-time signal processing can be...vi(t) art , he input and output samples which are simultaneously measured at time t. There are three unknowns in this expression; the maximum input and...Unfortunately the current OpAmp bandwidth of 30MHz is near state-of-the- art limits. 2.2-108 (2) The finite voltage-dependent on-resistance of S, distorted

  10. Signal Processing Expert Code (SPEC)

    SciTech Connect

    Ames, H.S.

    1985-12-01

    The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.

  11. Decomposition of biomedical signals in spatial and time-frequency modes.

    PubMed

    Gratkowski, M; Haueisen, J; Arendt-Nielsen, L; Cn Chen, A; Zanow, F

    2008-01-01

    The purpose of this paper is to introduce a new method for spatial-time-frequency analysis of multichannel biomedical data. We exemplify the method for data recorded with a 31-channel Philips biomagnetometer. The method creates approximations and decompositions of spatiotemporal signal distributions using elements (atoms) chosen from a very large and redundant set (dictionary). The method is based on the Matching Pursuit algorithm, but it uses atoms that are distributed both in time and space (instead of only time-distributed atoms in standard Matching Pursuit). The time-frequency distribution of signal components is described by Gabor atoms and their spatial distribution is modeled by spatial modes. The spatial modes are created with the help of Bessel functions. Two versions of the method, differing in the definition of spatial properties of the atoms, are presented. The technique was validated on simulated data and real magnetic field data. It was used for removal of powerline noise from multichannel magnetoencephalography data, extraction of high-frequency somatosensory evoked fields and for separation of partially overlapping T- and U-waves in magnetocardiography. The method allows for parameterization of multichannel data in the time-frequency as well as in the spatial domains. It thus can be used for signal preserving filtering simultaneously in time, frequency, and space. Applications are e.g. the elimination of artifact components, extraction of components with biological meaning, and data exploration.

  12. Photonic signal processing for biomedical and industrial ultrasonic probes

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.

    1996-12-01

    Ultrasonics has been widely used in medical, industrial, and scientific applications. In medical applications, ultrasonics is an essential diagnostic method in internal medicine, urology, and vascular surgery. High-Intensity Focussed Ultrasound (HIFU) and lithotripsy applications use relatively low ultrasonic frequencies (< 100 KHz), while a 5-15 MHz band is typically used in diagnostic external cavity imaging ultrasound. Today, with endoscopic applications in mind, a very high ultrasonic frequency, e.g., 100 MHz, probe with high (> 50%) instantaneous bandwidths is highly desirable as higher frequencies give higher imaging resolution and smaller physical dimensions of the front-end intracavity transducer array. It is desirable to have an ultrasonic energy control system that, with minimal hardware change, is compact and can operate over wide tunable and instantaneous bandwidths a requirement for different ultrasonic medical modes. Today, passive fiber-optics (FO's) coupled with active photonic devices, could lead to this multi-band, ultra-compact ultrasonic system. Hence, we have put forth, perhaps, the first proposal using photonic beamforming and fiber remoting of the front-end ultrasonic probe, for both narrowband1 and wideband2-3 ultrasonic arrays.

  13. Reengineering the biomedical-equipment procurement process through an integrated management information system.

    PubMed

    Larios, Y G; Matsopoulos, G K; Askounis, D T; Nikita, K S

    2000-01-01

    The design of each new hospital site is typically preceded by decisions on the most appropriate level of biomedical equipment which significantly influences the layout of the hospital departments which are under construction. The most appropriate biomedical equipment should ideally be decided upon considering a series of demographic and social parameters of the hospital and international regulations and standards. This information should ultimately be distilled to proper technical specifications. This paper proposes a streamlined management process related to the procurement of biomedical equipment for new hospital sites or for those under expansion. The new management process aims to increase the efficiency of the experts involved in the definition of the most appropriate level of equipment and its technical specifications. It also addresses all aspects of the biomedical equipment-selection cycle, including the evaluation of the bids submitted by the equipment suppliers. The proposed process is assisted by a management information system, which integrates all related data-handling operations. It provides extensive decision-support facilities to the expert and a platform for the support of knowledge re-use in the field of biomedical-equipment selection.

  14. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  15. Biomolecular filters for improved separation of output signals in enzyme logic systems applied to biomedical analysis.

    PubMed

    Halámek, Jan; Zhou, Jian; Halámková, Lenka; Bocharova, Vera; Privman, Vladimir; Wang, Joseph; Katz, Evgeny

    2011-11-15

    Biomolecular logic systems processing biochemical input signals and producing "digital" outputs in the form of YES/NO were developed for analysis of physiological conditions characteristic of liver injury, soft tissue injury, and abdominal trauma. Injury biomarkers were used as input signals for activating the logic systems. Their normal physiological concentrations were defined as logic-0 level, while their pathologically elevated concentrations were defined as logic-1 values. Since the input concentrations applied as logic 0 and 1 values were not sufficiently different, the output signals being at low and high values (0, 1 outputs) were separated with a short gap making their discrimination difficult. Coupled enzymatic reactions functioning as a biomolecular signal processing system with a built-in filter property were developed. The filter process involves a partial back-conversion of the optical-output-signal-yielding product, but only at its low concentrations, thus allowing the proper discrimination between 0 and 1 output values.

  16. Generation of open biomedical datasets through ontology-driven transformation and integration processes.

    PubMed

    Carmen Legaz-García, María Del; Miñarro-Giménez, José Antonio; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2016-06-03

    Biomedical research usually requires combining large volumes of data from multiple heterogeneous sources, which makes difficult the integrated exploitation of such data. The Semantic Web paradigm offers a natural technological space for data integration and exploitation by generating content readable by machines. Linked Open Data is a Semantic Web initiative that promotes the publication and sharing of data in machine readable semantic formats. We present an approach for the transformation and integration of heterogeneous biomedical data with the objective of generating open biomedical datasets in Semantic Web formats. The transformation of the data is based on the mappings between the entities of the data schema and the ontological infrastructure that provides the meaning to the content. Our approach permits different types of mappings and includes the possibility of defining complex transformation patterns. Once the mappings are defined, they can be automatically applied to datasets to generate logically consistent content and the mappings can be reused in further transformation processes. The results of our research are (1) a common transformation and integration process for heterogeneous biomedical data; (2) the application of Linked Open Data principles to generate interoperable, open, biomedical datasets; (3) a software tool, called SWIT, that implements the approach. In this paper we also describe how we have applied SWIT in different biomedical scenarios and some lessons learned. We have presented an approach that is able to generate open biomedical repositories in Semantic Web formats. SWIT is able to apply the Linked Open Data principles in the generation of the datasets, so allowing for linking their content to external repositories and creating linked open datasets. SWIT datasets may contain data from multiple sources and schemas, thus becoming integrated datasets.

  17. Toolkits and Software for Developing Biomedical Image Processing and Analysis Applications

    NASA Astrophysics Data System (ADS)

    Wolf, Ivo

    Solutions in biomedical image processing and analysis usually consist of much more than a single method. Typically, a whole pipeline of algorithms is necessary, combined with visualization components to display and verify the results as well as possibilities to interact with the data. Therefore, successful research in biomedical image processing and analysis requires a solid base to start from. This is the case regardless whether the goal is the development of a new method (e.g., for segmentation) or to solve a specific task (e.g., computer-assisted planning of surgery).

  18. A low power biomedical signal processor ASIC based on hardware software codesign.

    PubMed

    Nie, Z D; Wang, L; Chen, W G; Zhang, T; Zhang, Y T

    2009-01-01

    A low power biomedical digital signal processor ASIC based on hardware and software codesign methodology was presented in this paper. The codesign methodology was used to achieve higher system performance and design flexibility. The hardware implementation included a low power 32bit RISC CPU ARM7TDMI, a low power AHB-compatible bus, and a scalable digital co-processor that was optimized for low power Fast Fourier Transform (FFT) calculations. The co-processor could be scaled for 8-point, 16-point and 32-point FFTs, taking approximate 50, 100 and 150 clock circles, respectively. The complete design was intensively simulated using ARM DSM model and was emulated by ARM Versatile platform, before conducted to silicon. The multi-million-gate ASIC was fabricated using SMIC 0.18 microm mixed-signal CMOS 1P6M technology. The die area measures 5,000 microm x 2,350 microm. The power consumption was approximately 3.6 mW at 1.8 V power supply and 1 MHz clock rate. The power consumption for FFT calculations was less than 1.5 % comparing with the conventional embedded software-based solution.

  19. Studies in statistical signal processing

    NASA Astrophysics Data System (ADS)

    Kailath, Thomas

    1990-06-01

    The primary objective of our research is to develop efficient and numerically stable algorithms for nonstationary signal processing problems by understanding and exploiting special structures, both deterministic and stochastic, in the problems. We also strive to establish and broaden links with related disciplines, such as cascade filter synthesis, scattering theory, numerical linear algebra, and mathematical operator theory for the purpose of cross fertilization of ideas and techniques. These explorations have led to new results both in estimation theory and in these other fields, e.g., to new algorithms for triangular and QR factorization of structured matrices, new techniques for root location and stability testing, new realizations for multiple-input/multiple-output (MIMO) transfer functions, and new recursions for orthogonal polynomials on the unit circle and the real line as well as on other curves.

  20. MEMD-enhanced multivariate fuzzy entropy for the evaluation of complexity in biomedical signals.

    PubMed

    Azami, Hamed; Smith, Keith; Escudero, Javier

    2016-08-01

    Multivariate multiscale entropy (mvMSE) has been proposed as a combination of the coarse-graining process and multivariate sample entropy (mvSE) to quantify the irregularity of multivariate signals. However, both the coarse-graining process and mvSE may not be reliable for short signals. Although the coarse-graining process can be replaced with multivariate empirical mode decomposition (MEMD), the relative instability of mvSE for short signals remains a problem. Here, we address this issue by proposing the multivariate fuzzy entropy (mvFE) with a new fuzzy membership function. The results using white Gaussian noise show that the mvFE leads to more reliable and stable results, especially for short signals, in comparison with mvSE. Accordingly, we propose MEMD-enhanced mvFE to quantify the complexity of signals. The characteristics of brain regions influenced by partial epilepsy are investigated by focal and non-focal electroencephalogram (EEG) time series. In this sense, the proposed MEMD-enhanced mvFE and mvSE are employed to discriminate focal EEG signals from non-focal ones. The results demonstrate the MEMD-enhanced mvFE values have a smaller coefficient of variation in comparison with those obtained by the MEMD-enhanced mvSE, even for long signals. The results also show that the MEMD-enhanced mvFE has better performance to quantify focal and non-focal signals compared with multivariate multiscale permutation entropy.

  1. Computer-Based Clinical Instrumentation for Processing and Analysis of Mechanically Evoked Electromyographic Signals in the Upper Limb

    DTIC Science & Technology

    2007-11-02

    motor responses in Carpal Tunnel Syndrome (CTS) patients, it could equally well be modified to allow acquisition, processing and analysis of EMG...instrumentation; Biomedical signal processing; AMLAB; KIN-COM, Carpal Tunnel Syndrome, Movement analysis. I. INTRODUCTION Advances in understanding the... Carpal Tunnel Syndrome, it could equally well be modified to allow acquisition, processing, and analysis of EMG and other movement signals in

  2. Chaos-Based Signal Processing

    NASA Astrophysics Data System (ADS)

    Ogorzałek, Maciej J.

    2002-07-01

    Nonlinear systems exhibiting chaotic behavior can be considered as a source of a great variety of signals. Given a time series measured from a known or an unknown dynamical system we address a series of problems, such as section-wise approximation of the measured signal by pieces of trajectories from a chosen nonlinear dynamical system (model) signal restoration when the measured signal has been corrupted e.g. by quantization; signal coding and compression. The key to attack these problems is estimation of the initial conditions for a dynamical system which is used as the generator of approximating waveforms.

  3. Texture in Biomedical Images

    NASA Astrophysics Data System (ADS)

    Petrou, Maria

    An overview of texture analysis methods is given and the merits of each method for biomedical applications are discussed. Methods discussed include Markov random fields, Gibbs distributions, co-occurrence matrices, Gabor functions and wavelets, Karhunen-Loève basis images, and local symmetry and orientation from the monogenic signal. Some example applications of texture to medical image processing are reviewed.

  4. Methods of processing biomedical image of retinal macular region of the eye

    NASA Astrophysics Data System (ADS)

    Pavlov, S. V.; Vassilenko, V. B.; Saldan, I. R.; Vovkotrub, D. V.; Poplavskaya, A. A.; Kuzin, O. O.

    2016-09-01

    The paper reviews the tomograms of the retina by using coherent optical topographic scanner STRATUS OCT 3000. There had been researched the efficiency of processing the biomedical images of this class by using the standard procedure in tomography. There had been developed a new approach to determining the macular area of the retina in the received tomograms by using the developed program.

  5. New Windows based Color Morphological Operators for Biomedical Image Processing

    NASA Astrophysics Data System (ADS)

    Pastore, Juan; Bouchet, Agustina; Brun, Marcel; Ballarin, Virginia

    2016-04-01

    Morphological image processing is well known as an efficient methodology for image processing and computer vision. With the wide use of color in many areas, the interest on the color perception and processing has been growing rapidly. Many models have been proposed to extend morphological operators to the field of color images, dealing with some new problems not present previously in the binary and gray level contexts. These solutions usually deal with the lattice structure of the color space, or provide it with total orders, to be able to define basic operators with required properties. In this work we propose a new locally defined ordering, in the context of window based morphological operators, for the definition of erosions-like and dilation-like operators, which provides the same desired properties expected from color morphology, avoiding some of the drawbacks of the prior approaches. Experimental results show that the proposed color operators can be efficiently used for color image processing.

  6. Commercial applications in biomedical processing in the microgravity environment

    NASA Astrophysics Data System (ADS)

    Johnson, Terry C.; Taub, Floyd

    1995-01-01

    A series of studies have shown that a purified cell regulatory sialoglycopeptide (CeReS) that arrests cell division and induces cellular differentiation is fully capable of functionally interacting with target insect and mammalian cells in the microgravity environment. Data from several shuttle missions suggest that the signal transduction events that are known to be associated with CeReS action function as well in microgravity as in ground-based experiments. The molecular events known to be associated with CeReS include an ability to interfere with Ca2+ metabolism, the subsequent alkalinization of cell cytosol, and the inhibition of the phosphorylation of the nuclear protein product encoded by the retinoblastoma (RB) gene. The ability of CeReS to function in microgravity opens a wide variety of applications in space life sciences.

  7. Remembrance of time series past: simple chromatic method for visualizing trends in biomedical signals.

    PubMed

    Burykin, Anton; Mariani, Sara; Henriques, Teresa; Silva, Tiago F; Schnettler, William T; Costa, Madalena D; Goldberger, Ary L

    2015-07-01

    Analysis of biomedical time series plays an essential role in clinical management and basic investigation. However, conventional monitors streaming data in real-time show only the most recent values, not referenced to past dynamics. We describe a chromatic approach to bring the 'memory' of the physiologic system's past behavior into the current display window.The method employs the estimated probability density function of a time series segment to colorize subsequent data points.For illustrative purposes, we selected open-access recordings of continuous: (1) fetal heart rate during the pre-partum period, and (2) heart rate and systemic blood pressure from a critical care patient during a spontaneous breathing trial. The colorized outputs highlight changes from the 'baseline' reference state, the latter defined as the mode value assumed by the signal, i.e. the maximum of its probability density function.A colorization method may facilitate the recognition of relevant features of time series, especially shifts in baseline dynamics and other trends (including transient and longer-term deviation from baseline values) which may not be as readily noticed using traditional displays. This method may be applicable in clinical monitoring (real-time or off-line) and in research settings. Prospective studies are needed to assess the utility of this approach.

  8. Remembrance of time series past: simple chromatic method for visualizing trends in biomedical signals

    PubMed Central

    Burykin, Anton; Mariani, Sara; Henriques, Teresa; Silva, Tiago F.; Schnettler, William T.; Costa, Madalena D.; Goldberger, Ary L.

    2015-01-01

    Background Analysis of biomedical time series plays an essential role in clinical management and basic investigation. However, the monochromatic displays in conventional monitors show only short segments of the current time series, not referenced to past dynamics. We describe a chromatic approach to bring the “memory” of the physiologic system's past behavior into the current display window. Methods The method employs the estimated probability density function of a time series segment to colorize subsequent data points. Results For illustrative purposes, we selected open-access recordings of continuous: 1) fetal heart rate during the pre-partum period, and 2) heart rate and systemic blood pressure from a critical care patient during a spontaneous breathing trial. The colorized outputs highlight changes from the “baseline” reference state, defined as the mode value assumed by the signal, i.e., the maximum of its probability density function. Conclusions A colorization method may facilitate the recognition of relevant features of time series, especially shifts in baseline dynamics and other trends (including transient and longer-term deviation from baseline values) which may not be as readily noticed using traditional displays. This method may be applicable in clinical monitoring (real-time or off-line) and in research settings. Prospective studies are needed to assess the utility of this approach. PMID:26012777

  9. Training probabilistic VLSI models on-chip to recognise biomedical signals under hardware nonidealities.

    PubMed

    Jiang, P C; Chen, H

    2006-01-01

    VLSI implementation of probabilistic models is attractive for many biomedical applications. However, hardware non-idealities can prevent probabilistic VLSI models from modelling data optimally through on-chip learning. This paper investigates the maximum computational errors that a probabilistic VLSI model can tolerate when modelling real biomedical data. VLSI circuits capable of achieving the required precision are also proposed.

  10. Superconductive signal-processing circuits

    NASA Astrophysics Data System (ADS)

    Vanduzer, Theodore

    1994-08-01

    This work addresses new signal processing circuits using the special features of superconductivity. A novel flash-type analog-to-digital converter based on a comparator invented in the preceding contract period was demonstrated. The comparator was shown to be useful as a logic gate and an encoder was designed with it. A high-resolution delta-sigma analog-to-digital converter was devised with superconductive components in spite of the lack of an analog integrator in this technology. Positive theoretical results are being followed up experimentally. A simple flux-shuttle single-flux-quantum shift register was devised and several different readout schemes were studied. A six-bit-long version was successfully tested at 1 GHz. A decoder that takes in a five-bit word to select one of 32 output lines was completed. The design involved very tight limitations on current and power. The decoder was combined with a serial-to-parallel converter and operated at 2 GHz. A study of the appropriate architectures for various types of superconductive or Josephson digital technology was developed: an inductance-extraction program.

  11. Antimicrobial thermoplastic materials for biomedical applications prepared by melt processing

    NASA Astrophysics Data System (ADS)

    Botta, L.; Scaffaro, R.; Ceraulo, M.; Gallo, G.

    2014-05-01

    In this work thermoplastic polymers with antimicrobial properties were prepared by incorporating an antibiotic, i.e., ciprofloxacin (CFX), by melt processing. Two different polymers were used as matrices, i.e., polypropylene (PP) and poly(lactid acid) (PLA) and different concentrations of CFX have been incorporated. The antimicrobial properties, the release kinetic and the mechanical performances of the prepared materials were evaluated.

  12. A Software Package For Biomedical Image Processing And Analysis

    NASA Astrophysics Data System (ADS)

    Goncalves, Joao G. M.; Mealha, Oscar

    1988-06-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developped using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an efficient tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail.

  13. Evolutionary partial differential equations for biomedical image processing.

    PubMed

    Sarti, Alessandro; Mikula, Karol; Sgallari, Fiorella; Lamberti, Claudio

    2002-04-01

    We are presenting here a model for processing space-time image sequences and applying them to 3D echo-cardiography. The non-linear evolutionary equations filter the sequence with keeping space-time coherent structures. They have been developed using ideas of regularized Perona-Malik an-isotropic diffusion and geometrical diffusion of mean curvature flow type (Malladi-Sethian), combined with Galilean invariant movie multi-scale analysis of Alvarez et al. A discretization of space-time filtering equations by means of finite volume method is discussed in detail. Computational results in processing of 3D echo-cardiographic sequences obtained by rotational acquisition technique and by real-time 3D echo volumetrics acquisition technique are presented. Quantitative error estimation is also provided.

  14. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  15. Korteweg-de Vries-Kuramoto-Sivashinsky filters in biomedical image processing

    NASA Astrophysics Data System (ADS)

    Arango, Juan C.

    2015-09-01

    The Kuramoto- Sivashinsky operator is applied to the two-dimensional solution of the Korteweg-de Vries equation and the resulting filter is applied via convolution to image processing. The full procedure is implemented using an algorithm: prototyped with the Maple package named Image Tools. Some experiments were performed using biomedical images, infrared images obtained with smartphones and images generated by photon diffusion. The results from these experiments show that the Kuramoto-Sivashinsky-Korteweg-de Vries Filters are excellent tools for enhancement of images with interesting applications in image processing at general and biomedical image processing in particular. It is expected that the incorporation of the Kuramoto-Sivashinsky-Korteweg-de Vries Filters in standard programs for image processing will led to important improvements in various fields of optical engineering.

  16. A Processable Shape Memory Polymer System for Biomedical Applications.

    PubMed

    Hearon, Keith; Wierzbicki, Mark A; Nash, Landon D; Landsman, Todd L; Laramy, Christine; Lonnecker, Alexander T; Gibbons, Michael C; Ur, Sarah; Cardinal, Kristen O; Wilson, Thomas S; Wooley, Karen L; Maitland, Duncan J

    2015-06-24

    Polyurethane shape memory polymers (SMPs) with tunable thermomechanical properties and advanced processing capabilities are synthesized, characterized, and implemented in the design of a microactuator medical device prototype. The ability to manipulate glass transition temperature (Tg ) and crosslink density in low-molecular weight aliphatic thermoplastic polyurethane SMPs is demonstrated using a synthetic approach that employs UV catalyzed thiol-ene "click" reactions to achieve postpolymerization crosslinking. Polyurethanes containing varying C=C functionalization are synthesized, solution blended with polythiol crosslinking agents and photoinitiator and subjected to UV irradiation, and the effects of number of synthetic parameters on crosslink density are reported. Thermomechanical properties are highly tunable, including glass transitions tailorable between 30 and 105 °C and rubbery moduli tailorable between 0.4 and 20 MPa. This new SMP system exhibits high toughness for many formulations, especially in the case of low crosslink density materials, for which toughness exceeds 90 MJ m(-3) at select straining temperatures. To demonstrate the advanced processing capability and synthetic versatility of this new SMP system, a laser-actuated SMP microgripper device for minimally invasive delivery of endovascular devices is fabricated, shown to exhibit an average gripping force of 1.43 ± 0.37 N and successfully deployed in an in vitro experimental setup under simulated physiological conditions. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  18. Hybrid photonic signal processing for radio frequency signals

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.

    2005-09-01

    Photonics previously has been used in the all-analog and all-digital domain for processing of Radio Frequency (RF) Signals. This paper highlights recent work by the Riza group on a new hybrid analog-digital approach to RF signal processing and controls. Specifically, novels works will be described in the design of RF processing components such as fiberoptic attenuator, fiber-optic programmable delay lines, and optical transversal filters.

  19. Enhancement Of Optical Registration Signals Through Digital Signal Processing Techniques

    NASA Astrophysics Data System (ADS)

    Cote, Daniel R.; Lazo-Wasem, Jeanne

    1988-01-01

    Alignment and setup of lighography processes has largely been conducted on special test wafers. Actual product level optimization has been limited to manual techniques such as optical verniers. This is especially time consuming and prone to inconsistencies when the registration characteristics of lithographic systems are being measured. One key factor obstructing the use of automated metrology equipment on product level wafers is the inability to discern reliably, metrology features from the background noise and variations in optical registration signals. This is often the case for metal levels such as aluminum and tungsten. This paper discusses methods for enhancement of typical registration signals obtained from difficult semiconductor process levels. Brightfield and darkfield registration signals are obtained using a microscope and a 1024 element linear photodiode array. These signals are then digitized and stored on the hard disk of a computer. The techniques utilized include amplitude selective and adaptive and non-adaptive frequency domain filtering techniques. The effect of each of these techniques upon calculated registration values is analyzed by determining the positional variation of the center location of a two line registration feature. Plots of raw and processed signals obtained are presented as are plots of the power spectral density of ideal metrology feature signal and noise patterns. It is concluded that the proper application of digital signal processing (DSP) techniques to problematic optical registration signals greatly enhances the applicability of automated optical registration measurement techniques to difficult semiconductor process levels.

  20. Adaptive Signal Processing Testbed signal excision software: User's manual

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1992-05-01

    The Adaptive Signal Processing Testbed (ASPT) signal excision software is a set of programs that provide real-time processing functions for the excision of interfering tones from a live spread-spectrum signal as well as off-line functions for the analysis of the effectiveness of the excision technique. The processing functions provided by the ASPT signal excision software are real-time adaptive filtering of live data, storage to disk, and file sorting and conversion. The main off-line analysis function is bit error determination. The purpose of the software is to measure the effectiveness of an adaptive filtering algorithm to suppress interfering or jamming signals in a spread spectrum signal environment. A user manual for the software is provided, containing information on the different software components available to perform signal excision experiments: the real-time excision software, excision host program, file processing utilities, and despreading and bit error rate determination software. In addition, information is presented describing the excision algorithm implemented, the real-time processing framework, the steps required to add algorithms to the system, the processing functions used in despreading, and description of command sequences for post-run analysis of the data.

  1. Process Dissociation and Mixture Signal Detection Theory

    ERIC Educational Resources Information Center

    DeCarlo, Lawrence T.

    2008-01-01

    The process dissociation procedure was developed in an attempt to separate different processes involved in memory tasks. The procedure naturally lends itself to a formulation within a class of mixture signal detection models. The dual process model is shown to be a special case. The mixture signal detection model is applied to data from a widely…

  2. Large-scale combining signals from both biomedical literature and the FDA Adverse Event Reporting System (FAERS) to improve post-marketing drug safety signal detection

    PubMed Central

    2014-01-01

    Background Independent data sources can be used to augment post-marketing drug safety signal detection. The vast amount of publicly available biomedical literature contains rich side effect information for drugs at all clinical stages. In this study, we present a large-scale signal boosting approach that combines over 4 million records in the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) and over 21 million biomedical articles. Results The datasets are comprised of 4,285,097 records from FAERS and 21,354,075 MEDLINE articles. We first extracted all drug-side effect (SE) pairs from FAERS. Our study implemented a total of seven signal ranking algorithms. We then compared these different ranking algorithms before and after they were boosted with signals from MEDLINE sentences or abstracts. Finally, we manually curated all drug-cardiovascular (CV) pairs that appeared in both data sources and investigated whether our approach can detect many true signals that have not been included in FDA drug labels. We extracted a total of 2,787,797 drug-SE pairs from FAERS with a low initial precision of 0.025. The ranking algorithm combined signals from both FAERS and MEDLINE, significantly improving the precision from 0.025 to 0.371 for top-ranked pairs, representing a 13.8 fold elevation in precision. We showed by manual curation that drug-SE pairs that appeared in both data sources were highly enriched with true signals, many of which have not yet been included in FDA drug labels. Conclusions We have developed an efficient and effective drug safety signal ranking and strengthening approach We demonstrate that large-scale combining information from FAERS and biomedical literature can significantly contribute to drug safety surveillance. PMID:24428898

  3. Large-scale combining signals from both biomedical literature and the FDA Adverse Event Reporting System (FAERS) to improve post-marketing drug safety signal detection.

    PubMed

    Xu, Rong; Wang, QuanQiu

    2014-01-15

    Independent data sources can be used to augment post-marketing drug safety signal detection. The vast amount of publicly available biomedical literature contains rich side effect information for drugs at all clinical stages. In this study, we present a large-scale signal boosting approach that combines over 4 million records in the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) and over 21 million biomedical articles. The datasets are comprised of 4,285,097 records from FAERS and 21,354,075 MEDLINE articles. We first extracted all drug-side effect (SE) pairs from FAERS. Our study implemented a total of seven signal ranking algorithms. We then compared these different ranking algorithms before and after they were boosted with signals from MEDLINE sentences or abstracts. Finally, we manually curated all drug-cardiovascular (CV) pairs that appeared in both data sources and investigated whether our approach can detect many true signals that have not been included in FDA drug labels. We extracted a total of 2,787,797 drug-SE pairs from FAERS with a low initial precision of 0.025. The ranking algorithm combined signals from both FAERS and MEDLINE, significantly improving the precision from 0.025 to 0.371 for top-ranked pairs, representing a 13.8 fold elevation in precision. We showed by manual curation that drug-SE pairs that appeared in both data sources were highly enriched with true signals, many of which have not yet been included in FDA drug labels. We have developed an efficient and effective drug safety signal ranking and strengthening approach We demonstrate that large-scale combining information from FAERS and biomedical literature can significantly contribute to drug safety surveillance.

  4. Signals Intelligence - Processing - Analysis - Classification

    DTIC Science & Technology

    2009-10-01

    Example: Language identification from audio signals. In a certain mission, a set of languages seems important beforehand. These languages will – with a...tasks to be performed. • OCR: determine the text parts in an image – language dependent approach, quality depends on the language. • Steganography

  5. Morphological image processing for quantitative shape analysis of biomedical structures: effective contrast enhancement.

    PubMed

    Kimori, Yoshitaka

    2013-11-01

    Image processing methods significantly contribute to visualization of images captured by biomedical modalities (such as mammography, X-ray computed tomography, magnetic resonance imaging, and light and electron microscopy). Quantitative interpretation of the deluge of complicated biomedical images, however, poses many research challenges, one of which is to enhance structural features that are scarcely perceptible to the human eye. This study introduces a contrast enhancement approach based on a new type of mathematical morphology called rotational morphological processing. The proposed method is applied to medical images for the enhancement of structural features. The effectiveness of the method is evaluated quantitatively by the contrast improvement ratio (CIR). The CIR of the proposed method is 12.1, versus 4.7 and 0.1 for two conventional contrast enhancement methods, clearly indicating the high contrasting capability of the method.

  6. An efficient sampling algorithm for uncertain abnormal data detection in biomedical image processing and disease prediction.

    PubMed

    Liu, Fei; Zhang, Xi; Jia, Yan

    2015-01-01

    In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.

  7. Improved television signal processing system

    NASA Technical Reports Server (NTRS)

    Wong, R. Y.

    1967-01-01

    Digital system processes spacecraft television pictures by converting images sensed on a photostorage vidicon to pulses which can be transmitted by telemetry. This system can be applied in the processing of medical X ray photographs and in electron microscopy.

  8. Signal Processing on Finite Groups

    DTIC Science & Technology

    1990-02-27

    signal into another. Operation theory has been intensively developed in the last four decades, as has the more exotic theory of operator algebras ...algorithm for their evaluation whenever there is one for the associated group transform. If 4D(G) denotes the N-dimensional operator algebra of all group...Princeton. NJ: Princeton University Press (1955). 8. D. Gorenstein, "The enormous theorem," Sci. Am. 253, 104 (1985). 9. J.J. Rotman , The Theory of Groups

  9. Signal Processing Fault Detection System

    DTIC Science & Technology

    2007-07-13

    of strain sensor signals is wavelet analysis which is a linear mathematical analysis technique that can analyze discontinuities and edge effects...Real wavelets are suitable for identifying discontinuities and data compression. Analytic wavelets are suitable for capturing frequency content within a...function (i.e. the time series data captured from the sensors) and l*a,.u is identified as the complex conjugate of the mother wavelet . The variable t

  10. Bistatic SAR: Signal Processing and Image Formation.

    SciTech Connect

    Wahl, Daniel E.; Yocky, David A.

    2014-10-01

    This report describes the significant processing steps that were used to take the raw recorded digitized signals from the bistatic synthetic aperture RADAR (SAR) hardware built for the NCNS Bistatic SAR project to a final bistatic SAR image. In general, the process steps herein are applicable to bistatic SAR signals that include the direct-path signal and the reflected signal. The steps include preprocessing steps, data extraction to for a phase history, and finally, image format. Various plots and values will be shown at most steps to illustrate the processing for a bistatic COSMO SkyMed collection gathered on June 10, 2013 on Kirtland Air Force Base, New Mexico.

  11. Signal processing in ultrasound. [for diagnostic medicine

    NASA Technical Reports Server (NTRS)

    Le Croissette, D. H.; Gammell, P. M.

    1978-01-01

    Signal is the term used to denote the characteristic in the time or frequency domain of the probing energy of the system. Processing of this signal in diagnostic ultrasound occurs as the signal travels through the ultrasonic and electrical sections of the apparatus. The paper discusses current signal processing methods, postreception processing, display devices, real-time imaging, and quantitative measurements in noninvasive cardiology. The possibility of using deconvolution in a single transducer system is examined, and some future developments using digital techniques are outlined.

  12. Synthesis, Analysis, and Processing of Fractal Signals

    DTIC Science & Technology

    1991-10-01

    fractal dimension of the underlying signal , when defined. Robust estimation of the fractal dimension of 1/f processes is important in a number of...modeling errors. The resulting parameter estimation algorithms, which compute both fractal dimension parameters and the accompanying signal and noise...Synthesis, Analysis, and Processing of Fractal Signals RLE Technical Report No. 566 Gregory W. Wornell October 1991 Research Laboratory of

  13. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2013-09-30

    Advanced Digital Signal Processing for Hybrid Lidar William D. Jemison Clarkson University [Technical Section Technical Objectives The technical...objective of this project is the development and evaluation of various digital signal processing (DSP) algorithms that will enhance hybrid lidar ...algorithm as shown in Figure 1. Hardware Platform for Algorithm Implementation + Underwater Channel Characteristics ^ Lidar DSP Algorithm Figure

  14. Noise-processing by signaling networks.

    PubMed

    Kontogeorgaki, Styliani; Sánchez-García, Rubén J; Ewing, Rob M; Zygalakis, Konstantinos C; MacArthur, Ben D

    2017-04-03

    Signaling networks mediate environmental information to the cell nucleus. To perform this task effectively they must be able to integrate multiple stimuli and distinguish persistent signals from transient environmental fluctuations. However, the ways in which signaling networks process environmental noise are not well understood. Here we outline a mathematical framework that relates a network's structure to its capacity to process noise, and use this framework to dissect the noise-processing ability of signaling networks. We find that complex networks that are dense in directed paths are poor noise processors, while those that are sparse and strongly directional process noise well. These results suggest that while cross-talk between signaling pathways may increase the ability of signaling networks to integrate multiple stimuli, too much cross-talk may compromise the ability of the network to distinguish signal from noise. To illustrate these general results we consider the structure of the signalling network that maintains pluripotency in mouse embryonic stem cells, and find an incoherent feedforward loop structure involving Stat3, Tfcp2l1, Esrrb, Klf2 and Klf4 is particularly important for noise-processing. Taken together these results suggest that noise-processing is an important function of signaling networks and they may be structured in part to optimize this task.

  15. Fundamental and applied studies in nanoparticle biomedical imaging, stabilization, and processing

    NASA Astrophysics Data System (ADS)

    Pansare, Vikram J.

    Nanoparticle carrier systems are gaining importance in the rapidly expanding field of biomedical whole animal imaging where they provide long circulating, real time imaging capability. This thesis presents a new paradigm in imaging whereby long wavelength fluorescent or photoacoustically active contrast agents are embedded in the hydrophobic core of nanocarriers formed by Flash NanoPrecipitation. The long wavelength allows for improved optical penetration depth. Compared to traditional contrast agents where fluorophores are placed on the surface, this allows for improved signal, increased stability, and molecular targeting capabilities. Several types of long wavelength hydrophobic dyes based on acene, cyanine, and bacteriochlorin scaffolds are utilized and animal results obtained for nanocarrier systems used in both fluorescent and photoacoustic imaging modes. Photoacoustic imaging is particularly promising due to its high resolution, excellent penetration depth, and ability to provide real-time functional information. Fundamental studies in nanoparticle stabilization are also presented for two systems: model alumina nanoparticles and charge stabilized polystyrene nanoparticles. Motivated by the need for stable suspensions of alumina-based nanocrystals for security printing applications, results are presented for the adsorption of various small molecule charged hydrophobes onto the surface of alumina nanoparticles. Results are also presented for the production of charge stabilized polystyrene nanoparticles via Flash NanoPrecipitation, allowing for the independent control of polymer molecular weight and nanoparticle size, which is not possible by traditional emulsion polymerization routes. Lastly, methods for processing nanoparticle systems are explored. The increasing use of nanoparticle therapeutics in the pharmaceutical industry has necessitated the development of scalable, industrially relevant processing methods. Ultrafiltration is particularly well suited for

  16. RSFQ Baseband Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Herr, Anna Yurievna

    Ultra fast switching speed of superconducting digital circuits enable realization of Digital Signal Processors with performance unattainable by any other technology. Based on rapid-single-flux technology (RSFQ) logic, these integrated circuits are capable of delivering high computation capacity up to 30 GOPS on a single processor and very short latency of 0.1ns. There are two main applications of such hardware for practical telecommunication systems: filters for superconducting ADCs operating with digital RF data and recursive filters at baseband. The later of these allows functions such as multiuser detection for 3G WCDMA, equalization and channel precoding for 4G OFDM MIMO, and general blind detection. The performance gain is an increase in the cell capacity, quality of service, and transmitted data rate. The current status of the development of the RSFQ baseband DSP is discussed. Major components with operating speed of 30GHz have been developed. Designs, test results, and future development of the complete systems including cryopackaging and CMOS interface are reviewed.

  17. Optical Profilometers Using Adaptive Signal Processing

    NASA Technical Reports Server (NTRS)

    Hall, Gregory A.; Youngquist, Robert; Mikhael, Wasfy

    2006-01-01

    A method of adaptive signal processing has been proposed as the basis of a new generation of interferometric optical profilometers for measuring surfaces. The proposed profilometers would be portable, hand-held units. Sizes could be thus reduced because the adaptive-signal-processing method would make it possible to substitute lower-power coherent light sources (e.g., laser diodes) for white light sources and would eliminate the need for most of the optical components of current white-light profilometers. The adaptive-signal-processing method would make it possible to attain scanning ranges of the order of decimeters in the proposed profilometers.

  18. Nonlinear and Nonstationary Signal Processing

    NASA Astrophysics Data System (ADS)

    Wunsch, Carl

    Stationary linear systems driven by Gaussian processes are the basic representations of time series used in the Earth sciences. A large body of literature has developed around these misleadingly simple models, which straddle statistics, optimization, control, probability theory, and related fields. That fundamental errors of inference are still made in the refereed literature is perhaps a testimony to the subtleties and confusion that arise when statistics meets the real geophysical world. A major journal devoted to modern climate studies recently felt compelled to publish a tutorial explaining the importance of avoiding aliasing errors when sampling meteorological variables; this subject was clearly understood 100 years ago.

  19. Digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.

    1980-01-01

    A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.

  20. Topics in digital signal processing

    NASA Astrophysics Data System (ADS)

    Narayan, S. S. R.

    Three discrete Fourier transform (DFT) algorithms, namely, the fast Fourier transform algorithm (FFT), the prime factor algorithm (PFA) and the Winograd Fourier transform algorithm (WFTA) are analyzed and compared. A new set of short-length DFT algorithms well-suited for special purpose hardware implementations, employing monolithic multiplier-accumulators and microprocessors, are presented. Architectural considerations in designing DFT processors based on these algorithms are discussed. Efficient hardware structures for implementing the FFT and the PFA are presented. A digital implementation for performing linear-FM (LFM) pulse compression by using bandpass filter banks is presented. The concept of transform domain adaptive filtering is introduced. The DFT and the discrete cosine transform (DFT) domain adaptive filtering algorithm are considered. Applications of these in the areas of speech processing and adaptive line enhancers are discussed. A simple waveform coding algorithm capable of providing good quality speech at about 1.5 bits per sample is presented.

  1. SignalPlant: an open signal processing software platform.

    PubMed

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  2. Ideal filtering approach on DCT domain for biomedical signals: index blocked DCT filtering method (IB-DCTFM).

    PubMed

    Shin, Hang Sik; Lee, Chungkeun; Lee, Myoungho

    2010-08-01

    We proposed Index-Blocked Discrete Cosine Transform Filtering Method (IB-DCTFM) to design ideal frequency range filter on DCT domain for biomedical signal which frequently exposed to specific frequency noise such as motion artifacts and 50/60 Hz powerline interference. IB-DCTFM removes unwanted frequency range signal on time domain by blocking specific DCT index on DCT domain. In simulation, electrocardiography, electromyography, photoplethysmography are used as a signal source and FIR, IIR and adaptive filter are used for comparison with proposed IB-DCTFM. To evaluate filter performance, we calculated signal-to-noise ratio and correlation coefficient to clean signal of each signal and filtering method respectively. As a result of filter simulation, average signal to noise ration and correlation coefficient of IB-DCTFM are improved about 75.8 dB/0.477, and FIR, IIR and adaptive filtering results are 24.8 dB/0.130, 54.3 dB/0.440 and 29.5 dB/0.200 respectively.

  3. Optical Fiber Delay Line Signal Processing.

    NASA Astrophysics Data System (ADS)

    Newton, Steven Arthur

    The delay line transversal filter is a basic component in analog signal processing systems. Unfortunately, conventional delay line devices, such as those that use surface acoustic waves, are largely limited to operation at frequencies of several hundred megahertz and below. In this work, single-mode optical fiber has been used as a delay medium to make transversal filters that extend this kind of signal processing to frequencies of one gigahertz and above. Single-mode optical fiber is an excellent delay medium because it exhibits extremely low loss and dispersion. By efficiently collecting, weighting, and combining signals extracted from a fiber delay line, single-mode fiber can be used, not only to transmit broadband signals, but to process them as well. The goals of the work have been to study efficient tapping mechanisms, and to construct fiber transversal filters capable of performing some basic signal processing functions. Several different tapped and recirculating delay line prototypes have been fabricated using a variety of tapping techniques, including macrobending and evanescent field coupling. These devices have been used to demonstrate basic signal processing functions, such as code generation, convolution, correlation, and frequency filtering, at frequencies that exceed those possible using conventional delay line technologies. Fiber recirculating delay line loops have also been demonstrated as transient memories for the temporary storage of signals and as a means of time division multiplexing via data rate transformation. These devices are the building blocks that are necessary to make systems capable of performing complex signal processing functions. With the recent development of high speed optical sources and detectors to interface with fiber systems of this kind, the real time processing of signals having bandwidths of tens of gigahertz is envisioned.

  4. Signal feature recognition based on lightwave neuromorphic signal processing.

    PubMed

    Fok, Mable P; Deming, Hannah; Nahmias, Mitchell; Rafidi, Nicole; Rosenbluth, David; Tait, Alexander; Tian, Yue; Prucnal, Paul R

    2011-01-01

    We developed a hybrid analog/digital lightwave neuromorphic processing device that effectively performs signal feature recognition. The approach, which mimics the neurons in a crayfish responsible for the escape response mechanism, provides a fast and accurate reaction to its inputs. The analog processing portion of the device uses the integration characteristic of an electro-absorption modulator, while the digital processing portion employ optical thresholding in a highly Ge-doped nonlinear loop mirror. The device can be configured to respond to different sets of input patterns by simply varying the weights and delays of the inputs. We experimentally demonstrated the use of the proposed lightwave neuromorphic signal processing device for recognizing specific input patterns.

  5. Optical signal processing of phased array radar

    NASA Astrophysics Data System (ADS)

    Weverka, Robert T.

    This thesis develops optical processors that scale to very high processing speed. Optical signal processing is often promoted on the basis of smaller size, lower weight and lower power consumption as well as higher signal processing speed. While each of these requirements has applications, it is the ones that require processing speed beyond that available in electronics that are most compelling. Thirty years ago, optical processing was the only method fast enough to process Synthetic Aperture Radar (SAR), one of the more demanding signal processing tasks at this time. Since that time electronic processing speed has improved sufficiently to tackle that problem. We have sought out the problems that require significantly higher processing speed and developed optical processors that tackle these more difficult problems. The components that contribute to high signal processing speed are high input signal bandwidth, a large number of parallel input channels each with this high bandwidth, and a large number of parallel operations required on each input channel. Adaptive signal processing for phased array radar has all of these factors. The processors developed for this task scale well in three dimensions, which allows them to maximize parallelism for high speed. This thesis explores an example of a negative feedback adaptive phased array processor and an example of a positive feedback phased array processor. The negative feedback processor uses and array of inputs in up to two dimensions together with the time history of the signal in the third dimension to adapt the array pattern to null out incoming jammer signals. The positive feedback processor uses the incoming signals and assumptions about the radar scene to correct for position errors in a phased array. Discovery and analysis of these new processors are facilitated by an original volume holographic analysis technique developed in the thesis. The thesis includes a new acoustooptic Bragg cell geometry developed with

  6. Comparison between Hilbert-Huang transform and scalogram methods on non-stationary biomedical signals: application to laser Doppler flowmetry recordings.

    PubMed

    Roulier, Rémy; Humeau, Anne; Flatley, Thomas P; Abraham, Pierre

    2005-11-07

    A significant transient increase in laser Doppler flowmetry (LDF) signals is observed in response to a local and progressive cutaneous pressure application on healthy subjects. This reflex may be impaired in diabetic patients. The work presents a comparison between two signal processing methods that provide a clarification of this phenomenon. Analyses by the scalogram and the Hilbert-Huang transform (HHT) of LDF signals recorded at rest and during a local and progressive cutaneous pressure application are performed on healthy and type 1 diabetic subjects. Three frequency bands, corresponding to myogenic, neurogenic and endothelial related metabolic activities, are studied at different time intervals in order to take into account the dynamics of the phenomenon. The results show that both the scalogram and the HHT methods lead to the same conclusions concerning the comparisons of the myogenic, neurogenic and endothelial related metabolic activities-during the progressive pressure and at rest-in healthy and diabetic subjects. However, the HHT shows more details that may be obscured by the scalogram. Indeed, the non-locally adaptative limitations of the scalogram can remove some definition from the data. These results may improve knowledge on the above-mentioned reflex as well as on non-stationary biomedical signal processing methods.

  7. Electron quantum optics as quantum signal processing

    NASA Astrophysics Data System (ADS)

    Roussel, B.; Cabart, C.; Fève, G.; Thibierge, E.; Degiovanni, P.

    2017-03-01

    The recent developments of electron quantum optics in quantum Hall edge channels have given us new ways to probe the behavior of electrons in quantum conductors. It has brought new quantities called electronic coherences under the spotlight. In this paper, we explore the relations between electron quantum optics and signal processing through a global review of the various methods for accessing single- and two-electron coherences in electron quantum optics. We interpret electron quantum optics interference experiments as analog signal processing converting quantum signals into experimentally observable quantities such as current averages and correlations. This point of view also gives us a procedure to obtain quantum information quantities from electron quantum optics coherences. We illustrate these ideas by discussing two mode entanglement in electron quantum optics. We also sketch how signal processing ideas may open new perspectives for representing electronic coherences in quantum conductors and understand the properties of the underlying many-body electronic state.

  8. Digital signal processor and programming system for parallel signal processing

    SciTech Connect

    Van den Bout, D.E.

    1987-01-01

    This thesis describes an integrated assault upon the problem of designing high-throughput, low-cost digital signal-processing systems. The dual prongs of this assault consist of: (1) the design of a digital signal processor (DSP) which efficiently executes signal-processing algorithms in either a uniprocessor or multiprocessor configuration, (2) the PaLS programming system which accepts an arbitrary algorithm, partitions it across a group of DSPs, synthesizes an optimal communication link topology for the DSPs, and schedules the partitioned algorithm upon the DSPs. The results of applying a new quasi-dynamic analysis technique to a set of high-level signal-processing algorithms were used to determine the uniprocessor features of the DSP design. For multiprocessing applications, the DSP contains an interprocessor communications port (IPC) which supports simple, flexible, dataflow communications while allowing the total communication bandwidth to be incrementally allocated to achieve the best link utilization. The net result is a DSP with a simple architecture that is easy to program for both uniprocessor and multi-processor modes of operation. The PaLS programming system simplifies the task of parallelizing an algorithm for execution upon a multiprocessor built with the DSP.

  9. Interpretation of the approximate entropy using fixed tolerance values as a measure of amplitude variations in biomedical signals.

    PubMed

    Sarlabous, Leonardo; Torres, Abel; Fiz, Jose A; Gea, J; Martinez-Llorens, J M; Morera, J; Jane, Raimon

    2010-01-01

    A new method for the quantification of amplitude variations in biomedical signals through moving approximate entropy is presented. Unlike the usual method to calculate the approximate entropy (ApEn), in which the tolerance value (r) varies based on the standard deviation of each moving window, in this work ApEn has been computed using a fixed value of r. We called this method, moving approximate entropy with fixed tolerance values: ApEn(f). The obtained results indicate that ApEn(f) allows determining amplitude variations in biomedical data series. These amplitude variations are better determined when intermediate values of tolerance are used. The study performed in diaphragmatic mechanomyographic signals shows that the ApEn(f) curve is more correlated with the respiratory effort than the standard RMS amplitude parameter. Furthermore, it has been observed that the ApEn(f) parameter is less affected by the existence of impulsive, sinusoidal, constant and Gaussian noises in comparison with the RMS amplitude parameter.

  10. Synthesis and biomedical applications of functionalized fluorescent and magnetic dual reporter nanoparticles as obtained in the miniemulsion process

    NASA Astrophysics Data System (ADS)

    Holzapfel, Verena; Lorenz, Myriam; Kilian Weiss, Clemens; Schrezenmeier, Hubert; Landfester, Katharina; Mailänder, Volker

    2006-09-01

    As superparamagnetic nanoparticles capture new applications and markets, the flexibility and modifications of these nanoparticles are increasingly important aspects. Therefore a series of magnetic polystyrene particles encapsulating magnetite nanoparticles (10-12 nm) in a hydrophobic poly(styrene-co-acrylic acid) shell was synthesized by a three-step miniemulsion process. A high amount of iron oxide was incorporated by this process (typically 30-40% (w/w)). As a second reporter, a fluorescent dye was also integrated in order to obtain 'dual reporter particles'. Finally, polymerization of the monomer styrene yielded nanoparticles in the range 45-70 nm. By copolymerization of styrene with the hydrophilic acrylic acid, the amount of carboxyl groups on the surface was varied. The characterization of the latexes included dynamic light scattering, transmission electron microscopy, surface charge and magnetic measurements. For biomedical evaluation, the nanoparticles were incubated with different cell types. The introduction of carboxyl groups on the particle surfaces enabled the uptake of nanoparticles as demonstrated by the detection of the fluorescent signal by fluorescent activated cell sorter (FACS) and laser scanning microscopy. The quantity of iron in the cells that is required for most biomedical applications (like detection by magnetic resonance imaging) has to be significantly higher, as can be achieved by the uptake of magnetite encapsulated nanoparticles functionalized only with carboxyl groups. A further increase of uptake can be accomplished by transfection agents like poly-L-lysine or other positively charged polymers. This functionality was also engrafted into the surface of the nanoparticles by covalently coupling lysine to the carboxyl groups. The amount of iron that can be transfected was even higher than with the nanoparticles with a transfection agent added and this only physically adsorbed. Furthermore, the subcellular localization of these

  11. [Anesthesia in the Signal Processing Methods].

    PubMed

    Gu, Jiajun; Huang, Yan; Ye, Jilun; Wang, Kaijun; Zhang, Meimei

    2015-09-01

    Anesthesia plays an essential role in clinical operations. Guiding anesthesia by EEG signals is one of the most promising methods at present and it has obtained good results. The analysis and process of the EEG signals in anesthesia can provide clean signal for further research. This paper used variance threshold method to remove the mutation fast and large interfering signals; and used notch filter to remove frequency interference, smoothing filter to remove baseline drift and Butterworth low-pass filter to remove high frequency noise at the same time. In addition to this, the translation invariant wavelet method to remove interference noise on the signals which was after the classical filter and retained non-stationary characteristics was used to evaluate parameter calculation. By comparing the calculated parameters from treated signal using this paper's methods and untreated signal and standard signal, the standard deviation and correlation has been improved, particularly the major parameters BetaR, which provides better signal for integration of multi-parameter to evaluate depth of anesthesia index for the latter.

  12. Intelligent Signal Processing for Detection System Optimization

    SciTech Connect

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  13. Intelligent Signal Processing for Detection System Optimization

    SciTech Connect

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  14. Sonar signal processing using probabilistic signal and ocean environmental models.

    PubMed

    Culver, R Lee; Camin, H John

    2008-12-01

    Acoustic signals propagating through the ocean are refracted, scattered, and attenuated by the ocean volume and boundaries. Many aspects of how the ocean affects acoustic propagation are understood, such that the characteristics of a received signal can often be predicted with some degree of certainty. However, acoustic ocean parameters vary with time and location in a manner that is not, and cannot be, precisely known; some uncertainty will always remain. For this reason, the characteristics of the received signal can never be precisely predicted and must be described in probabilistic terms. A signal processing structure recently developed relies on knowledge of the ocean environment to predict the statistical characteristics of the received signal, and incorporates this description into the processor in order to detect and classify targets. Acoustic measurements at 250 Hz from the 1996 Strait of Gibraltar Acoustic Monitoring Experiment are used to illustrate how the processor utilizes environmental data to classify source depth and to underscore the importance of environmental model fidelity and completeness.

  15. Biomedical application of wavelets: analysis of electroencephalograph signals for monitoring depth of anesthesia

    NASA Astrophysics Data System (ADS)

    Abbate, Agostino; Nayak, A.; Koay, J.; Roy, R. J.; Das, Pankaj K.

    1996-03-01

    The wavelet transform (WT) has been used to study the nonstationary information in the electroencephalograph (EEG) as an aid in determining the anesthetic depth. A complex analytic mother wavelet is utilized to obtain the time evolution of the various spectral components of the EEG signal. The technique is utilized for the detection and spectral analysis of transient and background processes in the awake and asleep states. It can be observed that the response of both states before the application of the stimulus is similar in amplitude but not in spectral contents, which suggests a background activity of the brain. The brain reacts to the external stimulus in two different modes depending on the state of consciousness of the subject. In the case of awake state, there is an evident increase in response, while for the sleep state a reduction in this activity is observed. This analysis seems to suggest that the brain has an ongoing background process that monitors external stimulus in both the sleep and awake states.

  16. Overview of Digital Signal Processing Theory

    DTIC Science & Technology

    1975-05-20

    of digital integrated- circuit hardware elements along with their extremely high reliability, maintainability, and repeatability of performance have...limited by large-signal-performance and power limitations of circuit components. In the implementation of digital signal process- ing systems there...E. Polak and E. Wong, Notes For A First Course On Linear Systems, Van Nostrand Reinhold Company, New York, 1970. 2. C.A. Desoer , Notes For A

  17. Optical signal processing for wireless transmission

    NASA Astrophysics Data System (ADS)

    Kawanishi, Tetsuya

    2012-01-01

    Millimeter-wave bands are attracting attention because of the availability of wideband for high-speed transmission. However, due to the limitation of the performance of electric signal processing, it is rather difficult to modulate and demodulate millimeter-wave signals with high-speed baseband modulation. In this paper, we describe optical signal processing for high-speed modulation of millimeter-wave, based on high-speed and precise lightwave control. In optical fiber communication systems, various types of modulation formats, such as quadrature-amplitude-modulation, are reported to achieve high-speed transmission. Optical two-tone signals can be converted into millimeter-wave signals by using high-speed photodetectors. This technique can be used for distribution of stable reference signals in large-scale antenna arrays for radio astronomy. By using the millimeter-wave signal generation technique and the optical advanced modulation formats, we can achieve high-speed modulation of millimeter-waves, where the carrier frequency and bit rate can be over 90GHz and 40Gb/s, respectively.

  18. New Optical Methods for Signal Processing

    NASA Astrophysics Data System (ADS)

    Zhang, Yan

    This doctoral thesis studies the optical implementations of various new algorithms and methods for large bandwidth signal and image processing. Among the schemes to be studied are the long data stream convolution/correlation, the Gabor and the wavelet transforms, and their applications to system failure prediction, dense target signal processing and image coding. Based on the Chinese remainder theorem, optically implementable algorithms are described, which convert the convolution/correlation of long data streams to relatively small scale linear operations such as a group of short -term vector-matrix multiplications or short-term convolutions/correlations. The proposed algorithms can be realized by using the existing optical analog data processors. Simulations were performed to prove their validity. Technical problems and fundamental limitations of the above schemes are studied. Following the consideration of the above time domain operations, signal's representations in joint time -frequency (scale) domain are then considered. An opto -electronic Gabor coefficient processor is designed to perform the Gabor transform on short one-dimensional (1-D) signals in real-time. Some experimental results are presented to confirm the operational principle of the system. As an application of this processor, Gabor transform based transient signal detection is studied. Other schemes for implementing Gabor transform of long 1-D signals based on the long data stream convolver, and 2-D signals are also investigated. Following the study of the Gabor transform, the newly suggested wavelet transform is considered for its optical implementation. Using commercially available opto-electronic components, an optical wavelet processor is designed and built to perform the wavelet transforms on short 1-D signals in real-time. As an extension, architectures for 2-D optical wavelet transform are also described and computer simulated with the consideration of their technical problems of optical

  19. A unified approach to sparse signal processing

    NASA Astrophysics Data System (ADS)

    Marvasti, Farokh; Amini, Arash; Haddadi, Farzan; Soltanolkotabi, Mahdi; Khalaj, Babak Hossein; Aldroubi, Akram; Sanei, Saeid; Chambers, Janathon

    2012-12-01

    A unified view of the area of sparse signal processing is presented in tutorial form by bringing together various fields in which the property of sparsity has been successfully exploited. For each of these fields, various algorithms and techniques, which have been developed to leverage sparsity, are described succinctly. The common potential benefits of significant reduction in sampling rate and processing manipulations through sparse signal processing are revealed. The key application domains of sparse signal processing are sampling, coding, spectral estimation, array processing, component analysis, and multipath channel estimation. In terms of the sampling process and reconstruction algorithms, linkages are made with random sampling, compressed sensing, and rate of innovation. The redundancy introduced by channel coding in finite and real Galois fields is then related to over-sampling with similar reconstruction algorithms. The error locator polynomial (ELP) and iterative methods are shown to work quite effectively for both sampling and coding applications. The methods of Prony, Pisarenko, and MUltiple SIgnal Classification (MUSIC) are next shown to be targeted at analyzing signals with sparse frequency domain representations. Specifically, the relations of the approach of Prony to an annihilating filter in rate of innovation and ELP in coding are emphasized; the Pisarenko and MUSIC methods are further improvements of the Prony method under noisy environments. The iterative methods developed for sampling and coding applications are shown to be powerful tools in spectral estimation. Such narrowband spectral estimation is then related to multi-source location and direction of arrival estimation in array processing. Sparsity in unobservable source signals is also shown to facilitate source separation in sparse component analysis; the algorithms developed in this area such as linear programming and matching pursuit are also widely used in compressed sensing. Finally

  20. Nonlinear Real-Time Optical Signal Processing.

    DTIC Science & Technology

    1984-10-01

    DTIC ELECTE I B IIMAGE PROCESSING INSTITUTE 84 11 26 107 UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE (When Dota Entered), REPORT DOCUMENTATION...30, 1984 N NONLINEAR REAL-TIME OPTICAL SIGNAL PROCESSING i E~ A.A. Sawchuk, Principal Investigator T.C. Strand and A.R. Tanguay. Jr. October 1, 1984...RDepartment of Electrical Engineering Image Processing institute University of Southern California University Park-MC 0272 Los Angeles, California

  1. Bacteriorhodopsin Film For Processing SAR Signals

    NASA Technical Reports Server (NTRS)

    Yu, Jeffrey W.; Chao, Tien-Hsin; Margalit, Ruth; Cheng, Li-Jen

    1992-01-01

    "Instant" photographic film based on semisynthetic retinal pigment bacteriorhodopsin proposed for optical processing of synthetic-aperture-radar (SAR) signals. Input image recorded on film by laser operating at writing wavelength of bacteriorhodopsin, and output image recorded on computer by standard frame-grabber. Because it requires no chemical development, enables processing in nearly real time. Fast response and high resolution well suited for application. Film reusable, with concomitant reduction in cost of SAR processing.

  2. Bacteriorhodopsin Film For Processing SAR Signals

    NASA Technical Reports Server (NTRS)

    Yu, Jeffrey W.; Chao, Tien-Hsin; Margalit, Ruth; Cheng, Li-Jen

    1992-01-01

    "Instant" photographic film based on semisynthetic retinal pigment bacteriorhodopsin proposed for optical processing of synthetic-aperture-radar (SAR) signals. Input image recorded on film by laser operating at writing wavelength of bacteriorhodopsin, and output image recorded on computer by standard frame-grabber. Because it requires no chemical development, enables processing in nearly real time. Fast response and high resolution well suited for application. Film reusable, with concomitant reduction in cost of SAR processing.

  3. Engineering a material for biomedical applications with electric field assisted processing

    NASA Astrophysics Data System (ADS)

    Ahmad, Z.; Nangrejo, M.; Edirisinghe, M.; Stride, E.; Colombo, P.; Zhang, H. B.

    2009-10-01

    In this work, using multiple co-flows we demonstrate in-situ encapsulation of nano-particles, liquids and/or gases in different structural morphologies, which can also be deposited in a designated pattern by a direct write method and surface modification can be controlled to release encapsulated material. The range of possibilities offered by exposing a material solution to an applied electric field can result in a plethora of structures which can accommodate a whole host of biomedical applications from microfluidic devices (microchannels, loaded with various materials), printed 3D structures and patterns, lab-on-a-chip devices to encapsulated materials (capsules, tubes, fibres, dense multi-layered fibrous networks) for drug delivery and tissue engineering. The structures obtained in this way can vary in size from micrometer to the nanometer range and the processing is viable for all states of matter. The work shown demonstrates some novel structures and methodologies for processing a biomaterial.

  4. Hybrid Discrete Wavelet Transform and Gabor Filter Banks Processing for Features Extraction from Biomedical Images.

    PubMed

    Lahmiri, Salim; Boukadoum, Mounir

    2013-01-01

    A new methodology for automatic feature extraction from biomedical images and subsequent classification is presented. The approach exploits the spatial orientation of high-frequency textural features of the processed image as determined by a two-step process. First, the two-dimensional discrete wavelet transform (DWT) is applied to obtain the HH high-frequency subband image. Then, a Gabor filter bank is applied to the latter at different frequencies and spatial orientations to obtain new Gabor-filtered image whose entropy and uniformity are computed. Finally, the obtained statistics are fed to a support vector machine (SVM) binary classifier. The approach was validated on mammograms, retina, and brain magnetic resonance (MR) images. The obtained classification accuracies show better performance in comparison to common approaches that use only the DWT or Gabor filter banks for feature extraction.

  5. Challenges in the characterization of plasma-processed three-dimensional polymeric scaffolds for biomedical applications.

    PubMed

    Fisher, Ellen R

    2013-10-09

    Low-temperature plasmas offer a versatile method for delivering tailored functionality to a range of materials. Despite the vast array of choices offered by plasma processing techniques, there remain a significant number of hurdles that must be overcome to allow this methodology to realize its full potential in the area of biocompatible materials. Challenges include issues associated with analytical characterization, material structure, plasma processing, and uniform composition following treatment. Specific examples and solutions are presented utilizing results from analyses of three-dimensional (3D) poly(ε-caprolactone) scaffolds treated with different plasma surface modification strategies that illustrate these challenges well. Notably, many of these strategies result in 3D scaffolds that are extremely hydrophilic and that enhance human Saos-2 osteoblast cell growth and proliferation, which are promising results for applications including tissue engineering and advanced biomedical devices.

  6. Influence of Packaging and Processing Conditions on the Decontamination of Laboratory Biomedical Waste by Steam Sterilization

    PubMed Central

    Ozanne, Gérard; Huot, Roger; Montpetit, Claude

    1993-01-01

    The conditions for optimal steam decontamination of polypropylene bags half loaded with laboratory biomedical waste were studied (276 bags were processed). Controls were single-closed bags without water added or incisions made in the top, standing freely in an autoclave set at 121°C. The average time required to reach 121°C at the load center was 46 min for controls. A significant increase in this time occurred following addition of water to bags without incisions (60 min), with double bagging (60 min), or when using vertical containers (82 min). A significant decrease occurred when bags were slashed (37 min) or processed at 123°C (32 min) or 132°C (19 min). Horizontal containers or addition of water to slashed bags had no significant effect. PMID:16349131

  7. Hybrid Discrete Wavelet Transform and Gabor Filter Banks Processing for Features Extraction from Biomedical Images

    PubMed Central

    Lahmiri, Salim; Boukadoum, Mounir

    2013-01-01

    A new methodology for automatic feature extraction from biomedical images and subsequent classification is presented. The approach exploits the spatial orientation of high-frequency textural features of the processed image as determined by a two-step process. First, the two-dimensional discrete wavelet transform (DWT) is applied to obtain the HH high-frequency subband image. Then, a Gabor filter bank is applied to the latter at different frequencies and spatial orientations to obtain new Gabor-filtered image whose entropy and uniformity are computed. Finally, the obtained statistics are fed to a support vector machine (SVM) binary classifier. The approach was validated on mammograms, retina, and brain magnetic resonance (MR) images. The obtained classification accuracies show better performance in comparison to common approaches that use only the DWT or Gabor filter banks for feature extraction. PMID:27006906

  8. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field.

  9. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  10. Signal Processing Methods Monitor Cranial Pressure

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Dr. Norden Huang, of Goddard Space Flight Center, invented a set of algorithms (called the Hilbert-Huang Transform, or HHT) for analyzing nonlinear and nonstationary signals that developed into a user-friendly signal processing technology for analyzing time-varying processes. At an auction managed by Ocean Tomo Federal Services LLC, licenses of 10 U.S. patents and 1 domestic patent application related to HHT were sold to DynaDx Corporation, of Mountain View, California. DynaDx is now using the licensed NASA technology for medical diagnosis and prediction of brain blood flow-related problems, such as stroke, dementia, and traumatic brain injury.

  11. Array algebra estimation in signal processing

    NASA Astrophysics Data System (ADS)

    Rauhala, U. A.

    A general theory of linear estimators called array algebra estimation is interpreted in some terms of multidimensional digital signal processing, mathematical statistics, and numerical analysis. The theory has emerged during the past decade from the new field of a unified vector, matrix and tensor algebra called array algebra. The broad concepts of array algebra and its estimation theory cover several modern computerized sciences and technologies converting their established notations and terminology into one common language. Some concepts of digital signal processing are adopted into this language after a review of the principles of array algebra estimation and its predecessors in mathematical surveying sciences.

  12. Processing oscillatory signals by incoherent feedforward loops

    NASA Astrophysics Data System (ADS)

    Zhang, Carolyn; Wu, Feilun; Tsoi, Ryan; Shats, Igor; You, Lingchong

    From the timing of amoeba development to the maintenance of stem cell pluripotency,many biological signaling pathways exhibit the ability to differentiate between pulsatile and sustained signals in the regulation of downstream gene expression.While networks underlying this signal decoding are diverse,many are built around a common motif, the incoherent feedforward loop (IFFL),where an input simultaneously activates an output and an inhibitor of the output.With appropriate parameters,this motif can generate temporal adaptation,where the system is desensitized to a sustained input.This property serves as the foundation for distinguishing signals with varying temporal profiles.Here,we use quantitative modeling to examine another property of IFFLs,the ability to process oscillatory signals.Our results indicate that the system's ability to translate pulsatile dynamics is limited by two constraints.The kinetics of IFFL components dictate the input range for which the network can decode pulsatile dynamics.In addition,a match between the network parameters and signal characteristics is required for optimal ``counting''.We elucidate one potential mechanism by which information processing occurs in natural networks with implications in the design of synthetic gene circuits for this purpose. This work was partially supported by the National Science Foundation Graduate Research Fellowship (CZ).

  13. Wavelength-domain RF photonic signal processing

    NASA Astrophysics Data System (ADS)

    Gao, Lu

    This thesis presents a novel approach to RF-photonic signal processing applications based on wavelength-domain optical signal processing techniques using broadband light sources as the information carriers, such as femtosecond lasers and white light sources. The wavelength dimension of the broadband light sources adds an additional degree of freedom to conventional optical signal processing systems. Two novel wavelength-domain optical signal processing systems are presented and demonstrated in this thesis. The first wavelength-domain RF photonic signal processing system is a wavelength-compensated squint-free photonic multiple beam-forming system for wideband RF phased-array antennas. Such a photonic beam-forming system employs a new modulation scheme developed in this thesis, which uses traveling-wave tunable filters to modulate wideband RF signals onto broadband optical light sources in a frequency-mapped manner. The wavelength dimension of the broadband light sources provides an additional dimension in the wavelength-compensated Fourier beam-forming system for mapping the received RF frequencies to the linearly proportional optical frequencies, enabling true-time-delay beam forming, as well as other novel RF-photonic signal processing functions such as tunable filtering and frequency down conversion. A new slow-light mechanism, the SLUGGISH light, has also been discovered with an effective slow-light velocity of 86 m/s and a record time-bandwidth product of 20. Experimental demonstration of true-time-delay beam forming based on the SLUGGISH light effect has also been presented in this thesis. In the second wavelength-domain RF photonic signal processing system, the wavelength dimension increases the information carrying capacity by spectrally multiplexing multiple wavelength channels in a wavelength-division-multiplexing fiber-optic communication system. A novel ultrafast all-optical 3R (Re-amplification, Retiming, Re-shaping) wavelength converter based on

  14. Signal Processing Schemes for Doppler Global Velocimetry

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Lee, Joseph W.; Cavone, Angelo A.

    1991-01-01

    Two schemes for processing signals obtained from the Doppler global velocimeter are described. The analog approach is a simple, real time method for obtaining an RS-170 video signal containing the normalized intensity image. Pseudo colors are added using a monochromatic frame grabber producing a standard NTSC video signal that can be monitored and/or recorded. The digital approach is more complicated, but maintains the full resolution of the acquisition cameras with the capabilities to correct the signal image for pixel sensitivity variations and to remove of background light. Prototype circuits for each scheme are described and example results from the investigation of the vortical flow field above a 75-degree delta wing presented.

  15. Liquid argon TPC signal formation, signal processing and reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Baller, B.

    2017-07-01

    This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.

  16. Processing of the laser Doppler velocimeter signals

    NASA Technical Reports Server (NTRS)

    Meyers, J. F.; Feller, W. V.

    1973-01-01

    The laser Doppler velocimeter (LDV) is a probeless technique that provides a remote measurement of mean and fluctuating velocities. The measurement is actually obtained from small particles embedded in the flow which scatter light from an illuminating laser beam interference pattern. A portion of this scattered light is collected by a photomultiplier which yields an electronic signal whose frequency is directly proportional to the velocity of the small particles. The purpose of this paper is to describe and critically compare three techniques most used to process this electronic signal. These techniques are: (1) spectrum analyzer - a frequency scanning filter (frequency domain instrument), (2) wide-band frequency tracker - a frequency lock loop (frequency domain instrument), and (3) high-speed frequency counter - an interval timer (time domain instrument). The study determines the ability of each technique to process the LDV signal and yield velocity data to be used in determining the flow characteristics.

  17. Signal processing for distributed sensor concept: DISCO

    NASA Astrophysics Data System (ADS)

    Rafailov, Michael K.

    2007-04-01

    Distributed Sensor concept - DISCO proposed for multiplication of individual sensor capabilities through cooperative target engagement. DISCO relies on ability of signal processing software to format, to process and to transmit and receive sensor data and to exploit those data in signal synthesis process. Each sensor data is synchronized formatted, Signal-to-Noise Ration (SNR) enhanced and distributed inside of the sensor network. Signal processing technique for DISCO is Recursive Adaptive Frame Integration of Limited data - RAFIL technique that was initially proposed [1] as a way to improve the SNR, reduce data rate and mitigate FPA correlated noise of an individual sensor digital video-signal processing. In Distributed Sensor Concept RAFIL technique is used in segmented way, when constituencies of the technique are spatially and/or temporally separated between transmitters and receivers. Those constituencies include though not limited to two thresholds - one is tuned for optimum probability of detection, the other - to manage required false alarm rate, and limited frame integration placed somewhere between the thresholds as well as formatters, conventional integrators and more. RAFIL allows a non-linear integration that, along with SNR gain, provides system designers more capability where cost, weight, or power considerations limit system data rate, processing, or memory capability [2]. DISCO architecture allows flexible optimization of SNR gain, data rates and noise suppression on sensor's side and limited integration, re-formatting and final threshold on node's side. DISCO with Recursive Adaptive Frame Integration of Limited data may have flexible architecture that allows segmenting the hardware and software to be best suitable for specific DISCO applications and sensing needs - whatever it is air-or-space platforms, ground terminals or integration of sensors network.

  18. Data-Driven Sampling Matrix Boolean Optimization for Energy-Efficient Biomedical Signal Acquisition by Compressive Sensing.

    PubMed

    Wang, Yuhao; Li, Xin; Xu, Kai; Ren, Fengbo; Yu, Hao

    2017-04-01

    Compressive sensing is widely used in biomedical applications, and the sampling matrix plays a critical role on both quality and power consumption of signal acquisition. It projects a high-dimensional vector of data into a low-dimensional subspace by matrix-vector multiplication. An optimal sampling matrix can ensure accurate data reconstruction and/or high compression ratio. Most existing optimization methods can only produce real-valued embedding matrices that result in large energy consumption during data acquisition. In this paper, we propose an efficient method that finds an optimal Boolean sampling matrix in order to reduce the energy consumption. Compared to random Boolean embedding, our data-driven Boolean sampling matrix can improve the image recovery quality by 9 dB. Moreover, in terms of sampling hardware complexity, it reduces the energy consumption by 4.6× and the silicon area by 1.9× over the data-driven real-valued embedding.

  19. Distributed radiofrequency signal processing using multicore fibers

    NASA Astrophysics Data System (ADS)

    Garcia, S.; Gasulla, I.

    2016-11-01

    Next generation fiber-wireless communication paradigms will require new technologies to address the current limitations to massive capacity, connectivity and flexibility. Multicore optical fibers, which were conceived for high-capacity digital communications, can bring numerous advantages to fiber-wireless radio access architectures. Besides radio over fiber parallel distribution and multiple antenna connectivity, multicore fibers can implement, at the same time, a variety of broadband processing functionalities for microwave and millimeter-wave signals. This approach leads to the novel concept of "fiber-distributed signal processing". In particular, we capitalize on the spatial parallelism inherent to multicore fibers to implement a broadband tunable true time delay line, which is the basis of multiple processing applications such as signal filtering, arbitrary waveform generation and squint-free radio beamsteering. We present the design of trench-assisted heterogeneous multicore fibers composed of cores featuring individual spectral group delays and chromatic dispersion profiles. Besides fulfilling the requirements for true time delay line operation, the MCFs are optimized in terms of higher-order dispersion, crosstalk and bend sensitivity. Microwave photonics signal processing will benefit from the performance stability, 2D operation versatility and compactness brought by the reported fiberintegrated solution.

  20. Displays, memories, and signal processing: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Articles on electronics systems and techniques were presented. The first section is on displays and other electro-optical systems; the second section is devoted to signal processing. The third section presented several new memory devices for digital equipment, including articles on holographic memories. The latest patent information available is also given.

  1. Signal processing aspects of windshear detection

    NASA Technical Reports Server (NTRS)

    Aalfs, David D.; Baxa, Ernest G., Jr.; Bracalente, Emedio M.

    1993-01-01

    Low-altitude windshear (LAWS) has been identified as a major hazard to aircraft, particularly during takeoff and landing. The Federal Aviation Administration (FAA) has been involved with developing technology to detect LAWS. A key element in this technology is high resolution pulse Doppler weather radar equipped with signal and data processing to provide timely information about possible hazardous conditions.

  2. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  3. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  4. Laser induced micro plasma processing of polymer substrates for biomedical implant applications

    NASA Astrophysics Data System (ADS)

    French, P. W.; Rosowski, A.; Murphy, M.; Irving, M.; Sharp, M. C.

    2015-07-01

    This paper reports the experimental results of a new hybrid laser processing technique; Laser Induced Micro Plasma Processing (LIMP2). A transparent substrate is placed on top of a medium that will interact with the laser beam and create a plasma. The plasma and laser beam act in unison to ablate material and create micro-structuring on the "backside" of the substrate. We report the results of a series of experiments on a new laser processing technique that will use the same laser-plasma interaction to micromachining structures into glass and polymer substrates on the "topside" of the substrate and hence machine non-transparent material. This new laser processing technique is called Laser Induced Micro Plasma Processing (LIMP2). Micromachining of biomedical implants is proving an important enabling technology in controlling cell growth on a macro-scale. This paper discusses LIMP2 structuring of transparent substrate such as glasses and polymers for this application. Direct machining of these materials by lasers in the near infrared is at present impossible. Laser Induced Micro Plasma Processing (LIMP2) is a technique that allows laser operating at 1064 nm to machine microstructures directly these transparent substrates.

  5. Processing of New Materials by Additive Manufacturing: Iron-Based Alloys Containing Silver for Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Niendorf, Thomas; Brenne, Florian; Hoyer, Peter; Schwarze, Dieter; Schaper, Mirko; Grothe, Richard; Wiesener, Markus; Grundmeier, Guido; Maier, Hans Jürgen

    2015-07-01

    In the biomedical sector, production of bioresorbable implants remains challenging due to improper dissolution rates or deficient strength of many candidate alloys. Promising materials for overcoming the prevalent drawbacks are iron-based alloys containing silver. However, due to immiscibility of iron and silver these alloys cannot be manufactured based on conventional processing routes. In this study, iron-manganese-silver alloys were for the first time synthesized by means of additive manufacturing. Based on combined mechanical, microscopic, and electrochemical studies, it is shown that silver particles well distributed in the matrix can be obtained, leading to cathodic sites in the composite material. Eventually, this results in an increased dissolution rate of the alloy. Stress-strain curves showed that the incorporation of silver barely affects the mechanical properties.

  6. Methodology for Creating UMLS Content Views Appropriate for Biomedical Natural Language Processing

    PubMed Central

    Aronson, Alan R.; Mork, James G.; Névéol, Aurélie; Shooshan, Sonya E.; Demner-Fushman, Dina

    2008-01-01

    Given the growth in UMLS Metathesaurus content and the consequent growth in language complexity, it is not surprising that NLP applications that depend on the UMLS are experiencing increased difficulty in maintaining adequate levels of performance. This phenomenon underscores the need for UMLS content views which can support NLP processing of both the biomedical literature and clinical text. We report on experiments designed to provide guidance as to whether to adopt a conservative vs. an aggressive approach to the construction of UMLS content views. We tested three conservative views and two new aggressive views against two NLP applications and found that the conservative views consistently performed better for the literature application, but the most aggressive view performed best for the clinical application. PMID:18998883

  7. Mechanical properties of metals for biomedical applications using powder metallurgy process: A review

    NASA Astrophysics Data System (ADS)

    Dewidar, Montasser Marasy; Yoon, Ho-Chel; Lim, Jae Kyoo

    2006-06-01

    The uses of biomaterials have been revolutionizing the biomedical field in deployment as implants for humans. During the past five decades, many implant materials made of metals have been put into practical use. Powder metallurgy techniques have been used to produce controlled porous structures, such as porous coatings applied for dental and orthopedic surgical implants, which allow bony tissue ingrowth within the implant surface, thereby improving fixation. This paper examines various important metals using powder metallurgy technology to produce elements of a total hip replacement. These alloys are 316L stainless steel alloy, Co-Cr-Mo alloy, and Ti-6Al-4V alloy. Also, this paper examines current information on the mechanical properties. Mechanical properties are discussed as a function of type of materials and process of fabrication. This article addresses the engineering aspects concerning the advantages and disadvantages of each type of material.

  8. Impact of biomedical imaging and data visualization technology on the clinical development and regulatory review process

    NASA Astrophysics Data System (ADS)

    Conklin, James J.; Robbins, William L.

    1994-12-01

    The determination of whether a drug or medical device is safe and effective requires statistical proof of valid clinical trial information. Quantitative biostatistical measures from anatomic and functional medical images are now providing objective and reproducible measures of drug and device effects. These highly precise biostatistical measures can be used to quantitatively analyze the efficacy and occasionally the safety of these drugs and devices. Since medical imaging information is digital, or is readily digitized, it can be visualized and measured in a variety of ways to evaluate the validity of the data. Moreover, with advanced image processing and data visualization tools, this information can be electronically organized and submitted directly to the U.S. Food and Drug Administration reviewed. Biomedical imaging and computer-based data visualization technologies have the ability to substantially decrease the time required for clinical development and regulatory review while providing more valid data.

  9. Signal-to-noise analysis of biomedical photoacoustic measurements in time and frequency domains

    NASA Astrophysics Data System (ADS)

    Telenkov, Sergey; Mandelis, Andreas

    2010-12-01

    Sensitivity analysis of photoacoustic measurements is conducted using estimates of the signal-to-noise ratio (SNR) achieved under two different modes of optical excitation. The standard pulsed time-domain photoacoustic imaging is compared to the frequency-domain counterpart with a modulated optical source. The feasibility of high-SNR continuous wave depth-resolved photoacoustics with frequency-swept (chirp) modulation pattern has been demonstrated. Utilization of chirped modulation waveforms achieves dramatic SNR increase of the periodic signals and preserves axial resolution comparable to the time-domain method. Estimates of the signal-to-noise ratio were obtained using typical parameters of piezoelectric transducers and optical properties of tissue.

  10. Adaptive modeling and spectral estimation of nonstationary biomedical signals based on Kalman filtering.

    PubMed

    Aboy, Mateo; Márquez, Oscar W; McNames, James; Hornero, Roberto; Trong, Tran; Goldstein, Brahm

    2005-08-01

    We describe an algorithm to estimate the instantaneous power spectral density (PSD) of nonstationary signals. The algorithm is based on a dual Kalman filter that adaptively generates an estimate of the autoregressive model parameters at each time instant. The algorithm exhibits superior PSD tracking performance in nonstationary signals than classical nonparametric methodologies, and does not assume local stationarity of the data. Furthermore, it provides better time-frequency resolution, and is robust to model mismatches. We demonstrate its usefulness by a sample application involving PSD estimation of intracranial pressure signals (ICP) from patients with traumatic brain injury (TBI).

  11. Complexity of EEG-signal in Time Domain - Possible Biomedical Application

    NASA Astrophysics Data System (ADS)

    Klonowski, Wlodzimierz; Olejarczyk, Elzbieta; Stepien, Robert

    2002-07-01

    Human brain is a highly complex nonlinear system. So it is not surprising that in analysis of EEG-signal, which represents overall activity of the brain, the methods of Nonlinear Dynamics (or Chaos Theory as it is commonly called) can be used. Even if the signal is not chaotic these methods are a motivating tool to explore changes in brain activity due to different functional activation states, e.g. different sleep stages, or to applied therapy, e.g. exposure to chemical agents (drugs) and physical factors (light, magnetic field). The methods supplied by Nonlinear Dynamics reveal signal characteristics that are not revealed by linear methods like FFT. Better understanding of principles that govern dynamics and complexity of EEG-signal can help to find `the signatures' of different physiological and pathological states of human brain, quantitative characteristics that may find applications in medical diagnostics.

  12. An ultrasonic device for signal processing

    NASA Astrophysics Data System (ADS)

    Kulakov, S. V.; Leks, A. G.; Semenov, S. P.; Ulyanov, G. K.

    1985-11-01

    The invention concerns the field of radioengineering and can be used in analog processors of the signals of phased antenna arrays. There are familiar devices for processing the signals of phased antenna arrays. However these are large in size, structurally complicated, and contain expensive parts. In the proposed device, for the purpose of simplification and cheapening the design and reducing the dimensions, the counting system is in the form of a receiving acoustical array, the elements of which are hooked up to a television-type indicator.

  13. Invariance algorithms for processing NDE signals

    NASA Astrophysics Data System (ADS)

    Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William

    1996-11-01

    Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.

  14. Digital signal processing for ionospheric propagation diagnostics

    NASA Astrophysics Data System (ADS)

    Rino, Charles L.; Groves, Keith M.; Carrano, Charles S.; Gunter, Jacob H.; Parris, Richard T.

    2015-08-01

    For decades, analog beacon satellite receivers have generated multifrequency narrowband complex data streams that could be processed directly to extract total electron content (TEC) and scintillation diagnostics. With the advent of software-defined radio, modern digital receivers generate baseband complex data streams that require intermediate processing to extract the narrowband modulation imparted to the signal by ionospheric structure. This paper develops and demonstrates a processing algorithm for digital beacon satellite data that will extract TEC and scintillation components. For algorithm evaluation, a simulator was developed to generate noise-limited multifrequency complex digital signal realizations with representative orbital dynamics and propagation disturbances. A frequency-tracking procedure is used to capture the slowly changing frequency component. Dynamic demodulation against the low-frequency estimate captures the scintillation. The low-frequency reference can be used directly for dual-frequency TEC estimation.

  15. Using rule-based natural language processing to improve disease normalization in biomedical text.

    PubMed

    Kang, Ning; Singh, Bharat; Afzal, Zubair; van Mulligen, Erik M; Kors, Jan A

    2013-01-01

    In order for computers to extract useful information from unstructured text, a concept normalization system is needed to link relevant concepts in a text to sources that contain further information about the concept. Popular concept normalization tools in the biomedical field are dictionary-based. In this study we investigate the usefulness of natural language processing (NLP) as an adjunct to dictionary-based concept normalization. We compared the performance of two biomedical concept normalization systems, MetaMap and Peregrine, on the Arizona Disease Corpus, with and without the use of a rule-based NLP module. Performance was assessed for exact and inexact boundary matching of the system annotations with those of the gold standard and for concept identifier matching. Without the NLP module, MetaMap and Peregrine attained F-scores of 61.0% and 63.9%, respectively, for exact boundary matching, and 55.1% and 56.9% for concept identifier matching. With the aid of the NLP module, the F-scores of MetaMap and Peregrine improved to 73.3% and 78.0% for boundary matching, and to 66.2% and 69.8% for concept identifier matching. For inexact boundary matching, performances further increased to 85.5% and 85.4%, and to 73.6% and 73.3% for concept identifier matching. We have shown the added value of NLP for the recognition and normalization of diseases with MetaMap and Peregrine. The NLP module is general and can be applied in combination with any concept normalization system. Whether its use for concept types other than disease is equally advantageous remains to be investigated.

  16. Processing Electromyographic Signals to Recognize Words

    NASA Technical Reports Server (NTRS)

    Jorgensen, C. C.; Lee, D. D.

    2009-01-01

    A recently invented speech-recognition method applies to words that are articulated by means of the tongue and throat muscles but are otherwise not voiced or, at most, are spoken sotto voce. This method could satisfy a need for speech recognition under circumstances in which normal audible speech is difficult, poses a hazard, is disturbing to listeners, or compromises privacy. The method could also be used to augment traditional speech recognition by providing an additional source of information about articulator activity. The method can be characterized as intermediate between (1) conventional speech recognition through processing of voice sounds and (2) a method, not yet developed, of processing electroencephalographic signals to extract unspoken words directly from thoughts. This method involves computational processing of digitized electromyographic (EMG) signals from muscle innervation acquired by surface electrodes under a subject's chin near the tongue and on the side of the subject s throat near the larynx. After preprocessing, digitization, and feature extraction, EMG signals are processed by a neural-network pattern classifier, implemented in software, that performs the bulk of the recognition task as described.

  17. Female sexual responses using signal processing techniques.

    PubMed

    Rafiee, Javad; Rafiee, Mohammad A; Michaelsen, Diane

    2009-11-01

    An automatic algorithm for processing vaginal photoplethysmograph signals could benefit researchers investigating sexual behaviors by standardizing interlaboratory methods. Female sexual response does not co-vary consistently in the self-report and physiological domains, making the advancement of measurements difficult. Automatic processing algorithms would increase analysis efficiency. Vaginal pulse amplitude (VPA) is a method used to measure female sexual responses. However, VPA are problematic because of the movement artifacts that impinge on the signal. This article suggests a real-time approach for automatic artifact detection of VPA signals. The stochastic changes (artifacts) of VPA are characterized mathematically in this research, and a method is presented to automatically extract the frequency of interest from VPA based on the autocorrelation function and wavelet analysis. Additionally, a calculation is presented for the vaginal blood flow change rate (VBFCR) during female sexual arousal using VPA signals. The primary aim is to investigate the experimental VPA measures based on theoretical techniques. Particularly, the goal is to introduce an automatic monitoring system for female sexual behaviors, which may be helpful for experts of female sexuality. The methods in the research are divided into experimental and theoretical parts. The VPA in twenty women was measured by a common vaginal photoplethysmography system in two conditions. Each subject was tested watching a neutral video followed by an erotic video. For theoretical analysis, an approach was applied based on wavelet transform to process the VPA. Introduction of an automatic and real-time monitoring system for female sexual behaviors, automatic movement artifact detection, VBFCR, first application of wavelet transform, and correlogram in VPA analysis. The natural and significant frequency information of VPA signals was extracted to automatically detect movement artifacts and to investigate the

  18. Hot topics: Signal processing in acoustics

    NASA Astrophysics Data System (ADS)

    Candy, James

    2002-05-01

    Signal processing represents a technology that provides the mechanism to extract the desired information from noisy acoustical measurement data. The desired result can range from extracting a single number like sound intensity level in the case of marine mammals to the seemingly impossible task of imaging the complex bottom in a hostile ocean environment. Some of the latest approaches to solving acoustical processing problems including sophisticated Bayesian processors in architectural acoustics, iterative flaw removal processing for non-destructive evaluation, time-reversal imaging for buried objects and time-reversal receivers in communications as well as some of the exciting breakthroughs using so-called blind processing techniques for deconvolution are discussed. Processors discussed range from the simple to the sophisticated as dictated by the particular application. It is shown how processing techniques are crucial to extracting the required information for success in the underlying application.

  19. Nonlinear Cochlear Signal Processing and Phoneme Perception

    NASA Astrophysics Data System (ADS)

    Allen, Jont B.; Régnier, Marion; Phatak, Sandeep; Li, Feipeng

    2009-02-01

    The most important communication signal is human speech. It is helpful to think of speech communication in terms of Claude Shannon's information theory channel model. When thus viewed, it immediately becomes clear that the most complex part of speech communication channel is in auditory system (the receiver). In my opinion, even after years of work, relatively little is know about how the human auditory system decodes speech. Given cochlear damaged, speech scores are greatly reduced, even with tiny amounts of noise. The exact reasons for this SNR-loss presently remain unclear, but I speculate that the source of this must be cochlear outer hair cell temporal processing, not central processing. Specifically, "temporal edge enhancement" of the speech signal and forward masking could easily be modified in such ears, leading to SNR-Loss. What ever the reason, SNR-Loss is the key problem that needs to be fully researched.

  20. An Ontology-Enabled Natural Language Processing Pipeline for Provenance Metadata Extraction from Biomedical Text (Short Paper).

    PubMed

    Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S

    2016-10-01

    Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.

  1. FPGA implementation of a ZigBee wireless network control interface to transmit biomedical signals

    NASA Astrophysics Data System (ADS)

    Gómez López, M. A.; Goy, C. B.; Bolognini, P. C.; Herrera, M. C.

    2011-12-01

    In recent years, cardiac hemodynamic monitors have incorporated new technologies based on wireless sensor networks which can implement different types of communication protocols. More precisely, a digital conductance catheter system recently developed adds a wireless ZigBee module (IEEE 802.15.4 standards) to transmit cardiac signals (ECG, intraventricular pressure and volume) which would allow the physicians to evaluate the patient's cardiac status in a noninvasively way. The aim of this paper is to describe a control interface, implemented in a FPGA device, to manage a ZigBee wireless network. ZigBee technology is used due to its excellent performance including simplicity, low-power consumption, short-range transmission and low cost. FPGA internal memory stores 8-bit signals with which the control interface prepares the information packets. These data were send to the ZigBee END DEVICE module that receives and transmits wirelessly to the external COORDINATOR module. Using an USB port, the COORDINATOR sends the signals to a personal computer for displaying. Each functional block of control interface was assessed by means of temporal diagrams. Three biological signals, organized in packets and converted to RS232 serial protocol, were sucessfully transmitted and displayed in a PC screen. For this purpose, a custom-made graphical software was designed using LabView.

  2. An improvement of unsupervised hybrid biomedical signal classifiers by optimal feature extraction in system preliminary layer.

    PubMed

    Kostka, P; Tkacz, E J

    2004-01-01

    In this paper we try to place emphasis especially on the feature extraction stage of classification procedure, where new feature vectors obtained from a high-dimensional data space, which the best match the analysed classification task are proposed. Based on multilevel Mallat wavelet decomposition, parameters obtained directly from the wavelet component as well as feature resulting from energy and entropy analysis are tested. In classifier part of proposed hybrid systems, unsupervised learning systems with self organizing maps (SOM) and adaptive resonance networks (ART2) are verified. T-F methods and particularly wavelet analysis was chosen as feature extraction tool because of its ability to deal with non-stationary signals. It is important to take into consideration, that heart rate variability (HRV) signals, which were classified in elaborated systems are nonstationary and have important parameters included both in time and frequency domain. Proposed structures were tested using the set of clinically characterized heart rate variability (HRV) signals of 62 patients, as cases with a coronary artery disease of different level. Additionally similar control group of healthy patients was analyzed. Whole database was divided into learning and verifying set. Results showed, that the new HRV signal representation obtained in the space created by the feature vector based on Shannon entropy of Mallat component energy distribution gave the best classifier performance with ART2 neural structure used in classifier part of described hybrid system.

  3. A review of biocompatible metal injection moulding process parameters for biomedical applications.

    PubMed

    Hamidi, M F F A; Harun, W S W; Samykano, M; Ghani, S A C; Ghazalli, Z; Ahmad, F; Sulong, A B

    2017-09-01

    Biocompatible metals have been revolutionizing the biomedical field, predominantly in human implant applications, where these metals widely used as a substitute to or as function restoration of degenerated tissues or organs. Powder metallurgy techniques, in specific the metal injection moulding (MIM) process, have been employed for the fabrication of controlled porous structures used for dental and orthopaedic surgical implants. The porous metal implant allows bony tissue ingrowth on the implant surface, thereby enhancing fixation and recovery. This paper elaborates a systematic classification of various biocompatible metals from the aspect of MIM process as used in medical industries. In this study, three biocompatible metals are reviewed-stainless steels, cobalt alloys, and titanium alloys. The applications of MIM technology in biomedicine focusing primarily on the MIM process setting parameters discussed thoroughly. This paper should be of value to investigators who are interested in state of the art of metal powder metallurgy, particularly the MIM technology for biocompatible metal implant design and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Signal Detection for Pareto Renewal Processes.

    DTIC Science & Technology

    1982-10-01

    SThe Pareto distribution itself was, of course, introduced by Vilfredo Pareto (1648 - 1923). (See Reference [221). This distribution has been used and...Bull. Calcutta Statist. Assoc., 7, 115-123. 22. Pareto , Vilfredo (1897). Cours d’Economie Politique. Lausanne and Paris: Rouge and Cie. 23. Park, C...STANDARDS-193-A 0 .1 / - r- ,---------------,- 8-82 SERIES IN STATISTICS AND BIOSTATISTICS SIGNAL DETECTION FOR PARETO RENEWAL PROCESSES C.B. BELL, R

  5. A Vector Signal Processing Approach to Color

    DTIC Science & Technology

    1992-01-01

    AD-A259 499 Technical Report 1349 A Vector Signal Processing Approach to Color Kah-Kay Sun MIT Artificial Intelligence Laboratory DTIC S EELECTE JAN2...ADDRESS(ES) 8. PERFORMING ORGANIZATION Artificial Intelligence Laboratory REPORT NUMBER 545 Technology Square Cambridge, Massachusetts 02139 AIM 1349 9...report describes research done at the Artificial Intelligence Laboratory at MIT. This rc- search is sponsored by the Artificial Intelligence Center of

  6. The role of multiple toll-like receptor signalling cascades on interactions between biomedical polymers and dendritic cells.

    PubMed

    Shokouhi, Behnaz; Coban, Cevayir; Hasirci, Vasif; Aydin, Erkin; Dhanasingh, Anandhan; Shi, Nian; Koyama, Shohei; Akira, Shizuo; Zenke, Martin; Sechi, Antonio S

    2010-08-01

    Biomaterials are used in several health-related applications ranging from tissue regeneration to antigen-delivery systems. Yet, biomaterials often cause inflammatory reactions suggesting that they profoundly alter the homeostasis of host immune cells such as dendritic cells (DCs). Thus, there is a major need to understand how biomaterials affect the function of these cells. In this study, we have analysed the influence of chemically and physically diverse biomaterials on DCs using several murine knockouts. DCs can sense biomedical polymers through a mechanism, which involves multiple TLR/MyD88-dependent signalling pathways, in particular TLR2, TLR4 and TLR6. TLR-biomaterial interactions induce the expression of activation markers and pro-inflammatory cytokines and are sufficient to confer on DCs the ability to activate antigen-specific T cells. This happens through a direct biomaterial-DC interaction although, for degradable biomaterials, soluble polymer molecules can also alter DC function. Finally, the engagement of TLRs by biomaterials profoundly alters DC adhesive properties. Our findings could be useful for designing structure-function studies aimed at developing more bioinert materials. Moreover, they could also be exploited to generate biomaterials for studying the molecular mechanisms of TLR signalling and DC activation aiming at fine-tuning desired and pre-determined immune responses. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. Signal processing for ION mobility spectrometers

    NASA Technical Reports Server (NTRS)

    Taylor, S.; Hinton, M.; Turner, R.

    1995-01-01

    Signal processing techniques for systems based upon Ion Mobility Spectrometry will be discussed in the light of 10 years of experience in the design of real-time IMS. Among the topics to be covered are compensation techniques for variations in the number density of the gas - the use of an internal standard (a reference peak) or pressure and temperature sensors. Sources of noise and methods for noise reduction will be discussed together with resolution limitations and the ability of deconvolution techniques to improve resolving power. The use of neural networks (either by themselves or as a component part of a processing system) will be reviewed.

  8. C language algorithms for digital signal processing

    SciTech Connect

    Embree, P.M.; Kimble, B.

    1991-01-01

    The use of the C programming language to construct digital signal-processing (DSP) algorithms for operation on high-performance personal computers is described in a textbook for engineering students. Chapters are devoted to the fundamental principles of DSP, basic C programming techniques, user-interface and disk-storage routines, filtering routines, discrete Fourier transforms, matrix and vector routines, and image-processing routines. Also included is a floppy disk containing a library of standard C mathematics, character-string, memory-allocation, and I/O functions; a library of DSP functions; and several sample DSP programs. 83 refs.

  9. Using Seismic Signals to Forecast Volcanic Processes

    NASA Astrophysics Data System (ADS)

    Salvage, R.; Neuberg, J. W.

    2012-04-01

    Understanding seismic signals generated during volcanic unrest have the ability to allow scientists to more accurately predict and understand active volcanoes since they are intrinsically linked to rock failure at depth (Voight, 1988). In particular, low frequency long period signals (LP events) have been related to the movement of fluid and the brittle failure of magma at depth due to high strain rates (Hammer and Neuberg, 2009). This fundamentally relates to surface processes. However, there is currently no physical quantitative model for determining the likelihood of an eruption following precursory seismic signals, or the timing or type of eruption that will ensue (Benson et al., 2010). Since the beginning of its current eruptive phase, accelerating LP swarms (< 10 events per hour) have been a common feature at Soufriere Hills volcano, Montserrat prior to surface expressions such as dome collapse or eruptions (Miller et al., 1998). The dynamical behaviour of such swarms can be related to accelerated magma ascent rates since the seismicity is thought to be a consequence of magma deformation as it rises to the surface. In particular, acceleration rates can be successfully used in collaboration with the inverse material failure law; a linear relationship against time (Voight, 1988); in the accurate prediction of volcanic eruption timings. Currently, this has only been investigated for retrospective events (Hammer and Neuberg, 2009). The identification of LP swarms on Montserrat and analysis of their dynamical characteristics allows a better understanding of the nature of the seismic signals themselves, as well as their relationship to surface processes such as magma extrusion rates. Acceleration and deceleration rates of seismic swarms provide insights into the plumbing system of the volcano at depth. The application of the material failure law to multiple LP swarms of data allows a critical evaluation of the accuracy of the method which further refines current

  10. PROBLEM-BASED LEARNING FOR PROFESSIONALISM AND ETHICS TRAINING OF BIOMEDICAL GRADUATE STUDENTS: PROCESS EVALUATION

    PubMed Central

    Jones, Nancy L.; Peiffer, Ann M.; Lambros, Ann; Eldridge, J. Charles

    2013-01-01

    Purpose A process evaluation was conducted to assess whether the newly developed Problem-Based Learning (PBL) curriculum designed to teach professionalism and ethics to biomedical graduate students was achieving its objectives. The curriculum was chosen to present realistic cases and issues in the practice of science, to promote skill development and to acculturate students to professional norms of science. Method The perception to which the objectives for the curriculum and courses were being reached was assessed using 5-step Likert-scaled questions, open-ended questions and interviews of students and facilitators. Results Process evaluation indicated that both facilitators and students perceived course objectives were being met. For example, active learning was preferred over lectures; both faculty and students percieved that the curriculum increased their understanding of norms, role obligations, and responsibilities of professional scientists; their ability to identify ethical situations was increased; skills in moral reasoning and effective group work were developed. Conclusions Information gathered was used to improve course implementation and instructional material. For example, a negative perception as an “ethics” course was addressed by redesigning case debriefing activities that reinforced learning objectives and important skills. Cases were refined to be more engaging and relevant for students, and facilitators were given more specific training and resources for each case. The PBL small group strategy can stimulate an environment more aware of ethical implications of science and increase socialization and open communication about professional behavior. PMID:20663754

  11. Surface Coating of Oxide Powders: A New Synthesis Method to Process Biomedical Grade Nano-Composites

    PubMed Central

    Palmero, Paola; Montanaro, Laura; Reveron, Helen; Chevalier, Jérôme

    2014-01-01

    Composite and nanocomposite ceramics have achieved special interest in recent years when used for biomedical applications. They have demonstrated, in some cases, increased performance, reliability, and stability in vivo, with respect to pure monolithic ceramics. Current research aims at developing new compositions and architectures to further increase their properties. However, the ability to tailor the microstructure requires the careful control of all steps of manufacturing, from the synthesis of composite nanopowders, to their processing and sintering. This review aims at deepening understanding of the critical issues associated with the manufacturing of nanocomposite ceramics, focusing on the key role of the synthesis methods to develop homogeneous and tailored microstructures. In this frame, the authors have developed an innovative method, named “surface-coating process”, in which matrix oxide powders are coated with inorganic precursors of the second phase. The method is illustrated into two case studies; the former, on Zirconia Toughened Alumina (ZTA) materials for orthopedic applications, and the latter, on Zirconia-based composites for dental implants, discussing the advances and the potential of the method, which can become a valuable alternative to the current synthesis process already used at a clinical and industrial scale. PMID:28788117

  12. An intelligent, onboard signal processing payload concept

    SciTech Connect

    Shriver, P. M.; Harikumar, J.; Briles, S. C.; Gokhale, M.

    2003-01-01

    Our approach to onboard processing will enable a quicker return and improved quality of processed data from small, remote-sensing satellites. We describe an intelligent payload concept which processes RF lightning signal data onboard the spacecraft in a power-aware manner. Presently, onboard processing is severely curtailed due to the conventional management of limited resources and power-unaware payload designs. Delays of days to weeks are commonly experienced before raw data is received, processed into a human-usable format, and finally transmitted to the end-user. We enable this resource-critical technology of onboard processing through the concept of Algorithm Power Modulation (APM). APM is a decision process used to execute a specific software algorithm, from a suite of possible algorithms, to make the best use of the available power. The suite of software algorithms chosen for our application is intended to reduce the probability of false alarms through postprocessing. Each algorithm however also has a cost in energy usage. A heuristic decision tree procedure is used which selects an algorithm based on the available power, time allocated, algorithm priority, and algorithm performance. We demonstrate our approach to power-aware onboard processing through a preliminary software simulation.

  13. Radar transponder apparatus and signal processing technique

    SciTech Connect

    Axline, R.M. Jr.; Sloan, G.R.; Spalding, R.E.

    1994-12-31

    An active, phase-coded, time-grating transponder and a synthetic-aperture radar (SAR) and signal processor means, in combination, allow the recognition and location of the transponder (tag) in the SAR image and allow communication of information messages from the transponder to the SAR. The SAR is an illuminating radar having special processing modifications in an image-formation processor to receive an echo from a remote transponder, after the transponder receives and retransmits the SAR illuminations, and to enhance tile transponder`s echo relative to surrounding ground clutter by recognizing special transponder modulations from phase-shifted from the transponder retransmissions. The remote radio-frequency tag also transmits information to the SAR through a single antenna that also serves to receive the SAR illuminations. Unique tag-modulation and SAR signal processing techniques, in combination, allow the detection and precise geographical location of the tag, through the reduction of interfering signals from ground clutter, and allow communication of environmental and status information from said tag to be communicated to said SAR.

  14. Radar transponder apparatus and signal processing technique

    DOEpatents

    Axline, Jr., Robert M.; Sloan, George R.; Spalding, Richard E.

    1996-01-01

    An active, phase-coded, time-grating transponder and a synthetic-aperture radar (SAR) and signal processor means, in combination, allow the recognition and location of the transponder (tag) in the SAR image and allow communication of information messages from the transponder to the SAR. The SAR is an illuminating radar having special processing modifications in an image-formation processor to receive an echo from a remote transponder, after the transponder receives and retransmits the SAR illuminations, and to enhance the transponder's echo relative to surrounding ground clutter by recognizing special transponder modulations from phase-shifted from the transponder retransmissions. The remote radio-frequency tag also transmits information to the SAR through a single antenna that also serves to receive the SAR illuminations. Unique tag-modulation and SAR signal processing techniques, in combination, allow the detection and precise geographical location of the tag through the reduction of interfering signals from ground clutter, and allow communication of environmental and status information from said tag to be communicated to said SAR.

  15. Radar transponder apparatus and signal processing technique

    DOEpatents

    Axline, R.M. Jr.; Sloan, G.R.; Spalding, R.E.

    1996-01-23

    An active, phase-coded, time-grating transponder and a synthetic-aperture radar (SAR) and signal processor means, in combination, allow the recognition and location of the transponder (tag) in the SAR image and allow communication of information messages from the transponder to the SAR. The SAR is an illuminating radar having special processing modifications in an image-formation processor to receive an echo from a remote transponder, after the transponder receives and retransmits the SAR illuminations, and to enhance the transponder`s echo relative to surrounding ground clutter by recognizing special transponder modulations from phase-shifted from the transponder retransmissions. The remote radio-frequency tag also transmits information to the SAR through a single antenna that also serves to receive the SAR illuminations. Unique tag-modulation and SAR signal processing techniques, in combination, allow the detection and precise geographical location of the tag through the reduction of interfering signals from ground clutter, and allow communication of environmental and status information from said tag to be communicated to said SAR. 4 figs.

  16. Biomedical signal acquisition with streaming wireless communication for recording evoked potentials.

    PubMed

    Thie, Johnson; Klistorner, Alexander; Graham, Stuart L

    2012-10-01

    Commercial electrophysiology systems for recording evoked potentials always connect patients to the acquisition unit via long wires. Wires guarantee timely transfer of signals for synchronization with the stimuli, but they are susceptible to electromagnetic and electrostatic interferences. Though wireless solutions are readily available (e.g. Bluetooth), they introduce high delay variability that will distort the evoked potential traces. We developed a complete wireless acquisition system with a fixed delay. The system supports up to 4 bipolar channels; each is amplified by 20,000× and digitized to 24 bits. The system incorporates the "driven-right-leg" circuit to lower the common noise. Data are continuously streamed using radio-frequency transmission operating at 915 MHz and then tagged with the stimulus SYNC signal at the receiver. The delay, noise level and transmission error rate were measured. Flash visual evoked potentials were recorded monocularly from both eyes of six adults with normal vision. The signals were acquired via wireless and wired transmissions simultaneously. The recording was repeated on some participants within 2 weeks. The delay was constant at 20 ms. The system noise was white and Gaussian (2 microvolts RMS). The transmission error rate was about one per million packets. The VEPs recorded with wireless transmission were consistent with those with wired transmission. The VEP amplitudes and shapes showed good intra-session and inter-session reproducibility and were consistent across eyes. The wireless acquisition system can reliably record visual evoked potentials. It has a constant delay of 20 ms and very low error rate.

  17. Unique portable signal acquisition/processing station

    SciTech Connect

    Garron, R.D.; Azevedo, S.G.

    1983-05-16

    At Lawrence Livermore National Laboratory, there are experimental applications requiring digital signal acquisition as well as data reduction and analysis. A prototype Signal Acquisition/Processing Station (SAPS) has been constructed and is currently undergoing tests. The system employs an LSI-11/23 computer with Data Translation analog-to-digital hardware. SAPS is housed in a roll-around cart which has been designed to withstand most subtle EMI/RFI environments. A user-friendly menu allows a user to access powerful data acquisition packages with a minimum of training. The software architecture of SAPS involves two operating systems, each being transparent to the user. Since this is a general purpose workstation with several units being utilized, an emphasis on low cost, reliability, and maintenance was stressed during conception and design. The system is targeted for mid-range frequency data acquisition; between a data logger and a transient digitizer.

  18. Analysis of biomedical signals by the lempel-Ziv complexity: the effect of finite data size.

    PubMed

    Hu, Jing; Gao, Jianbo; Principe, Jose C

    2006-12-01

    The Lempel-Ziv (LZ) complexity and its variants are popular metrics for characterizing biological signals. Proper interpretation of such analyses, however, has not been thoroughly addressed. In this letter, we study the the effect of finite data size. We derive analytic expressions for the LZ complexity for regular and random sequences, and employ them to develop a normalization scheme. To gain further understanding, we compare the LZ complexity with the correlation entropy from chaos theory in the context of epileptic seizure detection from EEG data, and discuss advantages of the normalized LZ complexity over the correlation entropy.

  19. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  20. A complete data and power telemetry system utilizing BPSK and LSK signaling for biomedical implants.

    PubMed

    Sonkusale, Sameer; Luo, Zhenying

    2008-01-01

    In this paper, a prototype of a telemetry system for battery-less biological implant is implemented, which demonstrates both wireless power delivery and duplex wireless data communication. BPSK (Binary Phase Shift Keying) modulation is used for the data transmission from the external controller to the implant and LSK (Load Shift Keying) modulation is used for the reverse data transmission from the implant to the external controller. Power is being delivered wirelessly to the implant through the energy contained in the incoming BPSK data signal. This implant system contains a novel single chip realization of low power BPSK demodulator architecture, which provides considerable power savings compared to prior art. The demodulator occupies 0.1mm(2) area and consumes 5mW power from a 3.3V power supply. A sensitive board level LSK receiver for data transmitted from implant to the external reader has been proposed. External BPSK transmitter consists of a class-E power amplifier that serves the dual purpose of a data transmitter and wireless power delivery. In summary, a very low power bidirectional power and data telemetry system for biological implants based on BPSK and LSK signaling is proposed.

  1. Efficient audio signal processing for embedded systems

    NASA Astrophysics Data System (ADS)

    Chiu, Leung Kin

    As mobile platforms continue to pack on more computational power, electronics manufacturers start to differentiate their products by enhancing the audio features. However, consumers also demand smaller devices that could operate for longer time, hence imposing design constraints. In this research, we investigate two design strategies that would allow us to efficiently process audio signals on embedded systems such as mobile phones and portable electronics. In the first strategy, we exploit properties of the human auditory system to process audio signals. We designed a sound enhancement algorithm to make piezoelectric loudspeakers sound ”richer" and "fuller." Piezoelectric speakers have a small form factor but exhibit poor response in the low-frequency region. In the algorithm, we combine psychoacoustic bass extension and dynamic range compression to improve the perceived bass coming out from the tiny speakers. We also developed an audio energy reduction algorithm for loudspeaker power management. The perceptually transparent algorithm extends the battery life of mobile devices and prevents thermal damage in speakers. This method is similar to audio compression algorithms, which encode audio signals in such a ways that the compression artifacts are not easily perceivable. Instead of reducing the storage space, however, we suppress the audio contents that are below the hearing threshold, therefore reducing the signal energy. In the second strategy, we use low-power analog circuits to process the signal before digitizing it. We designed an analog front-end for sound detection and implemented it on a field programmable analog array (FPAA). The system is an example of an analog-to-information converter. The sound classifier front-end can be used in a wide range of applications because programmable floating-gate transistors are employed to store classifier weights. Moreover, we incorporated a feature selection algorithm to simplify the analog front-end. A machine

  2. Empirical data on corpus design and usage in biomedical natural language processing

    PubMed Central

    Cohen, K. Bretonnel; Ogren, Philip V.; Fox, Lynne; Hunter, Lawrence

    2005-01-01

    This paper describes the designs of six publicly available biomedical corpora. We then present usage data for the six corpora. We show that corpora that are carefully annotated with respect to structural and linguistic characteristics and that are distributed in standard formats are more widely used than corpora that are not. These findings have implications for the design of the next generation of biomedical corpora. PMID:16779021

  3. Empirical data on corpus design and usage in biomedical natural language processing.

    PubMed

    Cohen, K Bretonnel; Fox, Lynne; Ogren, Philip V; Hunter, Lawrence

    2005-01-01

    This paper describes the design of six publicly available biomedical corpora. We then present usage data for the six corpora. We show that corpora that are carefully annotated with respect to structural and linguistic characteristics and that are distributed in standard formats are more widely used than corpora that are not. These findings have implications for the design of the next generation of biomedical corpora.

  4. Seismic signal processing on heterogeneous supercomputers

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas

    2015-04-01

    The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that

  5. A Wireless Biomedical Signal Interface System-on-Chip for Body Sensor Networks.

    PubMed

    Lei Wang; Guang-Zhong Yang; Jin Huang; Jinyong Zhang; Li Yu; Zedong Nie; Cumming, D R S

    2010-04-01

    Recent years have seen the rapid development of biosensor technology, system-on-chip design, wireless technology. and ubiquitous computing. When assembled into an autonomous body sensor network (BSN), the technologies become powerful tools in well-being monitoring, medical diagnostics, and personal connectivity. In this paper, we describe the first demonstration of a fully customized mixed-signal silicon chip that has most of the attributes required for use in a wearable or implantable BSN. Our intellectual-property blocks include low-power analog sensor interface for temperature and pH, a data multiplexing and conversion module, a digital platform based around an 8-b microcontroller, data encoding for spread-spectrum wireless transmission, and a RF section requiring very few off-chip components. The chip has been fully evaluated and tested by connection to external sensors, and it satisfied typical system requirements.

  6. Microstructure and Texture Evolutions of Biomedical Ti-13Nb-13Zr Alloy Processed by Hydrostatic Extrusion

    NASA Astrophysics Data System (ADS)

    Ozaltin, K.; Panigrahi, A.; Chrominski, W.; Bulutsuz, A. G.; Kulczyk, M.; Zehetbauer, M. J.; Lewandowska, M.

    2017-08-01

    A biomedical β-type Ti-13Nb-13Zr (TNZ) (wt pct) ternary alloy was subjected to severe plastic deformation by means of hydrostatic extrusion (HE) at room temperature without intermediate annealing. Its effect on microstructure, mechanical properties, phase transformations, and texture was investigated by light and electron microscopy, mechanical tests (Vickers microhardness and tensile tests), and XRD analysis. Microstructural investigations by light microscope and transmission electron microscope showed that, after HE, significant grain refinement took place, also reaching high dislocation densities. Increases in strength up to 50 pct occurred, although the elongation to fracture left after HE was almost 9 pct. Furthermore, Young's modulus of HE-processed samples showed slightly lower values than the initial state due to texture. Such mechanical properties combined with lower Young's modulus are favorable for medical applications. Phase transformation analyses demonstrated that both initial and extruded samples consist of α' and β phases but that the phase fraction of α' was slightly higher after two stages of HE.

  7. Principal Component Analysis in ECG Signal Processing

    NASA Astrophysics Data System (ADS)

    Castells, Francisco; Laguna, Pablo; Sörnmo, Leif; Bollmann, Andreas; Roig, José Millet

    2007-12-01

    This paper reviews the current status of principal component analysis in the area of ECG signal processing. The fundamentals of PCA are briefly described and the relationship between PCA and Karhunen-Loève transform is explained. Aspects on PCA related to data with temporal and spatial correlations are considered as adaptive estimation of principal components is. Several ECG applications are reviewed where PCA techniques have been successfully employed, including data compression, ST-T segment analysis for the detection of myocardial ischemia and abnormalities in ventricular repolarization, extraction of atrial fibrillatory waves for detailed characterization of atrial fibrillation, and analysis of body surface potential maps.

  8. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2014-03-31

    on a multimeter to ensure that the PMT remained within its linear operating regime. The AC-coupTed signal was demodulated and digitized in the SDR ...receiver. The I and Q samples obtained by"" the SDR are transferred over an Ethernet cable to a PC, where the data are processed in a custom LabVIEW...Q samples are generated by the SDR receiver and used to compute range on a PC. Ranging results from the FDR experiments and RangeFinder simulations

  9. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2012-03-30

    Progress Report (1/1/2012- 3 /30/2012) This document provides a progress report on the project "Advanced JDigital Signal Processing" covering the...period of 1/1/2012- 3 /30/2012. XCiV’So3>oTu^lDM William D. Jemison, Professor and Chair, PO Box 5720, Clarkson University, Potsdam, NY 13699-5720 315...z + Az. t) S/jY(z,t) d) Az ■^ Souri^f^) Figura 1, Single delay line canceter When applying this filtering approach to a turbid underwater

  10. Research on Superconductive Signal-Processing Devices.

    DTIC Science & Technology

    1984-11-30

    LABORATORY RESEARCH ON SUPERCONDUCTIVE SIGNAL-PROCESSING DEVICES ANNUAIL REPORT To THlE AIR FORCE OFF ICE O1P SCIENT [F[C RESEARCH ELE(;TRONICS ANI... J .,.-,.c.t n......... :.. u ll.. . . -1i1611tiC /.. TABLE OF CONTENTS Abstract iii 1.0 Introduction 1 2.0 Background 3 2.1 Summary of early program...desirable at the present time. 30 30 ............ j . . . ... .o.o. -....... ’’•."".-’,-.-.-............ -. . i 3.2.2 Extension of Time-Bandwidth

  11. Focal-plane architectures and signal processing

    NASA Astrophysics Data System (ADS)

    Jayadev, T. S.

    1991-11-01

    This paper discusses the relationship of focal plane architectures and signal processing functions currently used in infrared sensors. It then discusses the development of an algorithm derived from the models developed by biologists to explain the functions of insect eyes and the hardware realization of this algorithm using commercially available silicon chips. The conclusion of this study is that there are important lessons to be learned from the architecture of biological sensors, which may lead to new techniques in electro-optic sensor design.

  12. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2013-03-31

    project "Advanced Digital Signal Processing for Hybrid Lidar " covering the period of 1/1/2013-3/31/2013. 9LO\\SO^O’IH^’?’ William D. Jemison...Chaotic LIDAR for Naval Applications This document contains a Progress Summary for FY13 Q2 and a Short Work Statement for FY13 Progress Summary for...This technique has the potential to increase the unambiguous range of hybrid lidar -radar while maintaining reasonable range resolution. Proof-of

  13. Optical Computing and Nonlinear Optical Signal Processing

    NASA Astrophysics Data System (ADS)

    Peyghambarian, N.

    1987-01-01

    Employment of optical techniques in signal processing and communication and computing systems has become a major research and development effort at many industrial, government, and university laboratories across the nation and in Europe and Japan. implementation of optical computing concepts and the use of bistable etalons and non-linear logic devices in computing have gained a lot of support and enthusiasm from the optics community in recent years. The significance Iof this field and its potential importance in future technologies is evidenced by the large number of conferences, workshops, and special issues on the subject.

  14. Digital signal processing for radioactive decay studies

    SciTech Connect

    Miller, D.; Madurga, M.; Paulauskas, S. V.; Ackermann, D.; Heinz, S.; Hessberger, F. P.; Hofmann, S.; Grzywacz, R.; Miernik, K.; Rykaczewski, K.; Tan, H.

    2011-11-30

    The use of digital acquisition system has been instrumental in the investigation of proton and alpha emitting nuclei. Recent developments extend the sensitivity and breadth of the application. The digital signal processing capabilities, used predominately by UT/ORNL for decay studies, include digitizers with decreased dead time, increased sampling rates, and new innovative firmware. Digital techniques and these improvements are furthermore applicable to a range of detector systems. Improvements in experimental sensitivity for alpha and beta-delayed neutron emitters measurements as well as the next generation of superheavy experiments are discussed.

  15. Three-dimensional image signals: processing methods

    NASA Astrophysics Data System (ADS)

    Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru

    2010-11-01

    Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.

  16. Inertial processing of vestibulo-ocular signals

    NASA Technical Reports Server (NTRS)

    Hess, B. J.; Angelaki, D. E.

    1999-01-01

    New evidence for a central resolution of gravito-inertial signals has been recently obtained by analyzing the properties of the vestibulo-ocular reflex (VOR) in response to combined lateral translations and roll tilts of the head. It is found that the VOR generates robust compensatory horizontal eye movements independent of whether or not the interaural translatory acceleration component is canceled out by a gravitational acceleration component due to simultaneous roll-tilt. This response property of the VOR depends on functional semicircular canals, suggesting that the brain uses both otolith and semicircular canal signals to estimate head motion relative to inertial space. Vestibular information about dynamic head attitude relative to gravity is the basis for computing head (and body) angular velocity relative to inertial space. Available evidence suggests that the inertial vestibular system controls both head attitude and velocity with respect to a gravity-centered reference frame. The basic computational principles underlying the inertial processing of otolith and semicircular canal afferent signals are outlined.

  17. Inertial processing of vestibulo-ocular signals

    NASA Technical Reports Server (NTRS)

    Hess, B. J.; Angelaki, D. E.

    1999-01-01

    New evidence for a central resolution of gravito-inertial signals has been recently obtained by analyzing the properties of the vestibulo-ocular reflex (VOR) in response to combined lateral translations and roll tilts of the head. It is found that the VOR generates robust compensatory horizontal eye movements independent of whether or not the interaural translatory acceleration component is canceled out by a gravitational acceleration component due to simultaneous roll-tilt. This response property of the VOR depends on functional semicircular canals, suggesting that the brain uses both otolith and semicircular canal signals to estimate head motion relative to inertial space. Vestibular information about dynamic head attitude relative to gravity is the basis for computing head (and body) angular velocity relative to inertial space. Available evidence suggests that the inertial vestibular system controls both head attitude and velocity with respect to a gravity-centered reference frame. The basic computational principles underlying the inertial processing of otolith and semicircular canal afferent signals are outlined.

  18. Biomedical ground lead system

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design and verification tests for the biomedical ground lead system of Apollo biomedical monitors are presented. Major efforts were made to provide a low impedance path to ground, reduce noise and artifact of ECG signals, and limit the current flowing in the ground electrode of the system.

  19. Pre-earthquake signals: Underlying physical processes

    NASA Astrophysics Data System (ADS)

    Freund, Friedemann

    2011-06-01

    Prior to large earthquakes the Earth sends out transient signals, sometimes strong, more often subtle and fleeting. These signals may consist of local magnetic field variations, electromagnetic emissions over a wide range of frequencies, a variety of atmospheric and ionospheric phenomena. Great uncertainty exists as to the nature of the processes that could produce such signals, both inside the Earth's crust and at the surface. The absence of a comprehensive physical mechanism has led to a patchwork of explanations, which are not internally consistent. The recognition that most crustal rocks contain dormant electronic charge carriers in the form of peroxy defects, OSi/⧹SiO, holds the key to a deeper understanding of these pre-earthquake signals from a solid state physics perspective. When rocks are stressed, peroxy links break, releasing electronic charge carriers, h ·, known as positive holes. The positive holes are highly mobile and can flow out of the stressed subvolume. The situation is similar to that in a battery. The h · outflow is possible when the battery circuit closes. The h · outflow constitutes an electric current, which generates magnetic field variations and low frequency EM emissions. When the positive holes arrive at the Earth's surface, they lead to ionization of air at the ground-air interface. Under certain conditions corona discharges occur, which cause RF emission. The upward expansion of ionized air may be the reason for perturbations in the ionosphere. Recombination of h · charge carriers at the surface leads to a spectroscopically distinct, non-thermal IR emission.

  20. Advances in biomedical engineering and biotechnology during 2013-2014.

    PubMed

    Liu, Feng; Wang, Ying; Burkhart, Timothy A; González Penedo, Manuel Francisco; Ma, Shaodong

    2014-01-01

    The 3rd International Conference on Biomedical Engineering and Biotechnology (iCBEB 2014), held in Beijing from the 25th to the 28th of September 2014, is an annual conference that intends to provide an opportunity for researchers and practitioners around the world to present the most recent advances and future challenges in the fields of biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, amongst others. The papers published in this issue are selected from this conference, which witnesses the advances in biomedical engineering and biotechnology during 2013-2014.

  1. Computational problems and signal processing in SETI

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. K.; Stauduhar, Richard

    1991-01-01

    The Search for Extraterrestrial Intelligence (SETI), currently being planned at NASA, will require that an enormous amount of data (on the order of 10 exp 11 distinct signal paths for a typical observation) be analyzed in real time by special-purpose hardware. Even though the SETI system design is not based on maximum entropy and Bayesian methods (partly due to the real-time processing constraint), it is expected that enough data will be saved to be able to apply these and other methods off line where computational complexity is not an overriding issue. Interesting computational problems that relate directly to the system design for processing such an enormous amount of data have emerged. Some of these problems are discussed, along with the current status on their solution.

  2. Computational problems and signal processing in SETI

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. K.; Stauduhar, Richard

    1991-01-01

    The Search for Extraterrestrial Intelligence (SETI), currently being planned at NASA, will require that an enormous amount of data (on the order of 10 exp 11 distinct signal paths for a typical observation) be analyzed in real time by special-purpose hardware. Even though the SETI system design is not based on maximum entropy and Bayesian methods (partly due to the real-time processing constraint), it is expected that enough data will be saved to be able to apply these and other methods off line where computational complexity is not an overriding issue. Interesting computational problems that relate directly to the system design for processing such an enormous amount of data have emerged. Some of these problems are discussed, along with the current status on their solution.

  3. Signal processing for fiber optic gyroscope (FOG)

    NASA Astrophysics Data System (ADS)

    Tanaka, Ryuichi; Kurokawa, Akihiro; Sato, Yoshiyuki; Magome, Tsutomu; Hayakawa, Yoshiaki; Nakatani, Ichiro; Kawaguchi, Junichiro

    1994-11-01

    A fiber-optic gyroscope (FOG) is expected to be the next generation gyroscope for guidance and control, because of various advantages. We have been developing the FOG-Inertial Navigation and Guidance (ING) for M-V satellite launching rocket of the Institute of Space and Astronautical Science (ISAS) since 1990. The FOG-ING consists of an Inertial Measurement Unit (IMU) and an Central Processing Unit Assembly. At current status, the proto-flight model FOG-IMU is being actively developed. And the flight test of the FOG-ING was performed on February 20, 1993, aboard M-3SII-7 satellite launching rocket at the ISAS test facilities in Uchinoura, Japan. This paper presents the signal processing technologies of our FOG which are used for the above FOG-ING.

  4. Signal processing and analyzing works of art

    NASA Astrophysics Data System (ADS)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  5. Regulation of cell behavior and tissue patterning by bioelectrical signals: challenges and opportunities for biomedical engineering.

    PubMed

    Levin, Michael; Stevenson, Claire G

    2012-01-01

    Achieving control over cell behavior and pattern formation requires molecular-level understanding of regulatory mechanisms. Alongside transcriptional networks and biochemical gradients, there functions an important system of cellular communication and control: transmembrane voltage gradients (V(mem)). Bioelectrical signals encoded in spatiotemporal changes of V(mem) control cell proliferation, migration, and differentiation. Moreover, endogenous bioelectrical gradients serve as instructive cues mediating anatomical polarity and other organ-level aspects of morphogenesis. In the past decade, significant advances in molecular physiology have enabled the development of new genetic and biophysical tools for the investigation and functional manipulation of bioelectric cues. Recent data implicate V(mem) as a crucial epigenetic regulator of patterning events in embryogenesis, regeneration, and cancer. We review new conceptual and methodological developments in this fascinating field. Bioelectricity offers a novel way of quantitatively understanding regulation of growth and form in vivo, and it reveals tractable, powerful control points that will enable truly transformative applications in bioengineering, regenerative medicine, and synthetic biology.

  6. Analysis of biomedical time signals for characterization of cutaneous diabetic micro-angiopathy

    NASA Astrophysics Data System (ADS)

    Kraitl, Jens; Ewald, Hartmut

    2007-02-01

    Photo-plethysmography (PPG) is frequently used in research on microcirculation of blood. It is a non-invasive procedure and takes minimal time to be carried out. Usually PPG time series are analyzed by conventional linear methods, mainly Fourier analysis. These methods may not be optimal for the investigation of nonlinear effects of the hearth circulation system like vasomotion, autoregulation, thermoregulation, breathing, heartbeat and vessels. The wavelet analysis of the PPG time series is a specific, sensitive nonlinear method for the in vivo identification of hearth circulation patterns and human health status. This nonlinear analysis of PPG signals provides additional information which cannot be detected using conventional approaches. The wavelet analysis has been used to study healthy subjects and to characterize the health status of patients with a functional cutaneous microangiopathy which was associated with diabetic neuropathy. The non-invasive in vivo method is based on the radiation of monochromatic light through an area of skin on the finger. A Photometrical Measurement Device (PMD) has been developed. The PMD is suitable for non-invasive continuous online monitoring of one or more biologic constituent values and blood circulation patterns.

  7. GLAST Burst Monitor Signal Processing System

    SciTech Connect

    Bhat, P. Narayana; Briggs, Michael; Connaughton, Valerie; Paciesas, William; Preece, Robert; Diehl, Roland; Greiner, Jochen; Kienlin, Andreas von; Lichti, Giselher; Steinle, Helmut; Fishman, Gerald; Kouveliotou, Chryssa; Meegan, Charles; Wilson-Hodge, Colleen; Kippen, R. Marc; Persyn, Steven

    2007-07-12

    The onboard Data Processing Unit (DPU), designed and built by Southwest Research Institute, performs the high-speed data acquisition for GBM. The analog signals from each of the 14 detectors are digitized by high-speed multichannel analog data acquisition architecture. The streaming digital values resulting from a periodic (period of 104.2 ns) sampling of the analog signal by the individual ADCs are fed to a Field-Programmable Gate Array (FPGA). Real-time Digital Signal Processing (DSP) algorithms within the FPGA implement functions like filtering, thresholding, time delay and pulse height measurement. The spectral data with a 12-bit resolution are formatted according to the commandable look-up-table (LUT) and then sent to the High-Speed Science-Date Bus (HSSDB, speed=1.5 MB/s) to be telemetered to ground. The DSP offers a novel feature of a commandable and constant event deadtime. The ADC non-linearities have been calibrated so that the spectral data can be corrected during analysis. The best temporal resolution is 2 {mu}s for the pre-burst and post-trigger time-tagged events (TTE) data. The time resolution of the binned data types is commandable from 64 msec to 1.024 s for the CTIME data (8 channel spectral resolution) and 1.024 to 32.768 s for the CSPEC data (128 channel spectral resolution). The pulse pile-up effects have been studied by Monte Carlo simulations. For a typical GRB, the possible shift in the Epeak value at high-count rates ({approx}100 kHz) is {approx}1% while the change in the single power-law index could be up to 5%.

  8. GLAST Burst Monitor Signal Processing System

    NASA Astrophysics Data System (ADS)

    Bhat, P. Narayana; Briggs, Michael; Connaughton, Valerie; Diehl, Roland; Fishman, Gerald; Greiner, Jochen; Kippen, R. Marc; von Kienlin, Andreas; Kouveliotou, Chryssa; Lichti, Giselher; Meegan, Charles; Paciesas, William; Persyn, Steven; Preece, Robert; Steinle, Helmut; Wilson-Hodge, Colleen

    2007-07-01

    The onboard Data Processing Unit (DPU), designed and built by Southwest Research Institute, performs the high-speed data acquisition for GBM. The analog signals from each of the 14 detectors are digitized by high-speed multichannel analog data acquisition architecture. The streaming digital values resulting from a periodic (period of 104.2 ns) sampling of the analog signal by the individual ADCs are fed to a Field-Programmable Gate Array (FPGA). Real-time Digital Signal Processing (DSP) algorithms within the FPGA implement functions like filtering, thresholding, time delay and pulse height measurement. The spectral data with a 12-bit resolution are formatted according to the commandable look-up-table (LUT) and then sent to the High-Speed Science-Date Bus (HSSDB, speed=1.5 MB/s) to be telemetered to ground. The DSP offers a novel feature of a commandable & constant event deadtime. The ADC non-linearities have been calibrated so that the spectral data can be corrected during analysis. The best temporal resolution is 2 μs for the pre-burst & post-trigger time-tagged events (TTE) data. The time resolution of the binned data types is commandable from 64 msec to 1.024 s for the CTIME data (8 channel spectral resolution) and 1.024 to 32.768 s for the CSPEC data (128 channel spectral resolution). The pulse pile-up effects have been studied by Monte Carlo simulations. For a typical GRB, the possible shift in the Epeak value at high-count rates (~100 kHz) is ~1% while the change in the single power-law index could be up to 5%.

  9. Digital Signal Processing in the GRETINA Spectrometer

    NASA Astrophysics Data System (ADS)

    Cromaz, Mario

    2015-10-01

    Developments in the segmentation of large-volume HPGe crystals has enabled the development of high-efficiency gamma-ray spectrometers which have the ability to track the path of gamma-rays scattering through the detector volume. This technology has been successfully implemented in the GRETINA spectrometer whose high efficiency and ability to perform precise event-by-event Doppler correction has made it an important tool in nuclear spectroscopy. Tracking has required the spectrometer to employ a fully digital signal processing chain. Each of the systems 1120 channels are digitized by 100 Mhz, 14-bit flash ADCs. Filters that provide timing and high-resolution energies are implemented on local FPGAs acting on the ADC data streams while interaction point locations and tracks, derived from the trace on each detector segment, are calculated in real time on a computing cluster. In this presentation we will give a description of GRETINA's digital signal processing system, the impact of design decisions on system performance, and a discussion of possible future directions as we look towards soon developing larger spectrometers such as GRETA with full 4 π solid angle coverage. This work was supported by the Office of Science in the Department of Energy under grant DE-AC02-05CH11231.

  10. Biomedical ultrasonoscope

    NASA Technical Reports Server (NTRS)

    Lee, R. D. (Inventor)

    1979-01-01

    The combination of a "C" mode scan electronics in a portable, battery powered biomedical ultrasonoscope having "A" and "M" mode scan electronics, the latter including a clock generator for generating clock pulses, a cathode ray tube having X, Y and Z axis inputs, a sweep generator connected between the clock generator and the X axis input of the cathode ray tube for generating a cathode ray sweep signal synchronized by the clock pulses, and a receiver adapted to be connected to the Z axis input of the cathode ray tube. The "C" mode scan electronics comprises a plurality of transducer elements arranged in a row and adapted to be positioned on the skin of the patient's body for converting a pulsed electrical signal to a pulsed ultrasonic signal, radiating the ultrasonic signal into the patient's body, picking up the echoes reflected from interfaces in the patient's body and converting the echoes to electrical signals; a plurality of transmitters, each transmitter being coupled to a respective transducer for transmitting a pulsed electrical signal thereto and for transmitting the converted electrical echo signals directly to the receiver, a sequencer connected between the clock generator and the plurality of transmitters and responsive to the clock pulses for firing the transmitters in cyclic order; and a staircase voltage generator connected between the clock generator and the Y axis input of the cathode ray tube for generating a staircase voltage having steps synchronized by the clock pulses.

  11. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  12. Measurement of OH, NO, O and N atoms in helium plasma jet for ROS/RNS controlled biomedical processes

    NASA Astrophysics Data System (ADS)

    Yonemori, Seiya; Kamakura, Taku; Ono, Ryo

    2014-10-01

    Atmospheric-pressure plasmas are of emerging interest for new plasma applications such as cancer treatment, cell activation and sterilization. In those biomedical processes, reactive oxygen/nitrogen species (ROS/RNS) are said that they play significant role. It is though that active species give oxidative stress and induce biomedical reactions. In this study, we measured OH, NO, O and N atoms using laser induced fluorescence (LIF) measurement and found that voltage polarity affect particular ROS. When negative high voltage was applied to the plasma jet, O atom density was tripled compared to the case of positive applied voltage. In that case, O atom density was around 3 × 1015 [cm-3] at maximum. In contrast, OH and NO density did not change their density depending on the polarity of applied voltage, measured as in order of 1013 and 1014 [cm-3] at maximum, respectively. From ICCD imaging measurement, it could be seen that negative high voltage enhanced secondary emission in plasma bullet propagation and it can affect the effective production of particular ROS. Since ROS/RNS dose can be a quantitative criterion to control plasma biomedical application, those measurement results is able to be applied for in vivo and in vitro plasma biomedical experiments. This study is supported by the Grant-in-Aid for Science Research by the Ministry of Education, Culture, Sport, Science and Technology.

  13. Active voltammetric microsensors with neural signal processing.

    SciTech Connect

    Vogt, M. C.

    1998-12-11

    Many industrial and environmental processes, including bioremediation, would benefit from the feedback and control information provided by a local multi-analyte chemical sensor. For most processes, such a sensor would need to be rugged enough to be placed in situ for long-term remote monitoring, and inexpensive enough to be fielded in useful numbers. The multi-analyte capability is difficult to obtain from common passive sensors, but can be provided by an active device that produces a spectrum-type response. Such new active gas microsensor technology has been developed at Argonne National Laboratory. The technology couples an electrocatalytic ceramic-metallic (cermet) microsensor with a voltammetric measurement technique and advanced neural signal processing. It has been demonstrated to be flexible, rugged, and very economical to produce and deploy. Both narrow interest detectors and wide spectrum instruments have been developed around this technology. Much of this technology's strength lies in the active measurement technique employed. The technique involves applying voltammetry to a miniature electrocatalytic cell to produce unique chemical ''signatures'' from the analytes. These signatures are processed with neural pattern recognition algorithms to identify and quantify the components in the analyte. The neural signal processing allows for innovative sampling and analysis strategies to be employed with the microsensor. In most situations, the whole response signature from the voltammogram can be used to identify, classify, and quantify an analyte, without dissecting it into component parts. This allows an instrument to be calibrated once for a specific gas or mixture of gases by simple exposure to a multi-component standard rather than by a series of individual gases. The sampled unknown analytes can vary in composition or in concentration, the calibration, sensing, and processing methods of these active voltammetric microsensors can detect, recognize, and

  14. Bio-swarm-pipeline: a light-weight, extensible batch processing system for efficient biomedical data processing.

    PubMed

    Cheng, Xi; Pizarro, Ricardo; Tong, Yunxia; Zoltick, Brad; Luo, Qian; Weinberger, Daniel R; Mattay, Venkata S

    2009-01-01

    A streamlined scientific workflow system that can track the details of the data processing history is critical for the efficient handling of fundamental routines used in scientific research. In the scientific workflow research community, the information that describes the details of data processing history is referred to as "provenance" which plays an important role in most of the existing workflow management systems. Despite its importance, however, provenance modeling and management is still a relatively new area in the scientific workflow research community. The proper scope, representation, granularity and implementation of a provenance model can vary from domain to domain and pose a number of challenges for an efficient pipeline design. This paper provides a case study on structured provenance modeling and management problems in the neuroimaging domain by introducing the Bio-Swarm-Pipeline. This new model, which is evaluated in the paper through real world scenarios, systematically addresses the provenance scope, representation, granularity, and implementation issues related to the neuroimaging domain. Although this model stems from applications in neuroimaging, the system can potentially be adapted to a wide range of bio-medical application scenarios.

  15. Bio-Swarm-Pipeline: A Light-Weight, Extensible Batch Processing System for Efficient Biomedical Data Processing

    PubMed Central

    Cheng, Xi; Pizarro, Ricardo; Tong, Yunxia; Zoltick, Brad; Luo, Qian; Weinberger, Daniel R.; Mattay, Venkata S.

    2009-01-01

    A streamlined scientific workflow system that can track the details of the data processing history is critical for the efficient handling of fundamental routines used in scientific research. In the scientific workflow research community, the information that describes the details of data processing history is referred to as “provenance” which plays an important role in most of the existing workflow management systems. Despite its importance, however, provenance modeling and management is still a relatively new area in the scientific workflow research community. The proper scope, representation, granularity and implementation of a provenance model can vary from domain to domain and pose a number of challenges for an efficient pipeline design. This paper provides a case study on structured provenance modeling and management problems in the neuroimaging domain by introducing the Bio-Swarm-Pipeline. This new model, which is evaluated in the paper through real world scenarios, systematically addresses the provenance scope, representation, granularity, and implementation issues related to the neuroimaging domain. Although this model stems from applications in neuroimaging, the system can potentially be adapted to a wide range of bio-medical application scenarios. PMID:19847314

  16. Regulation of STAT signalling by proteolytic processing.

    PubMed

    Hendry, Lisa; John, Susan

    2004-12-01

    Interaction of cytokines with their cognate receptors leads to the activation of latent transcription factors, the signal transducer and activator of transcription (STAT) proteins. Numerous studies have identified the critical roles played by STAT proteins in regulating cell proliferation, differentiation and survival. Consequently, the activity of STAT proteins is negatively regulated by a variety of different mechanisms, which include alternative splicing, covalent modifications, protein-protein interactions with negative regulatory proteins and proteolytic processing by proteases. Cleavage of STAT proteins by proteases results in the generation of C-terminally truncated proteins, called STATgamma, which lack the transactivation domain and behave as functional dominant-negative proteins. Currently, STATgamma isoforms have been identified for Stat3, Stat5a, Stat5b and Stat6 in different cellular contexts and biological processes. Evidence is mounting for the role of as yet unidentified serine proteases in the proteolytic processing of STAT proteins, although at least one cysteine protease, calpain is also known to cleave these STATs in platelets and mast cells. Recently, studies of acute myeloid leukaemia and cutaneous T cell lymphoma patients have revealed important roles for the aberrant expression of Stat3gamma and Stat5gamma proteins in the pathology of these diseases. Together, these findings indicate that proteolytic processing is an important mechanism in the regulation of STAT protein biological activity and provides a fertile area for future studies.

  17. The interaction of domain knowledge and linguistic structure in natural language processing: interpreting hypernymic propositions in biomedical text.

    PubMed

    Rindflesch, Thomas C; Fiszman, Marcelo

    2003-12-01

    Interpretation of semantic propositions in free-text documents such as MEDLINE citations would provide valuable support for biomedical applications, and several approaches to semantic interpretation are being pursued in the biomedical informatics community. In this paper, we describe a methodology for interpreting linguistic structures that encode hypernymic propositions, in which a more specific concept is in a taxonomic relationship with a more general concept. In order to effectively process these constructions, we exploit underspecified syntactic analysis and structured domain knowledge from the Unified Medical Language System (UMLS). After introducing the syntactic processing on which our system depends, we focus on the UMLS knowledge that supports interpretation of hypernymic propositions. We first use semantic groups from the Semantic Network to ensure that the two concepts involved are compatible; hierarchical information in the Metathesaurus then determines which concept is more general and which more specific. A preliminary evaluation of a sample based on the semantic group Chemicals and Drugs provides 83% precision. An error analysis was conducted and potential solutions to the problems encountered are presented. The research discussed here serves as a paradigm for investigating the interaction between domain knowledge and linguistic structure in natural language processing, and could also make a contribution to research on automatic processing of discourse structure. Additional implications of the system we present include its integration in advanced semantic interpretation processors for biomedical text and its use for information extraction in specific domains. The approach has the potential to support a range of applications, including information retrieval and ontology engineering.

  18. Ultrasonic signal processing and tissue characterization

    NASA Astrophysics Data System (ADS)

    Mu, Zhiping

    Ultrasound imaging has become one of the most widely used diagnostic tools in medicine. While it has advantages, compared with other modalities, in terms of safety, low-cost, accessibility, portability and capability of real-time imaging, it has limitations. One of the major disadvantages of ultrasound imaging is the relatively low image quality, especially the low signal-to-noise ratio (SNR) and the low spatial resolution. Part of this dissertation is dedicated to the development of digital ultrasound signal and image processing methods to improve ultrasound image quality. Conventional B-mode ultrasound systems display the demodulated signals, i.e., the envelopes, in the images. In this dissertation, I introduce the envelope matched quadrature filtering (EMQF) technique, which is a novel demodulation technique generating optimal performance in envelope detection. In ultrasonography, the echo signals are the results of the convolution of the pulses and the medium responses, and the finite pulse length is a major source of the degradation of the image resolution. Based on the more appropriate complex-valued medium response assumption rather than the real-valued assumption used by many researchers, a nonparametric iterative deconvolution method, the Least Squares method with Point Count regularization (LSPC), is proposed. This method was tested using simulated and experimental data, and has produced excellent results showing significant improvements in resolution. During the past two decades, ultrasound tissue characterization (UTC) has emerged as an active research field and shown potentials of applications in a variety of clinical areas. Particularly interesting to me is a group of methods characterizing the scatterer spatial distribution. For resolvable regular structures, a deconvolution based method is proposed to estimate parameters characterizing such structures, including mean scatterer spacing, and has demonstrated superior performance when compared to

  19. System for monitoring non-coincident, nonstationary process signals

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.

    2005-01-04

    An improved system for monitoring non-coincident, non-stationary, process signals. The mean, variance, and length of a reference signal is defined by an automated system, followed by the identification of the leading and falling edges of a monitored signal and the length of the monitored signal. The monitored signal is compared to the reference signal, and the monitored signal is resampled in accordance with the reference signal. The reference signal is then correlated with the resampled monitored signal such that the reference signal and the resampled monitored signal are coincident in time with each other. The resampled monitored signal is then compared to the reference signal to determine whether the resampled monitored signal is within a set of predesignated operating conditions.

  20. A Novel Approach for Adaptive Signal Processing

    NASA Technical Reports Server (NTRS)

    Chen, Ya-Chin; Juang, Jer-Nan

    1998-01-01

    Adaptive linear predictors have been used extensively in practice in a wide variety of forms. In the main, their theoretical development is based upon the assumption of stationarity of the signals involved, particularly with respect to the second order statistics. On this basis, the well-known normal equations can be formulated. If high- order statistical stationarity is assumed, then the equivalent normal equations involve high-order signal moments. In either case, the cross moments (second or higher) are needed. This renders the adaptive prediction procedure non-blind. A novel procedure for blind adaptive prediction has been proposed and considerable implementation has been made in our contributions in the past year. The approach is based upon a suitable interpretation of blind equalization methods that satisfy the constant modulus property and offers significant deviations from the standard prediction methods. These blind adaptive algorithms are derived by formulating Lagrange equivalents from mechanisms of constrained optimization. In this report, other new update algorithms are derived from the fundamental concepts of advanced system identification to carry out the proposed blind adaptive prediction. The results of the work can be extended to a number of control-related problems, such as disturbance identification. The basic principles are outlined in this report and differences from other existing methods are discussed. The applications implemented are speech processing, such as coding and synthesis. Simulations are included to verify the novel modelling method.

  1. Nonlinear biochemical signal processing via noise propagation

    NASA Astrophysics Data System (ADS)

    Kim, Kyung Hyuk; Qian, Hong; Sauro, Herbert M.

    2013-10-01

    Single-cell studies often show significant phenotypic variability due to the stochastic nature of intra-cellular biochemical reactions. When the numbers of molecules, e.g., transcription factors and regulatory enzymes, are in low abundance, fluctuations in biochemical activities become significant and such "noise" can propagate through regulatory cascades in terms of biochemical reaction networks. Here we develop an intuitive, yet fully quantitative method for analyzing how noise affects cellular phenotypes based on identifying a system's nonlinearities and noise propagations. We observe that such noise can simultaneously enhance sensitivities in one behavioral region while reducing sensitivities in another. Employing this novel phenomenon we designed three biochemical signal processing modules: (a) A gene regulatory network that acts as a concentration detector with both enhanced amplitude and sensitivity. (b) A non-cooperative positive feedback system, with a graded dose-response in the deterministic case, that serves as a bistable switch due to noise-induced ultra-sensitivity. (c) A noise-induced linear amplifier for gene regulation that requires no feedback. The methods developed in the present work allow one to understand and engineer nonlinear biochemical signal processors based on fluctuation-induced phenotypes.

  2. Single-step colloidal processing of stable aqueous dispersions of ferroelectric nanoparticles for biomedical imaging

    NASA Astrophysics Data System (ADS)

    Zribi, Olena; Garbovskiy, Yuriy; Glushchenko, Anatoliy

    2014-12-01

    The biomedical applications of ferroelectric nanoparticles rely on the production of stable aqueous colloids. We report an implementation of the high energy ball milling method to produce and disperse ultrafine BaTiO3 nanoparticles in an aqueous media in a single step. This technique is low-cost, environmentally friendly and has the capability to control nanoparticle size and functionality with milling parameters. As a result, ultrafine nanoparticles with sizes as small as 6 nm can be produced. These nanoparticles maintain ferroelectricity and can be used as second harmonic generating nanoprobes for biomedical imaging. This technique can be generalized to produce aqueous nanoparticle colloids of other imaging materials.

  3. Signal processing of aircraft flyover noise

    NASA Technical Reports Server (NTRS)

    Kelly, Jeffrey J.

    1991-01-01

    A detailed analysis of signal processing concerns for measuring aircraft flyover noise is presented. Development of a de-Dopplerization scheme for both corrected time history and spectral data is discussed along with an analysis of motion effects on measured spectra. A computer code was written to implement the de-Dopplerization scheme. Input to the code is the aircraft position data and the pressure time histories. To facilitate ensemble averaging, a uniform level flyover is considered but the code can accept more general flight profiles. The effects of spectral smearing and its removal is discussed. Using data acquired from XV-15 tilt rotor flyover test comparisons are made showing the measured and corrected spectra. Frequency shifts are accurately accounted for by the method. It is shown that correcting for spherical spreading, Doppler amplitude, and frequency can give some idea about source directivity. The analysis indicated that smearing increases with frequency and is more severe on approach than recession.

  4. Ultrasound perfusion signal processing for tumor detection

    NASA Astrophysics Data System (ADS)

    Kim, MinWoo; Abbey, Craig K.; Insana, Michael F.

    2016-04-01

    Enhanced blood perfusion in a tissue mass is an indication of neo-vascularity and a sign of a potential malignancy. Ultrasonic pulsed-Doppler imaging is a preferred modality for noninvasive monitoring of blood flow. However, the weak blood echoes and disorganized slow flow make it difficult to detect perfusion using standard methods without the expense and risk of contrast enhancement. Our research measures the efficiency of conventional power-Doppler (PD) methods at discriminating flow states by comparing measurement performance to that of an ideal discriminator. ROC analysis applied to the experimental results shows that power Doppler methods are just 30-50 % efficient at perfusion flows less than 1ml/min, suggesting an opportunity to improve perfusion assessment through signal processing. A new perfusion estimator is proposed by extending the statistical discriminator approach. We show that 2-D perfusion color imaging may be enhanced using this approach.

  5. Modulation and detection of the THz range signals using the highest harmonics of the fundamental frequency of the superlattice-based generator for biomedical applications

    NASA Astrophysics Data System (ADS)

    Makarov, Vladimir V.; Maksimenko, Vladimir A.; Ponomarenko, Vladimir I.; Khramova, Marina V.; Pavlov, Alexey N.; Prokhorov, Mikhail D.; Karavaev, Anatoly S.

    2016-04-01

    The data transmission method using the highest harmonics of semiconductor superlattice-based microwave generator has been proposed for biomedical applications. Semiconductor superlattice operated in charge domain formation regime is characterized by the rich high-harmonics power spectrum. The numerical modeling of modulation and detection of the THz range signals using the highest harmonics of the fundamental frequency of the superlattice-based generator was carried out. We have shown effectiveness of the proposed method and discussed the possible applications.

  6. Detection and Processing Techniques of FECG Signal for Fetal Monitoring

    PubMed Central

    2009-01-01

    Fetal electrocardiogram (FECG) signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system. PMID:19495912

  7. Gas phase laser synthesis and processing of calcium phosphate nanoparticles for biomedical applications

    NASA Astrophysics Data System (ADS)

    Bapat, Parimal V.

    Biochemical processes make pervasive use of calcium and phosphate ions. Calcium phosphate salts that are naturally nontoxic and bioactive have been used for several medical applications in form of coatings and micropowders. Nanoparticle-based calcium phosphates have been shown to be internalized by living cells and be effective in DNA transfection, drug delivery, and transport of fluorophores for imaging of intracellular processes. They are also expected to interact strongly with cell adhesive proteins and are therefore promising elements in approaches to mimic the complex environment of the extra cellular matrix of bone. Harnessing this biomedical potential requires the ability to control the numerous characteristics of nanophase calcium phosphates that affect biological response, including nanoparticle chemical composition, crystal phase, crystallinity, crystallographic orientation of exposed faces, size, shape, surface area, number concentration, and degree of aggregation. This dissertation focuses on the use of laser-induced gas-phase synthesis for creation of calcium phosphate nanoparticles, and corresponding nanoparticle-based substrates that could offer new opportunities for guiding biological responses through well-controlled biochemical and topological cues. Gas-phase synthesis of nanoparticles has several characteristics that could enhance control over particle morphology, crystallinity, and surface area, compared to liquid-phase techniques. Synthesis from gas-phase precursors can be carried out at high temperatures and in high-purity inert or reactive gas backgrounds, enabling good control of chemistry, crystal structure, and purity. Moreover, the particle mean free path and number concentration can be controlled independently. This allows regulation of interparticle collision rates, which can be adjusted to limit aggregation. High-temperature synthesis of well-separated particles is therefore possible. In this work high power lasers are employed to

  8. Anodization of magnesium for biomedical applications - Processing, characterization, degradation and cytocompatibility.

    PubMed

    Cipriano, Aaron F; Lin, Jiajia; Miller, Christopher; Lin, Alan; Cortez Alcaraz, Mayra C; Soria, Pedro; Liu, Huinan

    2017-08-14

    using KOH electrolyte for modifying the surface of Mg-based materials, and the resulted surface, degradation, and biological properties for biomedical applications. This study reported critical considerations that are important for repeatability of anodization process, homogeneity of surface microstructure and composition, and in vitro evaluations of the degradation and biological properties of surface treated Mg samples. The details in preparation of electrodes, anodization setup, annealing, and sample handling before and after surface treatment (e.g. re-embedding) reported in this article are valuable for studying a variety of electrochemical processes for surface treatment of Mg-based metals, because of enhanced reproducibility. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  9. Tunable signal processing through modular control of transcription factor translocation.

    PubMed

    Hao, Nan; Budnik, Bogdan A; Gunawardena, Jeremy; O'Shea, Erin K

    2013-01-25

    Signaling pathways can induce different dynamics of transcription factor (TF) activation. We explored how TFs process signaling inputs to generate diverse dynamic responses. The budding yeast general stress-responsive TF Msn2 acted as a tunable signal processor that could track, filter, or integrate signals in an input-dependent manner. This tunable signal processing appears to originate from dual regulation of both nuclear import and export by phosphorylation, as mutants with one form of regulation sustained only one signal-processing function. Versatile signal processing by Msn2 is crucial for generating distinct dynamic responses to different natural stresses. Our findings reveal how complex signal-processing functions are integrated into a single molecule and provide a guide for the design of TFs with "programmable" signal-processing functions.

  10. On the influence of time-series length in EMD to extract frequency content: simulations and models in biomedical signals.

    PubMed

    Fonseca-Pinto, R; Ducla-Soares, J L; Araújo, F; Aguiar, P; Andrade, A

    2009-07-01

    In this paper, fractional Gaussian noise (fGn) was used to simulate a homogeneously spreading broadband signal without any dominant frequency band, and to perform a simulation study about the influence of time-series length in the number of intrinsic mode functions (IMFs) obtained after empirical mode decomposition (EMD). In this context three models are presented. The first two models depend on the Hurst exponent H, and the last one is designed for small data lengths, in which the number of IMFs after EMD is obtained based on the regularity of the signal, and depends on an index measure of regularity. These models contribute to a better understanding of the EMD decomposition through the evaluation of its performance in fGn signals. Since an analytical formulation to evaluate the EMD performance is not available, using well-known signals allows for a better insight into the process. The last model presented is meant for application to real data. Its purpose is to predict, in function of the regularity signal, the time-series length that should be used when one wants to divide the spectrum into a pre-determined number of modes, corresponding to different frequency bands, using EMD. This is the case, e.g., in heart rate and blood pressure signals, used to assess sympathovagal balance in the central nervous system.

  11. Advances in white-light optical signal processing

    NASA Technical Reports Server (NTRS)

    Yu, F. T. S.

    1984-01-01

    A technique that permits signal processing operations which can be carried out by white light source is described. The method performs signal processing that obeys the concept of coherent light rather than incoherent optics. Since the white light source contains all the color wavelengths of the visible light, the technique is very suitable for color signal processing.

  12. Biomedical informatics techniques for processing and analyzing web blogs of military service members.

    PubMed

    Konovalov, Sergiy; Scotch, Matthew; Post, Lori; Brandt, Cynthia

    2010-10-05

    Web logs ("blogs") have become a popular mechanism for people to express their daily thoughts, feelings, and emotions. Many of these expressions contain health care-related themes, both physical and mental, similar to information discussed during a clinical interview or medical consultation. Thus, some of the information contained in blogs might be important for health care research, especially in mental health where stress-related conditions may be difficult and expensive to diagnose and where early recognition is often key to successful treatment. In the field of biomedical informatics, techniques such as information retrieval (IR) and natural language processing (NLP) are often used to unlock information contained in free-text notes. These methods might assist the clinical research community to better understand feelings and emotions post deployment and the burden of symptoms of stress among US military service members. In total, 90 military blog posts describing deployment situations and 60 control posts of Operation Enduring Freedom/Operation Iraqi Freedom (OEF/OIF) were collected. After "stop" word exclusion and stemming, a "bag-of-words" representation and term weighting was performed, and the most relevant words were manually selected out of the high-weight words. A pilot ontology was created using Collaborative Protégé, a knowledge management application. The word lists and the ontology were then used within General Architecture for Text Engineering (GATE), an NLP framework, to create an automated pipeline for recognition and analysis of blogs related to combat exposure. An independent expert opinion was used to create a reference standard and evaluate the results of the GATE pipeline. The 2 dimensions of combat exposure descriptors identified were: words dealing with physical exposure and the soldiers' emotional reactions to it. GATE pipeline was able to retrieve blog texts describing combat exposure with precision 0.9, recall 0.75, and F-score 0

  13. Biomedical Informatics Techniques for Processing and Analyzing Web Blogs of Military Service Members

    PubMed Central

    Konovalov, Sergiy; Post, Lori; Brandt, Cynthia

    2010-01-01

    Introduction Web logs (“blogs”) have become a popular mechanism for people to express their daily thoughts, feelings, and emotions. Many of these expressions contain health care-related themes, both physical and mental, similar to information discussed during a clinical interview or medical consultation. Thus, some of the information contained in blogs might be important for health care research, especially in mental health where stress-related conditions may be difficult and expensive to diagnose and where early recognition is often key to successful treatment. In the field of biomedical informatics, techniques such as information retrieval (IR) and natural language processing (NLP) are often used to unlock information contained in free-text notes. These methods might assist the clinical research community to better understand feelings and emotions post deployment and the burden of symptoms of stress among US military service members. Methods In total, 90 military blog posts describing deployment situations and 60 control posts of Operation Enduring Freedom/Operation Iraqi Freedom (OEF/OIF) were collected. After “stop” word exclusion and stemming, a “bag-of-words” representation and term weighting was performed, and the most relevant words were manually selected out of the high-weight words. A pilot ontology was created using Collaborative Protégé, a knowledge management application. The word lists and the ontology were then used within General Architecture for Text Engineering (GATE), an NLP framework, to create an automated pipeline for recognition and analysis of blogs related to combat exposure. An independent expert opinion was used to create a reference standard and evaluate the results of the GATE pipeline. Results The 2 dimensions of combat exposure descriptors identified were: words dealing with physical exposure and the soldiers’ emotional reactions to it. GATE pipeline was able to retrieve blog texts describing combat exposure with

  14. Materials processing towards development of rapid prototyping technology for manufacturing biomedical implants

    NASA Astrophysics Data System (ADS)

    Pekin, Senol

    2000-10-01

    Materials processing towards development of fused deposition of materials (FDM) method for manufacturing biomedical implants has been studied experimentally. Main processing steps consisted of thermoplastic binder development in the ethylene vinyl acetate (EVA)-microcrystalline wax system, feedstock extrusion, characterization and optimization of binder degradation, and sintering of calcium deficient hydroxyapatite. Differential scanning calorimetry (DSC) revealed that the melting index (MI) of the copolymer affects the temperature location of the solidification exotherm, whereas the effect on the temperature location of the melting endotherm was negligible. Nonisothermal measurement of viscosity of different blends as a function of VA content of the EVA component revealed that the microcrystalline wax is compatible with 25--14% VA-containing EVA grades. Further DSC analysis revealed that co-crystallization leads to compatible EVA-microcrystalline wax blends. A typical binder formulation that was developed in the present work has a viscosity of about 700 cP at 140°C, a compressive yield strength of 6 MPa and an elastic modulus of about 600 MPa, and contained 15--20% EVA and 80--85% microcrystalline wax. Various filaments with a nominal diameter of 1.8 mm were extruded by using such a binder, and calcium pyro-phosphate powder that had a distribution modulus of about 0.37. Measurement of physical dimensions of the filament revealed that fluid state can be achieved in the filaments. Simultaneous thermal analysis of degradation characteristics of the typical binder formulations revealed that degradation sequence is oxidation of the hydrocarbons, evaporation of the hydrocarbons, degradation of the vinyl acetate, and degradation of the ethylene chain. A rate controlled binder removal system was developed and used in order to optimize the binder removal schedule. Sintering of gel-cast calcium hydroxyapatite was studied by means of thermal analysis, XRD, mechanical

  15. Sensor Signal Processing for Ultrasonic Sensors Using Delta-Sigma Modulated Single-Bit Digital Signal

    NASA Astrophysics Data System (ADS)

    Hirata, S.; Kurosawa, M. K.; Katagiri, T.

    Ultrasonic distance measurement is based on determining the time of flight of ultrasonic wave. The pulse compression technique that obtains the cross-correlation function between the received signal and the reference signal is used to improve the resolution of distance measurement. The cross-correlation method requires high-cost digital signal processing. This paper presents a cross-correlation method using a delta-sigma modulated single-bit digital signal. Sensor signal processing composed of the cross-correlation between two single-bit signals and a post-moving average filter is proposed and enables reducing the cost of digital signal processing.

  16. Neutron coincidence counting with digital signal processing

    NASA Astrophysics Data System (ADS)

    Bagi, Janos; Dechamp, Luc; Dransart, Pascal; Dzbikowicz, Zdzislaw; Dufour, Jean-Luc; Holzleitner, Ludwig; Huszti, Joseph; Looman, Marc; Marin Ferrer, Montserrat; Lambert, Thierry; Peerani, Paolo; Rackham, Jamie; Swinhoe, Martyn; Tobin, Steve; Weber, Anne-Laure; Wilson, Mark

    2009-09-01

    Neutron coincidence counting is a widely adopted nondestructive assay (NDA) technique used in nuclear safeguards to measure the mass of nuclear material in samples. Nowadays, most neutron-counting systems are based on the original-shift-register technology, like the (ordinary or multiplicity) Shift-Register Analyser. The analogue signal from the He-3 tubes is processed by an amplifier/single channel analyser (SCA) producing a train of TTL pulses that are fed into an electronic unit that performs the time- correlation analysis. Following the suggestion of the main inspection authorities (IAEA, Euratom and the French Ministry of Industry), several research laboratories have started to study and develop prototypes of neutron-counting systems with PC-based processing. Collaboration in this field among JRC, IRSN and LANL has been established within the framework of the ESARDA-NDA working group. Joint testing campaigns have been performed in the JRC PERLA laboratory, using different equipment provided by the three partners. One area of development is the use of high-speed PCs and pulse acquisition electronics that provide a time stamp (LIST-Mode Acquisition) for every digital pulse. The time stamp data can be processed directly during acquisition or saved on a hard disk. The latter method has the advantage that measurement data can be analysed with different values for parameters like predelay and gate width, without repeating the acquisition. Other useful diagnostic information, such as die-away time and dead time, can also be extracted from this stored data. A second area is the development of "virtual instruments." These devices, in which the pulse-processing system can be embedded in the neutron counter itself and sends counting data to a PC, can give increased data-acquisition speeds. Either or both of these developments could give rise to the next generation of instrumentation for improved practical neutron-correlation measurements. The paper will describe the

  17. Coherent signal processing in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Kulkarni, Manish Dinkarrao

    1999-09-01

    Optical coherence tomography (OCT) is a novel method for non-invasive sub-surface imaging of biological tissue micro-structures. OCT achieves high spatial resolution ( ~ 15 m m in three dimensions) using a fiber-optically integrated system which is suitable for application in minimally invasive diagnostics, including endoscopy. OCT uses an optical heterodyne detection technique based on white light interferometry. Therefore extremely faint reflections ( ~ 10 fW) are routinely detected with high spatial localization. The goal of this thesis is twofold. The first is to present a theoretical model for describing image formation in OCT, and attempt to enhance the current level of understanding of this new modality. The second objective is to present signal processing methods for improving OCT image quality. We present deconvolution algorithms to obtain improved longitudinal resolution in OCT. This technique may be implemented without increasing system complexity as compared to current clinical OCT systems. Since the spectrum of the light backscattered from bio-scatterers is closely associated with ultrastructural variations in tissue, we propose a new technique for measuring spectra as a function of depth. This advance may assist OCT in differentiating various tissue types and detecting abnormalities within a tissue. In addition to depth resolved spectroscopy, Doppler processing of OCT signals can also improve OCT image contrast. We present a new technique, termed color Doppler OCT (CDOCT). It is an innovative extension of OCT for performing spatially localized optical Doppler velocimetry. Micron-resolution imaging of blood flow in sub-surface vessels in living tissue using CDOCT is demonstrated. The fundamental issues regarding the trade- off between the velocity estimation precision and image acquisition rate are presented. We also present novel algorithms for high accuracy velocity estimation. In many blood vessels velocities tend to be on the order of a few cm

  18. Pedagogical reforms of digital signal processing education

    NASA Astrophysics Data System (ADS)

    Christensen, Michael

    The future of the engineering discipline is arguably predicated heavily upon appealing to the future generation, in all its sensibilities. The greatest burden in doing so, one might rightly believe, lies on the shoulders of the educators. In examining the causal means by which the profession arrived at such a state, one finds that the technical revolution, precipitated by global war, had, as its catalyst, institutions as expansive as the government itself to satisfy the demand for engineers, who, as a result of such an existential crisis, were taught predominantly theoretical underpinnings to address a finite purpose. By contrast, the modern engineer, having expanded upon this vision and adapted to an evolving society, is increasingly placed in the proverbial role of the worker who must don many hats: not solely a scientist, yet often an artist; not a businessperson alone, but neither financially naive; not always a representative, though frequently a collaborator. Inasmuch as change then serves as the only constancy in a global climate, therefore, the educational system - if it is to mimic the demands of the industry - is left with an inherent need for perpetual revitalization to remain relevant. This work aims to serve that end. Motivated by existing research in engineering education, an epistemological challenge is molded into the framework of the electrical engineer with emphasis on digital signal processing. In particular, it is investigated whether students are better served by a learning paradigm that tolerates and, when feasible, encourages error via a medium free of traditional adjudication. Through the creation of learning modules using the Adobe Captivate environment, a wide range of fundamental knowledge in signal processing is challenged within the confines of existing undergraduate courses. It is found that such an approach not only conforms to the research agenda outlined for the engineering educator, but also reflects an often neglected reality

  19. Signal Processing and Interpretation Using Multilevel Signal Abstractions.

    DTIC Science & Technology

    1986-06-01

    3.3 Summary ....... ................................. 124 4 Helicopter Pitch and Power Tracking using Signal Matching 125 4.1 Overview of the...123 4.1 Architecture of the helicopter pitch and power tracking system. . . . 126 4.2 Partially completed pitch tracks. Harmonic...relationships between spectra at different times, 5pitch and power tracking come as a natural consequence. Moreover, the generated pitch and power tracks are

  20. Spatial acoustic signal processing for immersive communication

    NASA Astrophysics Data System (ADS)

    Atkins, Joshua

    Computing is rapidly becoming ubiquitous as users expect devices that can augment and interact naturally with the world around them. In these systems it is necessary to have an acoustic front-end that is able to capture and reproduce natural human communication. Whether the end point is a speech recognizer or another human listener, the reduction of noise, reverberation, and acoustic echoes are all necessary and complex challenges. The focus of this dissertation is to provide a general method for approaching these problems using spherical microphone and loudspeaker arrays.. In this work, a theory of capturing and reproducing three-dimensional acoustic fields is introduced from a signal processing perspective. In particular, the decomposition of the spatial part of the acoustic field into an orthogonal basis of spherical harmonics provides not only a general framework for analysis, but also many processing advantages. The spatial sampling error limits the upper frequency range with which a sound field can be accurately captured or reproduced. In broadband arrays, the cost and complexity of using multiple transducers is an issue. This work provides a flexible optimization method for determining the location of array elements to minimize the spatial aliasing error. The low frequency array processing ability is also limited by the SNR, mismatch, and placement error of transducers. To address this, a robust processing method is introduced and used to design a reproduction system for rendering over arbitrary loudspeaker arrays or binaurally over headphones. In addition to the beamforming problem, the multichannel acoustic echo cancellation (MCAEC) issue is also addressed. A MCAEC must adaptively estimate and track the constantly changing loudspeaker-room-microphone response to remove the sound field presented over the loudspeakers from that captured by the microphones. In the multichannel case, the system is overdetermined and many adaptive schemes fail to converge to

  1. User's manual SIG: a general-purpose signal processing program

    SciTech Connect

    Lager, D.; Azevedo, S.

    1983-10-25

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Many of the basic operations one would perform on digitized data are contained in the core SIG package. Out of these core commands, more powerful signal processing algorithms may be built. Many different operations on time- and frequency-domain signals can be performed by SIG. They include operations on the samples of a signal, such as adding a scalar to each sample, operations on the entire signal such as digital filtering, and operations on two or more signals such as adding two signals. Signals may be simulated, such as a pulse train or a random waveform. Graphics operations display signals and spectra.

  2. Optimal Hamiltonian Simulation by Quantum Signal Processing

    NASA Astrophysics Data System (ADS)

    Low, Guang Hao; Chuang, Isaac L.

    2017-01-01

    The physics of quantum mechanics is the inspiration for, and underlies, quantum computation. As such, one expects physical intuition to be highly influential in the understanding and design of many quantum algorithms, particularly simulation of physical systems. Surprisingly, this has been challenging, with current Hamiltonian simulation algorithms remaining abstract and often the result of sophisticated but unintuitive constructions. We contend that physical intuition can lead to optimal simulation methods by showing that a focus on simple single-qubit rotations elegantly furnishes an optimal algorithm for Hamiltonian simulation, a universal problem that encapsulates all the power of quantum computation. Specifically, we show that the query complexity of implementing time evolution by a d -sparse Hamiltonian H ^ for time-interval t with error ɛ is O [t d ∥H ^ ∥max+log (1 /ɛ ) /log log (1 /ɛ ) ] , which matches lower bounds in all parameters. This connection is made through general three-step "quantum signal processing" methodology, comprised of (i) transducing eigenvalues of H ^ into a single ancilla qubit, (ii) transforming these eigenvalues through an optimal-length sequence of single-qubit rotations, and (iii) projecting this ancilla with near unity success probability.

  3. Optimal signal processing for continuous qubit readout

    NASA Astrophysics Data System (ADS)

    Ng, Shilin; Tsang, Mankei

    2014-08-01

    The measurement of a quantum two-level system, or a qubit in modern terminology, often involves an electromagnetic field that interacts with the qubit, before the field is measured continuously and the qubit state is inferred from the noisy field measurement. During the measurement, the qubit may undergo spontaneous transitions, further obscuring the initial qubit state from the observer. Taking advantage of some well-known techniques in stochastic detection theory, here we propose a signal processing protocol that can infer the initial qubit state optimally from the measurement in the presence of noise and qubit dynamics. Assuming continuous quantum-nondemolition measurements with Gaussian or Poissonian noise and a classical Markov model for the qubit, we derive analytic solutions to the protocol in some special cases of interest using Itō calculus. Our method is applicable to multihypothesis testing for robust qubit readout and relevant to experiments on qubits in superconducting microwave circuits, trapped ions, nitrogen-vacancy centers in diamond, semiconductor quantum dots, or phosphorus donors in silicon.

  4. Microwave photonic delay line signal processing.

    PubMed

    Diehl, John F; Singley, Joseph M; Sunderman, Christopher E; Urick, Vincent J

    2015-11-01

    This paper provides a path for the design of state-of-the-art fiber-optic delay lines for signal processing. The theoretical forms for various radio-frequency system performance metrics are derived for four modulation types: X- and Z-cut Mach-Zehnder modulators, a phase modulator with asymmetric Mach-Zehnder interferometer, and a polarization modulator with control waveplate and polarizing beam splitter. Each modulation type is considered to cover the current and future needs for ideal system designs. System gain, compression point, and third-order output intercept point are derived from the transfer matrices for each modulation type. A discussion of optical amplifier placement and fiber-effect mitigation is offered. The paper concludes by detailing two high-performance delay lines, built for unique applications, that exhibit performance levels an order of magnitude better than commercial delay lines. This paper should serve as a guide to maximizing the performance of future systems and offer a look into current and future research being done to further improve photonics technologies.

  5. Optical signal processing using photonic reservoir computing

    NASA Astrophysics Data System (ADS)

    Salehi, Mohammad Reza; Dehyadegari, Louiza

    2014-10-01

    As a new approach to recognition and classification problems, photonic reservoir computing has such advantages as parallel information processing, power efficient and high speed. In this paper, a photonic structure has been proposed for reservoir computing which is investigated using a simple, yet, non-partial noisy time series prediction task. This study includes the application of a suitable topology with self-feedbacks in a network of SOA's - which lends the system a strong memory - and leads to adjusting adequate parameters resulting in perfect recognition accuracy (100%) for noise-free time series, which shows a 3% improvement over previous results. For the classification of noisy time series, the rate of accuracy showed a 4% increase and amounted to 96%. Furthermore, an analytical approach was suggested to solve rate equations which led to a substantial decrease in the simulation time, which is an important parameter in classification of large signals such as speech recognition, and better results came up compared with previous works.

  6. Integrated optical signal processing with magnetostatic waves

    NASA Technical Reports Server (NTRS)

    Fisher, A. D.; Lee, J. N.

    1984-01-01

    Magneto-optical devices based on Bragg diffraction of light by magnetostatic waves (MSW's) offer the potential of large time-bandwidth optical signal processing at microwave frequencies of 1 to 20 GHz and higher. A thin-film integrated-optical configuration, with the interacting MSW and guided-optical wave both propagating in a common ferrite layer, is necessary to avoid shape-factor demagnetization effects. The underlying theory of the MSW-optical interaction is outlined, including the development of expressions for optical diffraction efficiency as a function of MSW power and other relevant parameters. Bradd diffraction of guided-optical waves by transversely-propagating magnetostatic waves and collinear TE/TM mode conversion included by MSW's have been demonstrated in yttrium iron garnet (YIG) thin films. Diffraction levels as large as 4% (7 mm interaction length) and a modulation dynamic range of approx 30 dB have been observed. Advantages of these MSW-based devices over the analogous acousto-optical devices include: much greater operating frequencies, tunability of the MSW dispersion relation by varying either the RF frequency or the applied bias magnetic field, simple broad-band MSW transducer structures (e.g., a single stripline), and the potential for very high diffraction efficiencies.

  7. Processing and Characterization of Functionally Graded Hydroxyapatite Coatings for Biomedical Implants

    NASA Astrophysics Data System (ADS)

    Bai, Xiao

    Hydroxyapatite [Ca10(PO4)6(OH) 2, HA] has been widely applied as a coating on various biomedical bone/dental implants to improve biocompatibility and bioactivity. It has been observed that primary reasons leading to implantation failure of commercial HA coated implants processed by plasma spraying are the poor mechanical properties of coatings and infections accompanied by implantation. It has been also reported an ideal coating should be able to stimulate new bone growth at the initial stage of implantation and stay stable both mechanically and chemically thereafter. This research has investigated a functionally graded hydroxyapatite (FGHA) coating that is capable of improving the stability of implants, facilitating recovery, and preventing infections after implantation. A series of FGHA coatings with incorporated Ag 0 ˜ 13.53 wt. % has been deposited onto Ti substrate using ion beam assisted deposition (IBAD) with in-situ heat treatment. The compositional, microstructural, mechanical, and biological properties of coatings have been analyzed via various tests. The relationship among processing parameters, coating properties and biological behaviors has been established and the processing parameters for processing FGHA coatings with/without incorporated Ag have been optimized. Microstructure observations of coating cross section via transmission electron microscopy (TEM) and scanning transmission electron microscopy (STEM) for set temperature coatings deposited at 450°C ˜ 750°C reveals that in-situ substrate temperature is the primary factor controlling the crystallinity of the coatings. The microstructure observation of cross section via TEM/STEM for both FGHA coatings with/without incorporated Ag has shown that coatings are dense and have a gradually decreased crystallinity from substrate/coating interface to top surface. In particular, the interface has an atomically intermixed structure; the region near the interface has a columnar grain structure whereas

  8. Processing Motion Signals in Complex Environments

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti

    2000-01-01

    Motion information is critical for human locomotion and scene segmentation. Currently we have excellent neurophysiological models that are able to predict human detection and discrimination of local signals. Local motion signals are insufficient by themselves to guide human locomotion and to provide information about depth, object boundaries and surface structure. My research is aimed at understanding the mechanisms underlying the combination of motion signals across space and time. A target moving on an extended trajectory amidst noise dots in Brownian motion is much more detectable than the sum of signals generated by independent motion energy units responding to the trajectory segments. This result suggests that facilitation occurs between motion units tuned to similar directions, lying along the trajectory path. We investigated whether the interaction between local motion units along the motion direction is mediated by contrast. One possibility is that contrast-driven signals from motion units early in the trajectory sequence are added to signals in subsequent units. If this were the case, then units later in the sequence would have a larger signal than those earlier in the sequence. To test this possibility, we compared contrast discrimination thresholds for the first and third patches of a triplet of sequentially presented Gabor patches, aligned along the motion direction. According to this simple additive model, contrast increment thresholds for the third patch should be higher than thresholds for the first patch.The lack of a measurable effect on contrast thresholds for these various manipulations suggests that the pooling of signals along a trajectory is not mediated by contrast-driven signals. Instead, these results are consistent with models that propose that the facilitation of trajectory signals is achieved by a second-level network that chooses the strongest local motion signals and combines them if they occur in a spatio-temporal sequence consistent

  9. Processing Motion Signals in Complex Environments

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti

    2000-01-01

    Motion information is critical for human locomotion and scene segmentation. Currently we have excellent neurophysiological models that are able to predict human detection and discrimination of local signals. Local motion signals are insufficient by themselves to guide human locomotion and to provide information about depth, object boundaries and surface structure. My research is aimed at understanding the mechanisms underlying the combination of motion signals across space and time. A target moving on an extended trajectory amidst noise dots in Brownian motion is much more detectable than the sum of signals generated by independent motion energy units responding to the trajectory segments. This result suggests that facilitation occurs between motion units tuned to similar directions, lying along the trajectory path. We investigated whether the interaction between local motion units along the motion direction is mediated by contrast. One possibility is that contrast-driven signals from motion units early in the trajectory sequence are added to signals in subsequent units. If this were the case, then units later in the sequence would have a larger signal than those earlier in the sequence. To test this possibility, we compared contrast discrimination thresholds for the first and third patches of a triplet of sequentially presented Gabor patches, aligned along the motion direction. According to this simple additive model, contrast increment thresholds for the third patch should be higher than thresholds for the first patch.The lack of a measurable effect on contrast thresholds for these various manipulations suggests that the pooling of signals along a trajectory is not mediated by contrast-driven signals. Instead, these results are consistent with models that propose that the facilitation of trajectory signals is achieved by a second-level network that chooses the strongest local motion signals and combines them if they occur in a spatio-temporal sequence consistent

  10. Spatiotemporal signal processing for blind separation of multichannel signals

    NASA Astrophysics Data System (ADS)

    Tugnait, Jitendra K.

    1996-06-01

    This paper is concerned with the problem of blind separation of independent signals (sources) from their linear convolutive mixtures. The problem consists of recovering the sources up to shaping filters from the observations of multiple-input multiple-output (MIMO) system output. The various signals are assumed to be linear but not necessarily i.i.d. (independent and identically distributed). The problem is cast into the framework of spatio-temporal equalization and estimation of the matrix impulse response function of MIMO channels (systems). An iterative, Godard cost based approach is considered for spatio-temporal equalization and MIMO impulse response estimation. Stationary points of the cost function are investigated and it is shown that all stable local minima correspond to desirable minima when doubly infinite equalizers are used. Analysis is also provided for the case when finite-length equalizers exist. The various input sequences are extracted and cancelled one-by-one. The matrix impulse response is then obtained by cross-correlating the extracted inputs with the observed outputs. Identifiability conditions are analyzed. Computer simulation examples are presented to illustrate the proposed approach.

  11. Intelligent, onboard signal processing payload concept, addendum :

    SciTech Connect

    Shriver, P. M.; Harikumar, J.; Briles, S. C.; Gokhale, M.

    2003-01-01

    This document addresses two issues in the original paper entitled 'An Intelligent, Onboard Signal Processing Payload Concept' submitted to the SPIE AeroSense 2003 C0nference.l Since the original paper submission, and prior to the scheduled presentation, a correction has been made to one of the figures in the original paper and an update has been performed to the software simulation of the payload concept. The figure, referred to as Figure 8. Simulation Results in the original paper, contains an error in the voltage versus the capacity drained chart. This chart does not correctly display the voltage changes experienced by the battery module due to the varying discharge rates. This error is an artifact of the procedure used to graph the data. Additionally, the original version of the Simulation related the algorithm execution rate to the lightning event rate regardless of the number of events in the ring buffer. This feature was mentioned in section 5. Simulation Results of the original paper. A correction was also made to the size of the ring buffer. Incorrect information was provided to the authors that placed the number of possible events at 18,310. Corrected information has since been obtained that specifies the ring buffer can typically hold only 1,000 events. This has a significant impact on the APM process and the number of events lost when the size of the ring buffer is exceeded. Also, upon further analysis, it was realized that the simulation contained an error in the recording of the number of events in the ring buffer. The faster algorithms, LMS and ML, should have been able to process all events during the simulation time interval, but the initial results did not reflect this characteristic. The updated version of the simulation appropriately handles the number of algorithm executions and recording of events in the ring buffer as well as uses the correct size for the ring buffer. These improvements to the simulation and subsequent results are discussed in

  12. Issues in collecting, processing and storing human tissues and associated information to support biomedical research.

    PubMed

    Grizzle, William E; Bell, Walter C; Sexton, Katherine C

    2010-01-01

    The availability of human tissues to support biomedical research is critical to advance translational research focused on identifying and characterizing approaches to individualized (personalized) medical care. Providing such tissues relies on three acceptable models - a tissue banking model, a prospective collection model and a combination of these two models. An unacceptable model is the "catch as catch can" model in which tissues are collected, processed and stored without goals or a plan or without standard operating procedures, i.e., portions of tissues are collected as available and processed and stored when time permits. In the tissue banking model, aliquots of tissues are collected according to SOPs. Usually specific sizes and types of tissues are collected and processed (e.g., 0.1 gm of breast cancer frozen in OCT). Using the banking model, tissues may be collected that may not be used and/or do not meet specific needs of investigators; however, at the time of an investigator request, tissues are readily available as is clinical information including clinical outcomes. In the model of prospective collection, tissues are collected based upon investigator requests including specific requirements of investigators. For example, the investigator may request that two 0.15 gm matching aliquots of breast cancer be minced while fresh, put in RPMI media with and without fetal calf serum, cooled to 4°C and shipped to the investigator on wet ice. Thus, the tissues collected prospectively meet investigator needs, all collected specimens are utilized and storage of specimens is minimized; however, investigators must wait until specimens are collected, and if needed, for clinical outcome. The operation of any tissue repository requires well trained and dedicated personnel. A quality assurance program is required which provides quality control information on the diagnosis of a specimen that is matched specifically to the specimen provided to an investigator instead of an

  13. Biologically-based signal processing system applied to noise removal for signal extraction

    DOEpatents

    Fu, Chi Yung; Petrich, Loren I.

    2004-07-13

    The method and system described herein use a biologically-based signal processing system for noise removal for signal extraction. A wavelet transform may be used in conjunction with a neural network to imitate a biological system. The neural network may be trained using ideal data derived from physical principles or noiseless signals to determine to remove noise from the signal.

  14. Advanced algorithms and architectures for signal processing III

    SciTech Connect

    Luk, F.T.

    1988-01-01

    This book covers proceedings on advanced algorithms and architectures for signal processing III. Topics covered include: Fast QR-based array-processing algorithm; Model for the analysis of fault-tolerant signal processing architecture; Parallel VLSI direction finding algorithm; and Self-calibration techniques for high-resolution array processing.

  15. Neural mechanisms of spatiotemporal signal processing

    NASA Astrophysics Data System (ADS)

    Khanbabaie Shoub, Shaban (Reza)

    We have studied the synaptic, dendritic, and network mechanisms of spatiotemporal signal processing underlying the computation of visual motion in the avian tectum. Such mechanisms are critical for information processing in all vertebrates, but have been difficult to elucidate in mammals because of anatomical limitations. We have therefore developed a chick tectal slice preparation, which has features that help us circumvent these limitations. Using single-electrode multi-pulse synaptic stimulation experiments we found that the SGC-I cell responds to synaptic stimulation in a binary manner and its response is phasic in a time dependent probabilistic manner over large time scales. Synaptic inputs at two locations typically interact in a mutually exclusive manner when delivered within the "interaction time" of approximately 30 ms. Then we constructed a model of SGC-I cell and the retinal inputs to examine the role of the observed non-linear cellular properties in shaping the response of SGC-I neurons to assumed retinal representations of dynamic spatiotemporal visual stimuli. We found that by these properties, SGC-I cells can classify different stimuli. Especially without the phasic synaptic signal transfer the model SGC-I cell fails to distinguish between the static stationary stimuli and dynamic spatiotemporal stimuli. Based on one-site synaptic response probability and the assumption of independent neighboring dendritic endings we predicted the response probability of SGC-I cells to multiple synaptic inputs. We tested this independence-based model prediction and found that the independency assumption is not valid. The measured SGC-I response probability to multiple synaptic inputs does not increase with the number of synaptic inputs. The presence of GABAergic horizontal cells in layer 5 suggest an inhibitory effect of these cells on the SGC-I retino-tectal synaptic responses. In our experiment we found that the measured SGC-I response probability to multiple

  16. Statistical signal processing in sensor networks

    NASA Astrophysics Data System (ADS)

    Guerriero, Marco

    In this dissertation we focus on decentralized signal processing in Sensor Networks (SN). Four topics are studied: (i) Direction of Arrival (DOA) estimation using a Wireless Sensor network (WSN), (ii) multiple target tracking in large SN, (iii) decentralized target detection in SN and (iv) decentralized sequential detection in SN with communication constraints. The first topic of this thesis addresses the problem of estimating the DOA of an acoustic wavefront using a a WSN made of isotropic (hence individually useless) sensors. The WSN was designed according to the SENMA (SEnsor Network with Mobile Agents) architecture with a mobile agent (MA) that successively queries the sensors lying inside its field of view. We propose both fast/simple and optimal DOA-estimation schemes, and an optimization of the MAs observation management is also carried out, with the surprising finding that the MA ought to orient itself at an oblique angle to the expected DOA, rather than directly toward it. We also consider the extension to multiple sources; intriguingly, per-source DOA accuracy is higher when there is more than one source. In all cases, performance is investigated by simulation and compared, when appropriate, with asymptotic bounds; these latter are usually met after a moderate number of MA dwells. In the second topic, we study the problem of tracking multiple targets in large SN. While these networks hold significant potential for surveillance, it is of interest to address fundamental limitations in large-scale implementations. We first introduce a simple analytical tracker performance model. Analysis of this model suggests that scan-based tracking performance improves with increasing numbers of sensors, but only to a certain point beyond which degradation is observed. Correspondingly, we address model-based optimization of the local sensor detection threshold and the number of sensors. Next, we propose a two-stage tracking approach (fuse-before-track) as a possible

  17. Signal Processing in the Linear Statistical Model

    DTIC Science & Technology

    1994-11-04

    Covariance Bounds," Proc 07th Asilo - mar Conf on Signals, Systems, and Computers, Pacific Grove, CA (November 1993). [MuS9l] C. T. Mullis and L. L. Scharf...Transforms," Proc Asilo - mar Con. on Signals, Systems, and Computers, Asilomar, CA (November 1991). [SpS94] M. Spurbeck and L. L. Scharf, "Least Squares...McWhorter and L. L. Scharf, "Multiwindow Estimators of Correlation," Proc 28th Annual Asilo - mar Conf on Signals, Systems, and Computers, Asilomar, CA

  18. Microwave signal processing with photorefractive dynamic holography

    NASA Astrophysics Data System (ADS)

    Fotheringham, Edeline B.

    Have you ever found yourself listening to the music playing from the closest stereo rather than to the bromidic (uninspiring) person speaking to you? Your ears receive information from two sources but your brain listens to only one. What if your cell phone could distinguish among signals sharing the same bandwidth too? There would be no "full" channels to stop you from placing or receiving a call. This thesis presents a nonlinear optical circuit capable of distinguishing uncorrelated signals that have overlapping temporal bandwidths. This so called autotuning filter is the size of a U.S. quarter dollar and requires less than 3 mW of optical power to operate. It is basically an oscillator in which the losses are compensated with dynamic holographic gain. The combination of two photorefractive crystals in the resonator governs the filter's winner-take-all dynamics through signal-competition for gain. This physical circuit extracts what is mathematically referred to as the largest principal component of its spatio-temporal input space. The circuit's practicality is demonstrated by its incorporation in an RF-photonic system. An unknown mixture of unknown microwave signals, received by an antenna array, constitutes the input to the system. The output electronically returns one of the original microwave signals. The front-end of the system down converts the 10 GHz microwave signals and amplifies them before the signals phase modulate optical beams. The optical carrier is suppressed from these beams so that it may not be considered as a signal itself to the autotuning filter. The suppression is achieved with two-beam coupling in a single photorefractive crystal. The filter extracts the more intense of the signals present on the carrier-suppressed input beams. The detection of the extracted signal restores the microwave signal to an electronic form. The system, without the receiving antenna array, is packaged in a 13 x 18 x 6″ briefcase. Its power consumption equals that

  19. A comparison among different techniques for human ERG signals processing and classification.

    PubMed

    Barraco, R; Persano Adorno, D; Brai, M; Tranchina, L

    2014-02-01

    Feature detection in biomedical signals is crucial for deepening our knowledge about the involved physiological processes. To achieve this aim, many analytic approaches can be applied but only few are able to deal with signals whose time dependent features provide useful clinical information. Among the biomedical signals, the electroretinogram (ERG), that records the retinal response to a light flash, can improve our comprehension of the complex photoreceptoral activities. The present study is focused on the analysis of the early response of the photoreceptoral human system, known as a-wave ERG-component. This wave reflects the functional integrity of the photoreceptors, rods and cones, whose activation dynamics are not yet completely understood. Moreover, since in incipient photoreceptoral pathologies eventual anomalies in a-wave are not always detectable with a "naked eye" analysis of the traces, the possibility to discriminate pathologic from healthy traces, by means of appropriate analytical techniques, could help in clinical diagnosis. In the present paper, we discuss and compare the efficiency of various techniques of signal processing, such as Fourier analysis (FA), Principal Component Analysis (PCA), Wavelet Analysis (WA) in recognising pathological traces from the healthy ones. The investigated retinal pathologies are Achromatopsia, a cone disease and Congenital Stationary Night Blindness, affecting the photoreceptoral signal transmission. Our findings prove that both PCA and FA of conventional ERGs, don't add clinical information useful for the diagnosis of ocular pathologies, whereas the use of a more sophisticated analysis, based on the wavelet transform, provides a powerful tool for routine clinical examinations of patients. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Wavelet Signal Processing for Transient Feature Extraction

    DTIC Science & Technology

    1992-03-15

    Research was conducted to evaluate the feasibility of applying Wavelets and Wavelet Transform methods to transient signal feature extraction problems... Wavelet transform techniques were developed to extract low dimensional feature data that allowed a simple classification scheme to easily separate

  1. Adaptive Noise Suppression Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Kozel, David; Nelson, Richard

    1996-01-01

    A signal to noise ratio dependent adaptive spectral subtraction algorithm is developed to eliminate noise from noise corrupted speech signals. The algorithm determines the signal to noise ratio and adjusts the spectral subtraction proportion appropriately. After spectra subtraction low amplitude signals are squelched. A single microphone is used to obtain both eh noise corrupted speech and the average noise estimate. This is done by determining if the frame of data being sampled is a voiced or unvoiced frame. During unvoice frames an estimate of the noise is obtained. A running average of the noise is used to approximate the expected value of the noise. Applications include the emergency egress vehicle and the crawler transporter.

  2. Wavelet Transform Signal Processing Applied to Ultrasonics.

    DTIC Science & Technology

    1995-05-01

    THE WAVELET TRANSFORM IS APPLIED TO THE ANALYSIS OF ULTRASONIC WAVES FOR IMPROVED SIGNAL DETECTION AND ANALYSIS OF THE SIGNALS. In instances where...the mother wavelet is well defined, the wavelet transform has relative insensitivity to noise and does not need windowing. Peak detection of...ultrasonic pulses using the wavelet transform is described and results show good detection even when large white noise was added. The use of the wavelet

  3. All-numerical noise filtering of fluorescence signals for achieving ultra-low limit of detection in biomedical applications.

    PubMed

    Dongre, Chaitanya; Pollnau, Markus; Hoekstra, Hugo J W M

    2011-03-21

    We present an all-numerical method for post-processing of the fluorescent signal as obtained from labeled molecules by capillary electrophoresis (CE) in an optofluidic chip, on the basis of data filtering in the Fourier domain. It is shown that the method outperforms the well-known lock-in amplification during experiments in the reduction of noise by a factor of (square root)2. The method is illustrated using experimental data obtained during CE separation of molecules from a commercial DNA ladder with 17 fluorescently labeled molecules having different base-pair sizes. An improvement in signal-to-noise ratio by a factor of ∼10 is achieved, resulting in a record-low limit of detection of 210 fM.

  4. Automatic detection of slight parameter changes associated to complex biomedical signals using multiresolution q-entropy1.

    PubMed

    Torres, M E; Añino, M M; Schlotthauer, G

    2003-12-01

    It is well known that, from a dynamical point of view, sudden variations in physiological parameters which govern certain diseases can cause qualitative changes in the dynamics of the corresponding physiological process. The purpose of this paper is to introduce a technique that allows the automated temporal localization of slight changes in a parameter of the law that governs the nonlinear dynamics of a given signal. This tool takes, from the multiresolution entropies, the ability to show these changes as statistical variations at each scale. These variations are held in the corresponding principal component. Appropriately combining these techniques with a statistical changes detector, a complexity change detection algorithm is obtained. The relevance of the approach, together with its robustness in the presence of moderate noise, is discussed in numerical simulations and the automatic detector is applied to real and simulated biological signals.

  5. An overview of synthetic aperture radar signal processing techniques

    NASA Astrophysics Data System (ADS)

    vant, M. R.

    The principles of synthetic aperture radar and its significance in a military and remote sensing context are reviewed. The signal processing operations required to convert the signal data into an image are described. These operations must accomplish two things: correction of the range migration, both walk and curvature; and compression or focusing of the cross-range portion of the signal. These operations include time-domain deramping, matched filtering, range-Doppler fast convolution, step transform processing, spotlight mode processing, and motion compensation. Processing problems specific to synthetic aperture radar signal processing are summarized and an example of synthetic aperture radar imagery is presented.

  6. Neural Networks for Signal Processing and Control

    NASA Astrophysics Data System (ADS)

    Hesselroth, Ted Daniel

    cortex by the application of lateral interactions during the learning phase. The organization of the mature network is compared to that found in the macaque monkey by several analytical tests. The capacity of the network to process images is investigated. By a method of reconstructing the input images in terms of V1 activities, the simulations show that images can be faithfully represented in V1 by the proposed network. The signal-to-noise ratio of the image is improved by the representation, and compression ratios of well over two-hundred are possible. Lateral interactions between V1 neurons sharpen their orientational tuning. We further study the dynamics of the processing, showing that the rate of decrease of the error of the reconstruction is maximized for the receptive fields used. Lastly, we employ a Fokker-Planck equation for a more detailed prediction of the error value vs. time. The Fokker-Planck equation for an underdamped system with a driving force is derived, yielding an energy-dependent diffusion coefficient which is the integral of the spectral densities of the force and the velocity of the system. The theory is applied to correlated noise activation and resonant activation. Simulation results for the error of the network vs time are compared to the solution of the Fokker-Planck equation.

  7. Multichannel heterodyning for wideband interferometry, correlation and signal processing

    DOEpatents

    Erskine, D.J.

    1999-08-24

    A method is disclosed of signal processing a high bandwidth signal by coherently subdividing it into many narrow bandwidth channels which are individually processed at lower frequencies in a parallel manner. Autocorrelation and correlations can be performed using reference frequencies which may drift slowly with time, reducing cost of device. Coordinated adjustment of channel phases alters temporal and spectral behavior of net signal process more precisely than a channel used individually. This is a method of implementing precision long coherent delays, interferometers, and filters for high bandwidth optical or microwave signals using low bandwidth electronics. High bandwidth signals can be recorded, mathematically manipulated, and synthesized. 50 figs.

  8. Multichannel heterodyning for wideband interferometry, correlation and signal processing

    DOEpatents

    Erskine, David J.

    1999-01-01

    A method of signal processing a high bandwidth signal by coherently subdividing it into many narrow bandwidth channels which are individually processed at lower frequencies in a parallel manner. Autocorrelation and correlations can be performed using reference frequencies which may drift slowly with time, reducing cost of device. Coordinated adjustment of channel phases alters temporal and spectral behavior of net signal process more precisely than a channel used individually. This is a method of implementing precision long coherent delays, interferometers, and filters for high bandwidth optical or microwave signals using low bandwidth electronics. High bandwidth signals can be recorded, mathematically manipulated, and synthesized.

  9. Statistical Signal Processing Methods in Scattering and Imaging

    NASA Astrophysics Data System (ADS)

    Zambrano Nunez, Maytee

    This Ph.D. dissertation project addresses two related topics in wave-based signal processing: 1) Cramer-Rao bound (CRB) analysis of scattering systems formed by pointlike scatterers in one-dimensional (1D) and three-dimensional (3D) spaces. 2) Compressive optical coherent imaging, based on the incorporation of sparsity priors in the reconstructions. The first topic addresses for wave scattering systems in 1D and 3D spaces the information content about scattering parameters, in particular, the targets' positions and strengths, and derived quantities, that is contained in scattering data corresponding to reflective, transmissive, and more general sensing modalities. This part of the dissertation derives the Cramer-Rao bound (CRB) for the estimation of parameters of scalar wave scattering systems formed by point scatterers. The results shed light on the fundamental difference between the approximate Born approximation model for weak scatterers and the more general multiple scattering model, and facilitate the identification of regions in parameter space where multiple scattering facilitates or obstructs the estimation of parameters from scattering data, as well as of sensing configurations giving maximal or minimal information about the parameters. The derived results are illustrated with numerical examples, with particular emphasis on the imaging resolution which we quantify via a relative resolution index borrowed from a previous paper. Additionally, this work investigates fundamental limits of estimation performance for the localization of the targets and the inverse scattering problem. The second topic of the effort describes a novel compressive-sensing-based technique for optical imaging with a coherent single-detector system. This hybrid opto-micro-electromechanical, coherent single-detector imaging system applies the latest developments in the nascent field of compressive sensing to the problem of computational imaging of wavefield intensity from a small number

  10. Optical Wavelet Signals Processing and Multiplexing

    NASA Astrophysics Data System (ADS)

    Cincotti, Gabriella; Moreolo, Michela Svaluto; Neri, Alessandro

    2005-12-01

    We present compact integrable architectures to perform the discrete wavelet transform (DWT) and the wavelet packet (WP) decomposition of an optical digital signal, and we show that the combined use of planar lightwave circuits (PLC) technology and multiresolution analysis (MRA) can add flexibility to current multiple access optical networks. We furnish the design guidelines to synthesize wavelet filters as two-port lattice-form planar devices, and we give some examples of optical signal denoising and compression/decompression techniques in the wavelet domain. Finally, we present a fully optical wavelet packet division multiplexing (WPDM) scheme where data signals are waveform-coded onto wavelet atom functions for transmission, and numerically evaluate its performances.

  11. [The alcoholization process in Latin America. Critical analysis of biomedical and sociological production, 1970-1980 (2)].

    PubMed

    Menéndez, E L

    1984-03-01

    This work analyses the bibliographical production in the sociological and biomedical fields on alcoholization generated within and for Latin America during the seventies. This production is characterized by a unilaterally "pathologizing" outlook, which contrasts with the outlook dominant in the socio-anthropological fields, and which was analysed in a previous work. Empirical and factorial outlook dominate, in both, theory and methodology. They stress again an approach whose serious limitations have already been shown. The dominant technical elements--the sociological and epidemiological inquest--keep on being utilized, in spite of the many criticisms which they have received. Data obtained not only bears little relevance on the problem, but also stresses facts at a level of depth which is not justified, and do not justify, the theoretical framework of analysis. In spite of the fact that unsystematized empirical data and specific research which has been undertaken in other regional areas have made reference to a continual deficit on the part of the health team for the diagnosis and treatment of the alcoholization process; we hardly have any research which can throw light on the scientific and ideological limitations of medical and paramedical actions. Besides, we do not have a systematic analysis of alternative therapeutic strategics. All bibliographical publications refers, in a very biased way, the process of alcoholization to the lower population strata, without any critical reflection on that association. The biomedical dimension, although it utilizes conceptions and viewpoints which have been taken from anthropological and sociological production, this appropiation has meant a modification, which, in fact, has caused a split between the two dominant productions in Latin America: the biomedical and the anthropological one. This split replicates the conflict between models, that operates within other national and international contexts.

  12. Optimizing signal and image processing applications using Intel libraries

    NASA Astrophysics Data System (ADS)

    Landré, Jérôme; Truchetet, Frédéric

    2007-01-01

    This paper presents optimized signal and image processing libraries from Intel Corporation. Intel Performance Primitives (IPP) is a low-level signal and image processing library developed by Intel Corporation to optimize code on Intel processors. Open Computer Vision library (OpenCV) is a high-level library dedicated to computer vision tasks. This article describes the use of both libraries to build flexible and efficient signal and image processing applications.

  13. Optical Architectures for Signal Processing - Part A

    DTIC Science & Technology

    2003-04-01

    Input: F; 7 1 Ouput Microwave signal •icrowave signal Optical sourcd Passive optical I Photodetector . device P𔃻 ’ P2 b) Optical source Input: Microwave...integrated illumination with optical power of 2 mW. It can be concluded from Fig. 7 that the same switching performances can be observed whatever the way...1 2 3 4 5 6 7 8 9 10 Frequency (GHz) Figure 7 : Comparison of switching performances under 2mW of optical power for the full integrated structure

  14. Digital Signal Processing Leveraged for Intrusion Detection

    DTIC Science & Technology

    2008-03-27

    threats including, but not limited to, pre-packaged or scripted attacks delivered by readily available tools such as Metasploit , WinNuke and others...The metasploit project, 2008. [Rob04] Michael J. Roberts. Signals and systems :analysis using transform methods and MATLAB. McGrawHill, Dubuque, Iowa

  15. Laser heterodyne interferometric signal processing method based on rising edge locking with high frequency clock signal.

    PubMed

    Zhang, Enzheng; Chen, Benyong; Yan, Liping; Yang, Tao; Hao, Qun; Dong, Wenjun; Li, Chaorong

    2013-02-25

    A novel phase measurement method composed of the rising-edge locked signal processing and the digital frequency mixing is proposed for laser heterodyne interferometer. The rising-edge locked signal processing, which employs a high frequency clock signal to lock the rising-edges of the reference and measurement signals, not only can improve the steepness of the rising-edge, but also can eliminate the error counting caused by multi-rising-edge phenomenon in fringe counting. The digital frequency mixing is realized by mixing the digital interference signal with a digital base signal that is different from conventional frequency mixing with analogue signals. These signal processing can improve the measurement accuracy and enhance anti-interference and measurement stability. The principle and implementation of the method are described in detail. An experimental setup was constructed and a series of experiments verified the feasibility of the method in large displacement measurement with high speed and nanometer resolution.

  16. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  17. Development of an Ontology-Directed Signal Processing Toolbox

    SciTech Connect

    Stephen W. Lang

    2011-05-27

    This project was focused on the development of tools for the automatic configuration of signal processing systems. The goal is to develop tools that will be useful in a variety of Government and commercial areas and useable by people who are not signal processing experts. In order to get the most benefit from signal processing techniques, deep technical expertise is often required in order to select appropriate algorithms, combine them into a processing chain, and tune algorithm parameters for best performance on a specific problem. Therefore a significant benefit would result from the assembly of a toolbox of processing algorithms that has been selected for their effectiveness in a group of related problem areas, along with the means to allow people who are not signal processing experts to reliably select, combine, and tune these algorithms to solve specific problems. Defining a vocabulary for problem domain experts that is sufficiently expressive to drive the configuration of signal processing functions will allow the expertise of signal processing experts to be captured in rules for automated configuration. In order to test the feasibility of this approach, we addressed a lightning classification problem, which was proposed by DOE as a surrogate for problems encountered in nuclear nonproliferation data processing. We coded a toolbox of low-level signal processing algorithms for extracting features of RF waveforms, and demonstrated a prototype tool for screening data. We showed examples of using the tool for expediting the generation of ground-truth metadata, for training a signal recognizer, and for searching for signals with particular characteristics. The public benefits of this approach, if successful, will accrue to Government and commercial activities that face the same general problem - the development of sensor systems for complex environments. It will enable problem domain experts (e.g. analysts) to construct signal and image processing chains without

  18. Information processing in multi-step signaling pathways

    NASA Astrophysics Data System (ADS)

    Ganesan, Ambhi; Hamidzadeh, Archer; Zhang, Jin; Levchenko, Andre

    Information processing in complex signaling networks is limited by a high degree of variability in the abundance and activity of biochemical reactions (biological noise) operating in living cells. In this context, it is particularly surprising that many signaling pathways found in eukaryotic cells are composed of long chains of biochemical reactions, which are expected to be subject to accumulating noise and delayed signal processing. Here, we challenge the notion that signaling pathways are insulated chains, and rather view them as parts of extensively branched networks, which can benefit from a low degree of interference between signaling components. We further establish conditions under which this pathway organization would limit noise accumulation, and provide evidence for this type of signal processing in an experimental model of a calcium-activated MAPK cascade. These results address the long-standing problem of diverse organization and structure of signaling networks in live cells.

  19. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2013-12-31

    signal separation. Previous work was reported at the Oceans 󈧑 conference. Frequency Domain Reflectometry In an attempt overcome the unambiguous...respectively. The phase function of Maalox antacid was used as an input to the simulation program. In addition, the effect of integration time on...configuration as the simulations. The object position was fixed at the maximum distance while the turbidity was varied by dissolving Equate antacid into the

  20. Neural Networks Applied to Signal Processing

    DTIC Science & Technology

    1989-09-01

    identify by block number) FIELD GROUP SUB-GROUP Neural network, backpropagation, conjugato grad- ient method, Fibonacci line search, nonlinear signal...of the First Layer Gradients ............ 31 e. Calculation of the Input Layer Gradient-. ........... 33 i%" 5. Fibonacci Line Search Parameters...conjugate gradient optimization method is presented and then applied to the neu- ral network model. The Fibonacci line search method used in conjunction

  1. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2012-12-31

    in the sense that this is the first point at which point the signal magnitude is maximum. On the other extreme, as c becomes large (high turbidity ...centimeter intervals in a 3.6 meter long water tank at varying turbidities [4]. In this preliminary experiment, the single delay line canceler was...in Through-the-Wall Radar Imaging," IEEE Transactions on Geoscience and Remote Sensing , vol. 47, no. 9, pp. 3192-3208, Sept. 2009. [3] Zege, E.P

  2. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  3. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  4. Camera systems in human motion analysis for biomedical applications

    NASA Astrophysics Data System (ADS)

    Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.

    2015-05-01

    Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.

  5. Signal processing considerations for low signal to noise ratio laser Doppler and phase Doppler signals

    NASA Technical Reports Server (NTRS)

    Ibrahim, K. M.; Wertheimer, G. D.; Bachalo, William D.

    1991-01-01

    The relative performance of current methods used for estimating the phase and the frequency in LDV and phase Doppler applications in low signal to noise ratio conditions is analyzed. These methods include the Fourier analysis and the correlation techniques. Three methods that use the correlation function for frequency and phase estimations are evaluated in terms of accuracy and speed of processing. These methods include: (1) the frequency estimation using zero crossings counting of the auto-correlation function, (2) the Blackman-Tukey method, and (3) the AutoRegressive method (AR). The relative performance of these methods is evaluated and compared with the Fourier analysis method which provides the optimum performance in terms of the Maximum Likelihood (ML) criteria.

  6. High-speed and reconfigurable all-optical signal processing for phase and amplitude modulated signals

    NASA Astrophysics Data System (ADS)

    Khaleghi, Salman

    Technology has empowered people in all walks of life to generate, store, and communicate enormous amounts of data. Recent technological advances in high-speed backbone data networks, together with the growing trend toward bandwidth-demanding applications such as data and video sharing, cloud computing, and data collection systems, have created a need for higher capacities in signal transmission and signal processing. Optical communication systems have long benefited from the large bandwidth of optical signals (beyond tera-hertz) to transmit information. Through the use of optical signal processing techniques, this Ph.D. dissertation explores the potential of very-high-speed optics to assist electronics in processing huge amounts of data at high speeds. Optical signal processing brings together various fields of optics and signal processing---nonlinear devices and processes, analog and digital signals, and advanced data modulation formats---to achieve high-speed signal processing functions that can potentially operate at the line rate of fiber optic communications. Information can be encoded in amplitude, phase, wavelength, polarization, and spatial features of an optical wave to achieve high-capacity transmission. Many advances in the key enabling technologies have led to recent research in optical signal processing for digital signals that are encoded in one or more of these dimensions. Optical Kerr nonlinearities have femto-second response times that have been exploited for fast processing of optical signals. Various optical nonlinearities and chromatic dispersions have enabled key sub-system applications such as wavelength conversion, multicasting, multiplexing, demultiplexing, and tunable optical delays. In this Ph.D. dissertation, we employ these recent advances in the enabling technologies for high-speed optical signal processing to demonstrate various techniques that can process phase- and amplitude-encoded optical signals at the line rate of optics. We use

  7. An Analog Circuit Approximation of the Discrete Wavelet Transform for Ultra Low Power Signal Processing in Wearable Sensor Nodes

    PubMed Central

    Casson, Alexander J.

    2015-01-01

    Ultra low power signal processing is an essential part of all sensor nodes, and particularly so in emerging wearable sensors for biomedical applications. Analog signal processing has an important role in these low power, low voltage, low frequency applications, and there is a key drive to decrease the power consumption of existing analog domain signal processing and to map more signal processing approaches into the analog domain. This paper presents an analog domain signal processing circuit which approximates the output of the Discrete Wavelet Transform (DWT) for use in ultra low power wearable sensors. Analog filters are used for the DWT filters and it is demonstrated how these generate analog domain DWT-like information that embeds information from Butterworth and Daubechies maximally flat mother wavelet responses. The Analog DWT is realised in hardware via gmC circuits, designed to operate from a 1.3 V coin cell battery, and provide DWT-like signal processing using under 115 nW of power when implemented in a 0.18 μm CMOS process. Practical examples demonstrate the effective use of the new Analog DWT on ECG (electrocardiogram) and EEG (electroencephalogram) signals recorded from humans. PMID:26694414

  8. An Analog Circuit Approximation of the Discrete Wavelet Transform for Ultra Low Power Signal Processing in Wearable Sensor Nodes.

    PubMed

    Casson, Alexander J

    2015-12-17

    Ultra low power signal processing is an essential part of all sensor nodes, and particularly so in emerging wearable sensors for biomedical applications. Analog signal processing has an important role in these low power, low voltage, low frequency applications, and there is a key drive to decrease the power consumption of existing analog domain signal processing and to map more signal processing approaches into the analog domain. This paper presents an analog domain signal processing circuit which approximates the output of the Discrete Wavelet Transform (DWT) for use in ultra low power wearable sensors. Analog filters are used for the DWT filters and it is demonstrated how these generate analog domain DWT-like information that embeds information from Butterworth and Daubechies maximally flat mother wavelet responses. The Analog DWT is realised in hardware via g(m)C circuits, designed to operate from a 1.3 V coin cell battery, and provide DWT-like signal processing using under 115 nW of power when implemented in a 0.18 μm CMOS process. Practical examples demonstrate the effective use of the new Analog DWT on ECG (electrocardiogram) and EEG (electroencephalogram) signals recorded from humans.

  9. Nonlinear Real-Time Optical Signal Processing.

    DTIC Science & Technology

    1981-06-30

    bandwidth and space-bandwidth products. Real-time homonorphic and loga- rithmic filtering by halftone nonlinear processing has been achieved. A...Page ABSTRACT 1 1. RESEARCH OBJECTIVES AND PROGRESS 3 I-- 1.1 Introduction and Project overview 3 1.2 Halftone Processing 9 1.3 Direct Nonlinear...time homomorphic and logarithmic filtering by halftone nonlinear processing has been achieved. A detailed analysis of degradation due to the finite gamma

  10. Intracellular signalling proteins as smart' agents in parallel distributed processes.

    PubMed

    Fisher, M J; Paton, R C; Matsuno, K

    1999-06-01

    In eucaryotic organisms, responses to external signals are mediated by a repertoire of intracellular signalling pathways that ultimately bring about the activation/inactivation of protein kinases and/or protein phosphatases. Until relatively recently, little thought had been given to the intracellular distribution of the components of these signalling pathways. However, experimental evidence from a diverse range of organisms indicates that rather than being freely distributed, many of the protein components of signalling cascades show a significant degree of spatial organisation. Here, we briefly review the roles of 'anchor' 'scaffold' and 'adaptor' proteins in the organisation and functioning of intracellular signalling pathways. We then consider some of the parallel distributed processing capacities of these adaptive systems. We focus on signalling proteins-both as individual 'devices' (agents) and as 'networks' (ecologies) of parallel processes. Signalling proteins are described as 'smart thermodynamic machines' which satisfy 'gluing' (functorial) roles in the information economy of the cell. This combines two information-processing views of signalling proteins. Individually, they show 'cognitive' capacities and collectively they integrate (cohere) cellular processes. We exploit these views by drawing comparisons between signalling proteins and verbs. This text/dialogical metaphor also helps refine our view of signalling proteins as context-sensitive information processing agents.

  11. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  12. Strengthening of biomedical Ni-free Co-Cr-Mo alloy by multipass "low-strain-per-pass" thermomechanical processing.

    PubMed

    Mori, Manami; Yamanaka, Kenta; Sato, Shigeo; Tsubaki, Shinki; Satoh, Kozue; Kumagai, Masayoshi; Imafuku, Muneyuki; Shobu, Takahisa; Chiba, Akihiko

    2015-12-01

    Further strengthening of biomedical Co-Cr-Mo alloys is desired, owing to the demand for improvements to their durability in applications such as artificial hip joints, spinal rods, bone plates, and screws. Here, we present a strategy-multipass "low-strain-per-pass" thermomechanical processing-for achieving high-strength biomedical Co-Cr-Mo alloys with sufficient ductility. The process primarily consists of multipass hot deformation, which involves repeated introduction of relatively small amounts of strain to the alloy at elevated temperatures. The concept was verified by performing hot rolling of a Co-28 Cr-6 Mo-0.13N (mass%) alloy and its strengthening mechanisms were examined. Strength increased monotonically with hot-rolling reduction, eventually reaching 1,400 MPa in 0.2% proof stress, an exceptionally high value. Synchrotron X-ray diffraction (XRD) line-profile analysis revealed a drastic increase in the dislocation density with an increase in hot-rolling reduction and proposed that the significant strengthening was primarily driven by the increased dislocation density, while the contributions of grain refinement were minor. In addition, extra strengthening, which originates from contributions of planar defects (stacking faults/deformation twins), became apparent for greater hot-rolling reductions. The results obtained in this work help in reconsidering the existing strengthening strategy for the alloys, and thus, a novel feasible manufacturing route using conventional hot deformation processing, such as forging, rolling, swaging, and drawing, is realized. The results obtained in this work suggested a novel microstructural design concept/feasible manufacturing route of high-strength Co-Cr-Mo alloys using conventional hot deformation processing. The present strategy focuses on the strengthening due to the introduction of a high density of lattice defects rather than grain refinement using dynamic recrystallization (DRX). The hot-rolled samples obtained by our

  13. Profile and Instrumentation Driven Methods for Embedded Signal Processing

    DTIC Science & Technology

    2015-01-01

    instrumentation-based tech- niques that facilitate design and maintenance of embedded signal processing systems : 1. We propose and develop a novel, translation...several profile- and instrumentation-based techniques that facilitate design and maintenance of embedded signal processing systems 1. We propose and...9 2.2 Embedded System Design Strategies . . . . . . . . . . . . . . . . . . . . 12 2.3 Dataflow Modeling

  14. Signal processing at the Poker Flat MST radar

    NASA Technical Reports Server (NTRS)

    Carter, D. A.

    1983-01-01

    Signal processing for Mesosphere-Stratosphere-Troposphere (MST) radar is carried out by a combination of hardware in high-speed, special-purpose devices and software in a general-purpose, minicomputer/array processor. A block diagram of the signal processing system is presented, and the steps in the processing pathway are described. The current processing capabilities are given, and a system offering greater coherent integration speed is advanced which hinges upon a high speed preprocessor.

  15. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2012-01-31

    18, 1 gives an attenuation of-12dB and a sensitivity of 2 gives an attenuation of OdB. The signal is digitized using a Sigma - Delta ADC with a... Matlab code that called the API C functions was used. -The mean of the I data and the mean of the Q data were computed. - The amplitude was calculated...software . defined radio (SDR) modules . Clarkson has investigated several other SDR approaches which include two open source SDR platforms and one

  16. Innovative signal processing for Johnson Noise thermometry

    SciTech Connect

    Ezell, N. Dianne Bull; Britton, Jr, Charles L.; Roberts, Michael

    2016-07-01

    This report summarizes the newly developed algorithm that subtracted the Electromagnetic Interference (EMI). The EMI performance is very important to this measurement because any interference in the form on pickup from external signal sources from such as fluorescent lighting ballasts, motors, etc. can skew the measurement. Two methods of removing EMI were developed and tested at various locations. This report also summarizes the testing performed at different facilities outside Oak Ridge National Laboratory using both EMI removal techniques. The first EMI removal technique reviewed in previous milestone reports and therefore this report will detail the second method.

  17. Signal processing and electronic noise in LZ

    NASA Astrophysics Data System (ADS)

    Khaitan, D.

    2016-03-01

    The electronics of the LUX-ZEPLIN (LZ) experiment, the 10-tonne dark matter detector to be installed at the Sanford Underground Research Facility (SURF), consists of low-noise dual-gain amplifiers and a 100-MHz, 14-bit data acquisition system for the TPC PMTs. Pre-prototypes of the analog amplifiers and the 32-channel digitizers were tested extensively with simulated pulses that are similar to the prompt scintillation light and the electroluminescence signals expected in LZ. These studies are used to characterize the noise and to measure the linearity of the system. By increasing the amplitude of the test signals, the effect of saturating the amplifier and the digitizers was studied. The RMS ADC noise of the digitizer channels was measured to be 1.19± 0.01 ADCC. When a high-energy channel of the amplifier is connected to the digitizer, the measured noise remained virtually unchanged, while the noise added by a low-energy channel was estimated to be 0.38 ± 0.02 ADCC (46 ± 2 μV). A test facility is under construction to study saturation, mitigate noise and measure the performance of the LZ electronics and data acquisition chain.

  18. Signal Processing and Electronic Noise in LZ

    NASA Astrophysics Data System (ADS)

    Ashish Khaitan, Dev

    2015-11-01

    The electronics of the LUX-ZEPLIN (LZ) experiment, the 10-tonne dark matter detector to be installed at the Sanford Underground Research Facility (SURF), consists of low-noise dual-gain amplifiers and a 100-MHz, 14-bit data acquisition system for the TPC PMTs. Pre-prototypes of the analog amplifiers and the 32-channel digitizers were tested extensively with simulated pulses that are similar to the prompt scintillation light and the electroluminescence signals expected in LZ. These studies are used to characterize the noise and to measure the linearity of the system. By increasing the amplitude of the test signals, the effect of saturating the amplifier and the digitizers was studied. The RMS ADC noise of the digitizer channels was measured to be 1.19 $\\pm$ 0.01 ADCC. When a high-energy channel of the amplifier is connected to the digitizer, the measured noise remained virtually unchanged, while the noise added by a low-energy channel was estimated to be 0.38 $\\pm$ 0.02 ADCC (46 $\\pm$ 2$\\mu$V). A test facility is under construction to study saturation, mitigate noise and measure the performance of the LZ electronics and data acquisition chain.

  19. [Biomedical informatics].

    PubMed

    Capurro, Daniel; Soto, Mauricio; Vivent, Macarena; Lopetegui, Marcelo; Herskovic, Jorge R

    2011-12-01

    Biomedical Informatics is a new discipline that arose from the need to incorporate information technologies to the generation, storage, distribution and analysis of information in the domain of biomedical sciences. This discipline comprises basic biomedical informatics, and public health informatics. The development of the discipline in Chile has been modest and most projects have originated from the interest of individual people or institutions, without a systematic and coordinated national development. Considering the unique features of health care system of our country, research in the area of biomedical informatics is becoming an imperative.

  20. Biomedical Imaging,

    DTIC Science & Technology

    precision required from the task. This report details the technologies in surface and subsurface imaging systems for research and commercial applications. Biomedical imaging, Anthropometry, Computer imaging.

  1. Uncovering brain-heart information through advanced signal and image processing.

    PubMed

    Valenza, Gaetano; Toschi, Nicola; Barbieri, Riccardo

    2016-05-13

    Through their dynamical interplay, the brain and the heart ensure fundamental homeostasis and mediate a number of physiological functions as well as their disease-related aberrations. Although a vast number of ad hoc analytical and computational tools have been recently applied to the non-invasive characterization of brain and heart dynamic functioning, little attention has been devoted to combining information to unveil the interactions between these two physiological systems. This theme issue collects contributions from leading experts dealing with the development of advanced analytical and computational tools in the field of biomedical signal and image processing. It includes perspectives on recent advances in 7 T magnetic resonance imaging as well as electroencephalogram, electrocardiogram and cerebrovascular flow processing, with the specific aim of elucidating methods to uncover novel biological and physiological correlates of brain-heart physiology and physiopathology.

  2. Uncovering brain–heart information through advanced signal and image processing

    PubMed Central

    Toschi, Nicola; Barbieri, Riccardo

    2016-01-01

    Through their dynamical interplay, the brain and the heart ensure fundamental homeostasis and mediate a number of physiological functions as well as their disease-related aberrations. Although a vast number of ad hoc analytical and computational tools have been recently applied to the non-invasive characterization of brain and heart dynamic functioning, little attention has been devoted to combining information to unveil the interactions between these two physiological systems. This theme issue collects contributions from leading experts dealing with the development of advanced analytical and computational tools in the field of biomedical signal and image processing. It includes perspectives on recent advances in 7 T magnetic resonance imaging as well as electroencephalogram, electrocardiogram and cerebrovascular flow processing, with the specific aim of elucidating methods to uncover novel biological and physiological correlates of brain–heart physiology and physiopathology. PMID:27044995

  3. Is complex signal processing for bone conduction hearing aids useful?

    PubMed

    Kompis, Martin; Kurz, Anja; Pfiffner, Flurin; Senn, Pascal; Arnold, Andreas; Caversaccio, Marco

    2014-05-01

    To establish whether complex signal processing is beneficial for users of bone anchored hearing aids. Review and analysis of two studies from our own group, each comparing a speech processor with basic digital signal processing (either Baha Divino or Baha Intenso) and a processor with complex digital signal processing (either Baha BP100 or Baha BP110 power). The main differences between basic and complex signal processing are the number of audiologist accessible frequency channels and the availability and complexity of the directional multi-microphone noise reduction and loudness compression systems. Both studies show a small, statistically non-significant improvement of speech understanding in quiet with the complex digital signal processing. The average improvement for speech in noise is +0.9 dB, if speech and noise are emitted both from the front of the listener. If noise is emitted from the rear and speech from the front of the listener, the advantage of the devices with complex digital signal processing as opposed to those with basic signal processing increases, on average, to +3.2 dB (range +2.3 … +5.1 dB, p ≤ 0.0032). Complex digital signal processing does indeed improve speech understanding, especially in noise coming from the rear. This finding has been supported by another study, which has been published recently by a different research group. When compared to basic digital signal processing, complex digital signal processing can increase speech understanding of users of bone anchored hearing aids. The benefit is most significant for speech understanding in noise.

  4. [Dynamic Pulse Signal Processing and Analyzing in Mobile System].

    PubMed

    Chou, Yongxin; Zhang, Aihua; Ou, Jiqing; Qi, Yusheng

    2015-09-01

    In order to derive dynamic pulse rate variability (DPRV) signal from dynamic pulse signal in real time, a method for extracting DPRV signal was proposed and a portable mobile monitoring system was designed. The system consists of a front end for collecting and wireless sending pulse signal and a mobile terminal. The proposed method is employed to extract DPRV from dynamic pulse signal in mobile terminal, and the DPRV signal is analyzed both in the time domain and the frequency domain and also with non-linear method in real time. The results show that the proposed method can accurately derive DPRV signal in real time, the system can be used for processing and analyzing DPRV signal in real time.

  5. High Speed Cascadable Signal Processing Circuits

    DTIC Science & Technology

    1988-04-29

    Mexico 88002 / "’ . Otavio J. Morales . Assistant Principal Investigator ~, . - .-" " The views, opinions, and/or findings contained in this report...T Fort Collins, Colorado 80525 U.S.Army White Sands Missile Ranpe New Mexico 88002-91h3 go. NAME OF FUNDING i SPONSOXINd 9b. OFFICE SYMBOL 9...general purpos; data processor. The Vector Processing Hardware ( VPH ) is a speed-optimized architecture capable of processing vectors of complex data. The

  6. Time reversal signal processing for communication.

    SciTech Connect

    Young, Derek P.; Jacklin, Neil; Punnoose, Ratish J.; Counsil, David T.

    2011-09-01

    Time-reversal is a wave focusing technique that makes use of the reciprocity of wireless propagation channels. It works particularly well in a cluttered environment with associated multipath reflection. This technique uses the multipath in the environment to increase focusing ability. Time-reversal can also be used to null signals, either to reduce unintentional interference or to prevent eavesdropping. It does not require controlled geometric placement of the transmit antennas. Unlike existing techniques it can work without line-of-sight. We have explored the performance of time-reversal focusing in a variety of simulated environments. We have also developed new algorithms to simultaneously focus at a location while nulling at an eavesdropper location. We have experimentally verified these techniques in a realistic cluttered environment.

  7. Statistical mechanics and visual signal processing

    NASA Astrophysics Data System (ADS)

    Potters, Marc; Bialek, William

    1994-11-01

    We show how to use the language of statistical field theory to address and solve problems in which one must estimate some aspect of the environnent from the data in an array of sensors. In the field theory formulation the optimal estimator can be written as an expectation value in an ensemble where the input data act as external field. Problems at low signal-to-noise ratio can be solved in perturbation theory, while high signal-to-noise ratios are treated with a saddle-point approximation. These ideas are illustrated in detail by an example of visual motion estimation which is chosen to model a problem solved by the fly's brain. The optimal estimator bas a rich structure, adapting to various parameters of the environnent such as the mean-square contrast and the corrélation time of contrast fluctuations. This structure is in qualitative accord with existing measurements on motion sensitive neurons in the fly's brain, and the adaptive properties of the optimal estimator may help resolve conficts among different interpretations of these data. Finally we propose some crucial direct tests of the adaptive behavior. Nous montrons comment employer le langage de la théorie statistique des champs pour poser et résoudre des problèmes où l'on doit estimer une caractéristique de l'environnement à l'aide de données provenant d'un ensemble de détecteurs. Dans ce formalisme, l'estimateur optimal peut être écrit comme la valeur moyenne d'un opérateur, l'ensemble des données d'entrée agissant comme un champ externe. Les problèmes à faible rapport signal-bruit sont résolus par la théorie des perturbations. La méthode du col est employée pour ceux à haut rapport signal-bruit. Ces idées sont illustrées en détails sur un modèle d'estimation visuelle du mouvement basé sur un problème résolu par la mouche. L'estimateur optimal a une structure très riche, s'adaptant à divers paramètres de l'environnement tels la variance du contraste et le temps de corr

  8. Processing electrophysiological signals for the monitoring of alertness

    NASA Technical Reports Server (NTRS)

    Lai, D. C.

    1974-01-01

    Mathematical techniques are described for processing EEG signals associated with varying states of alertness. Fast algorithms for implementing real-time computations of alertness estimates were developed. A realization of the phase-distortionless digital filter is presented which approaches real-time filtering and a transform for EEG signals. This transform provides information for the alertness estimates and can be performed in real time. A statistical test for stationarity in EEG signals is being developed that will provide a method for determining the duration of the EEG signals necessary for estimating the short-time power or energy spectra for nonstationary analysis of EEG signals.

  9. Neuromorphic opto-electronic integrated circuits for optical signal processing

    NASA Astrophysics Data System (ADS)

    Romeira, B.; Javaloyes, J.; Balle, S.; Piro, O.; Avó, R.; Figueiredo, J. M. L.

    2014-08-01

    The ability to produce narrow optical pulses has been extensively investigated in laser systems with promising applications in photonics such as clock recovery, pulse reshaping, and recently in photonics artificial neural networks using spiking signal processing. Here, we investigate a neuromorphic opto-electronic integrated circuit (NOEIC) comprising a semiconductor laser driven by a resonant tunneling diode (RTD) photo-detector operating at telecommunication (1550 nm) wavelengths capable of excitable spiking signal generation in response to optical and electrical control signals. The RTD-NOEIC mimics biologically inspired neuronal phenomena and possesses high-speed response and potential for monolithic integration for optical signal processing applications.

  10. Safety Assessment of Commonly Used Nanoparticles in Biomedical Applications: Impact on Inflammatory Processes

    NASA Astrophysics Data System (ADS)

    Alnasser, Yossef

    Nanotechnology offers great promise in the biomedical field. Current knowledge of nanoparticles' (NPs) safety and possible mechanisms of various particle types' toxicity is insufficient. The role of particle properties and the route of particles administration in toxic reactions remain unexplored. In this thesis, we aimed to inspect the interrelationship between particle size, chemical composition and toxicological effects of four candidate NPs for drug delivery systems: gold (Au), chitosan, silica, and poly (lactide-co-glycolide) (PLGA). Mice model was combined with in vitro study to explore NPs' safety. We investigated mice survival, weight, behavior, and pro-inflammatory changes. NF-kappaB induction was assessed in vitro using the Luciferase Assay System. As observed in mice, Au NPs had a higher toxicity profile at a shorter duration than the other NPs. This was significantly in concordance with pro-inflammatory changes which may be the key routes of Au NPs toxicity. Although silica NPs induced NF-kappaB, they were less toxic to the mice than Au NPs and did not lead to the pro-inflammatory changes. Chitosan NPs were toxic to the mice but failed to cause significant NF-kappaB induction and pro-inflammatory changes. These findings indicate that chitosan NPs might not have the same pathophysiologic mechanism as the Au NPs. Comparative analysis in this model demonstrated that PLGA NPs is the safest drug delivery candidate to be administered subcutaneously.

  11. New signal processing technique for density profile reconstruction using reflectometry

    SciTech Connect

    Clairet, F.; Bottereau, C.; Ricaud, B.; Briolle, F.; Heuraux, S.

    2011-08-15

    Reflectometry profile measurement requires an accurate determination of the plasma reflected signal. Along with a good resolution and a high signal to noise ratio of the phase measurement, adequate data analysis is required. A new data processing based on time-frequency tomographic representation is used. It provides a clearer separation between multiple components and improves isolation of the relevant signals. In this paper, this data processing technique is applied to two sets of signals coming from two different reflectometer devices used on the Tore Supra tokamak. For the standard density profile reflectometry, it improves the initialization process and its reliability, providing a more accurate profile determination in the far scrape-off layer with density measurements as low as 10{sup 16} m{sup -1}. For a second reflectometer, which provides measurements in front of a lower hybrid launcher, this method improves the separation of the relevant plasma signal from multi-reflection processes due to the proximity of the plasma.

  12. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2014-10-30

    microcontroller board is used to control a laser scanner. A MATLAB program collects data at each pixel and constructs both amplitude and range images...development of signed processing algorithms for hybrid lidar- radar designed to improve detection performance. 15. SUBJECT TERMS Hybrid Lidar - Radar 16...hardware implementation and underwater channel characteristics. Tecfinical Approach A significant challenge in hybrid lidar- radar is optical

  13. Biomedical Imaging

    NASA Astrophysics Data System (ADS)

    MacPherson, Emma

    This chapter builds on the basic principles of THz spectroscopy and explains how they can be applied to biomedical systems as well as the motivation for doing so. Sample preparation techniques and measurement methods for biomedical samples are described in detail. Examples of medical applications investigated hitherto including breast cancer and skin cancer are also presented.

  14. Signal quality and Bayesian signal processing in neurofeedback based on real-time fMRI.

    PubMed

    Koush, Yury; Zvyagintsev, Mikhail; Dyck, Miriam; Mathiak, Krystyna A; Mathiak, Klaus

    2012-01-02

    Real-time fMRI allows analysis and visualization of the brain activity online, i.e. within one repetition time. It can be used in neurofeedback applications where subjects attempt to control an activation level in a specified region of interest (ROI) of their brain. The signal derived from the ROI is contaminated with noise and artifacts, namely with physiological noise from breathing and heart beat, scanner drift, motion-related artifacts and measurement noise. We developed a Bayesian approach to reduce noise and to remove artifacts in real-time using a modified Kalman filter. The system performs several signal processing operations: subtraction of constant and low-frequency signal components, spike removal and signal smoothing. Quantitative feedback signal quality analysis was used to estimate the quality of the neurofeedback time series and performance of the applied signal processing on different ROIs. The signal-to-noise ratio (SNR) across the entire time series and the group event-related SNR (eSNR) were significantly higher for the processed time series in comparison to the raw data. Applied signal processing improved the t-statistic increasing the significance of blood oxygen level-dependent (BOLD) signal changes. Accordingly, the contrast-to-noise ratio (CNR) of the feedback time series was improved as well. In addition, the data revealed increase of localized self-control across feedback sessions. The new signal processing approach provided reliable neurofeedback, performed precise artifacts removal, reduced noise, and required minimal manual adjustments of parameters. Advanced and fast online signal processing algorithms considerably increased the quality as well as the information content of the control signal which in turn resulted in higher contingency in the neurofeedback loop.

  15. A comb filter based signal processing method to effectively reduce motion artifacts from photoplethysmographic signals.

    PubMed

    Peng, Fulai; Liu, Hongyun; Wang, Weidong

    2015-10-01

    A photoplethysmographic (PPG) signal can provide very useful information about a subject's cardiovascular status. Motion artifacts (MAs), which usually deteriorate the waveform of a PPG signal, severely obstruct its applications in the clinical diagnosis and healthcare area. To reduce the MAs from a PPG signal, in the present study we present a comb filter based signal processing method. Firstly, wavelet de-noising was implemented to preliminarily suppress a part of the MAs. Then, the PPG signal in the time domain was transformed into the frequency domain by a fast Fourier transform (FFT). Thirdly, the PPG signal period was estimated from the frequency domain by tracking the fundamental frequency peak of the PPG signal. Lastly, the MAs were removed by the comb filter which was designed based on the obtained PPG signal period. Experiments with synthetic and real-world datasets were implemented to validate the performance of the method. Results show that the proposed method can effectively restore the PPG signals from the MA corrupted signals. Also, the accuracy of blood oxygen saturation (SpO2), calculated from red and infrared PPG signals, was significantly improved after the MA reduction by the proposed method. Our study demonstrates that the comb filter can effectively reduce the MAs from a PPG signal provided that the PPG signal period is obtained.

  16. Nonlinear Real-Time Optical Signal Processing

    DTIC Science & Technology

    1990-09-01

    pattern recognition. Additional work concerns the relationship of parallel computation paradigms to optical computing and halftone screen techniques...paradigms to optical computing and halftone screen techniques for implementing general nonlinear functions. 3\\ 2 Research Progress This section...Vol. 23, No. 8, pp. 34-57, 1986. 2.4 Nonlinear Optical Processing with Halftones : Degradation and Compen- sation Models This paper is concerned with

  17. Parametric Techniques for Multichannel Signal Processing.

    DTIC Science & Technology

    1985-10-01

    Parameter Estimation," 7th IFAC Symposium on Identification and Sysstem Parameter Estimation, July 1985, York, United Kingdom. 21. B. Porat and B...1801 Page ill Road. Palo Alto, CA 94304. UrSA Boaz PORAT Department ot Electrical Engineering . Technwon, Israel Institute of Technology, Haifa...is a zero-mean white noise many engineering applications. A common model process with variance cr: , and v, is the observed for a wide-sense

  18. Synthetic biology in the analysis and engineering of signaling processes.

    PubMed

    Kämpf, Michael M; Weber, Wilfried

    2010-01-01

    Synthetic biology as the discipline of reconstructing natural and designing novel biological systems is gaining increasing impact in signaling science. This review article provides insight into synthetic approaches for analyzing and synthesizing signaling processes starting with strategies into how natural and pathological signaling pathways can be reconstructed in an evolutionary distant host to study their topology and function while avoiding interference with the original host background. In the second part we integrate synthetic strategies in the rewiring of signaling systems at the nucleic acid and protein level to reprogram cellular functions for biotechnological applications. The last part focuses on synthetic inter-cell and inter-species signaling devices and their integration into synthetic ecosystems to study fundamental mechanisms governing the co-existence of species. We finally address current bottlenecks in the (re-)design of signaling pathways and discuss future directions in signaling-related synthetic biology.

  19. The Signal Processing and Communications (SPC) toolbox, release 2

    NASA Astrophysics Data System (ADS)

    Brown, Dennis W.; Fargues, Monique P.

    1995-09-01

    The SPC (Signal Processing & Communications) toolbox is a software package designed to provide the user with a series of data manipulation tools which use MATLAB v.4 graphical user interface controls SPC can be used in the classroom to illustrate and to reinforce basic concepts in digital signal processing and communications. It frees the user from having to write and debug his/her own code and gives him/her more time to understand the advantages and drawbacks of each technique included in the package. It can also be used as a basic analysis and modeling tool for research in Signal Processing.

  20. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  1. Fault Tolerant Statistical Signal Processing Algorithms for Parallel Architectures.

    DTIC Science & Technology

    2014-09-26

    AD-fi57 393 FAULT TOLERANT STATISTICAL SIGNAL PROCESSING ALGORITHMS i/i FOR PARALLEL ARCH U) JOHNS HOPKINS UNIV BALTIMORE MD DEPT OF ELECTRICAL...COVERED * ’ Fault Tolerant Statistical Signal Processing Technical A l g o r i t h m s f o r P a r a l l e l A r c h i t e c t u r e s a ._ P E R F O R M I...Identify by block number) , Fault Tolerance, Signal Processing, Parallel Architecture 0 20. ABSTRACT (Continue on reveree side It neceseary and identify by

  2. Conflicting interests involved in the process of publishing in biomedical journals.

    PubMed

    Igi, Rajko

    2015-01-01

    This short discussion on conflicting interests in publishing is designed to help all participants (authors, editors and peer reviewers) in the publication of biomedical papers. Authors who submit manuscripts to a journal are responsible for the overall quality and integrity of the paper. The main goal of the editor is to provide readers with the most relevant information by insuring proper presentation and interpretation of scientific data. The editor informs readers on potential conflicting interests of the authors to enable the reader to judge a paper in a more informative way. However, the editor must also consider potential conflicting interests of peer reviewers. If a peer reviewer has a potential conflicting interest in evaluating a manuscript, he/she should not accept the job of reviewing it. If the editor or any member of the executive board has a similar conflict of interest for an article under consideration, including an editorial for this journal, such persons should not participate in the vote to endorse the article, and the journal should publish a note to that effect. When an article is published in the local language for a "small scientific community" there is always a risk that peer review could reflect personal relationships and animosities. Blinding the reviewer to the author(s) might eliminate a reviewer's conflict of interests, but this is not always possible or even desirable. A better solution would be to have the journal publish all scientific articles in English. This would provide both wider readership and a larger group of international reviewers. To gain better reviewers, the journal staff could educate young local investigators by publishing educational articles. Advantages and disadvantages of publishing a statement on conflicting interests are discussed.

  3. Physics-based signal processing algorithms for micromachined cantilever arrays

    DOEpatents

    Candy, James V; Clague, David S; Lee, Christopher L; Rudd, Robert E; Burnham, Alan K; Tringe, Joseph W

    2013-11-19

    A method of using physics-based signal processing algorithms for micromachined cantilever arrays. The methods utilize deflection of a micromachined cantilever that represents the chemical, biological, or physical element being detected. One embodiment of the method comprises the steps of modeling the deflection of the micromachined cantilever producing a deflection model, sensing the deflection of the micromachined cantilever and producing a signal representing the deflection, and comparing the signal representing the deflection with the deflection model.

  4. A Review on Sensor, Signal, and Information Processing Algorithms (PREPRINT)

    DTIC Science & Technology

    2010-01-01

    ratio , and tonality [66]. The number of speakers in the speech signals was determined by analyzing the mod- ulation characteristics of the signals in...gorithms combined local decisions that were corrupted during the transmission process due to channel fading. Also, a new likelihood ratio based test was...maximizes the signal-to-noise ratio (SNR) at the fusion C-11 TRANSMISSION PHASE TRAINING PHASE ( fixed ) trn tot P P - trn P tot P

  5. Processing of physiological signals in automotive research.

    PubMed

    Dambier, Michael; Altmüller, Tobias; Ladstätter, Ulrich

    2006-12-01

    The development of innovative driver assistance systems requires the evaluation of the predisposed hypotheses such as acceptance and driving safety. For this purpose, the conduction of experiments with end-users as subjects is necessary. Analysis and evaluation are based on the recording of numerous sensor values and system variables. Video, gaze and physiological data are recorded for the analysis of gaze distraction and emotional reactions of subjects to system behaviour. In this paper, a modular data streaming and processing architecture is suggested and a concept for this architecture is defined for consistent data evaluation, which integrates off-the-shelf products for data analysis and evaluation.

  6. Modeling laser velocimeter signals as triply stochastic Poisson processes

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1976-01-01

    Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

  7. The Behavioral Neuroscience of Anuran Social Signal Processing

    PubMed Central

    Wilczynski, Walter; Ryan, Michael J.

    2010-01-01

    Acoustic communication is the major component of social behavior in anuran amphibians (frogs and toads) and has served as a neuroethological model for the nervous system’s processing of social signals related to mate choice decisions. The male’s advertisement or mating call is its most conspicuous social signal, and the nervous system’s analysis of the call is a progressive process. As processing proceeds through neural systems, response properties become more specific to the signal and, in addition, neural activity gradually shifts from representing sensory (auditory periphery and brainstem) to sensorimotor (diencephalon) to motor (forebrain) components of a behavioral response. A comparative analysis of many anuran species shows that the first stage in biasing responses toward conspecific signals over heterospecific signals, and toward particular features of conspecific signals, lies in the tuning of the peripheral auditory system. Biases in processing signals are apparent through the brainstem auditory system, where additional feature detection neurons are added by the time processing reaches the level of the midbrain. Recent work using immediate early gene expression as a marker of neural activity suggests that by the level of the midbrain and forebrain, the differential neural representation of conspecific and heterospecific signals involves both changes in mean activity levels across multiple subnuclei, and in the functional correlations among acoustically active areas. Our data show that in frogs the auditory midbrain appears to play an important role in controlling behavioral responses to acoustic social signals by acting as a regulatory gateway between the stimulus analysis of the brainstem and the behavioral and physiological control centers of the forebrain. We predict that this will hold true for other vertebrate groups such as birds and fish that produce acoustic social signals, and perhaps also in fish where electroreception or vibratory sensing

  8. Industrial Applications Of Optical Signal Processing I

    NASA Astrophysics Data System (ADS)

    Javidi, Bahram

    1988-04-01

    Optical technology has emerged as a viable solution to the growing demand to increase the throughput of high speed processors and computers. Although higher speed and denser integrated circuits are being developed, it appears that faster switching speeds in digital circuits will not provide an adequate solution to the bottleneck problem of computing systems for such tasks as real-time distortion-invariant pattern recognition and associative memory. Even supercomputers using new computing architectures and subnanosecond gate delays do not have sufficient speed for such real-time operations. Optical systems offer the advantages of inherent parallelism and high speed with superior interconnection capability, which allow for the processing of millions of simultaneous operations. The lack of electromagnetic interference in optics is ideally suited for neural network based proces-sors, which require a high degree of interconnectivity and global communications properties. Analog optical computers are particularly attractive for the processing of large stochastic data, while the more precise digital computers break down when confronted with such random problems. The immunity to electromagnetic interference can also be used advantageously in VLSI interconnections technology and board-to-board communications to reduce the pinout problem and to improve flexibility and performance. For these reasons, optical technology has become a major research and development effort at many industrial, government, and university laboratories both nationally and internationally.

  9. Understanding the effects of pre-processing on extracted signal features from gait accelerometry signals.

    PubMed

    Millecamps, Alexandre; Lowry, Kristin A; Brach, Jennifer S; Perera, Subashan; Redfern, Mark S; Sejdić, Ervin

    2015-07-01

    Gait accelerometry is an important approach for gait assessment. Previous contributions have adopted various pre-processing approaches for gait accelerometry signals, but none have thoroughly investigated the effects of such pre-processing operations on the obtained results. Therefore, this paper investigated the influence of pre-processing operations on signal features extracted from gait accelerometry signals. These signals were collected from 35 participants aged over 65years: 14 of them were healthy controls (HC), 10 had Parkinson׳s disease (PD) and 11 had peripheral neuropathy (PN). The participants walked on a treadmill at preferred speed. Signal features in time, frequency and time-frequency domains were computed for both raw and pre-processed signals. The pre-processing stage consisted of applying tilt correction and denoising operations to acquired signals. We first examined the effects of these operations separately, followed by the investigation of their joint effects. Several important observations were made based on the obtained results. First, the denoising operation alone had almost no effects in comparison to the trends observed in the raw data. Second, the tilt correction affected the reported results to a certain degree, which could lead to a better discrimination between groups. Third, the combination of the two pre-processing operations yielded similar trends as the tilt correction alone. These results indicated that while gait accelerometry is a valuable approach for the gait assessment, one has to carefully adopt any pre-processing steps as they alter the observed findings.

  10. Advanced Integrated Optical Signal Processing Components.

    NASA Astrophysics Data System (ADS)

    Rastani, Kasra

    This research was aimed at the development of advanced integrated optical components suitable for devices capable of processing multi-dimensional inputs. In such processors, densely packed waveguide arrays with low crosstalk are needed to provide dissection of the information that has been partially processed. Waveguide arrays also expand the information in the plane of the processor while maintaining its coherence. Rib waveguide arrays with low loss, high mode confinement and highly uniform surface quality (660 elements, 8 μm wide, 1 μm high, and 1 cm long with 2 mu m separations) were fabricated on LiNbO _3 substrates through the ion beam milling technique. A novel feature of the multi-dimensional IO processor architecture proposed herein is the implementation of large area uniform outcoupling (with low to moderate outcoupling efficiencies) from rib waveguide arrays in order to access the third dimension of the processor structure. As a means of outcoupling, uniform surface gratings (2 μm and 4 μm grating periods, 0.05 μm high and 1 mm long) with low outcoupling efficiencies (of approximately 2-18%/mm) were fabricated on the nonuniform surface of the rib waveguide arrays. As a practical technique of modulating the low outcoupling efficiencies of the surface gratings, it was proposed to alter the period of the grating as a function of position along each waveguide. Large aperture (2.5 mm) integrated lenses with short positive focal lengths (1.2-2.5 cm) were developed through a modification of the titanium-indiffused proton exchanged (TIPE) technique. Such integrated lenses were fabricated by increasing the refractive index of the slab waveguides by the TIPE process while maintaining the refractive index of the lenses at the lower level of Ti:LiNbO _3 waveguide. By means of curvature reversal of the integrated lenses, positive focal length lenses have been fabricated while providing high mode confinement for the slab waveguide. The above elements performed as

  11. Multiwavelength micropulse lidar in atmospheric aerosol study: signal processing

    NASA Astrophysics Data System (ADS)

    Posyniak, Michal; Malinowski, Szymon P.; Stacewicz, Tadeusz; Markowicz, Krzysztof M.; Zielinski, Tymon; Petelski, Tomasz; Makuch, Przemyslaw

    2011-11-01

    Multiwavelength micropulse lidar (MML) designed for continuous optical sounding of the atmosphere is presented. A specific signal processing technique applying two directional Kalman filtering is introduced in order to enhance signal to noise ratio. Application of this technique is illustrated with profiles collected in course of COAST 2009 and WRNP 2010 research campaigns.

  12. HYMOSS signal processing for pushbroom spectral imaging

    NASA Technical Reports Server (NTRS)

    Ludwig, David E.

    1991-01-01

    The objective of the Pushbroom Spectral Imaging Program was to develop on-focal plane electronics which compensate for detector array non-uniformities. The approach taken was to implement a simple two point calibration algorithm on focal plane which allows for offset and linear gain correction. The key on focal plane features which made this technique feasible was the use of a high quality transimpedance amplifier (TIA) and an analog-to-digital converter for each detector channel. Gain compensation is accomplished by varying the feedback capacitance of the integrate and dump TIA. Offset correction is performed by storing offsets in a special on focal plane offset register and digitally subtracting the offsets from the readout data during the multiplexing operation. A custom integrated circuit was designed, fabricated, and tested on this program which proved that nonuniformity compensated, analog-to-digital converting circuits may be used to read out infrared detectors. Irvine Sensors Corporation (ISC) successfully demonstrated the following innovative on-focal-plane functions that allow for correction of detector non-uniformities. Most of the circuit functions demonstrated on this program are finding their way onto future IC's because of their impact on reduced downstream processing, increased focal plane performance, simplified focal plane control, reduced number of dewar connections, as well as the noise immunity of a digital interface dewar. The potential commercial applications for this integrated circuit are primarily in imaging systems. These imaging systems may be used for: security monitoring systems, manufacturing process monitoring, robotics, and for spectral imaging when used in analytical instrumentation.

  13. Novel sonar signal processing tool using Shannon entropy

    SciTech Connect

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will be based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}

  14. Optical Signal Processing: Poisson Image Restoration and Shearing Interferometry

    NASA Technical Reports Server (NTRS)

    Hong, Yie-Ming

    1973-01-01

    Optical signal processing can be performed in either digital or analog systems. Digital computers and coherent optical systems are discussed as they are used in optical signal processing. Topics include: image restoration; phase-object visualization; image contrast reversal; optical computation; image multiplexing; and fabrication of spatial filters. Digital optical data processing deals with restoration of images degraded by signal-dependent noise. When the input data of an image restoration system are the numbers of photoelectrons received from various areas of a photosensitive surface, the data are Poisson distributed with mean values proportional to the illuminance of the incoherently radiating object and background light. Optical signal processing using coherent optical systems is also discussed. Following a brief review of the pertinent details of Ronchi's diffraction grating interferometer, moire effect, carrier-frequency photography, and achromatic holography, two new shearing interferometers based on them are presented. Both interferometers can produce variable shear.

  15. An algorithm for signal processing in multibeam antenna arrays

    NASA Astrophysics Data System (ADS)

    Danilevskii, L. N.; Domanov, Iu. A.; Korobko, O. V.

    1980-09-01

    A signal processing method for multibeam antenna arrays is presented which can be used to effectively reduce discrete-phasing sidelobes. Calculations of an 11-element array are presented as an example.

  16. Signal processing techniques for synchronization of wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Lee, Jaehan; Wu, Yik-Chung; Chaudhari, Qasim; Qaraqe, Khalid; Serpedin, Erchin

    2010-11-01

    Clock synchronization is a critical component in wireless sensor networks, as it provides a common time frame to different nodes. It supports functions such as fusing voice and video data from different sensor nodes, time-based channel sharing, and sleep wake-up scheduling, etc. Early studies on clock synchronization for wireless sensor networks mainly focus on protocol design. However, clock synchronization problem is inherently related to parameter estimation, and recently, studies of clock synchronization from the signal processing viewpoint started to emerge. In this article, a survey of latest advances on clock synchronization is provided by adopting a signal processing viewpoint. We demonstrate that many existing and intuitive clock synchronization protocols can be interpreted by common statistical signal processing methods. Furthermore, the use of advanced signal processing techniques for deriving optimal clock synchronization algorithms under challenging scenarios will be illustrated.

  17. Comparative analysis of genomic signal processing for microarray data clustering.

    PubMed

    Istepanian, Robert S H; Sungoor, Ala; Nebel, Jean-Christophe

    2011-12-01

    Genomic signal processing is a new area of research that combines advanced digital signal processing methodologies for enhanced genetic data analysis. It has many promising applications in bioinformatics and next generation of healthcare systems, in particular, in the field of microarray data clustering. In this paper we present a comparative performance analysis of enhanced digital spectral analysis methods for robust clustering of gene expression across multiple microarray data samples. Three digital signal processing methods: linear predictive coding, wavelet decomposition, and fractal dimension are studied to provide a comparative evaluation of the clustering performance of these methods on several microarray datasets. The results of this study show that the fractal approach provides the best clustering accuracy compared to other digital signal processing and well known statistical methods.

  18. 28. Perimeter acquisition radar building room #302, signal process and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. Perimeter acquisition radar building room #302, signal process and analog receiver room - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  19. Array signal processing in the NASA Deep Space Network

    NASA Technical Reports Server (NTRS)

    Pham, Timothy T.; Jongeling, Andre P.

    2004-01-01

    In this paper, we will describe the benefits of arraying and past as well as expected future use of this application. The signal processing aspects of array system are described. Field measurements via actual tracking spacecraft are also presented.

  20. Synthetic aperture radar signal processing: Trends and technologies

    NASA Technical Reports Server (NTRS)

    Curlander, John C.

    1993-01-01

    An overview of synthetic aperture radar (SAR) technology is presented in vugraph form. The following topics are covered: an SAR ground data system; SAR signal processing algorithms; SAR correlator architectures; and current and future trends.

  1. Array signal processing in the NASA Deep Space Network

    NASA Technical Reports Server (NTRS)

    Pham, Timothy T.; Jongeling, Andre P.

    2004-01-01

    In this paper, we will describe the benefits of arraying and past as well as expected future use of this application. The signal processing aspects of array system are described. Field measurements via actual tracking spacecraft are also presented.

  2. Functional description of signal processing in the Rogue GPS receiver

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1988-01-01

    Over the past year, two Rogue GPS prototype receivers have been assembled and successfully subjected to a variety of laboratory and field tests. A functional description is presented of signal processing in the Rogue receiver, tracing the signal from RF input to the output values of group delay, phase, and data bits. The receiver can track up to eight satellites, without time multiplexing among satellites or channels, simultaneously measuring both group delay and phase for each of three channels (L1-C/A, L1-P, L2-P). The Rogue signal processing described requires generation of the code for all three channels. Receiver functional design, which emphasized accuracy, reliability, flexibility, and dynamic capability, is summarized. A detailed functional description of signal processing is presented, including C/A-channel and P-channel processing, carrier-aided averaging of group delays, checks for cycle slips, acquistion, and distinctive features.

  3. Challenges to Control in Signal Processing and Communications

    DTIC Science & Technology

    1986-09-01

    September 1986 LIDS-P-1591 Challenges to Control in Signal Processing and Communications1 Alan S. Willsky Laboratory for Information and Decision...4. TITLE AND SUBTITLE Challenges to Control in Signal Processing and Communications 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...outset. In my plenary address at the Dec. 1981 IEEE Conference on Decision and Control I presented my views on why I felt that the contorl community

  4. Waveguide Studies for Fiber Optics and Optical Signal Processing Applications.

    DTIC Science & Technology

    1980-04-01

    beam expander is shown in Fig. 2 -i. The beam, which is expanded to approximately 100 Wm, can be deflected acousto - optically to make a spectrum analyzer...3 2 . DBR Lasers for Fiber Optics and Optical Signal Processing Sources ......... ................. 4 4. Studies of LiNbO 3...6 Chapter 1. Wave Beam Expansion ....... ............. 9 Chapter 2 . DBR Lasers for Fiber Optics and Optical Signal Processing Sources

  5. The Signal Processing Firmware for the Low Frequency Aperture Array

    NASA Astrophysics Data System (ADS)

    Comoretto, Gianni; Chiello, Riccardo; Roberts, Matt; Halsall, Rob; Adami, Kristian Zarb; Alderighi, Monica; Aminaei, Amin; Baker, Jeremy; Belli, Carolina; Chiarucci, Simone; D’Angelo, Sergio; De Marco, Andrea; Mura, Gabriele Dalle; Magro, Alessio; Mattana, Andrea; Monari, Jader; Naldi, Giovanni; Pastore, Sandro; Perini, Federico; Poloni, Marco; Pupillo, Giuseppe; Rusticelli, Simone; Schiaffino, Marco; Schillirò, Francesco; Zaccaro, Emanuele

    The signal processing firmware that has been developed for the Low Frequency Aperture Array component of the Square Kilometre Array (SKA) is described. The firmware is implemented on a dual FPGA board, that is capable of processing the streams from 16 dual polarization antennas. Data processing includes channelization of the sampled data for each antenna, correction for instrumental response and for geometric delays and formation of one or more beams by combining the aligned streams. The channelizer uses an oversampling polyphase filterbank architecture, allowing a frequency continuous processing of the input signal without discontinuities between spectral channels. Each board processes the streams from 16 antennas, as part of larger beamforming system, linked by standard Ethernet interconnections. These are envisaged to be 8192 of these signal processing platforms in the first phase of the SKA so particular attention has been devoted to ensure the design is low cost and low power.

  6. Digital Processing of Weak Signals Buried in Noise

    NASA Astrophysics Data System (ADS)

    Emerson, Darrel

    This article describes the use of digital signal processing to pull the AMSAT AO-13 ZRO test signal out of the noise. In the ZRO tests, a signal is transmitted from the Oscar 13 satellite at progressively lower power levels, in 3 dB steps. The challenge is to decode successfully the weakest possible signal. The signal from the receiver audio was digitized using a Sound Blaster card, then filtered with a modified FFT routine. The modification was to allow the pre-detection filter to follow the slowly drifting signal. After using the matched, sliding filter before detection, the post-detection signal was passed through another matched filter. Finally, a cross-correlation technique comparing the detected, filtered signal with every possible combination of ZRO signal was applied, taking also into account a gradual drift of CW sending speed. The final, statistically most probable, solution turned out to be correct. This gave the only successful detection of level A transmissions from Oscar 13 so far (Aug 1996.) The extensive digital processing partly made up for the relatively poor receiving antenna; a 10-element 146 MHz Yagi, part of the Cushcraft AOP-1 combination.

  7. Signal-processing theory for the TurboRogue receiver

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1995-01-01

    Signal-processing theory for the TurboRogue receiver is presented. The signal form is traced from its formation at the GPS satellite, to the receiver antenna, and then through the various stages of the receiver, including extraction of phase and delay. The analysis treats the effects of ionosphere, troposphere, signal quantization, receiver components, and system noise, covering processing in both the 'code mode' when the P code is not encrypted and in the 'P-codeless mode' when the P code is encrypted. As a possible future improvement to the current analog front end, an example of a highly digital front end is analyzed.

  8. Conjugate spectrum filters for eddy current signal processing

    SciTech Connect

    Stepinski, T.; Maszi, N. . Dept. of Technology.)

    1993-07-01

    The paper addresses the problem of detection and classification of material defects during eddy current inspection. Digital signal processing algorithms for detection and characterization of flaws are considered and a new type of filter for classification of eddy current data is proposed. In the first part of the paper the signal processing blocks used in modern eddy current instruments are presented and analyzed in terms of information transmission. The processing usually consists of two steps: detection by means of amplitude-phase detectors and filtering of the detector output signals by means of analog signal filters. Distortion introduced by the signal filters is considered and illustrated using real eddy current responses. The nature of the distortion is explained and the way to avoid it is indicated. A novel method for processing the eddy current responses is presented in the second part of the paper. The method employs two-dimensional conjugate spectrum filters (CSFs) that are sensitive both to the phase angle and the shape of the eddy current responses. First the theoretical background of the CSF is presented and then two different ways of application, matched filters and orthogonal conjugate spectrum filters, are considered. The matched CSFs can be used for attenuation of all signals with the phase angle different from the selected prototype. The orthogonal filters are able to suppress completely a specific interference, e.g. the response of supporting plate when testing heat exchanger tubes. The performance of the CSFs is illustrated by means of real and simulated eddy current signals.

  9. Signal and image processing for early detection of coronary artery diseases: A review

    NASA Astrophysics Data System (ADS)

    Mobssite, Youness; Samir, B. Belhaouari; Mohamad Hani, Ahmed Fadzil B.

    2012-09-01

    Today biomedical signals and image based detection are a basic step to diagnose heart diseases, in particular, coronary artery diseases. The goal of this work is to provide non-invasive early detection of Coronary Artery Diseases relying on analyzing images and ECG signals as a combined approach to extract features, further classify and quantify the severity of DCAD by using B-splines method. In an aim of creating a prototype of screening biomedical imaging for coronary arteries to help cardiologists to decide the kind of treatment needed to reduce or control the risk of heart attack.

  10. A survey of the economics of materials processing in space. [accenting biomedical materials

    NASA Technical Reports Server (NTRS)

    Miller, B. P.

    1975-01-01

    A survey of the economics of space materials processing has been performed with the objectives of identifying those areas of space materials processing that give preliminary indication of significant economic potential, and to identify possible approaches to quantify the economic potential. It is concluded that limited economic studies have been performed to date, primarily in the area of the processing of inorganic materials, but that the economics of space processing of biological material has not received adequate attention. Specific studies are recommended to evaluate the economic impact of human lymphocyte subgroup separation on organ transplantation, and on the separation and concentration of urokinase producing cells.

  11. A survey of the economics of materials processing in space. [accenting biomedical materials

    NASA Technical Reports Server (NTRS)

    Miller, B. P.

    1975-01-01

    A survey of the economics of space materials processing has been performed with the objectives of identifying those areas of space materials processing that give preliminary indication of significant economic potential, and to identify possible approaches to quantify the economic potential. It is concluded that limited economic studies have been performed to date, primarily in the area of the processing of inorganic materials, but that the economics of space processing of biological material has not received adequate attention. Specific studies are recommended to evaluate the economic impact of human lymphocyte subgroup separation on organ transplantation, and on the separation and concentration of urokinase producing cells.

  12. UCMS - A new signal parameter measurement system using digital signal processing techniques. [User Constraint Measurement System

    NASA Technical Reports Server (NTRS)

    Choi, H. J.; Su, Y. T.

    1986-01-01

    The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.

  13. UCMS - A new signal parameter measurement system using digital signal processing techniques. [User Constraint Measurement System

    NASA Technical Reports Server (NTRS)

    Choi, H. J.; Su, Y. T.

    1986-01-01

    The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.

  14. A comparison of signal processing techniques for Intrinsic Optical Signal imaging in mice.

    PubMed

    Turley, Jordan A; Nilsson, Michael; Walker, Frederick Rohan; Johnson, Sarah J

    2015-01-01

    Intrinsic Optical Signal imaging is a technique which allows the visualisation and mapping of activity related changes within the brain with excellent spatial and temporal resolution. We analysed a variety of signal and image processing techniques applied to real mouse imaging data. The results were compared in an attempt to overcome the unique issues faced when performing the technique on mice and improve the understanding of post processing options available.

  15. Synthesis and characterization of Ti-27.5Nb alloy made by CLAD® additive manufacturing process for biomedical applications.

    PubMed

    Fischer, M; Laheurte, P; Acquier, P; Joguet, D; Peltier, L; Petithory, T; Anselme, K; Mille, P

    2017-06-01

    Biocompatible beta-titanium alloys such as Ti-27.5(at.%)Nb are good candidates for implantology and arthroplasty applications as their particular mechanical properties, including low Young's modulus, could significantly reduce the stress-shielding phenomenon usually occurring after surgery. The CLAD® process is a powder blown additive manufacturing process that allows the manufacture of patient specific (i.e. custom) implants. Thus, the use of Ti-27.5(at.%)Nb alloy formed by CLAD® process for biomedical applications as a mean to increase cytocompatibility and mechanical biocompatibility was investigated in this study. The microstructural properties of the CLAD-deposited alloy were studied with optical microscopy and electron back-scattered diffraction (EBSD) analysis. The conservation of the mechanical properties of the Ti-27.5Nb material after the transformation steps (ingot-powder atomisation-CLAD) were verified with tensile tests and appear to remain close to those of reference material. Cytocompatibility of the material and subsequent cell viability tests showed that no cytotoxic elements are released in the medium and that viable cells proliferated well. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Reaching for the cloud: on the lessons learned from grid computing technology transfer process to the biomedical community.

    PubMed

    Mohammed, Yassene; Dickmann, Frank; Sax, Ulrich; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which led to the creation of the Grid. The inter domain transfer process of this technology has hitherto been an intuitive process without in depth analysis. Some difficulties facing the life science community in this transfer can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies which have achieved certain stability. Grid and Cloud solutions are technologies, which are still in flux. We show how Grid computing creates new difficulties in the transfer process that are not considered in Bozeman's model. We show why the success of healthgrids should be measured by the qualified scientific human capital and the opportunities created, and not primarily by the market impact. We conclude with recommendations that can help improve the adoption of Grid and Cloud solutions into the biomedical community. These results give a more concise explanation of the difficulties many life science IT projects are facing in the late funding periods, and show leveraging steps that can help overcoming the "vale of tears".

  17. Removing Background Noise with Phased Array Signal Processing

    NASA Technical Reports Server (NTRS)

    Podboy, Gary; Stephens, David

    2015-01-01

    Preliminary results are presented from a test conducted to determine how well microphone phased array processing software could pull an acoustic signal out of background noise. The array consisted of 24 microphones in an aerodynamic fairing designed to be mounted in-flow. The processing was conducted using Functional Beam forming software developed by Optinav combined with cross spectral matrix subtraction. The test was conducted in the free-jet of the Nozzle Acoustic Test Rig at NASA GRC. The background noise was produced by the interaction of the free-jet flow with the solid surfaces in the flow. The acoustic signals were produced by acoustic drivers. The results show that the phased array processing was able to pull the acoustic signal out of the background noise provided the signal was no more than 20 dB below the background noise level measured using a conventional single microphone equipped with an aerodynamic forebody.

  18. Common formalism for adaptive identification in signal processing and control

    NASA Astrophysics Data System (ADS)

    Macchi, O.

    1991-08-01

    The transversal and recursive approaches to adaptive identification are compared. ARMA modeling in signal processing, and identification in the indirect approach to control are developed in parallel. Adaptivity succeeds because the estimate is a linear function of the variable parameters for transversal identification. Control and signal processing can be imbedded in a unified well-established formalism that guarantees convergence of the adaptive parameters. For recursive identification, the estimate is a nonlinear function of the parameters, possibly resulting in nonuniqueness of the solution, in wandering and even instability of adaptive algorithms. The requirement for recursivity originates in the structure of the signal (MA-part) in signal processing. It is caused by the output measurement noise in control.

  19. High-speed optical coherence tomography signal processing on GPU

    NASA Astrophysics Data System (ADS)

    Li, Xiqi; Shi, Guohua; Zhang, Yudong

    2011-01-01

    The signal processing speed of spectral domain optical coherence tomography (SD-OCT) has become a bottleneck in many medical applications. Recently, a time-domain interpolation method was proposed. This method not only gets a better signal-to noise ratio (SNR) but also gets a faster signal processing time for the SD-OCT than the widely used zero-padding interpolation method. Furthermore, the re-sampled data is obtained by convoluting the acquired data and the coefficients in time domain. Thus, a lot of interpolations can be performed concurrently. So, this interpolation method is suitable for parallel computing. An ultra-high optical coherence tomography signal processing can be realized by using graphics processing unit (GPU) with computer unified device architecture (CUDA). This paper will introduce the signal processing steps of SD-OCT on GPU. An experiment is performed to acquire a frame SD-OCT data (400A-lines×2048 pixel per A-line) and real-time processed the data on GPU. The results show that it can be finished in 6.208 milliseconds, which is 37 times faster than that on Central Processing Unit (CPU).

  20. Biomedical Telectrodes

    NASA Technical Reports Server (NTRS)

    Shepherd, C. K.

    1989-01-01

    Compact transmitters eliminate need for wires to monitors. Biomedical telectrode is small electronic package that attaches to patient in manner similar to small adhesive bandage. Patient wearing biomedical telectrodes moves freely, without risk of breaking or entangling wire connections. Especially beneficial to patients undergoing electrocardiographic monitoring in intensive-care units in hospitals. Eliminates nuisance of coping with wire connections while dressing and going to toilet.

  1. Signal processing method and system for noise removal and signal extraction

    DOEpatents

    Fu, Chi Yung; Petrich, Loren

    2009-04-14

    A signal processing method and system combining smooth level wavelet pre-processing together with artificial neural networks all in the wavelet domain for signal denoising and extraction. Upon receiving a signal corrupted with noise, an n-level decomposition of the signal is performed using a discrete wavelet transform to produce a smooth component and a rough component for each decomposition level. The n.sup.th level smooth component is then inputted into a corresponding neural network pre-trained to filter out noise in that component by pattern recognition in the wavelet domain. Additional rough components, beginning at the highest level, may also be retained and inputted into corresponding neural networks pre-trained to filter out noise in those components also by pattern recognition in the wavelet domain. In any case, an inverse discrete wavelet transform is performed on the combined output from all the neural networks to recover a clean signal back in the time domain.

  2. [The "peer-review" process in biomedical journals: characteristics of "Elite" reviewers].

    PubMed

    Alfonso, F

    2010-01-01

    The "peer-review" system is used to improve the quality of submitted scientific papers and provides invaluable help to the Editors in their decision-making process. The "peer-review" system remains the cornerstone of the scientific process and, therefore, its quality should be closely monitored. The profile of the "elite" reviewers has been described, but further studies are warranted to better identify their main characteristics. A major challenge, not only for Editors but also for medical scientific societies as a whole, is to continue to guarantee the excellence in the "peer-review" process and to ensure that it receives adequate academic recognition.

  3. Interhemispheric support during demanding auditory signal-in-noise processing.

    PubMed

    Stracke, Henning; Okamoto, Hidehiko; Pantev, Christo

    2009-06-01

    We investigated attentional effects on human auditory signal-in-noise processing in a simultaneous masking paradigm using magnetoencephalography. Test signal was a monaural 1000-Hz tone; maskers were binaural band-eliminated noises (BENs) containing stopbands of different widths centered on 1000 Hz. Participants directed attention either to the left or the right ear. In an "irrelevant visual attention" condition subjects focused attention on a screen. Irrespective of attention focus location, the signal appeared randomly either in the left or right ear. During auditory focused attention (left- or right-ear attention), the signal thus randomly appeared either in the attended ("relevant auditory attention" condition) or the nonattended ear ("irrelevant auditory attention" condition). Results showed that N1m source strength was overall increased in the left relative to the right hemisphere, and for right-ear versus left-ear stimulation. Moreover, when attention was focused on the signal ear (relevant auditory attention condition) and the BEN stopbands were narrow, the right-hemispheric N1m source strength was increased, relative to irrelevant auditory attention. Such increments were neither observed in the left hemisphere nor for wide BENs. These novel results indicate 1) left-hemispheric dominance and robustness during auditory signal-in-noise processing, and 2) right-hemispheric assistance during attentive and demanding auditory signal-in-noise processing.

  4. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. New frontiers in biomedical science and engineering during 2014-2015.

    PubMed

    Liu, Feng; Lee, Dong-Hoon; Lagoa, Ricardo; Kumar, Sandeep

    2015-01-01

    The International Conference on Biomedical Engineering and Biotechnology (ICBEB) is an international meeting held once a year. This, the fourth International Conference on Biomedical Engineering and Biotechnology (ICBEB2015), will be held in Shanghai, China, during August 18th-21st, 2015. This annual conference intends to provide an opportunity for researchers and practitioners at home and abroad to present the most recent frontiers and future challenges in the fields of biomedical science, biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, etc. The papers published in this issue are selected from this Conference, which witness the advances in biomedical engineering and biotechnology during 2014-2015.

  6. Assess sleep stage by modern signal processing techniques.

    PubMed

    Wu, Hau-tieng; Talmon, Ronen; Lo, Yu-Lun

    2015-04-01

    In this paper, two modern adaptive signal processing techniques, empirical intrinsic geometry and synchrosqueezing transform, are applied to quantify different dynamical features of the respiratory and electroencephalographic signals. We show that the proposed features are theoretically rigorously supported, as well as capture the sleep information hidden inside the signals. The features are used as input to multiclass support vector machines with the radial basis function to automatically classify sleep stages. The effectiveness of the classification based on the proposed features is shown to be comparable to human expert classification-the proposed classification of awake, REM, N1, N2, and N3 sleeping stages based on the respiratory signal (resp. respiratory and EEG signals) has the overall accuracy 81.7% (resp. 89.3%) in the relatively normal subject group. In addition, by examining the combination of the respiratory signal with the electroencephalographic signal, we conclude that the respiratory signal consists of ample sleep information, which supplements to the information stored in the electroencephalographic signal.

  7. Analog signal processing for low-power sensor systems

    NASA Astrophysics Data System (ADS)

    Hasler, Paul

    2006-05-01

    We present the potential of using Programmable Analog Signal processing techniques for impacting low-power portable applications like imaging, audio processing, and speech recognition. The range of analog signal processing functions available results in many potential opportunities to incorporate these analog signal processing systems with digital signal processing systems for improved overall system performance. We describes our programmable analog technology based around floating-gate transistors that allow for non-volitile storage as well as computation through the same device. We describe the basic concepts for floating-gate devices, capacitor-based circuits, and the basic charge modification mechanisms that makes this analog technology programmable. We describes the techniques to extend these techniques to program an array of floating-gate devices. We show experimental evidence for the factor of 1000 to 10,000 power efficiency improvement for programmable analog signal processing compared to custom digital implementations in Vector Matrix Multipliers (VMM), CMOS imagers with computation on the pixel plane with high fill factors, and Large-Scale Field Programmable Analog Arrays (FPAA), among others.

  8. Signal processing applied to photothermal techniques for materials characterization

    NASA Technical Reports Server (NTRS)

    Rooney, James A.

    1989-01-01

    There is a need to make noncontact measurements of material characteristics in the microgravity environment. Photothermal and photoacoustics techniques offer one approach for attaining this capability since lasers can be used to generate the required thermal or acoustic signals. The perturbations in the materials that can be used for characterization can be detected by optical reflectance, infrared detection or laser detection of photoacoustics. However, some of these laser techniques have disadvantages of either high energy pulsed excitation or low signal-to-noise ratio. Alternative signal processing techniques that have been developed can be applied to photothermal or photoacoustic instrumentation. One fully coherent spread spectrum signal processing technique is called time delay spectrometry (TDS). With TDS the system is excited using a combined frequency-time domain by employing a linear frequency sweep excitation function. The processed received signal can provide either frequency, phase or improved time resolution. This signal processing technique was shown to outperform other time selective techniques with respect to noise rejection and was recently applied to photothermal instrumentation. The technique yields the mathematical equivalent of pulses yet the input irradiances are orders of magnitude less than pulses with the concomitant reduction in perturbation of the sample and can increase the capability of photothermal methods for materials characterization.

  9. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  10. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  11. Real-time signal processing with FPS array processors

    SciTech Connect

    Higginbotham, R.E.; Wiley, P.G.

    1980-01-01

    The authors discuss real-time signal processing with the FPS programmable array processor (AP). This AP is designed to attach as a peripheral device to a host machine, either a minicomputer or mainframe. The basic AP consists of a host interface, multiple independent memories, parallel pipelined floating-point adder and multiplier, and a control/integer processing unit. For real-time signal processing applications in which the throughput rate exceeds the capability of the host computer, independent input/output (I/O) interfaces are available which provide direct access to the AP for external input and output devices. 2 references.

  12. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research directions in the fields of digital signal processing and modern control and estimation theory are discussed. Stability theory, linear prediction and parameter identification, system synthesis and implementation, two-dimensional filtering, decentralized control and estimation, and image processing are considered in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the disciplines.

  13. Biomedical nanoparticles modulate specific CD4+ T cell stimulation by inhibition of antigen processing in dendritic cells.

    PubMed

    Blank, Fabian; Gerber, Peter; Rothen-Rutishauser, Barbara; Sakulkhu, Usawadee; Salaklang, Jatuporn; De Peyer, Karin; Gehr, Peter; Nicod, Laurent P; Hofmann, Heinrich; Geiser, Thomas; Petri-Fink, Alke; Von Garnier, Christophe

    2011-12-01

    Understanding how nanoparticles may affect immune responses is an essential prerequisite to developing novel clinical applications. To investigate nanoparticle-dependent outcomes on immune responses, dendritic cells (DCs) were treated with model biomedical poly(vinylalcohol)-coated super-paramagnetic iron oxide nanoparticles (PVA-SPIONs). PVA-SPIONs uptake by human monocyte-derived DCs (MDDCs) was analyzed by flow cytometry (FACS) and advanced imaging techniques. Viability, activation, function, and stimulatory capacity of MDDCs were assessed by FACS and an in vitro CD4+ T cell assay. PVA-SPION uptake was dose-dependent, decreased by lipopolysaccharide (LPS)-induced MDDC maturation at higher particle concentrations, and was inhibited by cytochalasin D pre-treatment. PVA-SPIONs did not alter surface marker expression (CD80, CD83, CD86, myeloid/plasmacytoid DC markers) or antigen-uptake, but decreased the capacity of MDDCs to process antigen, stimulate CD4+ T cells, and induce cytokines. The decreased antigen processing and CD4+ T cell stimulation capability of MDDCs following PVA-SPION treatment suggests that MDDCs may revert to a more functionally immature state following particle exposure.

  14. A corpus of full-text journal articles is a robust evaluation tool for revealing differences in performance of biomedical natural language processing tools.

    PubMed

    Verspoor, Karin; Cohen, Kevin Bretonnel; Lanfranchi, Arrick; Warner, Colin; Johnson, Helen L; Roeder, Christophe; Choi, Jinho D; Funk, Christopher; Malenkiy, Yuriy; Eckert, Miriam; Xue, Nianwen; Baumgartner, William A; Bada, Michael; Palmer, Martha; Hunter, Lawrence E

    2012-08-17

    We introduce the linguistic annotation of a corpus of 97 full-text biomedical publications, known as the Colorado Richly Annotated Full Text (CRAFT) corpus. We further assess the performance of existing tools for performing sentence splitting, tokenization, syntactic parsing, and named entity recognition on this corpus. Many biomedical natural language processing systems demonstrated large differences between their previously published results and their performance on the CRAFT corpus when tested with the publicly available models or rule sets. Trainable systems differed widely with respect to their ability to build high-performing models based on this data. The finding that some systems were able to train high-performing models based on this corpus is additional evidence, beyond high inter-annotator agreement, that the quality of the CRAFT corpus is high. The overall poor performance of various systems indicates that considerable work needs to be done to enable natural language processing systems to work well when the input is full-text journal articles. The CRAFT corpus provides a valuable resource to the biomedical natural language processing community for evaluation and training of new models for biomedical full text publications.

  15. A corpus of full-text journal articles is a robust evaluation tool for revealing differences in performance of biomedical natural language processing tools

    PubMed Central

    2012-01-01

    Background We introduce the linguistic annotation of a corpus of 97 full-text biomedical publications, known as the Colorado Richly Annotated Full Text (CRAFT) corpus. We further assess the performance of existing tools for performing sentence splitting, tokenization, syntactic parsing, and named entity recognition on this corpus. Results Many biomedical natural language processing systems demonstrated large differences between their previously published results and their performance on the CRAFT corpus when tested with the publicly available models or rule sets. Trainable systems differed widely with respect to their ability to build high-performing models based on this data. Conclusions The finding that some systems were able to train high-performing models based on this corpus is additional evidence, beyond high inter-annotator agreement, that the quality of the CRAFT corpus is high. The overall poor performance of various systems indicates that considerable work needs to be done to enable natural language processing systems to work well when the input is full-text journal articles. The CRAFT corpus provides a valuable resource to the biomedical natural language processing community for evaluation and training of new models for biomedical full text publications. PMID:22901054

  16. Additive manufacturing of titanium alloys in the biomedical field: processes, properties and applications.

    PubMed

    Trevisan, Francesco; Calignano, Flaviana; Aversa, Alberta; Marchese, Giulio; Lombardi, Mariangela; Biamino, Sara; Ugues, Daniele; Manfredi, Diego

    2017-09-25

    The mechanical properties and biocompatibility of titanium alloy medical devices and implants produced by additive manufacturing (AM) technologies - in particular, selective laser melting (SLM), electron beam melting (EBM) and laser metal deposition (LMD) - have been investigated by several researchers demonstrating how these innovative processes are able to fulfil medical requirements for clinical applications. This work reviews the advantages given by these technologies, which include the possibility to create porous complex structures to improve osseointegration and mechanical properties (best match with the modulus of elasticity of local bone), to lower processing costs, to produce custom-made implants according to the data for the patient acquired via computed tomography and to reduce waste.

  17. Optimal and adaptive methods of processing hydroacoustic signals (review)

    NASA Astrophysics Data System (ADS)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  18. SIG: a general-purpose signal processing program

    SciTech Connect

    Lager, D.; Azevedo, S.

    1986-02-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. It also accommodates other representations for data such as transfer function polynomials. Signal processing operations include digital filtering, auto/cross spectral density, transfer function/impulse response, convolution, Fourier transform, and inverse Fourier transform. Graphical operations provide display of signals and spectra, including plotting, cursor zoom, families of curves, and multiple viewport plots. SIG provides two user interfaces with a menu mode for occasional users and a command mode for more experienced users. Capability exits for multiple commands per line, command files with arguments, commenting lines, defining commands, automatic execution for each item in a repeat sequence, etc. SIG is presently available for VAX(VMS), VAX (BERKELEY 4.2 UNIX), SUN (BERKELEY 4.2 UNIX), DEC-20 (TOPS-20), LSI-11/23 (TSX), and DEC PRO 350 (TSX). 4 refs., 2 figs.

  19. BioC: a minimalist approach to interoperability for biomedical text processing.

    PubMed

    Comeau, Donald C; Islamaj Doğan, Rezarta; Ciccarese, Paolo; Cohen, Kevin Bretonnel; Krallinger, Martin; Leitner, Florian; Lu, Zhiyong; Peng, Yifan; Rinaldi, Fabio; Torii, Manabu; Valencia, Alfonso; Verspoor, Karin; Wiegers, Thomas C; Wu, Cathy H; Wilbur, W John

    2013-01-01

    A vast amount of scientific information is encoded in natural language text, and the quantity of such text has become so great that it is no longer economically feasible to have a human as the first step in the search process. Natural language processing and text mining tools have become essential to facilitate the search for and extraction of information from text. This has led to vigorous research efforts to create useful tools and to create humanly labeled text corpora, which can be used to improve such tools. To encourage combining these efforts into larger, more powerful and more capable systems, a common interchange format to represent, store and exchange the data in a simple manner between different language processing systems and text mining tools is highly desirable. Here we propose a simple extensible mark-up language format to share text documents and annotations. The proposed annotation approach allows a large number of different annotations to be represented including sentences, tokens, parts of speech, named entities such as genes or diseases and relationships between named entities. In addition, we provide simple code to hold this data, read it from and write it back to extensible mark-up language files and perform some sample processing. We also describe completed as well as ongoing work to apply the approach in several directions. Code and data are available at http://bioc.sourceforge.net/. Database URL: http://bioc.sourceforge.net/

  20. Review of Electrochemically Triggered Macromolecular Film Buildup Processes and Their Biomedical Applications.

    PubMed

    Maerten, Clément; Jierry, Loïc; Schaaf, Pierre; Boulmedais, Fouzia

    2017-08-30

    Macromolecular coatings play an important role in many technological areas, ranging from the car industry to biosensors. Among the different coating technologies, electrochemically triggered processes are extremely powerful because they allow in particular spatial confinement of the film buildup up to the micrometer scale on microelectrodes. Here, we review the latest advances in the field of electrochemically triggered macromolecular film buildup processes performed in aqueous solutions. All these processes will be discussed and related to their several applications such as corrosion prevention, biosensors, antimicrobial coatings, drug-release, barrier properties and cell encapsulation. Special emphasis will be put on applications in the rapidly growing field of biosensors. Using polymers or proteins, the electrochemical buildup of the films can result from a local change of macromolecules solubility, self-assembly of polyelectrolytes through electrostatic/ionic interactions or covalent cross-linking between different macromolecules. The assembly process can be in one step or performed step-by-step based on an electrical trigger affecting directly the interacting macromolecules or generating ionic species.

  1. BioC: a minimalist approach to interoperability for biomedical text processing

    PubMed Central

    Comeau, Donald C.; Islamaj Doğan, Rezarta; Ciccarese, Paolo; Cohen, Kevin Bretonnel; Krallinger, Martin; Leitner, Florian; Lu, Zhiyong; Peng, Yifan; Rinaldi, Fabio; Torii, Manabu; Valencia, Alfonso; Verspoor, Karin; Wiegers, Thomas C.; Wu, Cathy H.; Wilbur, W. John

    2013-01-01

    A vast amount of scientific information is encoded in natural language text, and the quantity of such text has become so great that it is no longer economically feasible to have a human as the first step in the search process. Natural language processing and text mining tools have become essential to facilitate the search for and extraction of information from text. This has led to vigorous research efforts to create useful tools and to create humanly labeled text corpora, which can be used to improve such tools. To encourage combining these efforts into larger, more powerful and more capable systems, a common interchange format to represent, store and exchange the data in a simple manner between different language processing systems and text mining tools is highly desirable. Here we propose a simple extensible mark-up language format to share text documents and annotations. The proposed annotation approach allows a large number of different annotations to be represented including sentences, tokens, parts of speech, named entities such as genes or diseases and relationships between named entities. In addition, we provide simple code to hold this data, read it from and write it back to extensible mark-up language files and perform some sample processing. We also describe completed as well as ongoing work to apply the approach in several directions. Code and data are available at http://bioc.sourceforge.net/. Database URL: http://bioc.sourceforge.net/ PMID:24048470

  2. Ablation processing of biomedical materials by ultrashort laser pulse ranging from 50 fs through 2 ps

    NASA Astrophysics Data System (ADS)

    Ozono, Kazue; Obara, Minoru; Sakuma, Jun

    2003-06-01

    In recent years, femtosecond laser processing of human hard/soft tissues has been studied. Here, we have demonstrated ablation etching of hydroxyapatite. Hydroxyapatite (Ca10(PO4)6(OH)2) is a key component of human tooth and human bone. The human bone is mainly made of hydroxyapatite oriented along the collagen. The micromachining of hydroxyapatite is highly required for orthopedics and dentistry. The important issue is to preserve the chemical property of the ablated surface. If chemical properties of hydroxyapatite change once, the human bone or tooth cannot grow again after laser processing. As for nanosecond laser ablation (for example excimer laser ablation), the relative content of calcium and phosphorus in (Ca10(PO4)6(OH)2) is found to change after laser ablation. We used here pulsewidth tunable output from 50 fs through 2 ps at 820 nm and 1 kpps. We measured calcium spectrum and phosphorus spectrum of the ablated surface of hydroxyapatite by XPS. As a result, the chemical content of calcium and phosphorus is kept unchanged before and after 50-fs - 2-ps laser ablation. We also demonstrated ablation processing of human tooth with Ti:sapphire laser, and precise ablation processing and microstructure fabrication are realized.

  3. Using acoustic emission signals for monitoring of production processes.

    PubMed

    Tönshoff, H K; Jung, M; Männel, S; Rietz, W

    2000-07-01

    The systems for in-process quality assurance offer the possibility of estimating the workpiece quality during machining. Especially for finishing processes like grinding or turning of hardened steels, it is important to control the process continuously in order to avoid rejects and refinishing. This paper describes the use of on-line monitoring systems with process-integrated measurement of acoustic emission to evaluate hard turning and grinding processes. The correlation between acoustic emission signals and subsurface integrity is determined to analyse the progression of the processes and the workpiece quality.

  4. Signal processing of heart signals for the quantification of non-deterministic events

    PubMed Central

    2011-01-01

    Background Heart signals represent an important way to evaluate cardiovascular function and often what is desired is to quantify the level of some signal of interest against the louder backdrop of the beating of the heart itself. An example of this type of application is the quantification of cavitation in mechanical heart valve patients. Methods An algorithm is presented for the quantification of high-frequency, non-deterministic events such as cavitation from recorded signals. A closed-form mathematical analysis of the algorithm investigates its capabilities. The algorithm is implemented on real heart signals to investigate usability and implementation issues. Improvements are suggested to the base algorithm including aligning heart sounds, and the implementation of the Short-Time Fourier Transform to study the time evolution of the energy in the signal. Results The improvements result in better heart beat alignment and better detection and measurement of the random events in the heart signals, so that they may provide a method to quantify nondeterministic events in heart signals. The use of the Short-Time Fourier Transform allows the examination of the random events in both time and frequency allowing for further investigation and interpretation of the signal. Conclusions The presented algorithm does allow for the quantification of nondeterministic events but proper care in signal acquisition and processing must be taken to obtain meaningful results. PMID:21269508

  5. Benefits of Software GPS Receivers for Enhanced Signal Processing

    DTIC Science & Technology

    2000-01-01

    1 Published in GPS SOLUTIONS 4(1) Summer, 2000, pages 56-66. Benefits of Software GPS Receivers for Enhanced Signal Processing Alison Brown...Diego, CA 92110-3127 Number of Pages: 24 Number of Figures: 20 ABSTRACT In this paper the architecture of a software GPS receiver is described...and an analysis is included of the performance of a software GPS receiver when tracking the GPS signals in challenging environments. Results are

  6. GNSS-based bistatic SAR: a signal processing view

    NASA Astrophysics Data System (ADS)

    Antoniou, Michail; Cherniakov, Mikhail

    2013-12-01

    This article presents signal processing algorithms used as a new remote sensing tool, that is passive bistatic SAR with navigation satellites (e.g. GPS, GLONASS or Galileo) as transmitters of opportunity. Signal synchronisation and image formation algorithms are described for two system variants: one where the receiver is moving and one where it is fixed on the ground. The applicability and functionality of the algorithms described is demonstrated through experimental imagery that ultimately confirms the feasibility of the overall technology.

  7. Higher-Dimensional Signal Processing via Multiscale Geometric Analysis

    DTIC Science & Technology

    2010-02-10

    compression, classification, and segmentation. Despite the remarkable advantages of wavelets for analyzing and processing l-D signals, a surprising...perpendicular to β. Our goal in this section is to analyze the phase (θ1, θ2, θ3) of the quaternion wavelet coefficient (e.g., dV`,p for the vertical... wavelet transforms suitable for signals containing low-dimensional manifold structures [22]. The QWT developed here could play an interesting rôle in

  8. Optical signal processing - Fourier transforms and convolution/correlation

    NASA Astrophysics Data System (ADS)

    Rhodes, William T.

    The application of Fourier techniques and linear-systems theory to the analysis and synthesis of optical systems is described in a theoretical review, and Fourier-based optical signal-processing methods are considered. Topics examined include monochromatic wave fields and their phasor representation, wave propagation, Fourier-transform and spectrum analysis with a spherical lens, coherent and incoherent imaging and spatial filtering, and a channelized spectrum analyzer (using both spherical and cylindrical lenses) for multiple one-dimensional input signals.

  9. Digital signal processing for fiber-optic thermometers

    SciTech Connect

    Fernicola, V.; Crovini, L.

    1994-12-31

    A digital signal processing scheme for measurement of exponentially-decaying signals, such as those found in fluorescence, lifetime-based, fiber-optic sensors, is proposed. The instrument uses a modified digital phase-sensitive-detection technique with the phase locked to a fixed value and the modulation period tracking the measured lifetime. Typical resolution of the system is 0.05% for slow decay (>500 {mu}s) and 0.1% for fast decay.

  10. Application of homomorphic signal processing to stress wave factor analysis

    NASA Technical Reports Server (NTRS)

    Karagulle, H.; Williams, J. H., Jr.; Lee, S. S.

    1985-01-01

    The stress wave factor (SWF) signal, which is the output of an ultrasonic testing system where the transmitting and receiving transducers are coupled to the same face of the test structure, is analyzed in the frequency domain. The SWF signal generated in an isotropic elastic plate is modelled as the superposition of successive reflections. The reflection which is generated by the stress waves which travel p times as a longitudinal (P) wave and s times as a shear (S) wave through the plate while reflecting back and forth between the bottom and top faces of the plate is designated as the reflection with p, s. Short-time portions of the SWF signal are considered for obtaining spectral information on individual reflections. If the significant reflections are not overlapped, the short-time Fourier analysis is used. A summary of the elevant points of homomorphic signal processing, which is also called cepstrum analysis, is given. Homomorphic signal processing is applied to short-time SWF signals to obtain estimates of the log spectra of individual reflections for cases in which the reflections are overlapped. Two typical SWF signals generated in aluminum plates (overlapping and non-overlapping reflections) are analyzed.

  11. Signal processing by its coil zipper domain activates IKKγ

    PubMed Central

    Bloor, Stuart; Ryzhakov, Grigor; Wagner, Sebastian; Butler, P. Jonathan G.; Smith, David L.; Krumbach, Rebekka; Dikic, Ivan; Randow, Felix

    2008-01-01

    NF-κB activation occurs upon degradation of its inhibitor I-κB and requires prior phosphorylation of the inhibitor by I-κB kinase (IKK). Activity of IKK is governed by its noncatalytic subunit IKKγ. Signaling defects due to missense mutations in IKKγ have been correlated to its inability to either become ubiquitylated or bind ubiquitin noncovalently. Because the relative contribution of these events to signaling had remained unknown, we have studied mutations in the coil-zipper (CoZi) domain of IKKγ that either impair signaling or cause constitutive NF-κB activity. Certain signaling-deficient alleles neither bound ubiquitin nor were they ubiquitylated by TRAF6. Introducing an activating mutation into those signaling-impaired alleles restored their ubiquitylation and created mutants constitutively activating NF-κB without repairing the ubiquitin-binding defect. Constitutive activity therefore arises downstream of ubiquitin binding but upstream of ubiquitylation. Such constitutive activity reveals a signal-processing function for IKKγ beyond that of a mere ubiquitin-binding adaptor. We propose that this signal processing may involve homophilic CoZi interactions as suggested by the enhanced affinity of CoZi domains from constitutively active IKKγ. PMID:18216269

  12. Adaptive digital signal processing for X-ray spectrometry

    NASA Astrophysics Data System (ADS)

    Lakatos, T.

    1990-05-01

    A real-time fully digital signal processing and analyzing system based on a new concept has been developed for high count rate high resolution spectrometry. The principle has been realized with digital filtering of the preamplifier output signals. The system's unique features are the maximum theoretically possible throughput rate with high resolution, and the adaptive noise filtering for nearly loss-free measurements. In adaptive mode the maximum output rate is about 20 times higher than in the case of the semi-Gaussian shaping, with low degradation of energy resolution. All parameters of the signal processor are software controllable.

  13. Fatigue properties of a biomedical 316L steel processed by surface mechanical attrition

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Chemkhi, M.; Kanoute, P.; Retraint, D.

    2014-08-01

    This work deals with the influence of surface mechanical attrition treatment (SMAT) on fatigue properties of a medical grade 316L stainless steel. Metallurgical parameters governed by SMAT such as micro-hardness and nanocrystalline layer are characterized using different techniques. Low cycle fatigue tests are performed to investigate the fatigue properties of untreated and SMAT-processed samples. The results show that the stress amplitude of SMAT- processed samples with two different treatment intensities is significantly enhanced compared to untreated samples, while the fatigue strength represented by the number of cycles to failure is not improved in the investigated strain range. The enhancement in the stress amplitude of treated samples can be attributed to the influence of the SMAT affected layer.

  14. Application of the Ornstein-Uhlenbeck equations for biomedical image processing

    NASA Astrophysics Data System (ADS)

    Mesa López, Juan P.

    2014-05-01

    The purpose of this paper is to demonstrate the implementation of a new kind of image filter, in which the equations of the Ornstein-Uhlenbeck process are used in image processing in order to detect the edges of computerized images taken with magnetic resonance imaging (MRI) scanners, as a generalization of the standard Gaussian filter. This new filter will be called Ornstein-Uhlenbeck filter (OUF). One of the results obtained after applying the filter to the image, with the variation of the parameter σ is an effective differentiation of the various organs that are next to each other in each one of the resulting images. In addition, the edges of the internal body parts and organs are highlighted efficiently and differentiated from their surroundings.

  15. Algebraic signal processing theory: 2-D spatial hexagonal lattice.

    PubMed

    Pünschel, Markus; Rötteler, Martin

    2007-06-01

    We develop the framework for signal processing on a spatial, or undirected, 2-D hexagonal lattice for both an infinite and a finite array of signal samples. This framework includes the proper notions of z-transform, boundary conditions, filtering or convolution, spectrum, frequency response, and Fourier transform. In the finite case, the Fourier transform is called discrete triangle transform. Like the hexagonal lattice, this transform is nonseparable. The derivation of the framework makes it a natural extension of the algebraic signal processing theory that we recently introduced. Namely, we construct the proper signal models, given by polynomial algebras, bottom-up from a suitable definition of hexagonal space shifts using a procedure provided by the algebraic theory. These signal models, in turn, then provide all the basic signal processing concepts. The framework developed in this paper is related to Mersereau's early work on hexagonal lattices in the same way as the discrete cosine and sine transforms are related to the discrete Fourier transform-a fact that will be made rigorous in this paper.

  16. All-optical signal processing technique for secure optical communication

    NASA Astrophysics Data System (ADS)

    Qian, Feng-chen; Su, Bing; Ye, Ya-lin; Zhang, Qian; Lin, Shao-feng; Duan, Tao; Duan, Jie

    2015-10-01

    Secure optical communication technologies are important means to solve the physical layer security for optical network. We present a scheme of secure optical communication system by all-optical signal processing technique. The scheme consists of three parts, as all-optical signal processing unit, optical key sequence generator, and synchronous control unit. In the paper, all-optical signal processing method is key technology using all-optical exclusive disjunction (XOR) gate based on optical cross-gain modulation effect, has advantages of wide dynamic range of input optical signal, simple structure and so on. All-optical XOR gate composed of two semiconductor optical amplifiers (SOA) is a symmetrical structure. By controlling injection current, input signal power, delay and filter bandwidth, the extinction ratio of XOR can be greater than 8dB. Finally, some performance parameters are calculated and the results are analyzed. The simulation and experimental results show that the proposed method can be achieved over 10Gbps optical signal encryption and decryption, which is simple, easy to implement, and error-free diffusion.

  17. Processing and Development of Nano-Scale HA Coatings for Biomedical Application

    DTIC Science & Technology

    2005-01-01

    thickness of the film has been processed and tested as a more effective orthopedic/ dental implant coating. The present study aims to increase the service...life of an orthopedic/ dental implant by creating materials that form a strong, long lasting, bond with the Ti substrate as well as juxtaposed bone... dental replacement surgery may quickly return to a normal active lifestyle. Cross-sectional transmission electron microscopy analysis displayed that the

  18. Research on mud pulse signal data processing in MWD

    NASA Astrophysics Data System (ADS)

    Tu, Bing; Li, De Sheng; Lin, En Huai; Ji, Miao Miao

    2012-12-01

    Wireless measure while drilling (MWD) transmits data by using mud pulse signal ; the ground decoding system collects the mud pulse signal and then decodes and displays the parameters under the down-hole according to the designed encoding rules and the correct detection and recognition of the ground decoding system towards the received mud pulse signal is one kind of the key technology of MWD. This paper introduces digit of Manchester encoding that transmits data and the format of the wireless transmission of data under the down-hole and develops a set of ground decoding systems. The ground decoding algorithm uses FIR (Finite impulse response) digital filtering to make de-noising on the mud pulse signal, then adopts the related base value modulating algorithm to eliminate the pump pulse base value of the denoised mud pulse signal, finally analyzes the mud pulse signal waveform shape of the selected Manchester encoding in three bits cycles, and applies the pattern similarity recognition algorithm to the mud pulse signal recognition. The field experiment results show that the developed device can make correctly extraction and recognition for the mud pulse signal with simple and practical decoding process and meet the requirements of engineering application.

  19. Biomedical digital assistant for ubiquitous healthcare.

    PubMed

    Lee, Tae-Soo; Hong, Joo-Hyun; Cho, Myeong-Chan

    2007-01-01

    The concept of ubiquitous healthcare service, which emerged as one of measures to solve healthcare problems in aged society, means that patients can receive services such as prevention, diagnosis, therapy and prognosis management at any time and in any place with the help of advanced information and communication technology. This service requires not only biomedical digital assistant that can monitor continuously the patients' health condition regardless of time and place, but also wired and wireless communication devices and telemedicine servers that provide doctors with data on patients' present health condition. In order to implement a biomedical digital assistant that is portable and wearable to patients, the present study developed a device that minimizes size, weight and power consumption, measures ECG and PPG signals, and even monitors moving patients' state. The biomedical sensor with the function of wireless communication was designed to be highly portable and wearable, to be operable 24 hours with small-size batteries, and to monitor the subject's heart rate, step count and respiratory rate in his daily life. The biomedical signal receiving device was implemented in two forms, PDA and cellular phone. The movement monitoring device embedded in the battery pack of a cellular phone does not have any problem in operating 24 hours, but the real-time biomedical signal receiving device implemented with PDA operated up to 6 hours due to the limited battery capacity of PDA. This problem is expected to be solved by reducing wireless communication load through improving the processing and storage functions of the sensor. The developed device can transmit a message on the patient's emergency to the remote server through the cellular phone network, and is expected to play crucial roles in the health management of chronic-aged patients in their daily life.

  20. Synthesizing oncogenic signal-processing systems that function as both "signal counters" and "signal blockers" in cancer cells.

    PubMed

    Liu, Yuchen; Huang, Weiren; Zhou, Dexi; Han, Yonghua; Duan, Yonggang; Zhang, Xiaoyue; Zhang, Hu; Jiang, Zhimao; Gui, Yaoting; Cai, Zhiming

    2013-07-01

    RNA-protein interaction plays a significant role in regulating eukaryotic translation. This phenomenon raises questions about the ability of artificial biological systems to take the advantage of protein-RNA interaction. Here, we designed an oncogenic signal-processing system expressing both a Renilla luciferase reporter gene controlled by RNA-protein interaction in its 5'-untranslated region (5'-UTR) and a Firefly luciferase normalization gene. To test the ability of the designed system, we then constructed vectors targeting the nuclear factor-κB (NF-κB) or the β-catenin signal. We found that the inhibition (%) of luciferase expression was correlated to the targeted protein content, allowing quantitative measurement of oncogenic signal intensity in cancer cells. The systems inhibited the expression of oncogenic signal downstream genes and induced bladder cancer cell proliferation inhibition and apoptosis without affecting normal urothelial cells. Compared to traditional methods (ELISA and quantitative immunoblotting), the bio-systems provided highly accurate, consistent, and reproducible quantification of protein signals and were able to discriminate between cancerous and non-cancerous cells. In conclusion, the synthetic systems function as both "signal counters" and "signal blockers" in cancer cells. This approach provides a synthetic biology platform for oncogenic signal measurement and cancer treatment.

  1. Ultralow-power electronics for biomedical applications.

    PubMed

    Chandrakasan, Anantha P; Verma, Naveen; Daly, Denis C

    2008-01-01

    The electronics of a general biomedical device consist of energy delivery, analog-to-digital conversion, signal processing, and communication subsystems. Each of these blocks must be designed for minimum energy consumption. Specific design techniques, such as aggressive voltage scaling, dynamic power-performance management, and energy-efficient signaling, must be employed to adhere to the stringent energy constraint. The constraint itself is set by the energy source, so energy harvesting holds tremendous promise toward enabling sophisticated systems without straining user lifestyle. Further, once harvested, efficient delivery of the low-energy levels, as well as robust operation in the aggressive low-power modes, requires careful understanding and treatment of the specific design limitations that dominate this realm. We outline the performance and power constraints of biomedical devices, and present circuit techniques to achieve complete systems operating down to power levels of microwatts. In all cases, approaches that leverage advanced technology trends are emphasized.

  2. Multichannel optical signal processing using sampled fiber Bragg gratings

    NASA Astrophysics Data System (ADS)

    Zhang, Guiju; Wang, Chinhua; Zhu, Xiaojun

    2008-12-01

    Sampled and linearly chirped fiber Bragg gratings provide multiple wavelength responses and linear group delays (constant dispersions) within each of the wavelength channels. We show that the sampled and chirped fiber Bragg gratings can be used to perform multiwavelength signal processing. In particular, we demonstrate, by numerical simulation, their use for performing real-time Fourier transform (RTFT) and for pulse repetition rate multiplication (PRRM) simultaneously over multiple wavelength channels. To present how the sampled fiber Bragg gratings perform the multichannel optical signal processing, a 9-channel sampled fiber grating with 100GHz channel spacing was designed and the effect of ripples in both amplitude and the group delay channel on the performance of the signal processing was examined and discussed.

  3. Statistical signal processing for an implantable ethanol biosensor.

    PubMed

    Han, Jae-Joon; Doerschuk, Peter C; Gelfand, Saul B; O'Connor, Sean J

    2006-01-01

    The understanding of drinking patterns leading to alcoholism has been hindered by an inability to unobtrusively measure ethanol consumption over periods of weeks to months in the community environment. Signal processing for an implantable ethanol MEMS bio sensor under simultaneous development is described where the sensor-signal processing system will provide a novel approach to this need. For safety and user acceptability issues, the sensor will be implanted subcutaneously and therefore measure peripheral-tissue ethanol concentration. A statistical signal processing system based on detailed models of the physiology and using extended Kalman filtering and dynamic programming tools is described which determines ethanol consumption and kinetics in other compartments from the time course of peripheral-tissue ethanol concentration.

  4. Parallel Signal Processing and System Simulation using aCe

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2003-01-01

    Recently, networked and cluster computation have become very popular for both signal processing and system simulation. A new language is ideally suited for parallel signal processing applications and system simulation since it allows the programmer to explicitly express the computations that can be performed concurrently. In addition, the new C based parallel language (ace C) for architecture-adaptive programming allows programmers to implement algorithms and system simulation applications on parallel architectures by providing them with the assurance that future parallel architectures will be able to run their applications with a minimum of modification. In this paper, we will focus on some fundamental features of ace C and present a signal processing application (FFT).

  5. Detectors and signal processing for high-energy physics

    SciTech Connect

    Rehak, P.

    1981-01-01

    Basic principles of the particle detection and signal processing for high-energy physics experiments are presented. It is shown that the optimum performance of a properly designed detector system is not limited by incidental imperfections, but solely by more fundamental limitations imposed by the quantum nature and statistical behavior of matter. The noise sources connected with the detection and signal processing are studied. The concepts of optimal filtering and optimal detector/amplifying device matching are introduced. Signal processing for a liquid argon calorimeter is analyzed in some detail. The position detection in gas counters is studied. Resolution in drift chambers for the drift coordinate measurement as well as the second coordinate measurement is discussed.

  6. Bicoid signal extraction with a selection of parametric and nonparametric signal processing techniques.

    PubMed

    Ghodsi, Zara; Silva, Emmanuel Sirimal; Hassani, Hossein

    2015-06-01

    The maternal segmentation coordinate gene bicoid plays a significant role during Drosophila embryogenesis. The gradient of Bicoid, the protein encoded by this gene, determines most aspects of head and thorax development. This paper seeks to explore the applicability of a variety of signal processing techniques at extracting bicoid expression signal, and whether these methods can outperform the current model. We evaluate the use of six different powerful and widely-used models representing both parametric and nonparametric signal processing techniques to determine the most efficient method for signal extraction in bicoid. The results are evaluated using both real and simulated data. Our findings show that the Singular Spectrum Analysis technique proposed in this paper outperforms the synthesis diffusion degradation model for filtering the noisy protein profile of bicoid whilst the exponential smoothing technique was found to be the next best alternative followed by the autoregressive integrated moving average.

  7. Bicoid Signal Extraction with a Selection of Parametric and Nonparametric Signal Processing Techniques

    PubMed Central

    Ghodsi, Zara; Silva, Emmanuel Sirimal; Hassani, Hossein

    2015-01-01

    The maternal segmentation coordinate gene bicoid plays a significant role during Drosophila embryogenesis. The gradient of Bicoid, the protein encoded by this gene, determines most aspects of head and thorax development. This paper seeks to explore the applicability of a variety of signal processing techniques at extracting bicoid expression signal, and whether these methods can outperform the current model. We evaluate the use of six different powerful and widely-used models representing both parametric and nonparametric signal processing techniques to determine the most efficient method for signal extraction in bicoid. The results are evaluated using both real and simulated data. Our findings show that the Singular Spectrum Analysis technique proposed in this paper outperforms the synthesis diffusion degradation model for filtering the noisy protein profile of bicoid whilst the exponential smoothing technique was found to be the next best alternative followed by the autoregressive integrated moving average. PMID:26197438

  8. Why optics students should take digital signal processing courses and why digital signal processing students should take optics courses

    NASA Astrophysics Data System (ADS)

    Cathey, W. Thomas, Jr.

    2000-06-01

    This paper is based on the claim that future major contributions in the field of imaging systems will be made by those who have a background in both optics and digital signal processing. As the introduction of Fourier transforms and linear systems theory to optics had a major impact on the design of hybrid optical/digital imaging systems, the introduction of digital signal processing into optics programs will have a major impact. Examples are given of new hybrid imaging systems that have unique performance. By jointly designing the optics and the signal processing in a digital camera, a new paradigm arises where aberration balancing takes into consideration not only the number of surfaces and indices of refraction, but also the processing capability.

  9. Distributed Signal Processing for Wireless EEG Sensor Networks.

    PubMed

    Bertrand, Alexander

    2015-11-01

    Inspired by ongoing evolutions in the field of wireless body area networks (WBANs), this tutorial paper presents a conceptual and exploratory study of wireless electroencephalography (EEG) sensor networks (WESNs), with an emphasis on distributed signal processing aspects. A WESN is conceived as a modular neuromonitoring platform for high-density EEG recordings, in which each node is equipped with an electrode array, a signal processing unit, and facilities for wireless communication. We first address the advantages of such a modular approach, and we explain how distributed signal processing algorithms make WESNs more power-efficient, in particular by avoiding data centralization. We provide an overview of distributed signal processing algorithms that are potentially applicable in WESNs, and for illustration purposes, we also provide a more detailed case study of a distributed eye blink artifact removal algorithm. Finally, we study the power efficiency of these distributed algorithms in comparison to their centralized counterparts in which all the raw sensor signals are centralized in a near-end or far-end fusion center.

  10. Influence of process parameters on plasma electrolytic surface treatment of tantalum for biomedical applications

    NASA Astrophysics Data System (ADS)

    Sowa, Maciej; Woszczak, Maja; Kazek-Kęsik, Alicja; Dercz, Grzegorz; Korotin, Danila M.; Zhidkov, Ivan S.; Kurmaev, Ernst Z.; Cholakh, Seif O.; Basiaga, Marcin; Simka, Wojciech

    2017-06-01

    This work aims to quantify the effect of anodization voltage and electrolyte composition used during DC plasma electrolytic oxidation (PEO), operated as a 2-step process, on the surface properties of the resulting oxide coatings on tantalum. The first step consisted of galvanostatic anodization (150 mA cm-2) of the tantalum workpiece up to several limiting voltages (200, 300, 400 and 500 V). After attaining the limiting voltage, the process was switched to voltage control, which resulted in a gradual decrease of the anodic current density. The anodic treatment was realized in a 0.5 M Ca(H2PO2)2 solution, which was then modified by the addition of 1.15 M Ca(HCOO)2 as well as 1.15 M and 1.5 M Mg(CH3COO)2. The increasing voltage of anodization led to the formation of thicker coatings, with larger pores and enriched with electrolytes species to a higher extent. The solutions containing HCOO- and CH3COO- ions caused the formation of coatings which were slightly hydrophobic (high contact angle). In the case of the samples anodized up to 500 V, scattered crystalline deposits were observed. Bioactive phases, such as hydroxyapatite, were detected in the treated oxide coatings by XRD and XPS.

  11. ISLE (Image and Signal Processing LISP Environment) reference manual

    SciTech Connect

    Sherwood, R.J.; Searfus, R.M.

    1990-01-01

    ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply the algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.

  12. SAR image enhancement via post-correlation signal processing

    NASA Technical Reports Server (NTRS)

    Matthews, N. D.; Kaupp, V. H.; Waite, W. P.; Macdonald, H. C.

    1984-01-01

    Seventeen interpreters ranked sets of computer-generated radar imagery to assess the value of post-correlation processing on the interpretability of SAR (synthetic aperture radar) imagery. The post-correlation processing evaluated amounts to a nonlinear mapping of the signal exiting a digital correlator and allows full use of signal bandwidth for improving the spatial resolution or for noise reduction. The results indicate that it is reasonable to hypothesize an optimal SAR presentation format for specific applications even though this study was too limited to be specific.

  13. Smart Microfluidic Electrochemical DNA Sensors with Signal Processing Circuits

    NASA Astrophysics Data System (ADS)

    Sawada, Kazuaki; Oda, Chigusa; Takao, Hidekuni; Ishida, Makoto

    2007-05-01

    A smart microfluidic DNA sensor with an integrated signal-processing circuit for electrochemical analysis has been successfully fabricated. The sensor comprises an integrated electrochemical sensing electrode, a microfluidic channel-type reactor, and operational amplifiers for electrochemical measurement. The microfluidic reactor employs a laminar flow principle. Generally, a relatively large and expensive system is necessary for electrochemical measurement. In the fabricated smart chip, signal-processing circuits for measuring cyclic-voltammogram characteristics are integrated, permitting cyclic-voltammograms to be successively measured, using only two simple sources of electrical power.

  14. The Savant Hypothesis: is autism a signal-processing problem?

    PubMed

    Fabricius, Thomas

    2010-08-01

    Autism is being investigated through many different approaches. This paper suggests the genetic, perceptual, cognitive, and histological findings ultimately manifest themselves as variations of the same signal-processing problem of defective compression. The Savant Hypothesis is formulated from first principles of both mathematical signal-processing and primary neuroscience to reflect the failure of compression. The Savant Hypothesis is applied to the problem of autism in a surprisingly straightforward application. The enigma of the autistic savant becomes intuitive when observed from this approach. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. Using image processing techniques on proximity probe signals in rotordynamics

    NASA Astrophysics Data System (ADS)

    Diamond, Dawie; Heyns, Stephan; Oberholster, Abrie

    2016-06-01

    This paper proposes a new approach to process proximity probe signals in rotordynamic applications. It is argued that the signal be interpreted as a one dimensional image. Existing image processing techniques can then be used to gain information about the object being measured. Some results from one application is presented. Rotor blade tip deflections can be calculated through localizing phase information in this one dimensional image. It is experimentally shown that the newly proposed method performs more accurately than standard techniques, especially where the sampling rate of the data acquisition system is inadequate by conventional standards.

  16. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P. ); Elliott, A. )

    1992-01-01

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  17. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P.; Elliott, A.

    1992-12-31

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  18. Signal Processing For Chemical Sensing: Statistics or Biological Inspiration

    NASA Astrophysics Data System (ADS)

    Marco, Santiago

    2011-09-01

    Current analytical instrumentation and continuous sensing can provide huge amounts of data. Automatic signal processing and information evaluation is needed to overcome drowning in data. Today, statistical techniques are typically used to analyse and extract information from continuous signals. However, it is very interesting to note that biology (insects and vertebrates) has found alternative solutions for chemical sensing and information processing. This is a brief introduction to the developments in the European Project: Bio-ICT NEUROCHEM: Biologically Inspired Computation for Chemical Sensing (grant no. 216916) Fp7 project devoted to biomimetic olfactory systems.

  19. Formal ontology for natural language processing and the integration of biomedical databases.

    PubMed

    Simon, Jonathan; Dos Santos, Mariana; Fielding, James; Smith, Barry

    2006-01-01

    The central hypothesis underlying this communication is that the methodology and conceptual rigor of a philosophically inspired formal ontology can bring significant benefits in the development and maintenance of application ontologies [A. Flett, M. Dos Santos, W. Ceusters, Some Ontology Engineering Procedures and their Supporting Technologies, EKAW2002, 2003]. This hypothesis has been tested in the collaboration between Language and Computing (L&C), a company specializing in software for supporting natural language processing especially in the medical field, and the Institute for Formal Ontology and Medical Information Science (IFOMIS), an academic research institution concerned with the theoretical foundations of ontology. In the course of this collaboration L&C's ontology, LinKBase, which is designed to integrate and support reasoning across a plurality of external databases, has been subjected to a thorough auditing on the basis of the principles underlying IFOMIS's Basic Formal Ontology (BFO) [B. Smith, Basic Formal Ontology, 2002. http://ontology.buffalo.edu/bfo]. The goal is to transform a large terminology-based ontology into one with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase are standardized in the framework of first-order logic. In this paper we describe how this principles-based standardization has led to a greater degree of internal coherence of the LinKBase structure, and how it has facilitated the construction of mappings between external databases using LinKBase as translation hub. We argue that the collaboration here described represents a new phase in the quest to solve the so-called "Tower of Babel" problem of ontology integration [F. Montayne, J. Flanagan, Formal Ontology: The Foundation for Natural Language Processing, 2003. http://www.landcglobal.com/].

  20. Biomedical research

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Biomedical problems encountered by man in space which have been identified as a result of previous experience in simulated or actual spaceflight include cardiovascular deconditioning, motion sickness, bone loss, muscle atrophy, red cell alterations, fluid and electrolyte loss, radiation effects, radiation protection, behavior, and performance. The investigations and the findings in each of these areas were reviewed. A description of how biomedical research is organized within NASA, how it is funded, and how it is being reoriented to meet the needs of future manned space missions is also provided.